Merge branch 'develop' of https://github.com/LLNL/spack into qa/coding_standard
This commit is contained in:
commit
93c22dd978
140 changed files with 4465 additions and 813 deletions
|
@ -59,7 +59,8 @@ can join it here:
|
|||
|
||||
At the moment, contributing to Spack is relatively simple. Just send us
|
||||
a [pull request](https://help.github.com/articles/using-pull-requests/).
|
||||
When you send your request, make ``develop`` the destination branch.
|
||||
When you send your request, make ``develop`` the destination branch on the
|
||||
[Spack repository](https://github.com/LLNL/spack).
|
||||
|
||||
Spack is using a rough approximation of the [Git
|
||||
Flow](http://nvie.com/posts/a-successful-git-branching-model/)
|
||||
|
|
8
etc/spack/modules.yaml
Normal file
8
etc/spack/modules.yaml
Normal file
|
@ -0,0 +1,8 @@
|
|||
# -------------------------------------------------------------------------
|
||||
# This is the default spack module files generation configuration.
|
||||
#
|
||||
# Changes to this file will affect all users of this spack install,
|
||||
# although users can override these settings in their ~/.spack/modules.yaml.
|
||||
# -------------------------------------------------------------------------
|
||||
modules:
|
||||
enable: ['tcl', 'dotkit']
|
|
@ -149,26 +149,46 @@ customize an installation in :ref:`sec-specs`.
|
|||
``spack uninstall``
|
||||
~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
To uninstall a package, type ``spack uninstall <package>``. This will
|
||||
completely remove the directory in which the package was installed.
|
||||
To uninstall a package, type ``spack uninstall <package>``. This will ask the user for
|
||||
confirmation, and in case will completely remove the directory in which the package was installed.
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
spack uninstall mpich
|
||||
|
||||
If there are still installed packages that depend on the package to be
|
||||
uninstalled, spack will refuse to uninstall it. You can override this
|
||||
behavior with ``spack uninstall -f <package>``, but you risk breaking
|
||||
other installed packages. In general, it is safer to remove dependent
|
||||
packages *before* removing their dependencies.
|
||||
uninstalled, spack will refuse to uninstall it.
|
||||
|
||||
A line like ``spack uninstall mpich`` may be ambiguous, if multiple
|
||||
``mpich`` configurations are installed. For example, if both
|
||||
To uninstall a package and every package that depends on it, you may give the
|
||||
`--dependents` option.
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
spack uninstall --dependents mpich
|
||||
|
||||
will display a list of all the packages that depends on `mpich` and, upon confirmation,
|
||||
will uninstall them in the right order.
|
||||
|
||||
A line like
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
spack uninstall mpich
|
||||
|
||||
may be ambiguous, if multiple ``mpich`` configurations are installed. For example, if both
|
||||
``mpich@3.0.2`` and ``mpich@3.1`` are installed, ``mpich`` could refer
|
||||
to either one. Because it cannot determine which one to uninstall,
|
||||
Spack will ask you to provide a version number to remove the
|
||||
ambiguity. As an example, ``spack uninstall mpich@3.1`` is
|
||||
unambiguous in this scenario.
|
||||
Spack will ask you either to provide a version number to remove the
|
||||
ambiguity or use the ``--all`` option to uninstall all of the matching packages.
|
||||
|
||||
You may force uninstall a package with the `--force` option
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
spack uninstall --force mpich
|
||||
|
||||
but you risk breaking other installed packages. In general, it is safer to remove dependent
|
||||
packages *before* removing their dependencies or use the `--dependents` option.
|
||||
|
||||
|
||||
Seeing installed packages
|
||||
|
@ -774,6 +794,34 @@ Environment modules
|
|||
Spack provides some limited integration with environment module
|
||||
systems to make it easier to use the packages it provides.
|
||||
|
||||
|
||||
Installing Environment Modules
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
In order to use Spack's generated environment modules, you must have
|
||||
installed the *Environment Modules* package. On many Linux
|
||||
distributions, this can be installed from the vendor's repository.
|
||||
For example: ```yum install environment-modules``
|
||||
(Fedora/RHEL/CentOS). If your Linux distribution does not have
|
||||
Environment Modules, you can get it with Spack:
|
||||
|
||||
1. Install with::
|
||||
|
||||
spack install environment-modules
|
||||
|
||||
2. Activate with::
|
||||
|
||||
MODULES_HOME=`spack location -i environment-modules`
|
||||
MODULES_VERSION=`ls -1 $MODULES_HOME/Modules | head -1`
|
||||
${MODULES_HOME}/Modules/${MODULES_VERSION}/bin/add.modules
|
||||
|
||||
This adds to your ``.bashrc`` (or similar) files, enabling Environment
|
||||
Modules when you log in. It will ask your permission before changing
|
||||
any files.
|
||||
|
||||
Spack and Environment Modules
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
You can enable shell support by sourcing some files in the
|
||||
``/share/spack`` directory.
|
||||
|
||||
|
|
269
lib/spack/env/cc
vendored
269
lib/spack/env/cc
vendored
|
@ -39,7 +39,7 @@
|
|||
#
|
||||
|
||||
# This is the list of environment variables that need to be set before
|
||||
# the script runs. They are set by routines in spack.build_environment
|
||||
# the script runs. They are set by routines in spack.build_environment
|
||||
# as part of spack.package.Package.do_install().
|
||||
parameters="
|
||||
SPACK_PREFIX
|
||||
|
@ -50,7 +50,7 @@ SPACK_SHORT_SPEC"
|
|||
|
||||
# The compiler input variables are checked for sanity later:
|
||||
# SPACK_CC, SPACK_CXX, SPACK_F77, SPACK_FC
|
||||
# Debug flag is optional; set to true for debug logging:
|
||||
# Debug flag is optional; set to "TRUE" for debug logging:
|
||||
# SPACK_DEBUG
|
||||
# Test command is used to unit test the compiler script.
|
||||
# SPACK_TEST_COMMAND
|
||||
|
@ -65,12 +65,11 @@ function die {
|
|||
}
|
||||
|
||||
for param in $parameters; do
|
||||
if [ -z "${!param}" ]; then
|
||||
die "Spack compiler must be run from spack! Input $param was missing!"
|
||||
if [[ -z ${!param} ]]; then
|
||||
die "Spack compiler must be run from Spack! Input '$param' is missing."
|
||||
fi
|
||||
done
|
||||
|
||||
#
|
||||
# Figure out the type of compiler, the language, and the mode so that
|
||||
# the compiler script knows what to do.
|
||||
#
|
||||
|
@ -78,14 +77,18 @@ done
|
|||
# 'command' is set based on the input command to $SPACK_[CC|CXX|F77|F90]
|
||||
#
|
||||
# 'mode' is set to one of:
|
||||
# vcheck version check
|
||||
# cpp preprocess
|
||||
# cc compile
|
||||
# as assemble
|
||||
# ld link
|
||||
# ccld compile & link
|
||||
# cpp preprocessor
|
||||
# vcheck version check
|
||||
#
|
||||
|
||||
command=$(basename "$0")
|
||||
case "$command" in
|
||||
cpp)
|
||||
mode=cpp
|
||||
;;
|
||||
cc|c89|c99|gcc|clang|icc|pgcc|xlc)
|
||||
command="$SPACK_CC"
|
||||
language="C"
|
||||
|
@ -102,9 +105,6 @@ case "$command" in
|
|||
command="$SPACK_F77"
|
||||
language="Fortran 77"
|
||||
;;
|
||||
cpp)
|
||||
mode=cpp
|
||||
;;
|
||||
ld)
|
||||
mode=ld
|
||||
;;
|
||||
|
@ -113,10 +113,12 @@ case "$command" in
|
|||
;;
|
||||
esac
|
||||
|
||||
# If any of the arguments below is present then the mode is vcheck. In vcheck mode nothing is added in terms of extra search paths or libraries
|
||||
if [ -z "$mode" ]; then
|
||||
# If any of the arguments below are present, then the mode is vcheck.
|
||||
# In vcheck mode, nothing is added in terms of extra search paths or
|
||||
# libraries.
|
||||
if [[ -z $mode ]]; then
|
||||
for arg in "$@"; do
|
||||
if [ "$arg" = -v -o "$arg" = -V -o "$arg" = --version -o "$arg" = -dumpversion ]; then
|
||||
if [[ $arg == -v || $arg == -V || $arg == --version || $arg == -dumpversion ]]; then
|
||||
mode=vcheck
|
||||
break
|
||||
fi
|
||||
|
@ -124,14 +126,16 @@ if [ -z "$mode" ]; then
|
|||
fi
|
||||
|
||||
# Finish setting up the mode.
|
||||
|
||||
if [ -z "$mode" ]; then
|
||||
if [[ -z $mode ]]; then
|
||||
mode=ccld
|
||||
for arg in "$@"; do
|
||||
if [ "$arg" = -E ]; then
|
||||
if [[ $arg == -E ]]; then
|
||||
mode=cpp
|
||||
break
|
||||
elif [ "$arg" = -c ]; then
|
||||
elif [[ $arg == -S ]]; then
|
||||
mode=as
|
||||
break
|
||||
elif [[ $arg == -c ]]; then
|
||||
mode=cc
|
||||
break
|
||||
fi
|
||||
|
@ -139,175 +143,76 @@ if [ -z "$mode" ]; then
|
|||
fi
|
||||
|
||||
# Dump the version and exit if we're in testing mode.
|
||||
if [ "$SPACK_TEST_COMMAND" = "dump-mode" ]; then
|
||||
if [[ $SPACK_TEST_COMMAND == dump-mode ]]; then
|
||||
echo "$mode"
|
||||
exit
|
||||
fi
|
||||
|
||||
# Check that at least one of the real commands was actually selected,
|
||||
# otherwise we don't know what to execute.
|
||||
if [ -z "$command" ]; then
|
||||
if [[ -z $command ]]; then
|
||||
die "ERROR: Compiler '$SPACK_COMPILER_SPEC' does not support compiling $language programs."
|
||||
fi
|
||||
|
||||
if [[ $mode == vcheck ]]; then
|
||||
exec ${command} "$@"
|
||||
fi
|
||||
|
||||
# Darwin's linker has a -r argument that merges object files together.
|
||||
# It doesn't work with -rpath.
|
||||
# This variable controls whether they are added.
|
||||
add_rpaths=true
|
||||
if [[ mode == ld && $OSTYPE == darwin* ]]; then
|
||||
for arg in "$@"; do
|
||||
if [[ $arg == -r ]]; then
|
||||
add_rpaths=false
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# Save original command for debug logging
|
||||
input_command="$@"
|
||||
|
||||
if [ "$mode" == vcheck ] ; then
|
||||
exec ${command} "$@"
|
||||
fi
|
||||
|
||||
#
|
||||
# Now do real parsing of the command line args, trying hard to keep
|
||||
# non-rpath linker arguments in the proper order w.r.t. other command
|
||||
# line arguments. This is important for things like groups.
|
||||
#
|
||||
includes=()
|
||||
libraries=()
|
||||
libs=()
|
||||
rpaths=()
|
||||
other_args=()
|
||||
|
||||
while [ -n "$1" ]; do
|
||||
case "$1" in
|
||||
-I*)
|
||||
arg="${1#-I}"
|
||||
if [ -z "$arg" ]; then shift; arg="$1"; fi
|
||||
includes+=("$arg")
|
||||
;;
|
||||
-L*)
|
||||
arg="${1#-L}"
|
||||
if [ -z "$arg" ]; then shift; arg="$1"; fi
|
||||
libraries+=("$arg")
|
||||
;;
|
||||
-l*)
|
||||
arg="${1#-l}"
|
||||
if [ -z "$arg" ]; then shift; arg="$1"; fi
|
||||
libs+=("$arg")
|
||||
;;
|
||||
-Wl,*)
|
||||
arg="${1#-Wl,}"
|
||||
# TODO: Handle multiple -Wl, continuations of -Wl,-rpath
|
||||
if [[ $arg == -rpath=* ]]; then
|
||||
arg="${arg#-rpath=}"
|
||||
for rpath in ${arg//,/ }; do
|
||||
rpaths+=("$rpath")
|
||||
done
|
||||
elif [[ $arg == -rpath,* ]]; then
|
||||
arg="${arg#-rpath,}"
|
||||
for rpath in ${arg//,/ }; do
|
||||
rpaths+=("$rpath")
|
||||
done
|
||||
elif [[ $arg == -rpath ]]; then
|
||||
shift; arg="$1"
|
||||
if [[ $arg != '-Wl,'* ]]; then
|
||||
die "-Wl,-rpath was not followed by -Wl,*"
|
||||
fi
|
||||
arg="${arg#-Wl,}"
|
||||
for rpath in ${arg//,/ }; do
|
||||
rpaths+=("$rpath")
|
||||
done
|
||||
else
|
||||
other_args+=("-Wl,$arg")
|
||||
fi
|
||||
;;
|
||||
-Xlinker)
|
||||
shift; arg="$1";
|
||||
if [[ $arg = -rpath=* ]]; then
|
||||
rpaths+=("${arg#-rpath=}")
|
||||
elif [[ $arg = -rpath ]]; then
|
||||
shift; arg="$1"
|
||||
if [[ $arg != -Xlinker ]]; then
|
||||
die "-Xlinker -rpath was not followed by -Xlinker <arg>"
|
||||
fi
|
||||
shift; arg="$1"
|
||||
rpaths+=("$arg")
|
||||
else
|
||||
other_args+=("-Xlinker")
|
||||
other_args+=("$arg")
|
||||
fi
|
||||
;;
|
||||
*)
|
||||
other_args+=("$1")
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
# Dump parsed values for unit testing if asked for
|
||||
if [ -n "$SPACK_TEST_COMMAND" ]; then
|
||||
IFS=$'\n'
|
||||
case "$SPACK_TEST_COMMAND" in
|
||||
dump-includes) echo "${includes[*]}";;
|
||||
dump-libraries) echo "${libraries[*]}";;
|
||||
dump-libs) echo "${libs[*]}";;
|
||||
dump-rpaths) echo "${rpaths[*]}";;
|
||||
dump-other-args) echo "${other_args[*]}";;
|
||||
dump-all)
|
||||
echo "INCLUDES:"
|
||||
echo "${includes[*]}"
|
||||
echo
|
||||
echo "LIBRARIES:"
|
||||
echo "${libraries[*]}"
|
||||
echo
|
||||
echo "LIBS:"
|
||||
echo "${libs[*]}"
|
||||
echo
|
||||
echo "RPATHS:"
|
||||
echo "${rpaths[*]}"
|
||||
echo
|
||||
echo "ARGS:"
|
||||
echo "${other_args[*]}"
|
||||
;;
|
||||
*)
|
||||
echo "ERROR: Unknown test command"
|
||||
exit 1 ;;
|
||||
esac
|
||||
exit
|
||||
fi
|
||||
args=("$@")
|
||||
|
||||
# Read spack dependencies from the path environment variable
|
||||
IFS=':' read -ra deps <<< "$SPACK_DEPENDENCIES"
|
||||
for dep in "${deps[@]}"; do
|
||||
if [ -d "$dep/include" ]; then
|
||||
includes+=("$dep/include")
|
||||
# Prepend include directories
|
||||
if [[ -d $dep/include ]]; then
|
||||
if [[ $mode == cpp || $mode == cc || $mode == as || $mode == ccld ]]; then
|
||||
args=("-I$dep/include" "${args[@]}")
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -d "$dep/lib" ]; then
|
||||
libraries+=("$dep/lib")
|
||||
rpaths+=("$dep/lib")
|
||||
# Prepend lib and RPATH directories
|
||||
if [[ -d $dep/lib ]]; then
|
||||
if [[ $mode == ccld ]]; then
|
||||
$add_rpaths && args=("-Wl,-rpath,$dep/lib" "${args[@]}")
|
||||
args=("-L$dep/lib" "${args[@]}")
|
||||
elif [[ $mode == ld ]]; then
|
||||
$add_rpaths && args=("-rpath" "$dep/lib" "${args[@]}")
|
||||
args=("-L$dep/lib" "${args[@]}")
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ -d "$dep/lib64" ]; then
|
||||
libraries+=("$dep/lib64")
|
||||
rpaths+=("$dep/lib64")
|
||||
# Prepend lib64 and RPATH directories
|
||||
if [[ -d $dep/lib64 ]]; then
|
||||
if [[ $mode == ccld ]]; then
|
||||
$add_rpaths && args=("-Wl,-rpath,$dep/lib64" "${args[@]}")
|
||||
args=("-L$dep/lib64" "${args[@]}")
|
||||
elif [[ $mode == ld ]]; then
|
||||
$add_rpaths && args=("-rpath" "$dep/lib64" "${args[@]}")
|
||||
args=("-L$dep/lib64" "${args[@]}")
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
# Include all -L's and prefix/whatever dirs in rpath
|
||||
for dir in "${libraries[@]}"; do
|
||||
[[ dir = $SPACK_INSTALL* ]] && rpaths+=("$dir")
|
||||
done
|
||||
rpaths+=("$SPACK_PREFIX/lib")
|
||||
rpaths+=("$SPACK_PREFIX/lib64")
|
||||
|
||||
# Put the arguments together
|
||||
args=()
|
||||
for dir in "${includes[@]}"; do args+=("-I$dir"); done
|
||||
args+=("${other_args[@]}")
|
||||
for dir in "${libraries[@]}"; do args+=("-L$dir"); done
|
||||
for lib in "${libs[@]}"; do args+=("-l$lib"); done
|
||||
|
||||
if [ "$mode" = ccld ]; then
|
||||
for dir in "${rpaths[@]}"; do
|
||||
args+=("-Wl,-rpath")
|
||||
args+=("-Wl,$dir");
|
||||
done
|
||||
elif [ "$mode" = ld ]; then
|
||||
for dir in "${rpaths[@]}"; do
|
||||
args+=("-rpath")
|
||||
args+=("$dir");
|
||||
done
|
||||
if [[ $mode == ccld ]]; then
|
||||
$add_rpaths && args=("-Wl,-rpath,$SPACK_PREFIX/lib" "-Wl,-rpath,$SPACK_PREFIX/lib64" "${args[@]}")
|
||||
elif [[ $mode == ld ]]; then
|
||||
$add_rpaths && args=("-rpath" "$SPACK_PREFIX/lib" "-rpath" "$SPACK_PREFIX/lib64" "${args[@]}")
|
||||
fi
|
||||
|
||||
#
|
||||
|
@ -323,34 +228,40 @@ unset DYLD_LIBRARY_PATH
|
|||
#
|
||||
IFS=':' read -ra env_path <<< "$PATH"
|
||||
IFS=':' read -ra spack_env_dirs <<< "$SPACK_ENV_PATH"
|
||||
spack_env_dirs+=(".")
|
||||
spack_env_dirs+=("" ".")
|
||||
PATH=""
|
||||
for dir in "${env_path[@]}"; do
|
||||
remove=""
|
||||
for rm_dir in "${spack_env_dirs[@]}"; do
|
||||
if [ "$dir" = "$rm_dir" ]; then remove=True; fi
|
||||
done
|
||||
if [ -z "$remove" ]; then
|
||||
if [ -z "$PATH" ]; then
|
||||
PATH="$dir"
|
||||
else
|
||||
PATH="$PATH:$dir"
|
||||
addpath=true
|
||||
for env_dir in "${spack_env_dirs[@]}"; do
|
||||
if [[ $dir == $env_dir ]]; then
|
||||
addpath=false
|
||||
break
|
||||
fi
|
||||
done
|
||||
if $addpath; then
|
||||
PATH="${PATH:+$PATH:}$dir"
|
||||
fi
|
||||
done
|
||||
export PATH
|
||||
|
||||
full_command=("$command")
|
||||
full_command+=("${args[@]}")
|
||||
full_command=("$command" "${args[@]}")
|
||||
|
||||
# In test command mode, write out full command for Spack tests.
|
||||
if [[ $SPACK_TEST_COMMAND == dump-args ]]; then
|
||||
echo "${full_command[@]}"
|
||||
exit
|
||||
elif [[ -n $SPACK_TEST_COMMAND ]]; then
|
||||
die "ERROR: Unknown test command"
|
||||
fi
|
||||
|
||||
#
|
||||
# Write the input and output commands to debug logs if it's asked for.
|
||||
#
|
||||
if [ "$SPACK_DEBUG" = "TRUE" ]; then
|
||||
if [[ $SPACK_DEBUG == TRUE ]]; then
|
||||
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_SHORT_SPEC.in.log"
|
||||
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_SHORT_SPEC.out.log"
|
||||
echo "$input_command" >> $input_log
|
||||
echo "$mode ${full_command[@]}" >> $output_log
|
||||
echo "[$mode] $command $input_command" >> $input_log
|
||||
echo "[$mode] ${full_command[@]}" >> $output_log
|
||||
fi
|
||||
|
||||
exec "${full_command[@]}"
|
||||
|
|
|
@ -27,9 +27,11 @@
|
|||
'force_remove', 'join_path', 'ancestor', 'can_access', 'filter_file',
|
||||
'FileFilter', 'change_sed_delimiter', 'is_exe', 'force_symlink',
|
||||
'set_executable', 'copy_mode', 'unset_executable_mode',
|
||||
'remove_dead_links', 'remove_linked_tree']
|
||||
'remove_dead_links', 'remove_linked_tree', 'find_library_path',
|
||||
'fix_darwin_install_name']
|
||||
|
||||
import os
|
||||
import glob
|
||||
import sys
|
||||
import re
|
||||
import shutil
|
||||
|
@ -38,6 +40,7 @@
|
|||
import getpass
|
||||
from contextlib import contextmanager, closing
|
||||
from tempfile import NamedTemporaryFile
|
||||
import subprocess
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from spack.util.compression import ALLOWED_ARCHIVE_TYPES
|
||||
|
@ -392,3 +395,44 @@ def remove_linked_tree(path):
|
|||
os.unlink(path)
|
||||
else:
|
||||
shutil.rmtree(path, True)
|
||||
|
||||
|
||||
def fix_darwin_install_name(path):
|
||||
"""
|
||||
Fix install name of dynamic libraries on Darwin to have full path.
|
||||
There are two parts of this task:
|
||||
(i) use install_name('-id',...) to change install name of a single lib;
|
||||
(ii) use install_name('-change',...) to change the cross linking between libs.
|
||||
The function assumes that all libraries are in one folder and currently won't
|
||||
follow subfolders.
|
||||
|
||||
Args:
|
||||
path: directory in which .dylib files are alocated
|
||||
|
||||
"""
|
||||
libs = glob.glob(join_path(path,"*.dylib"))
|
||||
for lib in libs:
|
||||
# fix install name first:
|
||||
subprocess.Popen(["install_name_tool", "-id",lib,lib], stdout=subprocess.PIPE).communicate()[0]
|
||||
long_deps = subprocess.Popen(["otool", "-L",lib], stdout=subprocess.PIPE).communicate()[0].split('\n')
|
||||
deps = [dep.partition(' ')[0][1::] for dep in long_deps[2:-1]]
|
||||
# fix all dependencies:
|
||||
for dep in deps:
|
||||
for loc in libs:
|
||||
if dep == os.path.basename(loc):
|
||||
subprocess.Popen(["install_name_tool", "-change",dep,loc,lib], stdout=subprocess.PIPE).communicate()[0]
|
||||
break
|
||||
|
||||
|
||||
def find_library_path(libname, *paths):
|
||||
"""Searches for a file called <libname> in each path.
|
||||
|
||||
Return:
|
||||
directory where the library was found, if found. None otherwise.
|
||||
|
||||
"""
|
||||
for path in paths:
|
||||
library = join_path(path, libname)
|
||||
if os.path.exists(library):
|
||||
return path
|
||||
return None
|
||||
|
|
|
@ -117,7 +117,8 @@ def caller_locals():
|
|||
scope. Yes, this is some black magic, and yes it's useful
|
||||
for implementing things like depends_on and provides.
|
||||
"""
|
||||
stack = inspect.stack()
|
||||
# Passing zero here skips line context for speed.
|
||||
stack = inspect.stack(0)
|
||||
try:
|
||||
return stack[2][0].f_locals
|
||||
finally:
|
||||
|
@ -128,7 +129,8 @@ def get_calling_module_name():
|
|||
"""Make sure that the caller is a class definition, and return the
|
||||
enclosing module's name.
|
||||
"""
|
||||
stack = inspect.stack()
|
||||
# Passing zero here skips line context for speed.
|
||||
stack = inspect.stack(0)
|
||||
try:
|
||||
# Make sure locals contain __module__
|
||||
caller_locals = stack[2][0].f_locals
|
||||
|
|
|
@ -136,9 +136,7 @@
|
|||
# don't add a second username if it's already unique by user.
|
||||
if not _tmp_user in path:
|
||||
tmp_dirs.append(join_path(path, '%u', 'spack-stage'))
|
||||
|
||||
for path in _tmp_candidates:
|
||||
if not path in tmp_dirs:
|
||||
else:
|
||||
tmp_dirs.append(join_path(path, 'spack-stage'))
|
||||
|
||||
# Whether spack should allow installation of unsafe versions of
|
||||
|
|
|
@ -59,6 +59,11 @@
|
|||
SPACK_DEBUG_LOG_DIR = 'SPACK_DEBUG_LOG_DIR'
|
||||
|
||||
|
||||
# Platform-specific library suffix.
|
||||
dso_suffix = 'dylib' if sys.platform == 'darwin' else 'so'
|
||||
|
||||
|
||||
|
||||
class MakeExecutable(Executable):
|
||||
"""Special callable executable object for make so the user can
|
||||
specify parallel or not on a per-invocation basis. Using
|
||||
|
@ -208,7 +213,7 @@ def set_module_variables_for_package(pkg, module):
|
|||
# TODO: of build dependencies, as opposed to link dependencies.
|
||||
# TODO: Currently, everything is a link dependency, but tools like
|
||||
# TODO: this shouldn't be.
|
||||
m.cmake = which("cmake")
|
||||
m.cmake = Executable('cmake')
|
||||
|
||||
# standard CMake arguments
|
||||
m.std_cmake_args = ['-DCMAKE_INSTALL_PREFIX=%s' % pkg.prefix,
|
||||
|
@ -246,6 +251,9 @@ def set_module_variables_for_package(pkg, module):
|
|||
# a Prefix object.
|
||||
m.prefix = pkg.prefix
|
||||
|
||||
# Platform-specific library suffix.
|
||||
m.dso_suffix = dso_suffix
|
||||
|
||||
|
||||
def get_rpaths(pkg):
|
||||
"""Get a list of all the rpaths for a package."""
|
||||
|
@ -270,21 +278,6 @@ def parent_class_modules(cls):
|
|||
return result
|
||||
|
||||
|
||||
def setup_module_variables_for_dag(pkg):
|
||||
"""Set module-scope variables for all packages in the DAG."""
|
||||
for spec in pkg.spec.traverse(order='post'):
|
||||
# If a user makes their own package repo, e.g.
|
||||
# spack.repos.mystuff.libelf.Libelf, and they inherit from
|
||||
# an existing class like spack.repos.original.libelf.Libelf,
|
||||
# then set the module variables for both classes so the
|
||||
# parent class can still use them if it gets called.
|
||||
spkg = spec.package
|
||||
modules = parent_class_modules(spkg.__class__)
|
||||
for mod in modules:
|
||||
set_module_variables_for_package(spkg, mod)
|
||||
set_module_variables_for_package(spkg, spkg.module)
|
||||
|
||||
|
||||
def setup_package(pkg):
|
||||
"""Execute all environment setup routines."""
|
||||
spack_env = EnvironmentModifications()
|
||||
|
@ -308,20 +301,27 @@ def setup_package(pkg):
|
|||
|
||||
set_compiler_environment_variables(pkg, spack_env)
|
||||
set_build_environment_variables(pkg, spack_env)
|
||||
setup_module_variables_for_dag(pkg)
|
||||
|
||||
# Allow dependencies to modify the module
|
||||
# traverse in postorder so package can use vars from its dependencies
|
||||
spec = pkg.spec
|
||||
for dependency_spec in spec.traverse(root=False):
|
||||
dpkg = dependency_spec.package
|
||||
dpkg.setup_dependent_package(pkg.module, spec)
|
||||
for dspec in pkg.spec.traverse(order='post', root=False):
|
||||
# If a user makes their own package repo, e.g.
|
||||
# spack.repos.mystuff.libelf.Libelf, and they inherit from
|
||||
# an existing class like spack.repos.original.libelf.Libelf,
|
||||
# then set the module variables for both classes so the
|
||||
# parent class can still use them if it gets called.
|
||||
spkg = dspec.package
|
||||
modules = parent_class_modules(spkg.__class__)
|
||||
for mod in modules:
|
||||
set_module_variables_for_package(spkg, mod)
|
||||
set_module_variables_for_package(spkg, spkg.module)
|
||||
|
||||
# Allow dependencies to set up environment as well
|
||||
for dependency_spec in spec.traverse(root=False):
|
||||
dpkg = dependency_spec.package
|
||||
# Allow dependencies to modify the module
|
||||
dpkg = dspec.package
|
||||
dpkg.setup_dependent_package(pkg.module, spec)
|
||||
dpkg.setup_dependent_environment(spack_env, run_env, spec)
|
||||
|
||||
# Allow the package to apply some settings.
|
||||
set_module_variables_for_package(pkg, pkg.module)
|
||||
pkg.setup_environment(spack_env, run_env)
|
||||
|
||||
# Make sure nothing's strange about the Spack environment.
|
||||
|
|
|
@ -52,7 +52,7 @@ def print_text_info(pkg):
|
|||
print "Safe versions: "
|
||||
|
||||
if not pkg.versions:
|
||||
print("None")
|
||||
print(" None")
|
||||
else:
|
||||
pad = padder(pkg.versions, 4)
|
||||
for v in reversed(sorted(pkg.versions)):
|
||||
|
@ -62,7 +62,7 @@ def print_text_info(pkg):
|
|||
print
|
||||
print "Variants:"
|
||||
if not pkg.variants:
|
||||
print "None"
|
||||
print " None"
|
||||
else:
|
||||
pad = padder(pkg.variants, 4)
|
||||
|
||||
|
|
|
@ -22,21 +22,16 @@
|
|||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import sys
|
||||
import os
|
||||
import shutil
|
||||
import argparse
|
||||
import sys
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.lang import partition_list
|
||||
from llnl.util.filesystem import mkdirp
|
||||
|
||||
import spack.cmd
|
||||
from llnl.util.filesystem import mkdirp
|
||||
from spack.modules import module_types
|
||||
from spack.util.string import *
|
||||
|
||||
from spack.spec import Spec
|
||||
|
||||
description ="Manipulate modules and dotkits."
|
||||
|
||||
|
||||
|
@ -98,7 +93,6 @@ def module_refresh():
|
|||
cls(spec).write()
|
||||
|
||||
|
||||
|
||||
def module(parser, args):
|
||||
if args.module_command == 'refresh':
|
||||
module_refresh()
|
||||
|
|
|
@ -77,7 +77,8 @@ def get_git():
|
|||
|
||||
def list_packages(rev):
|
||||
git = get_git()
|
||||
relpath = spack.packages_path[len(spack.prefix + os.path.sep):] + os.path.sep
|
||||
pkgpath = os.path.join(spack.packages_path, 'packages')
|
||||
relpath = pkgpath[len(spack.prefix + os.path.sep):] + os.path.sep
|
||||
output = git('ls-tree', '--full-tree', '--name-only', rev, relpath,
|
||||
output=str)
|
||||
return sorted(line[len(relpath):] for line in output.split('\n') if line)
|
||||
|
|
|
@ -35,6 +35,9 @@ def setup_parser(subparser):
|
|||
subparser.add_argument(
|
||||
'-n', '--no-checksum', action='store_true', dest='no_checksum',
|
||||
help="Do not check downloaded packages against checksum")
|
||||
subparser.add_argument(
|
||||
'-p', '--path', dest='path',
|
||||
help="Path to stage package, does not add to spack tree")
|
||||
|
||||
subparser.add_argument(
|
||||
'specs', nargs=argparse.REMAINDER, help="specs of packages to stage")
|
||||
|
@ -50,4 +53,6 @@ def stage(parser, args):
|
|||
specs = spack.cmd.parse_specs(args.specs, concretize=True)
|
||||
for spec in specs:
|
||||
package = spack.repo.get(spec)
|
||||
if args.path:
|
||||
package.path = args.path
|
||||
package.do_stage()
|
||||
|
|
|
@ -23,19 +23,33 @@
|
|||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from __future__ import print_function
|
||||
import sys
|
||||
|
||||
import argparse
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.tty.colify import colify
|
||||
|
||||
import spack
|
||||
import spack.cmd
|
||||
import spack.repository
|
||||
from spack.cmd.find import display_specs
|
||||
from spack.package import PackageStillNeededError
|
||||
|
||||
description="Remove an installed package"
|
||||
description = "Remove an installed package"
|
||||
|
||||
error_message = """You can either:
|
||||
a) Use a more specific spec, or
|
||||
b) use spack uninstall -a to uninstall ALL matching specs.
|
||||
"""
|
||||
|
||||
|
||||
def ask_for_confirmation(message):
|
||||
while True:
|
||||
tty.msg(message + '[y/n]')
|
||||
choice = raw_input().lower()
|
||||
if choice == 'y':
|
||||
break
|
||||
elif choice == 'n':
|
||||
raise SystemExit('Operation aborted')
|
||||
tty.warn('Please reply either "y" or "n"')
|
||||
|
||||
|
||||
def setup_parser(subparser):
|
||||
subparser.add_argument(
|
||||
|
@ -44,10 +58,101 @@ def setup_parser(subparser):
|
|||
subparser.add_argument(
|
||||
'-a', '--all', action='store_true', dest='all',
|
||||
help="USE CAREFULLY. Remove ALL installed packages that match each " +
|
||||
"supplied spec. i.e., if you say uninstall libelf, ALL versions of " +
|
||||
"libelf are uninstalled. This is both useful and dangerous, like rm -r.")
|
||||
"supplied spec. i.e., if you say uninstall libelf, ALL versions of " +
|
||||
"libelf are uninstalled. This is both useful and dangerous, like rm -r.")
|
||||
subparser.add_argument(
|
||||
'packages', nargs=argparse.REMAINDER, help="specs of packages to uninstall")
|
||||
'-d', '--dependents', action='store_true', dest='dependents',
|
||||
help='Also uninstall any packages that depend on the ones given via command line.'
|
||||
)
|
||||
subparser.add_argument(
|
||||
'-y', '--yes-to-all', action='store_true', dest='yes_to_all',
|
||||
help='Assume "yes" is the answer to every confirmation asked to the user.'
|
||||
|
||||
)
|
||||
subparser.add_argument('packages', nargs=argparse.REMAINDER, help="specs of packages to uninstall")
|
||||
|
||||
|
||||
def concretize_specs(specs, allow_multiple_matches=False, force=False):
|
||||
"""
|
||||
Returns a list of specs matching the non necessarily concretized specs given from cli
|
||||
|
||||
Args:
|
||||
specs: list of specs to be matched against installed packages
|
||||
allow_multiple_matches : boolean (if True multiple matches for each item in specs are admitted)
|
||||
|
||||
Return:
|
||||
list of specs
|
||||
"""
|
||||
specs_from_cli = [] # List of specs that match expressions given via command line
|
||||
has_errors = False
|
||||
for spec in specs:
|
||||
matching = spack.installed_db.query(spec)
|
||||
# For each spec provided, make sure it refers to only one package.
|
||||
# Fail and ask user to be unambiguous if it doesn't
|
||||
if not allow_multiple_matches and len(matching) > 1:
|
||||
tty.error("%s matches multiple packages:" % spec)
|
||||
print()
|
||||
display_specs(matching, long=True)
|
||||
print()
|
||||
has_errors = True
|
||||
|
||||
# No installed package matches the query
|
||||
if len(matching) == 0 and not force:
|
||||
tty.error("%s does not match any installed packages." % spec)
|
||||
has_errors = True
|
||||
|
||||
specs_from_cli.extend(matching)
|
||||
if has_errors:
|
||||
tty.die(error_message)
|
||||
|
||||
return specs_from_cli
|
||||
|
||||
|
||||
def installed_dependents(specs):
|
||||
"""
|
||||
Returns a dictionary that maps a spec with a list of its installed dependents
|
||||
|
||||
Args:
|
||||
specs: list of specs to be checked for dependents
|
||||
|
||||
Returns:
|
||||
dictionary of installed dependents
|
||||
"""
|
||||
dependents = {}
|
||||
for item in specs:
|
||||
lst = [x for x in item.package.installed_dependents if x not in specs]
|
||||
if lst:
|
||||
lst = list(set(lst))
|
||||
dependents[item] = lst
|
||||
return dependents
|
||||
|
||||
|
||||
def do_uninstall(specs, force):
|
||||
"""
|
||||
Uninstalls all the specs in a list.
|
||||
|
||||
Args:
|
||||
specs: list of specs to be uninstalled
|
||||
force: force uninstallation (boolean)
|
||||
"""
|
||||
packages = []
|
||||
for item in specs:
|
||||
try:
|
||||
# should work if package is known to spack
|
||||
packages.append(item.package)
|
||||
except spack.repository.UnknownPackageError as e:
|
||||
# The package.py file has gone away -- but still
|
||||
# want to uninstall.
|
||||
spack.Package(item).do_uninstall(force=True)
|
||||
|
||||
# Sort packages to be uninstalled by the number of installed dependents
|
||||
# This ensures we do things in the right order
|
||||
def num_installed_deps(pkg):
|
||||
return len(pkg.installed_dependents)
|
||||
|
||||
packages.sort(key=num_installed_deps)
|
||||
for item in packages:
|
||||
item.do_uninstall(force=force)
|
||||
|
||||
|
||||
def uninstall(parser, args):
|
||||
|
@ -56,50 +161,34 @@ def uninstall(parser, args):
|
|||
|
||||
with spack.installed_db.write_transaction():
|
||||
specs = spack.cmd.parse_specs(args.packages)
|
||||
# Gets the list of installed specs that match the ones give via cli
|
||||
uninstall_list = concretize_specs(specs, args.all, args.force) # takes care of '-a' is given in the cli
|
||||
dependent_list = installed_dependents(uninstall_list) # takes care of '-d'
|
||||
|
||||
# For each spec provided, make sure it refers to only one package.
|
||||
# Fail and ask user to be unambiguous if it doesn't
|
||||
pkgs = []
|
||||
for spec in specs:
|
||||
matching_specs = spack.installed_db.query(spec)
|
||||
if not args.all and len(matching_specs) > 1:
|
||||
tty.error("%s matches multiple packages:" % spec)
|
||||
print()
|
||||
display_specs(matching_specs, long=True)
|
||||
print()
|
||||
print("You can either:")
|
||||
print(" a) Use a more specific spec, or")
|
||||
print(" b) use spack uninstall -a to uninstall ALL matching specs.")
|
||||
sys.exit(1)
|
||||
|
||||
if len(matching_specs) == 0:
|
||||
if args.force: continue
|
||||
tty.die("%s does not match any installed packages." % spec)
|
||||
|
||||
for s in matching_specs:
|
||||
try:
|
||||
# should work if package is known to spack
|
||||
pkgs.append(s.package)
|
||||
except spack.repository.UnknownPackageError as e:
|
||||
# The package.py file has gone away -- but still
|
||||
# want to uninstall.
|
||||
spack.Package(s).do_uninstall(force=True)
|
||||
|
||||
# Sort packages to be uninstalled by the number of installed dependents
|
||||
# This ensures we do things in the right order
|
||||
def num_installed_deps(pkg):
|
||||
return len(pkg.installed_dependents)
|
||||
pkgs.sort(key=num_installed_deps)
|
||||
|
||||
# Uninstall packages in order now.
|
||||
for pkg in pkgs:
|
||||
try:
|
||||
pkg.do_uninstall(force=args.force)
|
||||
except PackageStillNeededError as e:
|
||||
tty.error("Will not uninstall %s" % e.spec.format("$_$@$%@$#", color=True))
|
||||
# Process dependent_list and update uninstall_list
|
||||
has_error = False
|
||||
if dependent_list and not args.dependents and not args.force:
|
||||
for spec, lst in dependent_list.items():
|
||||
tty.error("Will not uninstall %s" % spec.format("$_$@$%@$#", color=True))
|
||||
print('')
|
||||
print("The following packages depend on it:")
|
||||
display_specs(e.dependents, long=True)
|
||||
display_specs(lst, long=True)
|
||||
print('')
|
||||
print("You can use spack uninstall -f to force this action.")
|
||||
sys.exit(1)
|
||||
has_error = True
|
||||
elif args.dependents:
|
||||
for key, lst in dependent_list.items():
|
||||
uninstall_list.extend(lst)
|
||||
uninstall_list = list(set(uninstall_list))
|
||||
|
||||
if has_error:
|
||||
tty.die('You can use spack uninstall --dependents to uninstall these dependencies as well')
|
||||
|
||||
if not args.yes_to_all:
|
||||
tty.msg("The following packages will be uninstalled : ")
|
||||
print('')
|
||||
display_specs(uninstall_list, long=True)
|
||||
print('')
|
||||
ask_for_confirmation('Do you want to proceed ? ')
|
||||
|
||||
# Uninstall everything on the list
|
||||
do_uninstall(uninstall_list, args.force)
|
||||
|
|
|
@ -159,6 +159,10 @@ def concretize_version(self, spec):
|
|||
if any(v.satisfies(sv) for sv in spec.versions)],
|
||||
cmp=cmp_versions)
|
||||
|
||||
def prefer_key(v):
|
||||
return pkg.versions.get(Version(v)).get('preferred', False)
|
||||
valid_versions.sort(key=prefer_key, reverse=True)
|
||||
|
||||
if valid_versions:
|
||||
spec.versions = ver([valid_versions[0]])
|
||||
else:
|
||||
|
|
|
@ -237,7 +237,29 @@
|
|||
'type' : 'object',
|
||||
'default' : {},
|
||||
}
|
||||
},},},},},}
|
||||
},},},},},},
|
||||
'modules': {
|
||||
'$schema': 'http://json-schema.org/schema#',
|
||||
'title': 'Spack module file configuration file schema',
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'patternProperties': {
|
||||
r'modules:?': {
|
||||
'type': 'object',
|
||||
'default': {},
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
'enable': {
|
||||
'type': 'array',
|
||||
'default': [],
|
||||
'items': {
|
||||
'type': 'string'
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
"""OrderedDict of config scopes keyed by name.
|
||||
|
@ -405,11 +427,11 @@ def _read_config_file(filename, schema):
|
|||
validate_section(data, schema)
|
||||
return data
|
||||
|
||||
except MarkedYAMLError, e:
|
||||
except MarkedYAMLError as e:
|
||||
raise ConfigFileError(
|
||||
"Error parsing yaml%s: %s" % (str(e.context_mark), e.problem))
|
||||
|
||||
except IOError, e:
|
||||
except IOError as e:
|
||||
raise ConfigFileError(
|
||||
"Error reading configuration file %s: %s" % (filename, str(e)))
|
||||
|
||||
|
|
|
@ -289,8 +289,14 @@ def reset(self):
|
|||
if not self.archive_file:
|
||||
raise NoArchiveFileError("Tried to reset URLFetchStrategy before fetching",
|
||||
"Failed on reset() for URL %s" % self.url)
|
||||
if self.stage.source_path:
|
||||
shutil.rmtree(self.stage.source_path, ignore_errors=True)
|
||||
|
||||
# Remove everythigng but the archive from the stage
|
||||
for filename in os.listdir(self.stage.path):
|
||||
abspath = os.path.join(self.stage.path, filename)
|
||||
if abspath != self.archive_file:
|
||||
shutil.rmtree(abspath, ignore_errors=True)
|
||||
|
||||
# Expand the archive again
|
||||
self.expand()
|
||||
|
||||
def __repr__(self):
|
||||
|
|
|
@ -48,6 +48,7 @@
|
|||
|
||||
import llnl.util.tty as tty
|
||||
import spack
|
||||
import spack.config
|
||||
from llnl.util.filesystem import join_path, mkdirp
|
||||
from spack.environment import *
|
||||
|
||||
|
@ -56,6 +57,8 @@
|
|||
# Registry of all types of modules. Entries created by EnvModule's metaclass
|
||||
module_types = {}
|
||||
|
||||
CONFIGURATION = spack.config.get_config('modules')
|
||||
|
||||
|
||||
def print_help():
|
||||
"""For use by commands to tell user how to activate shell support."""
|
||||
|
@ -115,7 +118,7 @@ class EnvModule(object):
|
|||
class __metaclass__(type):
|
||||
def __init__(cls, name, bases, dict):
|
||||
type.__init__(cls, name, bases, dict)
|
||||
if cls.name != 'env_module':
|
||||
if cls.name != 'env_module' and cls.name in CONFIGURATION['enable']:
|
||||
module_types[cls.name] = cls
|
||||
|
||||
def __init__(self, spec=None):
|
||||
|
@ -158,13 +161,18 @@ def write(self):
|
|||
|
||||
# Let the extendee modify their extensions before asking for
|
||||
# package-specific modifications
|
||||
for extendee in self.pkg.extendees:
|
||||
extendee_spec = self.spec[extendee]
|
||||
extendee_spec.package.modify_module(
|
||||
self.pkg.module, extendee_spec, self.spec)
|
||||
spack_env = EnvironmentModifications()
|
||||
for item in self.pkg.extendees:
|
||||
try:
|
||||
package = self.spec[item].package
|
||||
package.setup_dependent_package(self.pkg.module, self.spec)
|
||||
package.setup_dependent_environment(spack_env, env, self.spec)
|
||||
except:
|
||||
# The extends was conditional, so it doesn't count here
|
||||
# eg: extends('python', when='+python')
|
||||
pass
|
||||
|
||||
# Package-specific environment modifications
|
||||
spack_env = EnvironmentModifications()
|
||||
self.spec.package.setup_environment(spack_env, env)
|
||||
|
||||
# TODO : implement site-specific modifications and filters
|
||||
|
@ -203,7 +211,11 @@ def use_name(self):
|
|||
def remove(self):
|
||||
mod_file = self.file_name
|
||||
if os.path.exists(mod_file):
|
||||
shutil.rmtree(mod_file, ignore_errors=True)
|
||||
try:
|
||||
os.remove(mod_file) # Remove the module file
|
||||
os.removedirs(os.path.dirname(mod_file)) # Remove all the empty directories from the leaf up
|
||||
except OSError:
|
||||
pass # removedirs throws OSError on first non-empty directory found
|
||||
|
||||
|
||||
class Dotkit(EnvModule):
|
||||
|
@ -275,6 +287,6 @@ def write_header(self, module_file):
|
|||
# Long description
|
||||
if self.long_description:
|
||||
module_file.write('proc ModulesHelp { } {\n')
|
||||
doc = re.sub(r'"', '\"', self.long_description)
|
||||
module_file.write("puts stderr \"%s\"\n" % doc)
|
||||
for line in textwrap.wrap(self.long_description, 72):
|
||||
module_file.write("puts stderr \"%s\"\n" % line)
|
||||
module_file.write('}\n\n')
|
||||
|
|
|
@ -335,6 +335,9 @@ def __init__(self, spec):
|
|||
if '.' in self.name:
|
||||
self.name = self.name[self.name.rindex('.') + 1:]
|
||||
|
||||
# Allow custom staging paths for packages
|
||||
self.path=None
|
||||
|
||||
# Sanity check attributes required by Spack directives.
|
||||
spack.directives.ensure_dicts(type(self))
|
||||
|
||||
|
@ -445,7 +448,8 @@ def _make_resource_stage(self, root_stage, fetcher, resource):
|
|||
resource_stage_folder = self._resource_stage(resource)
|
||||
resource_mirror = join_path(self.name, os.path.basename(fetcher.url))
|
||||
stage = ResourceStage(resource.fetcher, root=root_stage, resource=resource,
|
||||
name=resource_stage_folder, mirror_path=resource_mirror)
|
||||
name=resource_stage_folder, mirror_path=resource_mirror,
|
||||
path=self.path)
|
||||
return stage
|
||||
|
||||
def _make_root_stage(self, fetcher):
|
||||
|
@ -455,7 +459,7 @@ def _make_root_stage(self, fetcher):
|
|||
s = self.spec
|
||||
stage_name = "%s-%s-%s" % (s.name, s.version, s.dag_hash())
|
||||
# Build the composite stage
|
||||
stage = Stage(fetcher, mirror_path=mp, name=stage_name)
|
||||
stage = Stage(fetcher, mirror_path=mp, name=stage_name, path=self.path)
|
||||
return stage
|
||||
|
||||
def _make_stage(self):
|
||||
|
@ -709,7 +713,6 @@ def do_fetch(self, mirror_only=False):
|
|||
if spack.do_checksum and self.version in self.versions:
|
||||
self.stage.check()
|
||||
|
||||
|
||||
def do_stage(self, mirror_only=False):
|
||||
"""Unpacks the fetched tarball, then changes into the expanded tarball
|
||||
directory."""
|
||||
|
@ -926,6 +929,9 @@ def build_process():
|
|||
install(env_path, env_install_path)
|
||||
dump_packages(self.spec, packages_dir)
|
||||
|
||||
# Run post install hooks before build stage is removed.
|
||||
spack.hooks.post_install(self)
|
||||
|
||||
# Stop timer.
|
||||
self._total_time = time.time() - start_time
|
||||
build_time = self._total_time - self._fetch_time
|
||||
|
@ -954,9 +960,6 @@ def build_process():
|
|||
# the database, so that we don't need to re-read from file.
|
||||
spack.installed_db.add(self.spec, self.prefix)
|
||||
|
||||
# Once everything else is done, run post install hooks
|
||||
spack.hooks.post_install(self)
|
||||
|
||||
|
||||
def sanity_check_prefix(self):
|
||||
"""This function checks whether install succeeded."""
|
||||
|
|
|
@ -89,7 +89,7 @@ class Stage(object):
|
|||
"""
|
||||
|
||||
def __init__(self, url_or_fetch_strategy,
|
||||
name=None, mirror_path=None, keep=False):
|
||||
name=None, mirror_path=None, keep=False, path=None):
|
||||
"""Create a stage object.
|
||||
Parameters:
|
||||
url_or_fetch_strategy
|
||||
|
@ -135,7 +135,10 @@ def __init__(self, url_or_fetch_strategy,
|
|||
|
||||
# Try to construct here a temporary name for the stage directory
|
||||
# If this is a named stage, then construct a named path.
|
||||
self.path = join_path(spack.stage_path, self.name)
|
||||
if path is not None:
|
||||
self.path = path
|
||||
else:
|
||||
self.path = join_path(spack.stage_path, self.name)
|
||||
|
||||
# Flag to decide whether to delete the stage folder on exit or not
|
||||
self.keep = keep
|
||||
|
|
|
@ -67,7 +67,8 @@
|
|||
'namespace_trie',
|
||||
'yaml',
|
||||
'sbang',
|
||||
'environment']
|
||||
'environment',
|
||||
'cmd.uninstall']
|
||||
|
||||
|
||||
def list_tests():
|
||||
|
|
|
@ -28,6 +28,8 @@
|
|||
"""
|
||||
import os
|
||||
import unittest
|
||||
import tempfile
|
||||
import shutil
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
import spack
|
||||
|
@ -55,13 +57,40 @@ def setUp(self):
|
|||
self.ld = Executable(join_path(spack.build_env_path, "ld"))
|
||||
self.cpp = Executable(join_path(spack.build_env_path, "cpp"))
|
||||
|
||||
os.environ['SPACK_CC'] = "/bin/mycc"
|
||||
os.environ['SPACK_PREFIX'] = "/usr"
|
||||
self.realcc = "/bin/mycc"
|
||||
self.prefix = "/spack-test-prefix"
|
||||
|
||||
os.environ['SPACK_CC'] = self.realcc
|
||||
os.environ['SPACK_PREFIX'] = self.prefix
|
||||
os.environ['SPACK_ENV_PATH']="test"
|
||||
os.environ['SPACK_DEBUG_LOG_DIR'] = "."
|
||||
os.environ['SPACK_COMPILER_SPEC'] = "gcc@4.4.7"
|
||||
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2"
|
||||
|
||||
# Make some fake dependencies
|
||||
self.tmp_deps = tempfile.mkdtemp()
|
||||
self.dep1 = join_path(self.tmp_deps, 'dep1')
|
||||
self.dep2 = join_path(self.tmp_deps, 'dep2')
|
||||
self.dep3 = join_path(self.tmp_deps, 'dep3')
|
||||
self.dep4 = join_path(self.tmp_deps, 'dep4')
|
||||
|
||||
mkdirp(join_path(self.dep1, 'include'))
|
||||
mkdirp(join_path(self.dep1, 'lib'))
|
||||
|
||||
mkdirp(join_path(self.dep2, 'lib64'))
|
||||
|
||||
mkdirp(join_path(self.dep3, 'include'))
|
||||
mkdirp(join_path(self.dep3, 'lib64'))
|
||||
|
||||
mkdirp(join_path(self.dep4, 'include'))
|
||||
|
||||
if 'SPACK_DEPENDENCIES' in os.environ:
|
||||
del os.environ['SPACK_DEPENDENCIES']
|
||||
|
||||
|
||||
def tearDown(self):
|
||||
shutil.rmtree(self.tmp_deps, True)
|
||||
|
||||
|
||||
def check_cc(self, command, args, expected):
|
||||
os.environ['SPACK_TEST_COMMAND'] = command
|
||||
|
@ -92,6 +121,10 @@ def test_cpp_mode(self):
|
|||
self.check_cpp('dump-mode', [], "cpp")
|
||||
|
||||
|
||||
def test_as_mode(self):
|
||||
self.check_cc('dump-mode', ['-S'], "as")
|
||||
|
||||
|
||||
def test_ccld_mode(self):
|
||||
self.check_cc('dump-mode', [], "ccld")
|
||||
self.check_cc('dump-mode', ['foo.c', '-o', 'foo'], "ccld")
|
||||
|
@ -104,27 +137,85 @@ def test_ld_mode(self):
|
|||
self.check_ld('dump-mode', ['foo.o', 'bar.o', 'baz.o', '-o', 'foo', '-Wl,-rpath,foo'], "ld")
|
||||
|
||||
|
||||
def test_includes(self):
|
||||
self.check_cc('dump-includes', test_command,
|
||||
"\n".join(["/test/include", "/other/include"]))
|
||||
def test_dep_rpath(self):
|
||||
"""Ensure RPATHs for root package are added."""
|
||||
self.check_cc('dump-args', test_command,
|
||||
self.realcc + ' ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib64 ' +
|
||||
' '.join(test_command))
|
||||
|
||||
|
||||
def test_libraries(self):
|
||||
self.check_cc('dump-libraries', test_command,
|
||||
"\n".join(["/test/lib", "/other/lib"]))
|
||||
def test_dep_include(self):
|
||||
"""Ensure a single dependency include directory is added."""
|
||||
os.environ['SPACK_DEPENDENCIES'] = self.dep4
|
||||
self.check_cc('dump-args', test_command,
|
||||
self.realcc + ' ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib64 ' +
|
||||
'-I' + self.dep4 + '/include ' +
|
||||
' '.join(test_command))
|
||||
|
||||
|
||||
def test_libs(self):
|
||||
self.check_cc('dump-libs', test_command,
|
||||
"\n".join(["lib1", "lib2", "lib3", "lib4"]))
|
||||
def test_dep_lib(self):
|
||||
"""Ensure a single dependency RPATH is added."""
|
||||
os.environ['SPACK_DEPENDENCIES'] = self.dep2
|
||||
self.check_cc('dump-args', test_command,
|
||||
self.realcc + ' ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib64 ' +
|
||||
'-L' + self.dep2 + '/lib64 ' +
|
||||
'-Wl,-rpath,' + self.dep2 + '/lib64 ' +
|
||||
' '.join(test_command))
|
||||
|
||||
|
||||
def test_rpaths(self):
|
||||
self.check_cc('dump-rpaths', test_command,
|
||||
"\n".join(["/first/rpath", "/second/rpath", "/third/rpath", "/fourth/rpath"]))
|
||||
def test_all_deps(self):
|
||||
"""Ensure includes and RPATHs for all deps are added. """
|
||||
os.environ['SPACK_DEPENDENCIES'] = ':'.join([
|
||||
self.dep1, self.dep2, self.dep3, self.dep4])
|
||||
|
||||
# This is probably more constrained than it needs to be; it
|
||||
# checks order within prepended args and doesn't strictly have
|
||||
# to. We could loosen that if it becomes necessary
|
||||
self.check_cc('dump-args', test_command,
|
||||
self.realcc + ' ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib ' +
|
||||
'-Wl,-rpath,' + self.prefix + '/lib64 ' +
|
||||
|
||||
'-I' + self.dep4 + '/include ' +
|
||||
|
||||
'-L' + self.dep3 + '/lib64 ' +
|
||||
'-Wl,-rpath,' + self.dep3 + '/lib64 ' +
|
||||
'-I' + self.dep3 + '/include ' +
|
||||
|
||||
'-L' + self.dep2 + '/lib64 ' +
|
||||
'-Wl,-rpath,' + self.dep2 + '/lib64 ' +
|
||||
|
||||
'-L' + self.dep1 + '/lib ' +
|
||||
'-Wl,-rpath,' + self.dep1 + '/lib ' +
|
||||
'-I' + self.dep1 + '/include ' +
|
||||
|
||||
' '.join(test_command))
|
||||
|
||||
|
||||
def test_other_args(self):
|
||||
self.check_cc('dump-other-args', test_command,
|
||||
"\n".join(["arg1", "-Wl,--start-group", "arg2", "arg3", "arg4",
|
||||
"-Wl,--end-group", "arg5", "arg6"]))
|
||||
def test_ld_deps(self):
|
||||
"""Ensure no (extra) -I args or -Wl, are passed in ld mode."""
|
||||
os.environ['SPACK_DEPENDENCIES'] = ':'.join([
|
||||
self.dep1, self.dep2, self.dep3, self.dep4])
|
||||
|
||||
self.check_ld('dump-args', test_command,
|
||||
'ld ' +
|
||||
'-rpath ' + self.prefix + '/lib ' +
|
||||
'-rpath ' + self.prefix + '/lib64 ' +
|
||||
|
||||
'-L' + self.dep3 + '/lib64 ' +
|
||||
'-rpath ' + self.dep3 + '/lib64 ' +
|
||||
|
||||
'-L' + self.dep2 + '/lib64 ' +
|
||||
'-rpath ' + self.dep2 + '/lib64 ' +
|
||||
|
||||
'-L' + self.dep1 + '/lib ' +
|
||||
'-rpath ' + self.dep1 + '/lib ' +
|
||||
|
||||
' '.join(test_command))
|
||||
|
||||
|
|
0
lib/spack/spack/test/cmd/__init__.py
Normal file
0
lib/spack/spack/test/cmd/__init__.py
Normal file
37
lib/spack/spack/test/cmd/uninstall.py
Normal file
37
lib/spack/spack/test/cmd/uninstall.py
Normal file
|
@ -0,0 +1,37 @@
|
|||
import spack.test.mock_database
|
||||
|
||||
from spack.cmd.uninstall import uninstall
|
||||
|
||||
|
||||
class MockArgs(object):
|
||||
def __init__(self, packages, all=False, force=False, dependents=False):
|
||||
self.packages = packages
|
||||
self.all = all
|
||||
self.force = force
|
||||
self.dependents = dependents
|
||||
self.yes_to_all = True
|
||||
|
||||
|
||||
class TestUninstall(spack.test.mock_database.MockDatabase):
|
||||
def test_uninstall(self):
|
||||
parser = None
|
||||
# Multiple matches
|
||||
args = MockArgs(['mpileaks'])
|
||||
self.assertRaises(SystemExit, uninstall, parser, args)
|
||||
# Installed dependents
|
||||
args = MockArgs(['libelf'])
|
||||
self.assertRaises(SystemExit, uninstall, parser, args)
|
||||
# Recursive uninstall
|
||||
args = MockArgs(['callpath'], all=True, dependents=True)
|
||||
uninstall(parser, args)
|
||||
|
||||
all_specs = spack.install_layout.all_specs()
|
||||
self.assertEqual(len(all_specs), 7)
|
||||
# query specs with multiple configurations
|
||||
mpileaks_specs = [s for s in all_specs if s.satisfies('mpileaks')]
|
||||
callpath_specs = [s for s in all_specs if s.satisfies('callpath')]
|
||||
mpi_specs = [s for s in all_specs if s.satisfies('mpi')]
|
||||
|
||||
self.assertEqual(len(mpileaks_specs), 0)
|
||||
self.assertEqual(len(callpath_specs), 0)
|
||||
self.assertEqual(len(mpi_specs), 3)
|
|
@ -24,6 +24,7 @@
|
|||
##############################################################################
|
||||
import spack
|
||||
from spack.spec import Spec, CompilerSpec
|
||||
from spack.version import ver
|
||||
from spack.concretize import find_spec
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
|
@ -77,6 +78,14 @@ def test_concretize_variant(self):
|
|||
self.check_concretize('mpich')
|
||||
|
||||
|
||||
def test_concretize_preferred_version(self):
|
||||
spec = self.check_concretize('python')
|
||||
self.assertEqual(spec.versions, ver('2.7.11'))
|
||||
|
||||
spec = self.check_concretize('python@3.5.1')
|
||||
self.assertEqual(spec.versions, ver('3.5.1'))
|
||||
|
||||
|
||||
def test_concretize_with_virtual(self):
|
||||
self.check_concretize('mpileaks ^mpi')
|
||||
self.check_concretize('mpileaks ^mpi@:1.1')
|
||||
|
|
|
@ -28,16 +28,12 @@
|
|||
"""
|
||||
import os.path
|
||||
import multiprocessing
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
import spack
|
||||
from llnl.util.filesystem import join_path
|
||||
from llnl.util.lock import *
|
||||
from llnl.util.tty.colify import colify
|
||||
from spack.database import Database
|
||||
from spack.directory_layout import YamlDirectoryLayout
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.test.mock_database import MockDatabase
|
||||
|
||||
|
||||
def _print_ref_counts():
|
||||
|
@ -75,80 +71,7 @@ def add_rec(spec):
|
|||
colify(recs, cols=3)
|
||||
|
||||
|
||||
class DatabaseTest(MockPackagesTest):
|
||||
|
||||
def _mock_install(self, spec):
|
||||
s = Spec(spec)
|
||||
s.concretize()
|
||||
pkg = spack.repo.get(s)
|
||||
pkg.do_install(fake=True)
|
||||
|
||||
|
||||
def _mock_remove(self, spec):
|
||||
specs = spack.installed_db.query(spec)
|
||||
assert(len(specs) == 1)
|
||||
spec = specs[0]
|
||||
spec.package.do_uninstall(spec)
|
||||
|
||||
|
||||
def setUp(self):
|
||||
super(DatabaseTest, self).setUp()
|
||||
#
|
||||
# TODO: make the mockup below easier.
|
||||
#
|
||||
|
||||
# Make a fake install directory
|
||||
self.install_path = tempfile.mkdtemp()
|
||||
self.spack_install_path = spack.install_path
|
||||
spack.install_path = self.install_path
|
||||
|
||||
self.install_layout = YamlDirectoryLayout(self.install_path)
|
||||
self.spack_install_layout = spack.install_layout
|
||||
spack.install_layout = self.install_layout
|
||||
|
||||
# Make fake database and fake install directory.
|
||||
self.installed_db = Database(self.install_path)
|
||||
self.spack_installed_db = spack.installed_db
|
||||
spack.installed_db = self.installed_db
|
||||
|
||||
# make a mock database with some packages installed note that
|
||||
# the ref count for dyninst here will be 3, as it's recycled
|
||||
# across each install.
|
||||
#
|
||||
# Here is what the mock DB looks like:
|
||||
#
|
||||
# o mpileaks o mpileaks' o mpileaks''
|
||||
# |\ |\ |\
|
||||
# | o callpath | o callpath' | o callpath''
|
||||
# |/| |/| |/|
|
||||
# o | mpich o | mpich2 o | zmpi
|
||||
# | | o | fake
|
||||
# | | |
|
||||
# | |______________/
|
||||
# | .____________/
|
||||
# |/
|
||||
# o dyninst
|
||||
# |\
|
||||
# | o libdwarf
|
||||
# |/
|
||||
# o libelf
|
||||
#
|
||||
|
||||
# Transaction used to avoid repeated writes.
|
||||
with spack.installed_db.write_transaction():
|
||||
self._mock_install('mpileaks ^mpich')
|
||||
self._mock_install('mpileaks ^mpich2')
|
||||
self._mock_install('mpileaks ^zmpi')
|
||||
|
||||
|
||||
def tearDown(self):
|
||||
super(DatabaseTest, self).tearDown()
|
||||
shutil.rmtree(self.install_path)
|
||||
spack.install_path = self.spack_install_path
|
||||
spack.install_layout = self.spack_install_layout
|
||||
spack.installed_db = self.spack_installed_db
|
||||
|
||||
|
||||
class DatabaseTest(MockDatabase):
|
||||
def test_005_db_exists(self):
|
||||
"""Make sure db cache file exists after creating."""
|
||||
index_file = join_path(self.install_path, '.spack-db', 'index.yaml')
|
||||
|
@ -157,7 +80,6 @@ def test_005_db_exists(self):
|
|||
self.assertTrue(os.path.exists(index_file))
|
||||
self.assertTrue(os.path.exists(lock_file))
|
||||
|
||||
|
||||
def test_010_all_install_sanity(self):
|
||||
"""Ensure that the install layout reflects what we think it does."""
|
||||
all_specs = spack.install_layout.all_specs()
|
||||
|
|
|
@ -64,7 +64,14 @@ def tearDown(self):
|
|||
shutil.rmtree(self.tmpdir, ignore_errors=True)
|
||||
|
||||
|
||||
def test_install_and_uninstall(self):
|
||||
def fake_fetchify(self, pkg):
|
||||
"""Fake the URL for a package so it downloads from a file."""
|
||||
fetcher = FetchStrategyComposite()
|
||||
fetcher.append(URLFetchStrategy(self.repo.url))
|
||||
pkg.fetcher = fetcher
|
||||
|
||||
|
||||
def ztest_install_and_uninstall(self):
|
||||
# Get a basic concrete spec for the trivial install package.
|
||||
spec = Spec('trivial_install_test_package')
|
||||
spec.concretize()
|
||||
|
@ -73,11 +80,7 @@ def test_install_and_uninstall(self):
|
|||
# Get the package
|
||||
pkg = spack.repo.get(spec)
|
||||
|
||||
# Fake the URL for the package so it downloads from a file.
|
||||
|
||||
fetcher = FetchStrategyComposite()
|
||||
fetcher.append(URLFetchStrategy(self.repo.url))
|
||||
pkg.fetcher = fetcher
|
||||
self.fake_fetchify(pkg)
|
||||
|
||||
try:
|
||||
pkg.do_install()
|
||||
|
@ -85,3 +88,17 @@ def test_install_and_uninstall(self):
|
|||
except Exception, e:
|
||||
pkg.remove_prefix()
|
||||
raise
|
||||
|
||||
|
||||
def test_install_environment(self):
|
||||
spec = Spec('cmake-client').concretized()
|
||||
|
||||
for s in spec.traverse():
|
||||
self.fake_fetchify(s.package)
|
||||
|
||||
pkg = spec.package
|
||||
try:
|
||||
pkg.do_install()
|
||||
except Exception, e:
|
||||
pkg.remove_prefix()
|
||||
raise
|
||||
|
|
80
lib/spack/spack/test/mock_database.py
Normal file
80
lib/spack/spack/test/mock_database.py
Normal file
|
@ -0,0 +1,80 @@
|
|||
import shutil
|
||||
import tempfile
|
||||
|
||||
import spack
|
||||
from spack.spec import Spec
|
||||
from spack.database import Database
|
||||
from spack.directory_layout import YamlDirectoryLayout
|
||||
from spack.test.mock_packages_test import MockPackagesTest
|
||||
|
||||
|
||||
class MockDatabase(MockPackagesTest):
|
||||
def _mock_install(self, spec):
|
||||
s = Spec(spec)
|
||||
s.concretize()
|
||||
pkg = spack.repo.get(s)
|
||||
pkg.do_install(fake=True)
|
||||
|
||||
def _mock_remove(self, spec):
|
||||
specs = spack.installed_db.query(spec)
|
||||
assert len(specs) == 1
|
||||
spec = specs[0]
|
||||
spec.package.do_uninstall(spec)
|
||||
|
||||
def setUp(self):
|
||||
super(MockDatabase, self).setUp()
|
||||
#
|
||||
# TODO: make the mockup below easier.
|
||||
#
|
||||
|
||||
# Make a fake install directory
|
||||
self.install_path = tempfile.mkdtemp()
|
||||
self.spack_install_path = spack.install_path
|
||||
spack.install_path = self.install_path
|
||||
|
||||
self.install_layout = YamlDirectoryLayout(self.install_path)
|
||||
self.spack_install_layout = spack.install_layout
|
||||
spack.install_layout = self.install_layout
|
||||
|
||||
# Make fake database and fake install directory.
|
||||
self.installed_db = Database(self.install_path)
|
||||
self.spack_installed_db = spack.installed_db
|
||||
spack.installed_db = self.installed_db
|
||||
|
||||
# make a mock database with some packages installed note that
|
||||
# the ref count for dyninst here will be 3, as it's recycled
|
||||
# across each install.
|
||||
#
|
||||
# Here is what the mock DB looks like:
|
||||
#
|
||||
# o mpileaks o mpileaks' o mpileaks''
|
||||
# |\ |\ |\
|
||||
# | o callpath | o callpath' | o callpath''
|
||||
# |/| |/| |/|
|
||||
# o | mpich o | mpich2 o | zmpi
|
||||
# | | o | fake
|
||||
# | | |
|
||||
# | |______________/
|
||||
# | .____________/
|
||||
# |/
|
||||
# o dyninst
|
||||
# |\
|
||||
# | o libdwarf
|
||||
# |/
|
||||
# o libelf
|
||||
#
|
||||
|
||||
# Transaction used to avoid repeated writes.
|
||||
with spack.installed_db.write_transaction():
|
||||
self._mock_install('mpileaks ^mpich')
|
||||
self._mock_install('mpileaks ^mpich2')
|
||||
self._mock_install('mpileaks ^zmpi')
|
||||
|
||||
def tearDown(self):
|
||||
for spec in spack.installed_db.query():
|
||||
spec.package.do_uninstall(spec)
|
||||
super(MockDatabase, self).tearDown()
|
||||
shutil.rmtree(self.install_path)
|
||||
spack.install_path = self.spack_install_path
|
||||
spack.install_layout = self.spack_install_layout
|
||||
spack.installed_db = self.spack_installed_db
|
|
@ -0,0 +1,89 @@
|
|||
##############################################################################
|
||||
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://github.com/llnl/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License (as published by
|
||||
# the Free Software Foundation) version 2.1 dated February 1999.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but
|
||||
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
|
||||
# conditions of the GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from spack import *
|
||||
import os
|
||||
|
||||
def check(condition, msg):
|
||||
"""Raise an install error if condition is False."""
|
||||
if not condition:
|
||||
raise InstallError(msg)
|
||||
|
||||
|
||||
class CmakeClient(Package):
|
||||
"""A dumy package that uses cmake."""
|
||||
homepage = 'https://www.example.com'
|
||||
url = 'https://www.example.com/cmake-client-1.0.tar.gz'
|
||||
|
||||
version('1.0', '4cb3ff35b2472aae70f542116d616e63')
|
||||
|
||||
depends_on('cmake')
|
||||
|
||||
|
||||
def setup_environment(self, spack_env, run_env):
|
||||
spack_cc # Ensure spack module-scope variable is avaiabl
|
||||
check(from_cmake == "from_cmake",
|
||||
"setup_environment couldn't read global set by cmake.")
|
||||
|
||||
check(self.spec['cmake'].link_arg == "test link arg",
|
||||
"link arg on dependency spec not readable from setup_environment.")
|
||||
|
||||
|
||||
def setup_dependent_environment(self, spack_env, run_env, dspec):
|
||||
spack_cc # Ensure spack module-scope variable is avaiable
|
||||
check(from_cmake == "from_cmake",
|
||||
"setup_dependent_environment couldn't read global set by cmake.")
|
||||
|
||||
check(self.spec['cmake'].link_arg == "test link arg",
|
||||
"link arg on dependency spec not readable from setup_dependent_environment.")
|
||||
|
||||
|
||||
def setup_dependent_package(self, module, dspec):
|
||||
spack_cc # Ensure spack module-scope variable is avaiable
|
||||
check(from_cmake == "from_cmake",
|
||||
"setup_dependent_package couldn't read global set by cmake.")
|
||||
|
||||
check(self.spec['cmake'].link_arg == "test link arg",
|
||||
"link arg on dependency spec not readable from setup_dependent_package.")
|
||||
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# check that cmake is in the global scope.
|
||||
global cmake
|
||||
check(cmake is not None, "No cmake was in environment!")
|
||||
|
||||
# check that which('cmake') returns the right one.
|
||||
cmake = which('cmake')
|
||||
check(cmake.exe[0].startswith(spec['cmake'].prefix.bin),
|
||||
"Wrong cmake was in environment: %s" % cmake)
|
||||
|
||||
check(from_cmake == "from_cmake",
|
||||
"Couldn't read global set by cmake.")
|
||||
|
||||
check(os.environ['from_cmake'] == 'from_cmake',
|
||||
"Couldn't read env var set in envieonmnt by dependency")
|
||||
|
||||
mkdirp(prefix.bin)
|
||||
touch(join_path(prefix.bin, 'dummy'))
|
69
var/spack/repos/builtin.mock/packages/cmake/package.py
Normal file
69
var/spack/repos/builtin.mock/packages/cmake/package.py
Normal file
|
@ -0,0 +1,69 @@
|
|||
##############################################################################
|
||||
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://github.com/llnl/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License (as published by
|
||||
# the Free Software Foundation) version 2.1 dated February 1999.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but
|
||||
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
|
||||
# conditions of the GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from spack import *
|
||||
import os
|
||||
|
||||
def check(condition, msg):
|
||||
"""Raise an install error if condition is False."""
|
||||
if not condition:
|
||||
raise InstallError(msg)
|
||||
|
||||
|
||||
class Cmake(Package):
|
||||
"""A dumy package for the cmake build system."""
|
||||
homepage = 'https://www.cmake.org'
|
||||
url = 'https://cmake.org/files/v3.4/cmake-3.4.3.tar.gz'
|
||||
|
||||
version('3.4.3', '4cb3ff35b2472aae70f542116d616e63',
|
||||
url='https://cmake.org/files/v3.4/cmake-3.4.3.tar.gz')
|
||||
|
||||
|
||||
def setup_environment(self, spack_env, run_env):
|
||||
spack_cc # Ensure spack module-scope variable is avaiable
|
||||
spack_env.set('for_install', 'for_install')
|
||||
|
||||
def setup_dependent_environment(self, spack_env, run_env, dspec):
|
||||
spack_cc # Ensure spack module-scope variable is avaiable
|
||||
spack_env.set('from_cmake', 'from_cmake')
|
||||
|
||||
|
||||
def setup_dependent_package(self, module, dspec):
|
||||
spack_cc # Ensure spack module-scope variable is avaiable
|
||||
|
||||
self.spec.from_cmake = "from_cmake"
|
||||
module.from_cmake = "from_cmake"
|
||||
|
||||
self.spec.link_arg = "test link arg"
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
mkdirp(prefix.bin)
|
||||
|
||||
check(os.environ['for_install'] == 'for_install',
|
||||
"Couldn't read env var set in compile envieonmnt")
|
||||
|
||||
cmake_exe = join_path(prefix.bin, 'cmake')
|
||||
touch(cmake_exe)
|
||||
set_executable(cmake_exe)
|
43
var/spack/repos/builtin.mock/packages/python/package.py
Normal file
43
var/spack/repos/builtin.mock/packages/python/package.py
Normal file
|
@ -0,0 +1,43 @@
|
|||
##############################################################################
|
||||
# Copyright (c) 2013-2015, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://github.com/llnl/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License (as published by
|
||||
# the Free Software Foundation) version 2.1 dated February 1999.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but
|
||||
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
|
||||
# conditions of the GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from spack import *
|
||||
|
||||
class Python(Package):
|
||||
"""Dummy Python package to demonstrate preferred versions."""
|
||||
homepage = "http://www.python.org"
|
||||
url = "http://www.python.org/ftp/python/2.7.8/Python-2.7.8.tgz"
|
||||
|
||||
extendable = True
|
||||
|
||||
version('3.5.1', 'be78e48cdfc1a7ad90efff146dce6cfe')
|
||||
version('3.5.0', 'a56c0c0b45d75a0ec9c6dee933c41c36')
|
||||
version('2.7.11', '6b6076ec9e93f05dd63e47eb9c15728b', preferred=True)
|
||||
version('2.7.10', 'd7547558fd673bd9d38e2108c6b42521')
|
||||
version('2.7.9', '5eebcaa0030dc4061156d3429657fb83')
|
||||
version('2.7.8', 'd4bca0159acb0b44a781292b5231936f')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
pass
|
||||
|
44
var/spack/repos/builtin/packages/apr-util/package.py
Normal file
44
var/spack/repos/builtin/packages/apr-util/package.py
Normal file
|
@ -0,0 +1,44 @@
|
|||
##############################################################################
|
||||
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://github.com/llnl/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License (as published by
|
||||
# the Free Software Foundation) version 2.1 dated February 1999.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but
|
||||
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
|
||||
# conditions of the GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from spack import *
|
||||
|
||||
class AprUtil(Package):
|
||||
"""Apache Portable Runtime Utility"""
|
||||
homepage = 'https://apr.apache.org/'
|
||||
url = 'http://archive.apache.org/dist/apr/apr-util-1.5.4.tar.gz'
|
||||
|
||||
version('1.5.4', '866825c04da827c6e5f53daff5569f42')
|
||||
|
||||
depends_on('apr')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
# configure, build, install:
|
||||
options = ['--prefix=%s' % prefix]
|
||||
options.append('--with-apr=%s' % spec['apr'].prefix)
|
||||
|
||||
configure(*options)
|
||||
make()
|
||||
make('install')
|
38
var/spack/repos/builtin/packages/apr/package.py
Normal file
38
var/spack/repos/builtin/packages/apr/package.py
Normal file
|
@ -0,0 +1,38 @@
|
|||
##############################################################################
|
||||
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://github.com/llnl/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License (as published by
|
||||
# the Free Software Foundation) version 2.1 dated February 1999.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but
|
||||
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
|
||||
# conditions of the GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from spack import *
|
||||
|
||||
class Apr(Package):
|
||||
"""Apache portable runtime."""
|
||||
homepage = 'https://apr.apache.org/'
|
||||
url = 'http://archive.apache.org/dist/apr/apr-1.5.2.tar.gz'
|
||||
|
||||
version('1.5.2', '98492e965963f852ab29f9e61b2ad700')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = ['--prefix=%s' % prefix]
|
||||
configure(*options)
|
||||
make()
|
||||
make('install')
|
|
@ -41,16 +41,26 @@ class ArpackNg(Package):
|
|||
|
||||
depends_on('blas')
|
||||
depends_on('lapack')
|
||||
depends_on('automake')
|
||||
depends_on('autoconf')
|
||||
depends_on('libtool@2.4.2:')
|
||||
|
||||
depends_on('mpi', when='+mpi')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# Apparently autotools are not bootstrapped
|
||||
# TODO: switch to use the CMake build in the next version
|
||||
# rather than bootstrapping.
|
||||
which('libtoolize')()
|
||||
bootstrap = Executable('./bootstrap')
|
||||
|
||||
options = ['--prefix=%s' % prefix]
|
||||
|
||||
if '+mpi' in spec:
|
||||
options.append('--enable-mpi')
|
||||
options.extend([
|
||||
'--enable-mpi',
|
||||
'F77=mpif77' #FIXME: avoid hardcoding MPI wrapper names
|
||||
])
|
||||
|
||||
if '~shared' in spec:
|
||||
options.append('--enable-shared=no')
|
||||
|
|
17
var/spack/repos/builtin/packages/astyle/package.py
Normal file
17
var/spack/repos/builtin/packages/astyle/package.py
Normal file
|
@ -0,0 +1,17 @@
|
|||
from spack import *
|
||||
import os
|
||||
|
||||
class Astyle(Package):
|
||||
"""A Free, Fast, and Small Automatic Formatter for C, C++, C++/CLI, Objective-C, C#, and Java Source Code."""
|
||||
homepage = "http://astyle.sourceforge.net/"
|
||||
url = "http://downloads.sourceforge.net/project/astyle/astyle/astyle%202.04/astyle_2.04_linux.tar.gz"
|
||||
|
||||
version('2.04', '30b1193a758b0909d06e7ee8dd9627f6')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
with working_dir('src'):
|
||||
make('-f',
|
||||
join_path(self.stage.source_path,'build','clang','Makefile'),
|
||||
parallel=False)
|
||||
install(join_path(self.stage.source_path, 'src','bin','astyle'), self.prefix.bin)
|
|
@ -1,31 +1,36 @@
|
|||
from spack import *
|
||||
from spack.util.executable import Executable
|
||||
import os
|
||||
import os.path
|
||||
|
||||
class Atlas(Package):
|
||||
"""
|
||||
Automatically Tuned Linear Algebra Software, generic shared
|
||||
ATLAS is an approach for the automatic generation and optimization of
|
||||
numerical software. Currently ATLAS supplies optimized versions for the
|
||||
complete set of linear algebra kernels known as the Basic Linear Algebra
|
||||
Subroutines (BLAS), and a subset of the linear algebra routines in the
|
||||
LAPACK library.
|
||||
Automatically Tuned Linear Algebra Software, generic shared ATLAS is an approach for the automatic generation and
|
||||
optimization of numerical software. Currently ATLAS supplies optimized versions for the complete set of linear
|
||||
algebra kernels known as the Basic Linear Algebra Subroutines (BLAS), and a subset of the linear algebra routines
|
||||
in the LAPACK library.
|
||||
"""
|
||||
homepage = "http://math-atlas.sourceforge.net/"
|
||||
|
||||
version('3.10.2', 'a4e21f343dec8f22e7415e339f09f6da',
|
||||
url='http://downloads.sourceforge.net/project/math-atlas/Stable/3.10.2/atlas3.10.2.tar.bz2', preferred=True)
|
||||
resource(name='lapack',
|
||||
url='http://www.netlib.org/lapack/lapack-3.5.0.tgz',
|
||||
md5='b1d3e3e425b2e44a06760ff173104bdf',
|
||||
destination='spack-resource-lapack',
|
||||
when='@3:')
|
||||
|
||||
version('3.11.34', '0b6c5389c095c4c8785fd0f724ec6825',
|
||||
url='http://sourceforge.net/projects/math-atlas/files/Developer%20%28unstable%29/3.11.34/atlas3.11.34.tar.bz2/download')
|
||||
version('3.10.2', 'a4e21f343dec8f22e7415e339f09f6da',
|
||||
url='http://downloads.sourceforge.net/project/math-atlas/Stable/3.10.2/atlas3.10.2.tar.bz2')
|
||||
|
||||
# TODO: make this provide BLAS once it works better. Create a way
|
||||
# TODO: to mark "beta" packages and require explicit invocation.
|
||||
variant('shared', default=True, description='Builds shared library')
|
||||
|
||||
# provides('blas')
|
||||
provides('blas')
|
||||
provides('lapack')
|
||||
|
||||
parallel = False
|
||||
|
||||
def patch(self):
|
||||
# Disable thraed check. LLNL's environment does not allow
|
||||
# Disable thread check. LLNL's environment does not allow
|
||||
# disabling of CPU throttling in a way that ATLAS actually
|
||||
# understands.
|
||||
filter_file(r'^\s+if \(thrchk\) exit\(1\);', 'if (0) exit(1);',
|
||||
|
@ -33,26 +38,21 @@ def patch(self):
|
|||
# TODO: investigate a better way to add the check back in
|
||||
# TODO: using, say, MSRs. Or move this to a variant.
|
||||
|
||||
@when('@:3.10')
|
||||
def install(self, spec, prefix):
|
||||
with working_dir('ATLAS-Build', create=True):
|
||||
|
||||
options = []
|
||||
if '+shared' in spec:
|
||||
options.append('--shared')
|
||||
|
||||
# Lapack resource
|
||||
lapack_stage = self.stage[1]
|
||||
lapack_tarfile = os.path.basename(lapack_stage.fetcher.url)
|
||||
lapack_tarfile_path = join_path(lapack_stage.path, lapack_tarfile)
|
||||
options.append('--with-netlib-lapack-tarfile=%s' % lapack_tarfile_path)
|
||||
|
||||
with working_dir('spack-build', create=True):
|
||||
configure = Executable('../configure')
|
||||
configure('--prefix=%s' % prefix, '-C', 'ic', 'cc', '-C', 'if', 'f77', "--dylibs")
|
||||
make()
|
||||
make('check')
|
||||
make('ptcheck')
|
||||
make('time')
|
||||
make("install")
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
with working_dir('ATLAS-Build', create=True):
|
||||
configure = Executable('../configure')
|
||||
configure('--incdir=%s' % prefix.include,
|
||||
'--libdir=%s' % prefix.lib,
|
||||
'--cc=cc',
|
||||
"--shared")
|
||||
|
||||
configure('--prefix=%s' % prefix, *options)
|
||||
make()
|
||||
make('check')
|
||||
make('ptcheck')
|
||||
|
|
20
var/spack/repos/builtin/packages/bash/package.py
Normal file
20
var/spack/repos/builtin/packages/bash/package.py
Normal file
|
@ -0,0 +1,20 @@
|
|||
from spack import *
|
||||
|
||||
class Bash(Package):
|
||||
"""The GNU Project's Bourne Again SHell."""
|
||||
|
||||
homepage = "https://www.gnu.org/software/bash/"
|
||||
url = "ftp://ftp.gnu.org/gnu/bash/bash-4.3.tar.gz"
|
||||
|
||||
version('4.3', '81348932d5da294953e15d4814c74dd1')
|
||||
|
||||
depends_on('readline')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure('--prefix=%s' % prefix,
|
||||
'--with-curses',
|
||||
'--with-installed-readline=%s' % spec['readline'].prefix)
|
||||
|
||||
make()
|
||||
make("tests")
|
||||
make("install")
|
|
@ -12,6 +12,10 @@ class Binutils(Package):
|
|||
version('2.23.2', '4f8fa651e35ef262edc01d60fb45702e')
|
||||
version('2.20.1', '2b9dc8f2b7dbd5ec5992c6e29de0b764')
|
||||
|
||||
depends_on('m4')
|
||||
depends_on('flex')
|
||||
depends_on('bison')
|
||||
|
||||
# Add a patch that creates binutils libiberty_pic.a which is preferred by OpenSpeedShop and cbtf-krell
|
||||
variant('krellpatch', default=False, description="build with openspeedshop based patch.")
|
||||
variant('gold', default=True, description="build the gold linker")
|
||||
|
|
|
@ -1,5 +1,9 @@
|
|||
from spack import *
|
||||
import spack
|
||||
import sys
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
class Boost(Package):
|
||||
"""Boost provides free peer-reviewed portable C++ source
|
||||
|
@ -45,34 +49,34 @@ class Boost(Package):
|
|||
version('1.34.1', '2d938467e8a448a2c9763e0a9f8ca7e5')
|
||||
version('1.34.0', 'ed5b9291ffad776f8757a916e1726ad0')
|
||||
|
||||
default_install_libs = set(['atomic',
|
||||
'chrono',
|
||||
'date_time',
|
||||
'filesystem',
|
||||
default_install_libs = set(['atomic',
|
||||
'chrono',
|
||||
'date_time',
|
||||
'filesystem',
|
||||
'graph',
|
||||
'iostreams',
|
||||
'locale',
|
||||
'log',
|
||||
'math',
|
||||
'math',
|
||||
'program_options',
|
||||
'random',
|
||||
'regex',
|
||||
'serialization',
|
||||
'signals',
|
||||
'system',
|
||||
'test',
|
||||
'thread',
|
||||
'random',
|
||||
'regex',
|
||||
'serialization',
|
||||
'signals',
|
||||
'system',
|
||||
'test',
|
||||
'thread',
|
||||
'wave'])
|
||||
|
||||
# mpi/python are not installed by default because they pull in many
|
||||
# dependencies and/or because there is a great deal of customization
|
||||
# mpi/python are not installed by default because they pull in many
|
||||
# dependencies and/or because there is a great deal of customization
|
||||
# possible (and it would be difficult to choose sensible defaults)
|
||||
default_noinstall_libs = set(['mpi', 'python'])
|
||||
|
||||
all_libs = default_install_libs | default_noinstall_libs
|
||||
|
||||
for lib in all_libs:
|
||||
variant(lib, default=(lib not in default_noinstall_libs),
|
||||
variant(lib, default=(lib not in default_noinstall_libs),
|
||||
description="Compile with {0} library".format(lib))
|
||||
|
||||
variant('debug', default=False, description='Switch to the debug version of Boost')
|
||||
|
@ -124,9 +128,9 @@ def determine_bootstrap_options(self, spec, withLibs, options):
|
|||
|
||||
with open('user-config.jam', 'w') as f:
|
||||
compiler_wrapper = join_path(spack.build_env_path, 'c++')
|
||||
f.write("using {0} : : {1} ;\n".format(boostToolsetId,
|
||||
f.write("using {0} : : {1} ;\n".format(boostToolsetId,
|
||||
compiler_wrapper))
|
||||
|
||||
|
||||
if '+mpi' in spec:
|
||||
f.write('using mpi : %s ;\n' %
|
||||
join_path(spec['mpi'].prefix.bin, 'mpicxx'))
|
||||
|
@ -155,7 +159,7 @@ def determine_b2_options(self, spec, options):
|
|||
linkTypes = ['static']
|
||||
if '+shared' in spec:
|
||||
linkTypes.append('shared')
|
||||
|
||||
|
||||
threadingOpts = []
|
||||
if '+multithreaded' in spec:
|
||||
threadingOpts.append('multi')
|
||||
|
@ -163,28 +167,50 @@ def determine_b2_options(self, spec, options):
|
|||
threadingOpts.append('single')
|
||||
if not threadingOpts:
|
||||
raise RuntimeError("At least one of {singlethreaded, multithreaded} must be enabled")
|
||||
|
||||
|
||||
options.extend([
|
||||
'toolset=%s' % self.determine_toolset(spec),
|
||||
'link=%s' % ','.join(linkTypes),
|
||||
'--layout=tagged'])
|
||||
|
||||
|
||||
return threadingOpts
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# On Darwin, Boost expects the Darwin libtool. However, one of the
|
||||
# dependencies may have pulled in Spack's GNU libtool, and these two are
|
||||
# not compatible. We thus create a symlink to Darwin's libtool and add
|
||||
# it at the beginning of PATH.
|
||||
if sys.platform == 'darwin':
|
||||
newdir = os.path.abspath('darwin-libtool')
|
||||
mkdirp(newdir)
|
||||
force_symlink('/usr/bin/libtool', join_path(newdir, 'libtool'))
|
||||
env['PATH'] = newdir + ':' + env['PATH']
|
||||
|
||||
withLibs = list()
|
||||
for lib in Boost.all_libs:
|
||||
if "+{0}".format(lib) in spec:
|
||||
withLibs.append(lib)
|
||||
if not withLibs:
|
||||
# if no libraries are specified for compilation, then you dont have
|
||||
# if no libraries are specified for compilation, then you dont have
|
||||
# to configure/build anything, just copy over to the prefix directory.
|
||||
src = join_path(self.stage.source_path, 'boost')
|
||||
mkdirp(join_path(prefix, 'include'))
|
||||
dst = join_path(prefix, 'include', 'boost')
|
||||
install_tree(src, dst)
|
||||
return
|
||||
|
||||
|
||||
# Remove libraries that the release version does not support
|
||||
if not spec.satisfies('@1.54.0:'):
|
||||
withLibs.remove('log')
|
||||
if not spec.satisfies('@1.53.0:'):
|
||||
withLibs.remove('atomic')
|
||||
if not spec.satisfies('@1.48.0:'):
|
||||
withLibs.remove('locale')
|
||||
if not spec.satisfies('@1.47.0:'):
|
||||
withLibs.remove('chrono')
|
||||
if not spec.satisfies('@1.43.0:'):
|
||||
withLibs.remove('random')
|
||||
|
||||
# to make Boost find the user-config.jam
|
||||
env['BOOST_BUILD_PATH'] = './'
|
||||
|
||||
|
@ -207,4 +233,7 @@ def install(self, spec, prefix):
|
|||
# Boost.MPI if the threading options are not separated.
|
||||
for threadingOpt in threadingOpts:
|
||||
b2('install', 'threading=%s' % threadingOpt, *b2_options)
|
||||
|
||||
|
||||
# The shared libraries are not installed correctly on Darwin; correct this
|
||||
if (sys.platform == 'darwin') and ('+shared' in spec):
|
||||
fix_darwin_install_name(prefix.lib)
|
||||
|
|
|
@ -30,6 +30,7 @@ class Cmake(Package):
|
|||
homepage = 'https://www.cmake.org'
|
||||
url = 'https://cmake.org/files/v3.4/cmake-3.4.3.tar.gz'
|
||||
|
||||
version('3.5.1', 'ca051f4a66375c89d1a524e726da0296')
|
||||
version('3.5.0', '33c5d09d4c33d4ffcc63578a6ba8777e')
|
||||
version('3.4.3', '4cb3ff35b2472aae70f542116d616e63')
|
||||
version('3.4.0', 'cd3034e0a44256a0917e254167217fc8')
|
||||
|
@ -38,10 +39,12 @@ class Cmake(Package):
|
|||
version('2.8.10.2', '097278785da7182ec0aea8769d06860c')
|
||||
|
||||
variant('ncurses', default=True, description='Enables the build of the ncurses gui')
|
||||
variant('openssl', default=True, description="Enables CMake's OpenSSL features")
|
||||
variant('qt', default=False, description='Enables the build of cmake-gui')
|
||||
variant('doc', default=False, description='Enables the generation of html and man page documentation')
|
||||
|
||||
depends_on('ncurses', when='+ncurses')
|
||||
depends_on('openssl', when='+openssl')
|
||||
depends_on('qt', when='+qt')
|
||||
depends_on('python@2.7.11:', when='+doc')
|
||||
depends_on('py-sphinx', when='+doc')
|
||||
|
@ -77,8 +80,9 @@ def install(self, spec, prefix):
|
|||
options.append('--sphinx-html')
|
||||
options.append('--sphinx-man')
|
||||
|
||||
options.append('--')
|
||||
options.append('-DCMAKE_USE_OPENSSL=ON')
|
||||
if '+openssl' in spec:
|
||||
options.append('--')
|
||||
options.append('-DCMAKE_USE_OPENSSL=ON')
|
||||
|
||||
configure(*options)
|
||||
make()
|
||||
|
|
|
@ -8,8 +8,8 @@ class Cryptopp(Package):
|
|||
public-key encryption (RSA, DSA), and a few obsolete/historical encryption
|
||||
algorithms (MD5, Panama)."""
|
||||
|
||||
homepage = "http://www.cryptopp.com/"
|
||||
url = "http://www.cryptopp.com/cryptopp563.zip"
|
||||
homepage = "http://www.cryptopp.com"
|
||||
base_url = "http://www.cryptopp.com"
|
||||
|
||||
version('5.6.3', '3c5b70e2ec98b7a24988734446242d07')
|
||||
version('5.6.2', '7ed022585698df48e65ce9218f6c6a67')
|
||||
|
@ -25,7 +25,5 @@ def install(self, spec, prefix):
|
|||
install('libcryptopp.a', prefix.lib)
|
||||
|
||||
def url_for_version(self, version):
|
||||
version_tuple = tuple(v for v in iter(version))
|
||||
version_string = reduce(lambda vs, nv: vs + str(nv), version_tuple, "")
|
||||
|
||||
return "%scryptopp%s.zip" % (Cryptopp.homepage, version_string)
|
||||
version_string = str(version).replace('.', '')
|
||||
return '%s/cryptopp%s.zip' % (Cryptopp.base_url, version_string)
|
||||
|
|
47
var/spack/repos/builtin/packages/cuda/package.py
Normal file
47
var/spack/repos/builtin/packages/cuda/package.py
Normal file
|
@ -0,0 +1,47 @@
|
|||
from spack import *
|
||||
from glob import glob
|
||||
import os
|
||||
|
||||
class Cuda(Package):
|
||||
"""CUDA is a parallel computing platform and programming model invented by
|
||||
NVIDIA. It enables dramatic increases in computing performance by harnessing
|
||||
the power of the graphics processing unit (GPU).
|
||||
|
||||
Note: NVIDIA does not provide a download URL for CUDA so you will need to
|
||||
download it yourself. Go to https://developer.nvidia.com/cuda-downloads
|
||||
and select your Operating System, Architecture, Distribution, and Version.
|
||||
For the Installer Type, select runfile and click Download. Spack will search
|
||||
your current directory for this file. Alternatively, add this file to a
|
||||
mirror so that Spack can find it. For instructions on how to set up a mirror,
|
||||
see http://software.llnl.gov/spack/mirrors.html
|
||||
|
||||
Note: This package does not currently install the drivers necessary to run
|
||||
CUDA. These will need to be installed manually. See:
|
||||
http://docs.nvidia.com/cuda/cuda-getting-started-guide-for-linux for details."""
|
||||
|
||||
homepage = "http://www.nvidia.com/object/cuda_home_new.html"
|
||||
|
||||
version('7.5.18', '4b3bcecf0dfc35928a0898793cf3e4c6', expand=False,
|
||||
url="file://%s/cuda_7.5.18_linux.run" % os.getcwd())
|
||||
version('6.5.14', '90b1b8f77313600cc294d9271741f4da', expand=False,
|
||||
url="file://%s/cuda_6.5.14_linux_64.run" % os.getcwd())
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
runfile = glob(os.path.join(self.stage.path, 'cuda*.run'))[0]
|
||||
chmod = which('chmod')
|
||||
chmod('+x', runfile)
|
||||
runfile = which(runfile)
|
||||
|
||||
# Note: NVIDIA does not officially support many newer versions of compilers.
|
||||
# For example, on CentOS 6, you must use GCC 4.4.7 or older. See:
|
||||
# http://docs.nvidia.com/cuda/cuda-installation-guide-linux/#system-requirements
|
||||
# for details.
|
||||
|
||||
runfile(
|
||||
'--silent', # disable interactive prompts
|
||||
'--verbose', # create verbose log file
|
||||
'--toolkit', # install CUDA Toolkit
|
||||
'--toolkitpath=%s' % prefix
|
||||
)
|
||||
|
|
@ -13,12 +13,15 @@ class Dbus(Package):
|
|||
homepage = "http://dbus.freedesktop.org/"
|
||||
url = "http://dbus.freedesktop.org/releases/dbus/dbus-1.8.8.tar.gz"
|
||||
|
||||
version('1.11.2', '957a07f066f3730d2bb3ea0932f0081b')
|
||||
version('1.9.0', 'ec6895a4d5c0637b01f0d0e7689e2b36')
|
||||
version('1.8.8', 'b9f4a18ee3faa1e07c04aa1d83239c43')
|
||||
version('1.8.6', '6a08ba555d340e9dfe2d623b83c0eea8')
|
||||
version('1.8.4', '4717cb8ab5b80978fcadf2b4f2f72e1b')
|
||||
version('1.8.2', 'd6f709bbec0a022a1847c7caec9d6068')
|
||||
|
||||
depends_on('expat')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure(
|
||||
"--prefix=%s" % prefix,
|
||||
|
|
253
var/spack/repos/builtin/packages/dealii/package.py
Normal file
253
var/spack/repos/builtin/packages/dealii/package.py
Normal file
|
@ -0,0 +1,253 @@
|
|||
from spack import *
|
||||
import sys
|
||||
|
||||
class Dealii(Package):
|
||||
"""C++ software library providing well-documented tools to build finite element codes for a broad variety of PDEs."""
|
||||
homepage = "https://www.dealii.org"
|
||||
url = "https://github.com/dealii/dealii/releases/download/v8.4.0/dealii-8.4.0.tar.gz"
|
||||
|
||||
version('8.4.0', 'ac5dbf676096ff61e092ce98c80c2b00')
|
||||
version('dev', git='https://github.com/dealii/dealii.git')
|
||||
|
||||
variant('mpi', default=True, description='Compile with MPI')
|
||||
variant('arpack', default=True, description='Compile with Arpack and PArpack (only with MPI)')
|
||||
variant('doc', default=False, description='Compile with documentation')
|
||||
variant('hdf5', default=True, description='Compile with HDF5 (only with MPI)')
|
||||
variant('metis', default=True, description='Compile with Metis')
|
||||
variant('netcdf', default=True, description='Compile with Netcdf (only with MPI)')
|
||||
variant('oce', default=True, description='Compile with OCE')
|
||||
variant('p4est', default=True, description='Compile with P4est (only with MPI)')
|
||||
variant('petsc', default=True, description='Compile with Petsc (only with MPI)')
|
||||
variant('slepc', default=True, description='Compile with Slepc (only with Petsc and MPI)')
|
||||
variant('trilinos', default=True, description='Compile with Trilinos (only with MPI)')
|
||||
|
||||
# required dependencies, light version
|
||||
depends_on ("blas")
|
||||
# Boost 1.58 is blacklisted, see https://github.com/dealii/dealii/issues/1591
|
||||
# require at least 1.59
|
||||
depends_on ("boost@1.59.0:", when='~mpi')
|
||||
depends_on ("boost@1.59.0:+mpi", when='+mpi')
|
||||
depends_on ("bzip2")
|
||||
depends_on ("cmake")
|
||||
depends_on ("lapack")
|
||||
depends_on ("muparser")
|
||||
depends_on ("suite-sparse")
|
||||
depends_on ("tbb")
|
||||
depends_on ("zlib")
|
||||
|
||||
# optional dependencies
|
||||
depends_on ("mpi", when="+mpi")
|
||||
depends_on ("arpack-ng+mpi", when='+arpack+mpi')
|
||||
depends_on ("doxygen", when='+doc')
|
||||
depends_on ("hdf5+mpi~cxx", when='+hdf5+mpi') #FIXME NetCDF declares dependency with ~cxx, why?
|
||||
depends_on ("metis@5:", when='+metis')
|
||||
depends_on ("netcdf+mpi", when="+netcdf+mpi")
|
||||
depends_on ("netcdf-cxx", when='+netcdf+mpi')
|
||||
depends_on ("oce", when='+oce')
|
||||
depends_on ("p4est", when='+p4est+mpi')
|
||||
depends_on ("petsc+mpi", when='+petsc+mpi')
|
||||
depends_on ("slepc", when='+slepc+petsc+mpi')
|
||||
depends_on ("trilinos", when='+trilinos+mpi')
|
||||
|
||||
# developer dependnecies
|
||||
#depends_on ("numdiff") #FIXME
|
||||
#depends_on ("astyle") #FIXME
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = []
|
||||
options.extend(std_cmake_args)
|
||||
|
||||
# CMAKE_BUILD_TYPE should be DebugRelease | Debug | Release
|
||||
for word in options[:]:
|
||||
if word.startswith('-DCMAKE_BUILD_TYPE'):
|
||||
options.remove(word)
|
||||
|
||||
dsuf = 'dylib' if sys.platform == 'darwin' else 'so'
|
||||
options.extend([
|
||||
'-DCMAKE_BUILD_TYPE=DebugRelease',
|
||||
'-DDEAL_II_COMPONENT_EXAMPLES=ON',
|
||||
'-DDEAL_II_WITH_THREADS:BOOL=ON',
|
||||
'-DBOOST_DIR=%s' % spec['boost'].prefix,
|
||||
'-DBZIP2_DIR=%s' % spec['bzip2'].prefix,
|
||||
# CMake's FindBlas/Lapack may pickup system's blas/lapack instead of Spack's.
|
||||
# Be more specific to avoid this.
|
||||
# Note that both lapack and blas are provided in -DLAPACK_XYZ variables
|
||||
'-DLAPACK_FOUND=true',
|
||||
'-DLAPACK_INCLUDE_DIRS=%s;%s' %
|
||||
(spec['lapack'].prefix.include,
|
||||
spec['blas'].prefix.include),
|
||||
'-DLAPACK_LIBRARIES=%s;%s' %
|
||||
(join_path(spec['lapack'].prefix.lib,'liblapack.%s' % dsuf), # FIXME don't hardcode names
|
||||
join_path(spec['blas'].prefix.lib,'libblas.%s' % dsuf)), # FIXME don't hardcode names
|
||||
'-DMUPARSER_DIR=%s ' % spec['muparser'].prefix,
|
||||
'-DP4EST_DIR=%s' % spec['p4est'].prefix,
|
||||
'-DUMFPACK_DIR=%s' % spec['suite-sparse'].prefix,
|
||||
'-DTBB_DIR=%s' % spec['tbb'].prefix,
|
||||
'-DZLIB_DIR=%s' % spec['zlib'].prefix
|
||||
])
|
||||
|
||||
# MPI
|
||||
if '+mpi' in spec:
|
||||
options.extend([
|
||||
'-DDEAL_II_WITH_MPI:BOOL=ON',
|
||||
'-DCMAKE_C_COMPILER=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpicc'), # FIXME: avoid hardcoding mpi wrappers names
|
||||
'-DCMAKE_CXX_COMPILER=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpic++'),
|
||||
'-DCMAKE_Fortran_COMPILER=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif90'),
|
||||
])
|
||||
else:
|
||||
options.extend([
|
||||
'-DDEAL_II_WITH_MPI:BOOL=OFF',
|
||||
])
|
||||
|
||||
# Optional dependencies for which librariy names are the same as CMake variables
|
||||
for library in ('hdf5', 'p4est','petsc', 'slepc','trilinos','metis'):
|
||||
if library in spec:
|
||||
options.extend([
|
||||
'-D{library}_DIR={value}'.format(library=library.upper(), value=spec[library].prefix),
|
||||
'-DDEAL_II_WITH_{library}:BOOL=ON'.format(library=library.upper())
|
||||
])
|
||||
else:
|
||||
options.extend([
|
||||
'-DDEAL_II_WITH_{library}:BOOL=OFF'.format(library=library.upper())
|
||||
])
|
||||
|
||||
# doxygen
|
||||
options.extend([
|
||||
'-DDEAL_II_COMPONENT_DOCUMENTATION=%s' % ('ON' if '+doc' in spec else 'OFF'),
|
||||
])
|
||||
|
||||
|
||||
# arpack
|
||||
if '+arpack' in spec:
|
||||
options.extend([
|
||||
'-DARPACK_DIR=%s' % spec['arpack-ng'].prefix,
|
||||
'-DDEAL_II_WITH_ARPACK=ON',
|
||||
'-DDEAL_II_ARPACK_WITH_PARPACK=ON'
|
||||
])
|
||||
else:
|
||||
options.extend([
|
||||
'-DDEAL_II_WITH_ARPACK=OFF'
|
||||
])
|
||||
|
||||
# since Netcdf is spread among two, need to do it by hand:
|
||||
if '+netcdf' in spec:
|
||||
options.extend([
|
||||
'-DNETCDF_FOUND=true',
|
||||
'-DNETCDF_LIBRARIES=%s;%s' %
|
||||
(join_path(spec['netcdf-cxx'].prefix.lib,'libnetcdf_c++.%s' % dsuf),
|
||||
join_path(spec['netcdf'].prefix.lib,'libnetcdf.%s' % dsuf)),
|
||||
'-DNETCDF_INCLUDE_DIRS=%s;%s' %
|
||||
(spec['netcdf-cxx'].prefix.include,
|
||||
spec['netcdf'].prefix.include),
|
||||
])
|
||||
else:
|
||||
options.extend([
|
||||
'-DDEAL_II_WITH_NETCDF=OFF'
|
||||
])
|
||||
|
||||
# Open Cascade
|
||||
if '+oce' in spec:
|
||||
options.extend([
|
||||
'-DOPENCASCADE_DIR=%s' % spec['oce'].prefix,
|
||||
'-DDEAL_II_WITH_OPENCASCADE=ON'
|
||||
])
|
||||
else:
|
||||
options.extend([
|
||||
'-DDEAL_II_WITH_OPENCASCADE=OFF'
|
||||
])
|
||||
|
||||
cmake('.', *options)
|
||||
|
||||
make()
|
||||
make("test")
|
||||
make("install")
|
||||
|
||||
# run some MPI examples with different solvers from PETSc and Trilinos
|
||||
env['DEAL_II_DIR'] = prefix
|
||||
print('=====================================')
|
||||
print('============ EXAMPLES ===============')
|
||||
print('=====================================')
|
||||
# take bare-bones step-3
|
||||
print('=====================================')
|
||||
print('============ Step-3 =================')
|
||||
print('=====================================')
|
||||
with working_dir('examples/step-3'):
|
||||
cmake('.')
|
||||
make('release')
|
||||
make('run',parallel=False)
|
||||
|
||||
# An example which uses Metis + PETSc
|
||||
# FIXME: switch step-18 to MPI
|
||||
with working_dir('examples/step-18'):
|
||||
print('=====================================')
|
||||
print('============= Step-18 ===============')
|
||||
print('=====================================')
|
||||
# list the number of cycles to speed up
|
||||
filter_file(r'(end_time = 10;)', ('end_time = 3;'), 'step-18.cc')
|
||||
if '^petsc' in spec and '^metis' in spec:
|
||||
cmake('.')
|
||||
make('release')
|
||||
make('run',parallel=False)
|
||||
|
||||
# take step-40 which can use both PETSc and Trilinos
|
||||
# FIXME: switch step-40 to MPI run
|
||||
with working_dir('examples/step-40'):
|
||||
print('=====================================')
|
||||
print('========== Step-40 PETSc ============')
|
||||
print('=====================================')
|
||||
# list the number of cycles to speed up
|
||||
filter_file(r'(const unsigned int n_cycles = 8;)', ('const unsigned int n_cycles = 2;'), 'step-40.cc')
|
||||
cmake('.')
|
||||
if '^petsc' in spec:
|
||||
make('release')
|
||||
make('run',parallel=False)
|
||||
|
||||
print('=====================================')
|
||||
print('========= Step-40 Trilinos ==========')
|
||||
print('=====================================')
|
||||
# change Linear Algebra to Trilinos
|
||||
filter_file(r'(\/\/ #define FORCE_USE_OF_TRILINOS.*)', ('#define FORCE_USE_OF_TRILINOS'), 'step-40.cc')
|
||||
if '^trilinos+hypre' in spec:
|
||||
make('release')
|
||||
make('run',parallel=False)
|
||||
|
||||
print('=====================================')
|
||||
print('=== Step-40 Trilinos SuperluDist ====')
|
||||
print('=====================================')
|
||||
# change to direct solvers
|
||||
filter_file(r'(LA::SolverCG solver\(solver_control\);)', ('TrilinosWrappers::SolverDirect::AdditionalData data(false,"Amesos_Superludist"); TrilinosWrappers::SolverDirect solver(solver_control,data);'), 'step-40.cc')
|
||||
filter_file(r'(LA::MPI::PreconditionAMG preconditioner;)', (''), 'step-40.cc')
|
||||
filter_file(r'(LA::MPI::PreconditionAMG::AdditionalData data;)', (''), 'step-40.cc')
|
||||
filter_file(r'(preconditioner.initialize\(system_matrix, data\);)', (''), 'step-40.cc')
|
||||
filter_file(r'(solver\.solve \(system_matrix, completely_distributed_solution, system_rhs,)', ('solver.solve (system_matrix, completely_distributed_solution, system_rhs);'), 'step-40.cc')
|
||||
filter_file(r'(preconditioner\);)', (''), 'step-40.cc')
|
||||
if '^trilinos+superlu-dist' in spec:
|
||||
make('release')
|
||||
make('run',paralle=False)
|
||||
|
||||
print('=====================================')
|
||||
print('====== Step-40 Trilinos MUMPS =======')
|
||||
print('=====================================')
|
||||
# switch to Mumps
|
||||
filter_file(r'(Amesos_Superludist)', ('Amesos_Mumps'), 'step-40.cc')
|
||||
if '^trilinos+mumps' in spec:
|
||||
make('release')
|
||||
make('run',parallel=False)
|
||||
|
||||
print('=====================================')
|
||||
print('============ Step-36 ================')
|
||||
print('=====================================')
|
||||
with working_dir('examples/step-36'):
|
||||
if 'slepc' in spec:
|
||||
cmake('.')
|
||||
make('release')
|
||||
make('run',parallel=False)
|
||||
|
||||
print('=====================================')
|
||||
print('============ Step-54 ================')
|
||||
print('=====================================')
|
||||
with working_dir('examples/step-54'):
|
||||
if 'oce' in spec:
|
||||
cmake('.')
|
||||
make('release')
|
||||
make('run',parallel=False)
|
34
var/spack/repos/builtin/packages/dia/package.py
Normal file
34
var/spack/repos/builtin/packages/dia/package.py
Normal file
|
@ -0,0 +1,34 @@
|
|||
from spack import *
|
||||
|
||||
class Dia(Package):
|
||||
"""Dia is a program for drawing structured diagrams."""
|
||||
homepage = 'https://wiki.gnome.org/Apps/Dia'
|
||||
url = 'https://ftp.gnome.org/pub/gnome/sources/dia/0.97/dia-0.97.3.tar.xz'
|
||||
|
||||
version('0.97.3', '0e744a0f6a6c4cb6a089e4d955392c3c')
|
||||
|
||||
depends_on('gtkplus@2.6.0:')
|
||||
depends_on('cairo')
|
||||
#depends_on('libart') # optional dependency, not yet supported by spack.
|
||||
depends_on('libpng')
|
||||
depends_on('libxslt')
|
||||
depends_on('python')
|
||||
depends_on('swig')
|
||||
# depends_on('py-gtk') # optional dependency, not yet supported by spack.
|
||||
|
||||
def url_for_version(self, version):
|
||||
"""Handle Dia's version-based custom URLs."""
|
||||
return 'https://ftp.gnome.org/pub/gnome/source/dia/%s/dia-%s.tar.xz' % (version.up_to(2), version)
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
# configure, build, install:
|
||||
options = ['--prefix=%s' % prefix,
|
||||
'--with-cairo',
|
||||
'--with-xslt-prefix=%s' % spec['libxslt'].prefix,
|
||||
'--with-python',
|
||||
'--with-swig']
|
||||
|
||||
configure(*options)
|
||||
make()
|
||||
make('install')
|
|
@ -4,6 +4,7 @@
|
|||
#------------------------------------------------------------------------------
|
||||
|
||||
from spack import *
|
||||
import sys
|
||||
|
||||
class Doxygen(Package):
|
||||
"""Doxygen is the de facto standard tool for generating documentation
|
||||
|
@ -17,6 +18,10 @@ class Doxygen(Package):
|
|||
version('1.8.10', '79767ccd986f12a0f949015efb5f058f')
|
||||
|
||||
depends_on("cmake@2.8.12:")
|
||||
# flex does not build on OSX, but it's provided there anyway
|
||||
depends_on("flex", sys.platform != 'darwin')
|
||||
depends_on("bison", sys.platform != 'darwin')
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
cmake('.', *std_cmake_args)
|
||||
|
|
|
@ -31,6 +31,8 @@ class Dyninst(Package):
|
|||
url = "http://www.dyninst.org/sites/default/files/downloads/dyninst/8.1.2/DyninstAPI-8.1.2.tgz"
|
||||
list_url = "http://www.dyninst.org/downloads/dyninst-8.x"
|
||||
|
||||
version('9.1.0', '5c64b77521457199db44bec82e4988ac',
|
||||
url="http://www.paradyn.org/release9.1.0/DyninstAPI-9.1.0.tgz")
|
||||
version('8.2.1', 'abf60b7faabe7a2e4b54395757be39c7',
|
||||
url="http://www.paradyn.org/release8.2/DyninstAPI-8.2.1.tgz")
|
||||
version('8.1.2', 'bf03b33375afa66fe0efa46ce3f4b17a',
|
||||
|
|
|
@ -45,7 +45,7 @@ class Eigen(Package):
|
|||
|
||||
# TODO : dependency on googlehash, superlu, adolc missing
|
||||
|
||||
depends_on('metis', when='+metis')
|
||||
depends_on('metis@5:', when='+metis')
|
||||
depends_on('scotch', when='+scotch')
|
||||
depends_on('fftw', when='+fftw')
|
||||
depends_on('suite-sparse', when='+suitesparse')
|
||||
|
|
122
var/spack/repos/builtin/packages/elk/package.py
Normal file
122
var/spack/repos/builtin/packages/elk/package.py
Normal file
|
@ -0,0 +1,122 @@
|
|||
from spack import *
|
||||
import spack
|
||||
|
||||
class Elk(Package):
|
||||
'''An all-electron full-potential linearised augmented-plane wave
|
||||
(FP-LAPW) code with many advanced features.'''
|
||||
|
||||
homepage = 'http://elk.sourceforge.net/'
|
||||
url = 'https://sourceforge.net/projects/elk/files/elk-3.3.17.tgz'
|
||||
|
||||
version('3.3.17', 'f57f6230d14f3b3b558e5c71f62f0592')
|
||||
|
||||
# Elk provides these libraries, but allows you to specify your own
|
||||
variant('blas', default=True, description='Build with custom BLAS library')
|
||||
variant('lapack', default=True, description='Build with custom LAPACK library')
|
||||
variant('fft', default=True, description='Build with custom FFT library')
|
||||
|
||||
# Elk does not provide these libraries, but allows you to use them
|
||||
variant('mpi', default=True, description='Enable MPI parallelism')
|
||||
variant('openmp', default=True, description='Enable OpenMP support')
|
||||
variant('libxc', default=True, description='Link to Libxc functional library')
|
||||
|
||||
depends_on('blas', when='+blas')
|
||||
depends_on('lapack', when='+lapack')
|
||||
depends_on('fftw', when='+fft')
|
||||
depends_on('mpi', when='+mpi')
|
||||
depends_on('libxc', when='+libxc')
|
||||
|
||||
# Cannot be built in parallel
|
||||
parallel = False
|
||||
|
||||
|
||||
def configure(self, spec):
|
||||
# Dictionary of configuration options
|
||||
config = {
|
||||
'MAKE': 'make',
|
||||
'F90': join_path(spack.build_env_path, 'f90'),
|
||||
'F77': join_path(spack.build_env_path, 'f77'),
|
||||
'AR': 'ar',
|
||||
'LIB_FFT': 'fftlib.a',
|
||||
'SRC_MPI': 'mpi_stub.f90',
|
||||
'SRC_OMP': 'omp_stub.f90',
|
||||
'SRC_libxc': 'libxcifc_stub.f90',
|
||||
'SRC_FFT': 'zfftifc.f90'
|
||||
}
|
||||
|
||||
# Compiler-specific flags
|
||||
flags = ''
|
||||
if self.compiler.name == 'intel':
|
||||
flags = '-O3 -ip -unroll -no-prec-div -openmp'
|
||||
elif self.compiler.name == 'gcc':
|
||||
flags = '-O3 -ffast-math -funroll-loops -fopenmp'
|
||||
elif self.compiler.name == 'pgi':
|
||||
flags = '-O3 -mp -lpthread'
|
||||
elif self.compiler.name == 'g95':
|
||||
flags = '-O3 -fno-second-underscore'
|
||||
elif self.compiler.name == 'nag':
|
||||
flags = '-O4 -kind=byte -dusty -dcfuns'
|
||||
elif self.compiler.name == 'xl':
|
||||
flags = '-O3 -qsmp=omp'
|
||||
config['F90_OPTS'] = flags
|
||||
config['F77_OPTS'] = flags
|
||||
|
||||
# BLAS/LAPACK support
|
||||
blas = 'blas.a'
|
||||
lapack = 'lapack.a'
|
||||
if '+blas' in spec:
|
||||
blas = join_path(spec['blas'].prefix.lib, 'libblas.so')
|
||||
if '+lapack' in spec:
|
||||
lapack = join_path(spec['lapack'].prefix.lib, 'liblapack.so')
|
||||
config['LIB_LPK'] = ' '.join([lapack, blas]) # lapack must come before blas
|
||||
|
||||
# FFT support
|
||||
if '+fft' in spec:
|
||||
config['LIB_FFT'] = join_path(spec['fftw'].prefix.lib, 'libfftw3.so')
|
||||
config['SRC_FFT'] = 'zfftifc_fftw.f90'
|
||||
|
||||
# MPI support
|
||||
if '+mpi' in spec:
|
||||
config.pop('SRC_MPI')
|
||||
config['F90'] = join_path(spec['mpi'].prefix.bin, 'mpif90')
|
||||
config['F77'] = join_path(spec['mpi'].prefix.bin, 'mpif77')
|
||||
|
||||
# OpenMP support
|
||||
if '+openmp' in spec:
|
||||
config.pop('SRC_OMP')
|
||||
|
||||
# Libxc support
|
||||
if '+libxc' in spec:
|
||||
config['LIB_libxc'] = ' '.join([
|
||||
join_path(spec['libxc'].prefix.lib, 'libxcf90.so'),
|
||||
join_path(spec['libxc'].prefix.lib, 'libxc.so')
|
||||
])
|
||||
config['SRC_libxc'] = ' '.join([
|
||||
'libxc_funcs.f90',
|
||||
'libxc.f90',
|
||||
'libxcifc.f90'
|
||||
])
|
||||
|
||||
# Write configuration options to include file
|
||||
with open('make.inc', 'w') as inc:
|
||||
for key in config:
|
||||
inc.write('{0} = {1}\n'.format(key, config[key]))
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# Elk only provides an interactive setup script
|
||||
self.configure(spec)
|
||||
|
||||
make()
|
||||
make('test')
|
||||
|
||||
# The Elk Makefile does not provide an install target
|
||||
mkdirp(prefix.bin)
|
||||
|
||||
install('src/elk', prefix.bin)
|
||||
install('src/eos/eos', prefix.bin)
|
||||
install('src/spacegroup/spacegroup', prefix.bin)
|
||||
|
||||
install_tree('examples', join_path(prefix, 'examples'))
|
||||
install_tree('species', join_path(prefix, 'species'))
|
||||
|
|
@ -0,0 +1,38 @@
|
|||
from spack import *
|
||||
|
||||
|
||||
class EnvironmentModules(Package):
|
||||
"""The Environment Modules package provides for the dynamic
|
||||
modification of a user's environment via modulefiles."""
|
||||
|
||||
homepage = "https://sourceforge.net/p/modules/wiki/Home/"
|
||||
url = "http://prdownloads.sourceforge.net/modules/modules-3.2.10.tar.gz"
|
||||
|
||||
version('3.2.10', '8b097fdcb90c514d7540bb55a3cb90fb')
|
||||
|
||||
# Dependencies:
|
||||
depends_on('tcl')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
tcl_spec = spec['tcl']
|
||||
|
||||
# See: https://sourceforge.net/p/modules/bugs/62/
|
||||
CPPFLAGS = ['-DUSE_INTERP_ERRORLINE']
|
||||
config_args = [
|
||||
"--without-tclx",
|
||||
"--with-tclx-ver=0.0",
|
||||
"--prefix=%s" % prefix,
|
||||
"--with-tcl=%s" % join_path(tcl_spec.prefix, 'lib'), # It looks for tclConfig.sh
|
||||
"--with-tcl-ver=%d.%d" % (tcl_spec.version.version[0], tcl_spec.version.version[1]),
|
||||
'--disable-debug',
|
||||
'--disable-dependency-tracking',
|
||||
'--disable-silent-rules',
|
||||
'--disable-versioning',
|
||||
'--datarootdir=%s' % prefix.share,
|
||||
'CPPFLAGS=%s' % ' '.join(CPPFLAGS)
|
||||
]
|
||||
|
||||
|
||||
configure(*config_args)
|
||||
make()
|
||||
make('install')
|
|
@ -24,7 +24,7 @@ class Espresso(Package):
|
|||
depends_on('fftw~mpi', when='~mpi')
|
||||
depends_on('fftw+mpi', when='+mpi')
|
||||
depends_on('scalapack', when='+scalapack+mpi') # TODO : + mpi needed to avoid false dependencies installation
|
||||
|
||||
|
||||
def check_variants(self, spec):
|
||||
error = 'you cannot ask for \'+{variant}\' when \'+mpi\' is not active'
|
||||
if '+scalapack' in spec and '~mpi' in spec:
|
||||
|
@ -33,9 +33,10 @@ def check_variants(self, spec):
|
|||
raise RuntimeError(error.format(variant='elpa'))
|
||||
|
||||
def install(self, spec, prefix):
|
||||
from glob import glob
|
||||
self.check_variants(spec)
|
||||
|
||||
options = ['-prefix=%s' % prefix]
|
||||
options = ['-prefix=%s' % prefix.bin]
|
||||
|
||||
if '+mpi' in spec:
|
||||
options.append('--enable-parallel')
|
||||
|
@ -61,5 +62,11 @@ def install(self, spec, prefix):
|
|||
|
||||
configure(*options)
|
||||
make('all')
|
||||
make('install')
|
||||
|
||||
if spec.architecture.startswith('darwin'):
|
||||
mkdirp(prefix.bin)
|
||||
for filename in glob("bin/*.x"):
|
||||
install(filename, prefix.bin)
|
||||
else:
|
||||
make('install')
|
||||
|
||||
|
|
42
var/spack/repos/builtin/packages/gcc/darwin/gcc-4.9.patch1
Normal file
42
var/spack/repos/builtin/packages/gcc/darwin/gcc-4.9.patch1
Normal file
|
@ -0,0 +1,42 @@
|
|||
diff --git a/gcc/configure b/gcc/configure
|
||||
index 9523773..52b0bf7 100755
|
||||
--- a/gcc/configure
|
||||
+++ b/gcc/configure
|
||||
@@ -24884,7 +24884,7 @@ if test "${gcc_cv_as_ix86_filds+set}" = set; then :
|
||||
else
|
||||
gcc_cv_as_ix86_filds=no
|
||||
if test x$gcc_cv_as != x; then
|
||||
- $as_echo 'filds mem; fists mem' > conftest.s
|
||||
+ $as_echo 'filds (%ebp); fists (%ebp)' > conftest.s
|
||||
if { ac_try='$gcc_cv_as $gcc_cv_as_flags -o conftest.o conftest.s >&5'
|
||||
{ { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_try\""; } >&5
|
||||
(eval $ac_try) 2>&5
|
||||
@@ -24915,7 +24915,7 @@ if test "${gcc_cv_as_ix86_fildq+set}" = set; then :
|
||||
else
|
||||
gcc_cv_as_ix86_fildq=no
|
||||
if test x$gcc_cv_as != x; then
|
||||
- $as_echo 'fildq mem; fistpq mem' > conftest.s
|
||||
+ $as_echo 'fildq (%ebp); fistpq (%ebp)' > conftest.s
|
||||
if { ac_try='$gcc_cv_as $gcc_cv_as_flags -o conftest.o conftest.s >&5'
|
||||
{ { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$ac_try\""; } >&5
|
||||
(eval $ac_try) 2>&5
|
||||
diff --git a/gcc/configure.ac b/gcc/configure.ac
|
||||
index 68b0ee8..bd53978 100644
|
||||
--- a/gcc/configure.ac
|
||||
+++ b/gcc/configure.ac
|
||||
@@ -3869,13 +3869,13 @@ foo: nop
|
||||
|
||||
gcc_GAS_CHECK_FEATURE([filds and fists mnemonics],
|
||||
gcc_cv_as_ix86_filds,,,
|
||||
- [filds mem; fists mem],,
|
||||
+ [filds (%ebp); fists (%ebp)],,
|
||||
[AC_DEFINE(HAVE_AS_IX86_FILDS, 1,
|
||||
[Define if your assembler uses filds and fists mnemonics.])])
|
||||
|
||||
gcc_GAS_CHECK_FEATURE([fildq and fistpq mnemonics],
|
||||
gcc_cv_as_ix86_fildq,,,
|
||||
- [fildq mem; fistpq mem],,
|
||||
+ [fildq (%ebp); fistpq (%ebp)],,
|
||||
[AC_DEFINE(HAVE_AS_IX86_FILDQ, 1,
|
||||
[Define if your assembler uses fildq and fistq mnemonics.])])
|
||||
|
28
var/spack/repos/builtin/packages/gcc/darwin/gcc-4.9.patch2
Normal file
28
var/spack/repos/builtin/packages/gcc/darwin/gcc-4.9.patch2
Normal file
|
@ -0,0 +1,28 @@
|
|||
From 82f81877458ea372176eabb5de36329431dce99b Mon Sep 17 00:00:00 2001
|
||||
From: Iain Sandoe <iain@codesourcery.com>
|
||||
Date: Sat, 21 Dec 2013 00:30:18 +0000
|
||||
Subject: [PATCH] don't try to mark local symbols as no-dead-strip
|
||||
|
||||
---
|
||||
gcc/config/darwin.c | 5 +++++
|
||||
1 file changed, 5 insertions(+)
|
||||
|
||||
diff --git a/gcc/config/darwin.c b/gcc/config/darwin.c
|
||||
index 40804b8..0080299 100644
|
||||
--- a/gcc/config/darwin.c
|
||||
+++ b/gcc/config/darwin.c
|
||||
@@ -1259,6 +1259,11 @@ darwin_encode_section_info (tree decl, rtx rtl, int first ATTRIBUTE_UNUSED)
|
||||
void
|
||||
darwin_mark_decl_preserved (const char *name)
|
||||
{
|
||||
+ /* Actually we shouldn't mark any local symbol this way, but for now
|
||||
+ this only happens with ObjC meta-data. */
|
||||
+ if (darwin_label_is_anonymous_local_objc_name (name))
|
||||
+ return;
|
||||
+
|
||||
fprintf (asm_out_file, "\t.no_dead_strip ");
|
||||
assemble_name (asm_out_file, name);
|
||||
fputc ('\n', asm_out_file);
|
||||
--
|
||||
2.2.1
|
||||
|
|
@ -26,6 +26,8 @@
|
|||
|
||||
from contextlib import closing
|
||||
from glob import glob
|
||||
import sys
|
||||
import os
|
||||
|
||||
class Gcc(Package):
|
||||
"""The GNU Compiler Collection includes front ends for C, C++,
|
||||
|
@ -47,24 +49,33 @@ class Gcc(Package):
|
|||
version('4.6.4', 'b407a3d1480c11667f293bfb1f17d1a4')
|
||||
version('4.5.4', '27e459c2566b8209ab064570e1b378f7')
|
||||
|
||||
variant('gold', default=True, description="Build the gold linker plugin for ld-based LTO")
|
||||
variant('binutils', default=sys.platform != 'darwin',
|
||||
description="Build via binutils")
|
||||
variant('gold', default=sys.platform != 'darwin',
|
||||
description="Build the gold linker plugin for ld-based LTO")
|
||||
|
||||
depends_on("mpfr")
|
||||
depends_on("gmp")
|
||||
depends_on("mpc", when='@4.5:')
|
||||
depends_on("isl", when='@5.0:')
|
||||
depends_on("binutils~libiberty", when='~gold')
|
||||
depends_on("binutils~libiberty+gold", when='+gold')
|
||||
depends_on("binutils~libiberty", when='+binutils ~gold')
|
||||
depends_on("binutils~libiberty+gold", when='+binutils +gold')
|
||||
|
||||
# TODO: integrate these libraries.
|
||||
#depends_on("ppl")
|
||||
#depends_on("cloog")
|
||||
if sys.platform == 'darwin':
|
||||
patch('darwin/gcc-4.9.patch1', when='@4.9.3')
|
||||
patch('darwin/gcc-4.9.patch2', when='@4.9.3')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# libjava/configure needs a minor fix to install into spack paths.
|
||||
filter_file(r"'@.*@'", "'@[[:alnum:]]*@'", 'libjava/configure', string=True)
|
||||
filter_file(r"'@.*@'", "'@[[:alnum:]]*@'", 'libjava/configure',
|
||||
string=True)
|
||||
|
||||
enabled_languages = set(('c', 'c++', 'fortran', 'java', 'objc'))
|
||||
if spec.satisfies("@4.7.1:"):
|
||||
|
||||
if spec.satisfies("@4.7.1:") and sys.platform != 'darwin':
|
||||
enabled_languages.add('go')
|
||||
|
||||
# Generic options to compile GCC
|
||||
|
@ -72,32 +83,40 @@ def install(self, spec, prefix):
|
|||
"--libdir=%s/lib64" % prefix,
|
||||
"--disable-multilib",
|
||||
"--enable-languages=" + ','.join(enabled_languages),
|
||||
"--with-mpc=%s" % spec['mpc'].prefix,
|
||||
"--with-mpfr=%s" % spec['mpfr'].prefix,
|
||||
"--with-gmp=%s" % spec['gmp'].prefix,
|
||||
"--with-mpc=%s" % spec['mpc'].prefix,
|
||||
"--with-mpfr=%s" % spec['mpfr'].prefix,
|
||||
"--with-gmp=%s" % spec['gmp'].prefix,
|
||||
"--enable-lto",
|
||||
"--with-gnu-ld",
|
||||
"--with-gnu-as",
|
||||
"--with-quad"]
|
||||
# Binutils
|
||||
static_bootstrap_flags = "-static-libstdc++ -static-libgcc"
|
||||
binutils_options = ["--with-sysroot=/",
|
||||
"--with-stage1-ldflags=%s %s" % (self.rpath_args, static_bootstrap_flags),
|
||||
"--with-boot-ldflags=%s %s" % (self.rpath_args, static_bootstrap_flags),
|
||||
"--with-ld=%s/bin/ld" % spec['binutils'].prefix,
|
||||
"--with-as=%s/bin/as" % spec['binutils'].prefix]
|
||||
options.extend(binutils_options)
|
||||
if spec.satisfies('+binutils'):
|
||||
static_bootstrap_flags = "-static-libstdc++ -static-libgcc"
|
||||
binutils_options = ["--with-sysroot=/",
|
||||
"--with-stage1-ldflags=%s %s" %
|
||||
(self.rpath_args, static_bootstrap_flags),
|
||||
"--with-boot-ldflags=%s %s" %
|
||||
(self.rpath_args, static_bootstrap_flags),
|
||||
"--with-gnu-ld",
|
||||
"--with-ld=%s/bin/ld" % spec['binutils'].prefix,
|
||||
"--with-gnu-as",
|
||||
"--with-as=%s/bin/as" % spec['binutils'].prefix]
|
||||
options.extend(binutils_options)
|
||||
# Isl
|
||||
if 'isl' in spec:
|
||||
isl_options = ["--with-isl=%s" % spec['isl'].prefix]
|
||||
options.extend(isl_options)
|
||||
|
||||
if sys.platform == 'darwin' :
|
||||
darwin_options = [ "--with-build-config=bootstrap-debug" ]
|
||||
options.extend(darwin_options)
|
||||
|
||||
build_dir = join_path(self.stage.path, 'spack-build')
|
||||
configure = Executable( join_path(self.stage.source_path, 'configure') )
|
||||
with working_dir(build_dir, create=True):
|
||||
# Rest of install is straightforward.
|
||||
configure(*options)
|
||||
make()
|
||||
if sys.platform == 'darwin' : make("bootstrap")
|
||||
else: make()
|
||||
make("install")
|
||||
|
||||
self.write_rpath_specs()
|
||||
|
@ -114,7 +133,8 @@ def write_rpath_specs(self):
|
|||
"""Generate a spec file so the linker adds a rpath to the libs
|
||||
the compiler used to build the executable."""
|
||||
if not self.spec_dir:
|
||||
tty.warn("Could not install specs for %s." % self.spec.format('$_$@'))
|
||||
tty.warn("Could not install specs for %s." %
|
||||
self.spec.format('$_$@'))
|
||||
return
|
||||
|
||||
gcc = Executable(join_path(self.prefix.bin, 'gcc'))
|
||||
|
@ -124,5 +144,6 @@ def write_rpath_specs(self):
|
|||
for line in lines:
|
||||
out.write(line + "\n")
|
||||
if line.startswith("*link:"):
|
||||
out.write("-rpath %s/lib:%s/lib64 \\\n"% (self.prefix, self.prefix))
|
||||
out.write("-rpath %s/lib:%s/lib64 \\\n" %
|
||||
(self.prefix, self.prefix))
|
||||
set_install_permissions(specs_file)
|
||||
|
|
69
var/spack/repos/builtin/packages/gdal/package.py
Normal file
69
var/spack/repos/builtin/packages/gdal/package.py
Normal file
|
@ -0,0 +1,69 @@
|
|||
from spack import *
|
||||
|
||||
class Gdal(Package):
|
||||
"""
|
||||
GDAL is a translator library for raster and vector geospatial
|
||||
data formats that is released under an X/MIT style Open Source
|
||||
license by the Open Source Geospatial Foundation. As a library,
|
||||
it presents a single raster abstract data model and vector
|
||||
abstract data model to the calling application for all supported
|
||||
formats. It also comes with a variety of useful command line
|
||||
utilities for data translation and processing
|
||||
"""
|
||||
|
||||
homepage = "http://www.gdal.org/"
|
||||
url = "http://download.osgeo.org/gdal/2.0.2/gdal-2.0.2.tar.gz"
|
||||
list_url = "http://download.osgeo.org/gdal/"
|
||||
list_depth = 2
|
||||
|
||||
version('2.0.2', '573865f3f59ba7b4f8f4cddf223b52a5')
|
||||
|
||||
extends('python')
|
||||
|
||||
variant('hdf5', default=False, description='Enable HDF5 support')
|
||||
variant('hdf', default=False, description='Enable HDF4 support')
|
||||
variant('openjpeg', default=False, description='Enable JPEG2000 support')
|
||||
variant('geos', default=False, description='Enable GEOS support')
|
||||
variant('kea', default=False, description='Enable KEA support')
|
||||
variant('netcdf', default=False, description='Enable netcdf support')
|
||||
|
||||
depends_on('swig')
|
||||
depends_on("hdf5", when='+hdf5')
|
||||
depends_on("hdf", when='+hdf')
|
||||
depends_on("openjpeg", when='+openjpeg')
|
||||
depends_on("geos", when='+geos')
|
||||
depends_on("kealib", when='+kea')
|
||||
depends_on("netcdf", when='+netcdf')
|
||||
depends_on("libtiff")
|
||||
depends_on("libpng")
|
||||
depends_on("zlib")
|
||||
depends_on("proj")
|
||||
depends_on("py-numpy")
|
||||
|
||||
parallel = False
|
||||
|
||||
def install(self, spec, prefix):
|
||||
args = []
|
||||
args.append("--prefix=%s" % prefix)
|
||||
args.append("--with-liblzma=yes")
|
||||
args.append("--with-zlib=%s" % spec['zlib'].prefix)
|
||||
args.append("--with-python=%s" % spec['python'].prefix.bin + "/python")
|
||||
args.append("--without-libtool")
|
||||
|
||||
if '+geos' in spec:
|
||||
args.append('--with-geos=yes')
|
||||
if '+hdf' in spec:
|
||||
args.append('--with-hdf4=%s' % spec['hdf'].prefix)
|
||||
if '+hdf5' in spec:
|
||||
args.append('--with-hdf5=%s' % spec['hdf5'].prefix)
|
||||
if '+openjpeg' in spec:
|
||||
args.append('--with-openjpeg=%s' % spec['openjpeg'].prefix)
|
||||
if '+kea' in spec:
|
||||
args.append('--with-kea=yes')
|
||||
if '+netcdf' in spec:
|
||||
args.append('--with-netcdf=%s' % spec['netcdf'].prefix)
|
||||
|
||||
configure(*args)
|
||||
|
||||
make()
|
||||
make("install")
|
|
@ -1,4 +1,5 @@
|
|||
from spack import *
|
||||
import os
|
||||
|
||||
class Geos(Package):
|
||||
"""GEOS (Geometry Engine - Open Source) is a C++ port of the Java
|
||||
|
@ -10,6 +11,10 @@ class Geos(Package):
|
|||
homepage = "http://trac.osgeo.org/geos/"
|
||||
url = "http://download.osgeo.org/geos/geos-3.4.2.tar.bz2"
|
||||
|
||||
# Verison 3.5.0 supports Autotools and CMake
|
||||
version('3.5.0', '136842690be7f504fba46b3c539438dd')
|
||||
|
||||
# Versions through 3.4.2 have CMake, but only Autotools is supported
|
||||
version('3.4.2', 'fc5df2d926eb7e67f988a43a92683bae')
|
||||
version('3.4.1', '4c930dec44c45c49cd71f3e0931ded7e')
|
||||
version('3.4.0', 'e41318fc76b5dc764a69d43ac6b18488')
|
||||
|
@ -21,11 +26,22 @@ class Geos(Package):
|
|||
version('3.3.4', '1bb9f14d57ef06ffa41cb1d67acb55a1')
|
||||
version('3.3.3', '8454e653d7ecca475153cc88fd1daa26')
|
||||
|
||||
extends('python')
|
||||
depends_on('swig')
|
||||
# # Python3 is not supported.
|
||||
# variant('python', default=False, description='Enable Python support')
|
||||
|
||||
# extends('python', when='+python')
|
||||
# depends_on('python', when='+python')
|
||||
# depends_on('swig', when='+python')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix,
|
||||
"--enable-python")
|
||||
args = ["--prefix=%s" % prefix]
|
||||
# if '+python' in spec:
|
||||
# os.environ['PYTHON'] = join_path(spec['python'].prefix, 'bin',
|
||||
# 'python' if spec['python'].version[:1][0] <= 2 else 'python3')
|
||||
# os.environ['SWIG'] = join_path(spec['swig'].prefix, 'bin', 'swig')
|
||||
#
|
||||
# args.append("--enable-python")
|
||||
|
||||
configure(*args)
|
||||
make()
|
||||
make("install")
|
||||
|
|
30
var/spack/repos/builtin/packages/gettext/package.py
Normal file
30
var/spack/repos/builtin/packages/gettext/package.py
Normal file
|
@ -0,0 +1,30 @@
|
|||
from spack import *
|
||||
|
||||
class Gettext(Package):
|
||||
"""GNU internationalization (i18n) and localization (l10n) library."""
|
||||
homepage = "https://www.gnu.org/software/gettext/"
|
||||
url = "http://ftpmirror.gnu.org/gettext/gettext-0.19.7.tar.xz"
|
||||
|
||||
version('0.19.7', 'f81e50556da41b44c1d59ac93474dca5')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = ['--disable-dependency-tracking',
|
||||
'--disable-silent-rules',
|
||||
'--disable-debug',
|
||||
'--prefix=%s' % prefix,
|
||||
'--with-included-gettext',
|
||||
'--with-included-glib',
|
||||
'--with-included-libcroco',
|
||||
'--with-included-libunistring',
|
||||
'--with-emacs',
|
||||
'--with-lispdir=%s/emacs/site-lisp/gettext' % prefix.share,
|
||||
'--disable-java',
|
||||
'--disable-csharp',
|
||||
'--without-git', # Don't use VCS systems to create these archives
|
||||
'--without-cvs',
|
||||
'--without-xz']
|
||||
|
||||
configure(*options)
|
||||
|
||||
make()
|
||||
make("install")
|
|
@ -36,6 +36,8 @@ class Git(Package):
|
|||
depends_on("curl", when="+curl")
|
||||
depends_on("expat", when="+expat")
|
||||
|
||||
# Also depends_on gettext: apt-get install gettext (Ubuntu)
|
||||
|
||||
# Use system perl for now.
|
||||
# depends_on("perl")
|
||||
# depends_on("pcre")
|
||||
|
|
|
@ -11,6 +11,7 @@ class Global(Package):
|
|||
version('6.5', 'dfec818b4f53d91721e247cf7b218078')
|
||||
|
||||
depends_on('exuberant-ctags')
|
||||
depends_on('ncurses')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
config_args = ['--prefix={0}'.format(prefix)]
|
||||
|
|
24
var/spack/repos/builtin/packages/googletest/package.py
Normal file
24
var/spack/repos/builtin/packages/googletest/package.py
Normal file
|
@ -0,0 +1,24 @@
|
|||
from spack import *
|
||||
|
||||
class Googletest(Package):
|
||||
"""Google test framework for C++. Also called gtest."""
|
||||
homepage = "https://github.com/google/googletest"
|
||||
url = "https://github.com/google/googletest/tarball/release-1.7.0"
|
||||
|
||||
version('1.7.0', '5eaf03ed925a47b37c8e1d559eb19bc4')
|
||||
|
||||
depends_on("cmake")
|
||||
|
||||
def install(self, spec, prefix):
|
||||
which('cmake')('.', *std_cmake_args)
|
||||
|
||||
make()
|
||||
|
||||
# Google Test doesn't have a make install
|
||||
# We have to do our own install here.
|
||||
install_tree('include', prefix.include)
|
||||
|
||||
mkdirp(prefix.lib)
|
||||
install('./libgtest.a', '%s' % prefix.lib)
|
||||
install('./libgtest_main.a', '%s' % prefix.lib)
|
||||
|
|
@ -7,6 +7,12 @@ class Graphviz(Package):
|
|||
|
||||
version('2.38.0', '5b6a829b2ac94efcd5fa3c223ed6d3ae')
|
||||
|
||||
# By default disable optional Perl language support to prevent build issues
|
||||
# related to missing Perl packages. If spack begins support for Perl in the
|
||||
# future, this package can be updated to depend_on('perl') and the
|
||||
# ncecessary devel packages.
|
||||
variant('perl', default=False, description='Enable if you need the optional Perl language bindings.')
|
||||
|
||||
parallel = False
|
||||
|
||||
depends_on("swig")
|
||||
|
@ -14,8 +20,10 @@ class Graphviz(Package):
|
|||
depends_on("ghostscript")
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" %prefix)
|
||||
options = ['--prefix=%s' % prefix]
|
||||
if not '+perl' in spec:
|
||||
options.append('--disable-perl')
|
||||
|
||||
configure(*options)
|
||||
make()
|
||||
make("install")
|
||||
|
||||
|
|
|
@ -37,6 +37,7 @@ class Hdf5(Package):
|
|||
list_url = "http://www.hdfgroup.org/ftp/HDF5/releases"
|
||||
list_depth = 3
|
||||
|
||||
version('1.10.0', 'bdc935337ee8282579cd6bc4270ad199')
|
||||
version('1.8.16', 'b8ed9a36ae142317f88b0c7ef4b9c618')
|
||||
version('1.8.15', '03cccb5b33dbe975fdcd8ae9dc021f24')
|
||||
version('1.8.13', 'c03426e9e77d7766944654280b467289')
|
||||
|
@ -80,10 +81,16 @@ def install(self, spec, prefix):
|
|||
# sanity check in configure, so this doesn't merit a variant.
|
||||
extra_args.append("--enable-unsupported")
|
||||
|
||||
if '+debug' in spec:
|
||||
extra_args.append('--enable-debug=all')
|
||||
if spec.satisfies('@1.10:'):
|
||||
if '+debug' in spec:
|
||||
extra_args.append('--enable-build-mode=debug')
|
||||
else:
|
||||
extra_args.append('--enable-build-mode=production')
|
||||
else:
|
||||
extra_args.append('--enable-production')
|
||||
if '+debug' in spec:
|
||||
extra_args.append('--enable-debug=all')
|
||||
else:
|
||||
extra_args.append('--enable-production')
|
||||
|
||||
if '+shared' in spec:
|
||||
extra_args.append('--enable-shared')
|
||||
|
@ -94,10 +101,10 @@ def install(self, spec, prefix):
|
|||
extra_args.append('--enable-cxx')
|
||||
|
||||
if '+fortran' in spec:
|
||||
extra_args.extend([
|
||||
'--enable-fortran',
|
||||
'--enable-fortran2003'
|
||||
])
|
||||
extra_args.append('--enable-fortran')
|
||||
# '--enable-fortran2003' no longer exists as of version 1.10.0
|
||||
if spec.satisfies('@:1.8.16'):
|
||||
extra_args.append('--enable-fortran2003')
|
||||
|
||||
if '+mpi' in spec:
|
||||
# The HDF5 configure script warns if cxx and mpi are enabled
|
||||
|
@ -139,5 +146,7 @@ def url_for_version(self, version):
|
|||
return "http://www.hdfgroup.org/ftp/HDF5/releases/hdf5-" + v + ".tar.gz"
|
||||
elif version < Version("1.7"):
|
||||
return "http://www.hdfgroup.org/ftp/HDF5/releases/hdf5-" + version.up_to(2) + "/hdf5-" + v + ".tar.gz"
|
||||
else:
|
||||
elif version < Version("1.10"):
|
||||
return "http://www.hdfgroup.org/ftp/HDF5/releases/hdf5-" + v + "/src/hdf5-" + v + ".tar.gz"
|
||||
else:
|
||||
return "http://www.hdfgroup.org/ftp/HDF5/releases/hdf5-" + version.up_to(2) + "/hdf5-" + v + "/src/hdf5-" + v + ".tar.gz"
|
||||
|
|
73
var/spack/repos/builtin/packages/hoomd-blue/package.py
Normal file
73
var/spack/repos/builtin/packages/hoomd-blue/package.py
Normal file
|
@ -0,0 +1,73 @@
|
|||
from spack import *
|
||||
import os
|
||||
|
||||
class HoomdBlue(Package):
|
||||
"""HOOMD-blue is a general-purpose particle simulation toolkit. It scales
|
||||
from a single CPU core to thousands of GPUs.
|
||||
|
||||
You define particle initial conditions and interactions in a high-level
|
||||
python script. Then tell HOOMD-blue how you want to execute the job and it
|
||||
takes care of the rest. Python job scripts give you unlimited flexibility
|
||||
to create custom initialization routines, control simulation parameters,
|
||||
and perform in situ analysis."""
|
||||
|
||||
homepage = "https://codeblue.umich.edu/hoomd-blue/index.html"
|
||||
url = "https://bitbucket.org/glotzer/hoomd-blue/get/v1.3.3.tar.bz2"
|
||||
|
||||
version('1.3.3', '1469ef4531dc14b579c0acddbfe6a273')
|
||||
|
||||
variant('mpi', default=True, description='Compile with MPI enabled')
|
||||
variant('cuda', default=True, description='Compile with CUDA Toolkit')
|
||||
variant('doc', default=True, description='Generate documentation')
|
||||
|
||||
extends('python')
|
||||
depends_on('py-numpy')
|
||||
depends_on('boost+python')
|
||||
depends_on('cmake')
|
||||
depends_on('mpi', when='+mpi')
|
||||
depends_on('cuda', when='+cuda')
|
||||
depends_on('doxygen', when='+doc')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
cmake_args = [
|
||||
'-DPYTHON_EXECUTABLE=%s/python' % spec['python'].prefix.bin,
|
||||
'-DBOOST_ROOT=%s' % spec['boost' ].prefix
|
||||
]
|
||||
|
||||
# MPI support
|
||||
if '+mpi' in spec:
|
||||
os.environ['MPI_HOME'] = spec['mpi'].prefix
|
||||
cmake_args.append('-DENABLE_MPI=ON')
|
||||
else:
|
||||
cmake_args.append('-DENABLE_MPI=OFF')
|
||||
|
||||
# CUDA support
|
||||
if '+cuda' in spec:
|
||||
cmake_args.append('-DENABLE_CUDA=ON')
|
||||
else:
|
||||
cmake_args.append('-DENABLE_CUDA=OFF')
|
||||
|
||||
# CUDA-aware MPI library support
|
||||
#if '+cuda' in spec and '+mpi' in spec:
|
||||
# cmake_args.append('-DENABLE_MPI_CUDA=ON')
|
||||
#else:
|
||||
# cmake_args.append('-DENABLE_MPI_CUDA=OFF')
|
||||
|
||||
# There may be a bug in the MPI-CUDA code. See:
|
||||
# https://groups.google.com/forum/#!msg/hoomd-users/2griTESmc5I/E69s_M5fDwAJ
|
||||
# This prevented "make test" from passing for me.
|
||||
cmake_args.append('-DENABLE_MPI_CUDA=OFF')
|
||||
|
||||
# Documentation
|
||||
if '+doc' in spec:
|
||||
cmake_args.append('-DENABLE_DOXYGEN=ON')
|
||||
else:
|
||||
cmake_args.append('-DENABLE_DOXYGEN=OFF')
|
||||
|
||||
cmake_args.extend(std_cmake_args)
|
||||
cmake('.', *cmake_args)
|
||||
|
||||
make()
|
||||
make("test")
|
||||
make("install")
|
|
@ -1,5 +1,5 @@
|
|||
from spack import *
|
||||
import os
|
||||
import os, sys
|
||||
|
||||
class Hypre(Package):
|
||||
"""Hypre is a library of high performance preconditioners that
|
||||
|
@ -12,7 +12,10 @@ class Hypre(Package):
|
|||
version('2.10.1', 'dc048c4cabb3cd549af72591474ad674')
|
||||
version('2.10.0b', '768be38793a35bb5d055905b271f5b8e')
|
||||
|
||||
variant('shared', default=True, description="Build shared library version (disables static library)")
|
||||
# hypre does not know how to build shared libraries on Darwin
|
||||
variant('shared', default=sys.platform!='darwin', description="Build shared library version (disables static library)")
|
||||
# SuperluDist have conflicting headers with those in Hypre
|
||||
variant('internal-superlu', default=True, description="Use internal Superlu routines")
|
||||
|
||||
depends_on("mpi")
|
||||
depends_on("blas")
|
||||
|
@ -37,6 +40,12 @@ def install(self, spec, prefix):
|
|||
if '+shared' in self.spec:
|
||||
configure_args.append("--enable-shared")
|
||||
|
||||
if '~internal-superlu' in self.spec:
|
||||
configure_args.append("--without-superlu")
|
||||
# MLI and FEI do not build without superlu on Linux
|
||||
configure_args.append("--without-mli")
|
||||
configure_args.append("--without-fei")
|
||||
|
||||
# Hypre's source is staged under ./src so we'll have to manually
|
||||
# cd into it.
|
||||
with working_dir("src"):
|
||||
|
|
51
var/spack/repos/builtin/packages/ipopt/package.py
Normal file
51
var/spack/repos/builtin/packages/ipopt/package.py
Normal file
|
@ -0,0 +1,51 @@
|
|||
from spack import *
|
||||
|
||||
class Ipopt(Package):
|
||||
"""Ipopt (Interior Point OPTimizer, pronounced eye-pea-Opt) is a
|
||||
software package for large-scale nonlinear optimization."""
|
||||
homepage = "https://projects.coin-or.org/Ipopt"
|
||||
url = "http://www.coin-or.org/download/source/Ipopt/Ipopt-3.12.4.tgz"
|
||||
|
||||
version('3.12.4', '12a8ecaff8dd90025ddea6c65b49cb03')
|
||||
version('3.12.3', 'c560cbfa9cbf62acf8b485823c255a1b')
|
||||
version('3.12.2', 'ec1e855257d7de09e122c446506fb00d')
|
||||
version('3.12.1', 'ceaf895ce80c77778f2cab68ba9f17f3')
|
||||
version('3.12.0', 'f7dfc3aa106a6711a85214de7595e827')
|
||||
|
||||
depends_on("blas")
|
||||
depends_on("lapack")
|
||||
depends_on("pkg-config")
|
||||
depends_on("mumps+double~mpi")
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# Dependency directories
|
||||
blas_dir = spec['blas'].prefix
|
||||
lapack_dir = spec['lapack'].prefix
|
||||
mumps_dir = spec['mumps'].prefix
|
||||
|
||||
# Add directory with fake MPI headers in sequential MUMPS
|
||||
# install to header search path
|
||||
mumps_flags = "-ldmumps -lmumps_common -lpord -lmpiseq"
|
||||
mumps_libcmd = "-L%s " % mumps_dir.lib + mumps_flags
|
||||
|
||||
# By convention, spack links blas & lapack libs to libblas & liblapack
|
||||
blas_lib = "-L%s" % blas_dir.lib + " -lblas"
|
||||
lapack_lib = "-L%s" % lapack_dir.lib + " -llapack"
|
||||
|
||||
configure_args = [
|
||||
"--prefix=%s" % prefix,
|
||||
"--with-mumps-incdir=%s" % mumps_dir.include,
|
||||
"--with-mumps-lib=%s" % mumps_libcmd,
|
||||
"--enable-shared",
|
||||
"--with-blas-incdir=%s" % blas_dir.include,
|
||||
"--with-blas-lib=%s" % blas_lib,
|
||||
"--with-lapack-incdir=%s" % lapack_dir.include,
|
||||
"--with-lapack-lib=%s" % lapack_lib
|
||||
]
|
||||
|
||||
configure(*configure_args)
|
||||
|
||||
# IPOPT does not build correctly in parallel on OS X
|
||||
make(parallel=False)
|
||||
make("test", parallel=False)
|
||||
make("install", parallel=False)
|
68
var/spack/repos/builtin/packages/julia/openblas.patch
Normal file
68
var/spack/repos/builtin/packages/julia/openblas.patch
Normal file
|
@ -0,0 +1,68 @@
|
|||
diff --git a/deps/Makefile b/deps/Makefile
|
||||
index 6cb73be..bcd8520 100644
|
||||
--- a/deps/Makefile
|
||||
+++ b/deps/Makefile
|
||||
@@ -1049,7 +1049,7 @@ OPENBLAS_BUILD_OPTS += NO_AFFINITY=1
|
||||
|
||||
# Build for all architectures - required for distribution
|
||||
ifeq ($(OPENBLAS_DYNAMIC_ARCH), 1)
|
||||
-OPENBLAS_BUILD_OPTS += DYNAMIC_ARCH=1
|
||||
+OPENBLAS_BUILD_OPTS += DYNAMIC_ARCH=1 MAKE_NO_J=1
|
||||
endif
|
||||
|
||||
# 64-bit BLAS interface
|
||||
@@ -1085,6 +1085,7 @@ OPENBLAS_BUILD_OPTS += NO_AVX2=1
|
||||
endif
|
||||
|
||||
$(OPENBLAS_SRC_DIR)/config.status: $(OPENBLAS_SRC_DIR)/Makefile
|
||||
+ cd $(dir $@) && patch -p1 < ../openblas-make.patch
|
||||
ifeq ($(OS),WINNT)
|
||||
cd $(dir $@) && patch -p1 < ../openblas-win64.patch
|
||||
endif
|
||||
diff --git a/deps/openblas.version b/deps/openblas.version
|
||||
index 7c97e1b..58b9467 100644
|
||||
--- a/deps/openblas.version
|
||||
+++ b/deps/openblas.version
|
||||
@@ -1,2 +1,2 @@
|
||||
-OPENBLAS_BRANCH=v0.2.15
|
||||
-OPENBLAS_SHA1=53e849f4fcae4363a64576de00e982722c7304f9
|
||||
+OPENBLAS_BRANCH=v0.2.17
|
||||
+OPENBLAS_SHA1=a71e8c82f6a9f73093b631e5deab1e8da716b61f
|
||||
--- a/deps/openblas-make.patch
|
||||
+++ b/deps/openblas-make.patch
|
||||
@@ -0,0 +1,35 @@
|
||||
+diff --git a/Makefile.system b/Makefile.system
|
||||
+index b89f60e..2dbdad0 100644
|
||||
+--- a/Makefile.system
|
||||
++++ b/Makefile.system
|
||||
+@@ -139,6 +139,10 @@ NO_PARALLEL_MAKE=0
|
||||
+ endif
|
||||
+ GETARCH_FLAGS += -DNO_PARALLEL_MAKE=$(NO_PARALLEL_MAKE)
|
||||
+
|
||||
++ifdef MAKE_NO_J
|
||||
++GETARCH_FLAGS += -DMAKE_NO_J=$(MAKE_NO_J)
|
||||
++endif
|
||||
++
|
||||
+ ifdef MAKE_NB_JOBS
|
||||
+ GETARCH_FLAGS += -DMAKE_NB_JOBS=$(MAKE_NB_JOBS)
|
||||
+ endif
|
||||
+diff --git a/getarch.c b/getarch.c
|
||||
+index f9c49e6..dffad70 100644
|
||||
+--- a/getarch.c
|
||||
++++ b/getarch.c
|
||||
+@@ -1012,6 +1012,7 @@ int main(int argc, char *argv[]){
|
||||
+ #endif
|
||||
+ #endif
|
||||
+
|
||||
++#ifndef MAKE_NO_J
|
||||
+ #ifdef MAKE_NB_JOBS
|
||||
+ printf("MAKE += -j %d\n", MAKE_NB_JOBS);
|
||||
+ #elif NO_PARALLEL_MAKE==1
|
||||
+@@ -1021,6 +1022,7 @@ int main(int argc, char *argv[]){
|
||||
+ printf("MAKE += -j %d\n", get_num_cores());
|
||||
+ #endif
|
||||
+ #endif
|
||||
++#endif
|
||||
+
|
||||
+ break;
|
||||
+
|
|
@ -4,43 +4,56 @@
|
|||
class Julia(Package):
|
||||
"""The Julia Language: A fresh approach to technical computing"""
|
||||
homepage = "http://julialang.org"
|
||||
url = "http://github.com/JuliaLang/julia/releases/download/v0.4.2/julia-0.4.2.tar.gz"
|
||||
url = "https://github.com/JuliaLang/julia/releases/download/v0.4.3/julia-0.4.3-full.tar.gz"
|
||||
|
||||
version('0.4.3', '7b9f096798fca4bef262a64674bc2b52')
|
||||
version('0.4.2', 'ccfeb4f4090c8b31083f5e1ccb03eb06')
|
||||
version('master',
|
||||
git='https://github.com/JuliaLang/julia.git', branch='master')
|
||||
version('0.4.5', '69141ff5aa6cee7c0ec8c85a34aa49a6')
|
||||
version('0.4.3', '8a4a59fd335b05090dd1ebefbbe5aaac')
|
||||
|
||||
patch('gc.patch')
|
||||
patch('openblas.patch', when='@0.4:0.4.5')
|
||||
|
||||
# Build-time dependencies
|
||||
depends_on("cmake @2.8:")
|
||||
# Build-time dependencies:
|
||||
# depends_on("awk")
|
||||
# depends_on("m4")
|
||||
# depends_on("pkg-config")
|
||||
depends_on("python @2.6:2.9")
|
||||
|
||||
# I think that Julia requires the dependencies above, but it builds find (on
|
||||
# my system) without these. We should enable them as necessary.
|
||||
# Combined build-time and run-time dependencies:
|
||||
depends_on("binutils")
|
||||
depends_on("cmake @2.8:")
|
||||
depends_on("git")
|
||||
depends_on("openssl")
|
||||
depends_on("python @2.7:2.999")
|
||||
|
||||
# Run-time dependencies
|
||||
# I think that Julia requires the dependencies above, but it
|
||||
# builds fine (on my system) without these. We should enable them
|
||||
# as necessary.
|
||||
|
||||
# Run-time dependencies:
|
||||
# depends_on("arpack")
|
||||
# depends_on("fftw +float")
|
||||
# depends_on("gmp")
|
||||
# depends_on("libgit")
|
||||
# depends_on("mpfr")
|
||||
# depends_on("openblas")
|
||||
# depends_on("pcre2")
|
||||
|
||||
# ARPACK: Requires BLAS and LAPACK; needs to use the same version as Julia.
|
||||
# ARPACK: Requires BLAS and LAPACK; needs to use the same version
|
||||
# as Julia.
|
||||
|
||||
# BLAS and LAPACK: Julia prefers 64-bit versions on 64-bit systems. OpenBLAS
|
||||
# has an option for this; make it available as variant.
|
||||
# BLAS and LAPACK: Julia prefers 64-bit versions on 64-bit
|
||||
# systems. OpenBLAS has an option for this; make it available as
|
||||
# variant.
|
||||
|
||||
# FFTW: Something doesn't work when using a pre-installed FFTW library; need
|
||||
# to investigate.
|
||||
# FFTW: Something doesn't work when using a pre-installed FFTW
|
||||
# library; need to investigate.
|
||||
|
||||
# GMP, MPFR: Something doesn't work when using a pre-installed FFTW library;
|
||||
# need to investigate.
|
||||
# GMP, MPFR: Something doesn't work when using a pre-installed
|
||||
# FFTW library; need to investigate.
|
||||
|
||||
# LLVM: Julia works only with specific versions, and might require patches.
|
||||
# Thus we let Julia install its own LLVM.
|
||||
# LLVM: Julia works only with specific versions, and might require
|
||||
# patches. Thus we let Julia install its own LLVM.
|
||||
|
||||
# Other possible dependencies:
|
||||
# USE_SYSTEM_OPENLIBM=0
|
||||
|
@ -50,11 +63,21 @@ class Julia(Package):
|
|||
# USE_SYSTEM_UTF8PROC=0
|
||||
# USE_SYSTEM_LIBGIT2=0
|
||||
|
||||
# Run-time dependencies for Julia packages:
|
||||
depends_on("hdf5")
|
||||
depends_on("mpi")
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# Explicitly setting CC, CXX, or FC breaks building libuv, one of
|
||||
# Julia's dependencies. This might be a Darwin-specific problem. Given
|
||||
# how Spack sets up compilers, Julia should still use Spack's compilers,
|
||||
# even if we don't specify them explicitly.
|
||||
if '@master' in spec:
|
||||
# Julia needs to know the offset from a specific commit
|
||||
git = which('git')
|
||||
git('fetch', '--unshallow')
|
||||
|
||||
# Explicitly setting CC, CXX, or FC breaks building libuv, one
|
||||
# of Julia's dependencies. This might be a Darwin-specific
|
||||
# problem. Given how Spack sets up compilers, Julia should
|
||||
# still use Spack's compilers, even if we don't specify them
|
||||
# explicitly.
|
||||
options = [#"CC=cc",
|
||||
#"CXX=c++",
|
||||
#"FC=fc",
|
||||
|
|
35
var/spack/repos/builtin/packages/kealib/package.py
Normal file
35
var/spack/repos/builtin/packages/kealib/package.py
Normal file
|
@ -0,0 +1,35 @@
|
|||
from spack import *
|
||||
|
||||
class Kealib(Package):
|
||||
"""An HDF5 Based Raster File Format
|
||||
|
||||
KEALib provides an implementation of the GDAL data model.
|
||||
The format supports raster attribute tables, image pyramids,
|
||||
meta-data and in-built statistics while also handling very
|
||||
large files and compression throughout.
|
||||
|
||||
Based on the HDF5 standard, it also provides a base from which
|
||||
other formats can be derived and is a good choice for long
|
||||
term data archiving. An independent software library (libkea)
|
||||
provides complete access to the KEA image format and a GDAL
|
||||
driver allowing KEA images to be used from any GDAL supported software.
|
||||
|
||||
Development work on this project has been funded by Landcare Research.
|
||||
"""
|
||||
homepage = "http://kealib.org/"
|
||||
url = "https://bitbucket.org/chchrsc/kealib/get/kealib-1.4.5.tar.gz"
|
||||
|
||||
version('1.4.5', '112e9c42d980b2d2987a3c15d0833a5d')
|
||||
|
||||
depends_on("hdf5")
|
||||
|
||||
def install(self, spec, prefix):
|
||||
with working_dir('trunk', create=False):
|
||||
cmake_args = []
|
||||
cmake_args.append("-DCMAKE_INSTALL_PREFIX=%s" % prefix)
|
||||
cmake_args.append("-DHDF5_INCLUDE_DIR=%s" % spec['hdf5'].prefix.include)
|
||||
cmake_args.append("-DHDF5_LIB_PATH=%s" % spec['hdf5'].prefix.lib)
|
||||
cmake('.', *cmake_args)
|
||||
|
||||
make()
|
||||
make("install")
|
|
@ -2,7 +2,7 @@
|
|||
|
||||
class Libdrm(Package):
|
||||
"""A userspace library for accessing the DRM, direct
|
||||
rendering manager, on Linux, BSD and other operating
|
||||
rendering manager, on Linux, BSD and other operating
|
||||
systems that support the ioctl interface."""
|
||||
|
||||
homepage = "http://dri.freedesktop.org/libdrm/" # no real website...
|
||||
|
@ -11,6 +11,8 @@ class Libdrm(Package):
|
|||
version('2.4.59', '105ac7af1afcd742d402ca7b4eb168b6')
|
||||
version('2.4.33', '86e4e3debe7087d5404461e0032231c8')
|
||||
|
||||
depends_on('libpciaccess')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
|
||||
|
|
|
@ -38,8 +38,6 @@ class Libelf(Package):
|
|||
|
||||
provides('elf')
|
||||
|
||||
sanity_check_is_file = 'include/libelf.h'
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=" + prefix,
|
||||
"--enable-shared",
|
||||
|
|
|
@ -8,6 +8,11 @@ class Libpng(Package):
|
|||
version('1.6.16', '1a4ad377919ab15b54f6cb6a3ae2622d')
|
||||
version('1.6.15', '829a256f3de9307731d4f52dc071916d')
|
||||
version('1.6.14', '2101b3de1d5f348925990f9aa8405660')
|
||||
version('1.5.26', '3ca98347a5541a2dad55cd6d07ee60a9')
|
||||
version('1.4.19', '89bcbc4fc8b31f4a403906cf4f662330')
|
||||
version('1.2.56', '9508fc59d10a1ffadd9aae35116c19ee')
|
||||
|
||||
depends_on('zlib')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
|
|
18
var/spack/repos/builtin/packages/libxc/package.py
Normal file
18
var/spack/repos/builtin/packages/libxc/package.py
Normal file
|
@ -0,0 +1,18 @@
|
|||
from spack import *
|
||||
|
||||
class Libxc(Package):
|
||||
"""Libxc is a library of exchange-correlation functionals for
|
||||
density-functional theory."""
|
||||
|
||||
homepage = "http://www.tddft.org/programs/octopus/wiki/index.php/Libxc"
|
||||
url = "http://www.tddft.org/programs/octopus/down.php?file=libxc/libxc-2.2.2.tar.gz"
|
||||
|
||||
version('2.2.2', 'd9f90a0d6e36df6c1312b6422280f2ec')
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure('--prefix=%s' % prefix,
|
||||
'--enable-shared')
|
||||
|
||||
make()
|
||||
make("install")
|
|
@ -14,6 +14,9 @@ class Libxcb(Package):
|
|||
depends_on("python")
|
||||
depends_on("xcb-proto")
|
||||
|
||||
# depends_on('pthread') # Ubuntu: apt-get install libpthread-stubs0-dev
|
||||
# depends_on('xau') # Ubuntu: apt-get install libxau-dev
|
||||
|
||||
def patch(self):
|
||||
filter_file('typedef struct xcb_auth_info_t {', 'typedef struct {', 'src/xcb.h')
|
||||
|
||||
|
|
|
@ -0,0 +1,22 @@
|
|||
# HG changeset patch
|
||||
# User Sean Farley <sean@mcs.anl.gov>
|
||||
# Date 1332269671 18000
|
||||
# Tue Mar 20 13:54:31 2012 -0500
|
||||
# Node ID b95c0c2e1d8bf8e3273f7d45e856f0c0127d998e
|
||||
# Parent 88049269953c67c3fdcc4309bf901508a875f0dc
|
||||
cmake: add gklib headers to install into include
|
||||
|
||||
diff -r 88049269953c -r b95c0c2e1d8b libmetis/CMakeLists.txt
|
||||
Index: libmetis/CMakeLists.txt
|
||||
===================================================================
|
||||
--- a/libmetis/CMakeLists.txt Tue Mar 20 13:54:29 2012 -0500
|
||||
+++ b/libmetis/CMakeLists.txt Tue Mar 20 13:54:31 2012 -0500
|
||||
@@ -12,6 +12,8 @@ endif()
|
||||
if(METIS_INSTALL)
|
||||
install(TARGETS metis
|
||||
LIBRARY DESTINATION lib
|
||||
RUNTIME DESTINATION lib
|
||||
ARCHIVE DESTINATION lib)
|
||||
+ install(FILES gklib_defs.h DESTINATION include)
|
||||
+ install(FILES gklib_rename.h DESTINATION include)
|
||||
endif()
|
|
@ -24,7 +24,7 @@
|
|||
##############################################################################
|
||||
|
||||
from spack import *
|
||||
|
||||
import glob, sys, os
|
||||
|
||||
class Metis(Package):
|
||||
"""
|
||||
|
@ -36,7 +36,10 @@ class Metis(Package):
|
|||
homepage = 'http://glaros.dtc.umn.edu/gkhome/metis/metis/overview'
|
||||
url = "http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis/metis-5.1.0.tar.gz"
|
||||
|
||||
version('5.1.0', '5465e67079419a69e0116de24fce58fe')
|
||||
version('5.1.0', '5465e67079419a69e0116de24fce58fe',
|
||||
url='http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis/metis-5.1.0.tar.gz')
|
||||
version('4.0.3', '5efa35de80703c1b2c4d0de080fafbcf4e0d363a21149a1ad2f96e0144841a55',
|
||||
url='http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis/OLD/metis-4.0.3.tar.gz')
|
||||
|
||||
variant('shared', default=True, description='Enables the build of shared libraries')
|
||||
variant('debug', default=False, description='Builds the library in debug mode')
|
||||
|
@ -45,10 +48,85 @@ class Metis(Package):
|
|||
variant('idx64', default=False, description='Use int64_t as default index type')
|
||||
variant('double', default=False, description='Use double precision floating point types')
|
||||
|
||||
depends_on('cmake @2.8:') # build-time dependency
|
||||
|
||||
depends_on('cmake @2.8:', when='@5:') # build-time dependency
|
||||
depends_on('gdb', when='+gdb')
|
||||
|
||||
patch('install_gklib_defs_rename.patch', when='@5:')
|
||||
|
||||
|
||||
@when('@4:4.0.3')
|
||||
def install(self, spec, prefix):
|
||||
|
||||
if '+gdb' in spec:
|
||||
raise InstallError('gdb support not implemented in METIS 4!')
|
||||
if '+idx64' in spec:
|
||||
raise InstallError('idx64 option not implemented in METIS 4!')
|
||||
if '+double' in spec:
|
||||
raise InstallError('double option not implemented for METIS 4!')
|
||||
|
||||
options = ['COPTIONS=-fPIC']
|
||||
if '+debug' in spec:
|
||||
options.append('OPTFLAGS=-g -O0')
|
||||
make(*options)
|
||||
|
||||
mkdir(prefix.bin)
|
||||
for x in ('pmetis', 'kmetis', 'oemetis', 'onmetis', 'partnmesh',
|
||||
'partdmesh', 'mesh2nodal', 'mesh2dual', 'graphchk'):
|
||||
install(x, prefix.bin)
|
||||
|
||||
mkdir(prefix.lib)
|
||||
install('libmetis.a', prefix.lib)
|
||||
|
||||
mkdir(prefix.include)
|
||||
for h in glob.glob(join_path('Lib', '*.h')):
|
||||
install(h, prefix.include)
|
||||
|
||||
mkdir(prefix.share)
|
||||
for f in (join_path(*p)
|
||||
for p in (('Programs', 'io.c'),
|
||||
('Test','mtest.c'),
|
||||
('Graphs','4elt.graph'),
|
||||
('Graphs', 'metis.mesh'),
|
||||
('Graphs', 'test.mgraph'))):
|
||||
install(f, prefix.share)
|
||||
|
||||
if '+shared' in spec:
|
||||
if sys.platform == 'darwin':
|
||||
lib_dsuffix = 'dylib'
|
||||
load_flag = '-Wl,-all_load'
|
||||
no_load_flag = ''
|
||||
else:
|
||||
lib_dsuffix = 'so'
|
||||
load_flag = '-Wl,-whole-archive'
|
||||
no_load_flag = '-Wl,-no-whole-archive'
|
||||
|
||||
os.system(spack_cc + ' -fPIC -shared ' + load_flag +
|
||||
' libmetis.a ' + no_load_flag + ' -o libmetis.' +
|
||||
lib_dsuffix)
|
||||
install('libmetis.' + lib_dsuffix, prefix.lib)
|
||||
|
||||
# Set up and run tests on installation
|
||||
symlink(join_path(prefix.share, 'io.c'), 'io.c')
|
||||
symlink(join_path(prefix.share, 'mtest.c'), 'mtest.c')
|
||||
os.system(spack_cc + ' -I%s' % prefix.include + ' -c io.c')
|
||||
os.system(spack_cc + ' -I%s' % prefix.include +
|
||||
' -L%s' % prefix.lib + ' -lmetis mtest.c io.o -o mtest')
|
||||
_4eltgraph = join_path(prefix.share, '4elt.graph')
|
||||
test_mgraph = join_path(prefix.share, 'test.mgraph')
|
||||
metis_mesh = join_path(prefix.share, 'metis.mesh')
|
||||
kmetis = join_path(prefix.bin, 'kmetis')
|
||||
os.system('./mtest ' + _4eltgraph)
|
||||
os.system(kmetis + ' ' + _4eltgraph + ' 40')
|
||||
os.system(join_path(prefix.bin, 'onmetis') + ' ' + _4eltgraph)
|
||||
os.system(join_path(prefix.bin, 'pmetis') + ' ' + test_mgraph + ' 2')
|
||||
os.system(kmetis + ' ' + test_mgraph + ' 2')
|
||||
os.system(kmetis + ' ' + test_mgraph + ' 5')
|
||||
os.system(join_path(prefix.bin, 'partnmesh') + metis_mesh + ' 10')
|
||||
os.system(join_path(prefix.bin, 'partdmesh') + metis_mesh + ' 10')
|
||||
os.system(join_path(prefix.bin, 'mesh2dual') + metis_mesh)
|
||||
|
||||
|
||||
@when('@5:')
|
||||
def install(self, spec, prefix):
|
||||
|
||||
options = []
|
||||
|
@ -77,7 +155,36 @@ def install(self, spec, prefix):
|
|||
if '+double' in spec:
|
||||
filter_file('REALTYPEWIDTH 32', 'REALTYPEWIDTH 64', metis_header)
|
||||
|
||||
# Make clang 7.3 happy.
|
||||
# Prevents "ld: section __DATA/__thread_bss extends beyond end of file"
|
||||
# See upstream LLVM issue https://llvm.org/bugs/show_bug.cgi?id=27059
|
||||
# Adopted from https://github.com/Homebrew/homebrew-science/blob/master/metis.rb
|
||||
if spec.satisfies('%clang@7.3.0'):
|
||||
filter_file('#define MAX_JBUFS 128', '#define MAX_JBUFS 24', join_path(source_directory, 'GKlib', 'error.c'))
|
||||
|
||||
with working_dir(build_directory, create=True):
|
||||
cmake(source_directory, *options)
|
||||
make()
|
||||
make("install")
|
||||
make("install")
|
||||
# now run some tests:
|
||||
for f in ["4elt", "copter2", "mdual"]:
|
||||
graph = join_path(source_directory,'graphs','%s.graph' % f)
|
||||
Executable(join_path(prefix.bin,'graphchk'))(graph)
|
||||
Executable(join_path(prefix.bin,'gpmetis'))(graph,'2')
|
||||
Executable(join_path(prefix.bin,'ndmetis'))(graph)
|
||||
|
||||
graph = join_path(source_directory,'graphs','test.mgraph')
|
||||
Executable(join_path(prefix.bin,'gpmetis'))(graph,'2')
|
||||
graph = join_path(source_directory,'graphs','metis.mesh')
|
||||
Executable(join_path(prefix.bin,'mpmetis'))(graph,'2')
|
||||
|
||||
# install GKlib headers, which will be needed for ParMETIS
|
||||
GKlib_dist = join_path(prefix.include,'GKlib')
|
||||
mkdirp(GKlib_dist)
|
||||
fs = glob.glob(join_path(source_directory,'GKlib',"*.h"))
|
||||
for f in fs:
|
||||
install(f, GKlib_dist)
|
||||
|
||||
# The shared library is not installed correctly on Darwin; correct this
|
||||
if (sys.platform == 'darwin') and ('+shared' in spec):
|
||||
fix_darwin_install_name(prefix.lib)
|
||||
|
|
|
@ -47,12 +47,12 @@ class Mpich(Package):
|
|||
provides('mpi@:3.0', when='@3:')
|
||||
provides('mpi@:1.3', when='@1:')
|
||||
|
||||
def setup_dependent_environment(self, env, dependent_spec):
|
||||
env.set('MPICH_CC', spack_cc)
|
||||
env.set('MPICH_CXX', spack_cxx)
|
||||
env.set('MPICH_F77', spack_f77)
|
||||
env.set('MPICH_F90', spack_f90)
|
||||
env.set('MPICH_FC', spack_fc)
|
||||
def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
|
||||
spack_env.set('MPICH_CC', spack_cc)
|
||||
spack_env.set('MPICH_CXX', spack_cxx)
|
||||
spack_env.set('MPICH_F77', spack_f77)
|
||||
spack_env.set('MPICH_F90', spack_fc)
|
||||
spack_env.set('MPICH_FC', spack_fc)
|
||||
|
||||
def setup_dependent_package(self, module, dep_spec):
|
||||
"""For dependencies, make mpicc's use spack wrapper."""
|
||||
|
@ -78,6 +78,9 @@ def install(self, spec, prefix):
|
|||
if not self.compiler.fc:
|
||||
config_args.append("--disable-fc")
|
||||
|
||||
if not self.compiler.fc and not self.compiler.f77:
|
||||
config_args.append("--disable-fortran")
|
||||
|
||||
configure(*config_args)
|
||||
make()
|
||||
make("install")
|
||||
|
|
154
var/spack/repos/builtin/packages/mrnet/krell-5.0.1.patch
Normal file
154
var/spack/repos/builtin/packages/mrnet/krell-5.0.1.patch
Normal file
|
@ -0,0 +1,154 @@
|
|||
--- mrnet-3093918/include/mrnet/Types.h 2015-12-10 09:32:24.000000000 -0800
|
||||
+++ mrnet_top_of_tree/include/mrnet/Types.h 2016-03-16 12:29:33.986132302 -0700
|
||||
@@ -23,7 +23,7 @@
|
||||
#ifndef MRNET_VERSION_MAJOR
|
||||
# define MRNET_VERSION_MAJOR 5
|
||||
# define MRNET_VERSION_MINOR 0
|
||||
-# define MRNET_VERSION_REV 0
|
||||
+# define MRNET_VERSION_REV 1
|
||||
#endif
|
||||
|
||||
namespace MRN
|
||||
--- mrnet-3093918/include/mrnet_lightweight/Types.h 2015-12-10 09:32:24.000000000 -0800
|
||||
+++ mrnet_top_of_tree/include/mrnet_lightweight/Types.h 2016-03-16 12:29:33.987132302 -0700
|
||||
@@ -30,7 +30,7 @@
|
||||
#ifndef MRNET_VERSION_MAJOR
|
||||
#define MRNET_VERSION_MAJOR 5
|
||||
#define MRNET_VERSION_MINOR 0
|
||||
-#define MRNET_VERSION_REV 0
|
||||
+#define MRNET_VERSION_REV 1
|
||||
#endif
|
||||
void get_Version(int* major,
|
||||
int* minor,
|
||||
--- mrnet-3093918/src/lightweight/SerialGraph.c 2015-12-10 09:32:24.000000000 -0800
|
||||
+++ mrnet_top_of_tree/src/lightweight/SerialGraph.c 2016-03-16 12:29:33.995132302 -0700
|
||||
@@ -59,7 +59,7 @@
|
||||
|
||||
mrn_dbg_func_begin();
|
||||
|
||||
- sprintf(hoststr, "[%s:%hu:%u:", ihostname, iport, irank);
|
||||
+ sprintf(hoststr, "[%s:%05hu:%u:", ihostname, iport, irank);
|
||||
mrn_dbg(5, mrn_printf(FLF, stderr, "looking for SubTreeRoot: '%s'\n", hoststr));
|
||||
|
||||
byte_array = sg->byte_array;
|
||||
@@ -110,7 +110,7 @@
|
||||
|
||||
mrn_dbg_func_begin();
|
||||
|
||||
- len = (size_t) sprintf(hoststr, "[%s:%hu:%u:0]", ihostname, iport, irank);
|
||||
+ len = (size_t) sprintf(hoststr, "[%s:%05hu:%u:0]", ihostname, iport, irank);
|
||||
mrn_dbg(5, mrn_printf(FLF, stderr, "adding sub tree leaf: %s\n", hoststr));
|
||||
|
||||
len += strlen(sg->byte_array) + 1;
|
||||
@@ -139,7 +139,7 @@
|
||||
|
||||
mrn_dbg_func_begin();
|
||||
|
||||
- len = (size_t) sprintf(hoststr, "[%s:%hu:%u:1", ihostname, iport, irank);
|
||||
+ len = (size_t) sprintf(hoststr, "[%s:%05hu:%u:1", ihostname, iport, irank);
|
||||
mrn_dbg(5, mrn_printf(FLF, stderr, "adding sub tree root: %s\n", hoststr));
|
||||
|
||||
len += strlen(sg->byte_array) + 1;
|
||||
@@ -360,8 +360,8 @@
|
||||
char old_hoststr[256];
|
||||
char new_hoststr[256];
|
||||
|
||||
- sprintf(old_hoststr, "[%s:%hu:%u:", hostname, UnknownPort, irank);
|
||||
- sprintf(new_hoststr, "[%s:%hu:%u:", hostname, port, irank);
|
||||
+ sprintf(old_hoststr, "[%s:%05hu:%u:", hostname, UnknownPort, irank);
|
||||
+ sprintf(new_hoststr, "[%s:%05hu:%u:", hostname, port, irank);
|
||||
|
||||
old_byte_array = sg->byte_array;
|
||||
new_byte_array = (char*) malloc( strlen(old_byte_array) + 10 );
|
||||
--- mrnet-3093918/xplat/src/lightweight/SocketUtils.c 2015-12-10 09:32:24.000000000 -0800
|
||||
+++ mrnet_top_of_tree/xplat/src/lightweight/SocketUtils.c 2016-03-16 12:29:34.006132303 -0700
|
||||
@@ -15,7 +15,7 @@
|
||||
#else
|
||||
const XPlat_Socket InvalidSocket = INVALID_SOCKET;
|
||||
#endif
|
||||
-const XPlat_Port InvalidPort = (XPlat_Port)-1;
|
||||
+const XPlat_Port InvalidPort = (XPlat_Port)0;
|
||||
|
||||
static bool_t SetTcpNoDelay( XPlat_Socket sock )
|
||||
{
|
||||
--- mrnet-3093918/conf/configure.in 2015-12-10 09:32:24.000000000 -0800
|
||||
+++ mrnet_top_of_tree/conf/configure.in 2016-03-16 12:45:54.573196781 -0700
|
||||
@@ -107,6 +107,18 @@
|
||||
AC_SUBST(PURIFY)
|
||||
|
||||
|
||||
+AC_ARG_WITH(expat,
|
||||
+ [AS_HELP_STRING([--with-expat=PATH],
|
||||
+ [Absolute path to installation of EXPAT libraries (note: specify the path to the directory containing "include" and "lib" sub-directories)])],
|
||||
+ [EXPAT_DIR="${withval}"],
|
||||
+ [EXPAT_DIR=""])
|
||||
+
|
||||
+if test "x$EXPAT_DIR" = "x" ; then
|
||||
+ EXPAT_LIB=""
|
||||
+else
|
||||
+ EXPAT_LIB="-L$EXPAT_DIR/lib"
|
||||
+fi
|
||||
+
|
||||
dnl === Checks for header files.
|
||||
AC_CHECK_HEADERS([assert.h errno.h fcntl.h limits.h netdb.h signal.h stddef.h stdlib.h stdio.h string.h unistd.h arpa/inet.h netinet/in.h sys/ioctl.h sys/socket.h sys/sockio.h sys/time.h])
|
||||
AC_HEADER_STDBOOL
|
||||
@@ -432,7 +444,7 @@
|
||||
CRAYXT_ATH_LIBS_SO="$CRAYXT_ATH_LIBS -lalps"
|
||||
CRAYXT_ATH_LIBS="$CRAYXT_ATH_LIBS -Wl,-Bstatic -lalps -lxmlrpc -Wl,-Bdynamic"
|
||||
CRAYXE_ATH_LIBS_SO="$CRAYXE_ATH_LIBS -lalps"
|
||||
- CRAYXE_ATH_LIBS="$CRAYXE_ATH_LIBS -Wl,-Bstatic -lalps -lxmlrpc-epi -lexpat -Wl,-Bdynamic"
|
||||
+ CRAYXE_ATH_LIBS="$CRAYXE_ATH_LIBS -Wl,-Bstatic -lalps -lxmlrpc-epi $EXPAT_LIB -lexpat -Wl,-Bdynamic"
|
||||
|
||||
AC_CHECK_LIB( [alps], [alps_launch_tool_helper],
|
||||
[HAVE_ATH_LIBS="yes"; EXTRA_LIBS="$CRAYXT_ATH_LIBS $EXTRA_LIBS"; EXTRA_LIBS_SO="$CRAYXT_ATH_LIBS_SO $EXTRA_LIBS_SO"],
|
||||
--- mrnet-3093918/configure 2015-12-10 09:32:24.000000000 -0800
|
||||
+++ mrnet_top_of_tree/configure 2016-03-16 13:47:20.386439143 -0700
|
||||
@@ -742,6 +742,7 @@
|
||||
enable_debug
|
||||
enable_ltwt_threadsafe
|
||||
with_purify
|
||||
+with_expat
|
||||
'
|
||||
ac_precious_vars='build_alias
|
||||
host_alias
|
||||
@@ -1399,6 +1400,9 @@
|
||||
containing "include" and "lib" sub-directories)
|
||||
--with-launchmon=PATH Absolute path to installation of LaunchMON
|
||||
--with-purify Use purify for memory debugging
|
||||
+ --with-expat=PATH Absolute path to installation of EXPAT libraries
|
||||
+ (note: specify the path to the directory containing
|
||||
+ "include" and "lib" sub-directories)
|
||||
|
||||
Some influential environment variables:
|
||||
CC C compiler command
|
||||
@@ -3541,6 +3545,21 @@
|
||||
|
||||
|
||||
|
||||
+# Check whether --with-expat was given.
|
||||
+if test "${with_expat+set}" = set; then :
|
||||
+ withval=$with_expat; EXPAT_DIR="${withval}"
|
||||
+else
|
||||
+ EXPAT_DIR=""
|
||||
+fi
|
||||
+
|
||||
+
|
||||
+if test "x$EXPAT_DIR" = "x" ; then
|
||||
+ EXPAT_LIB=""
|
||||
+else
|
||||
+ EXPAT_LIB="-L$EXPAT_DIR/lib"
|
||||
+fi
|
||||
+
|
||||
+
|
||||
ac_ext=cpp
|
||||
ac_cpp='$CXXCPP $CPPFLAGS'
|
||||
ac_compile='$CXX -c $CXXFLAGS $CPPFLAGS conftest.$ac_ext >&5'
|
||||
@@ -5473,7 +5492,7 @@
|
||||
CRAYXT_ATH_LIBS_SO="$CRAYXT_ATH_LIBS -lalps"
|
||||
CRAYXT_ATH_LIBS="$CRAYXT_ATH_LIBS -Wl,-Bstatic -lalps -lxmlrpc -Wl,-Bdynamic"
|
||||
CRAYXE_ATH_LIBS_SO="$CRAYXE_ATH_LIBS -lalps"
|
||||
- CRAYXE_ATH_LIBS="$CRAYXE_ATH_LIBS -Wl,-Bstatic -lalps -lxmlrpc-epi -lexpat -Wl,-Bdynamic"
|
||||
+ CRAYXE_ATH_LIBS="$CRAYXE_ATH_LIBS -Wl,-Bstatic -lalps -lxmlrpc-epi $EXPAT_LIB -lexpat -Wl,-Bdynamic"
|
||||
|
||||
{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for alps_launch_tool_helper in -lalps" >&5
|
||||
$as_echo_n "checking for alps_launch_tool_helper in -lalps... " >&6; }
|
|
@ -3,11 +3,17 @@
|
|||
class Mrnet(Package):
|
||||
"""The MRNet Multi-Cast Reduction Network."""
|
||||
homepage = "http://paradyn.org/mrnet"
|
||||
url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_4.0.0.tar.gz"
|
||||
url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_5.0.1.tar.gz"
|
||||
list_url = "http://ftp.cs.wisc.edu/paradyn/mrnet"
|
||||
|
||||
version('4.0.0', 'd00301c078cba57ef68613be32ceea2f')
|
||||
version('4.1.0', '5a248298b395b329e2371bf25366115c')
|
||||
version('5.0.1', '17f65738cf1b9f9b95647ff85f69ecdd')
|
||||
version('4.1.0', '5a248298b395b329e2371bf25366115c')
|
||||
version('4.0.0', 'd00301c078cba57ef68613be32ceea2f')
|
||||
|
||||
# Add a patch that brings mrnet-5.0.1 up to date with the current development tree
|
||||
# The development tree contains fixes needed for the krell based tools
|
||||
variant('krellpatch', default=False, description="Build MRNet with krell openspeedshop based patch.")
|
||||
patch('krell-5.0.1.patch', when='@5.0.1+krellpatch')
|
||||
|
||||
variant('lwthreads', default=False, description="Also build the MRNet LW threadsafe libraries")
|
||||
parallel = False
|
||||
|
|
|
@ -8,12 +8,9 @@ IORDERINGSF = $(ISCOTCH)
|
|||
IORDERINGSC = $(IMETIS) $(IPORD) $(ISCOTCH)
|
||||
|
||||
PLAT =
|
||||
LIBEXT = .a
|
||||
OUTC = -o
|
||||
OUTC = -o
|
||||
OUTF = -o
|
||||
RM = /bin/rm -f
|
||||
AR = ar vr
|
||||
RANLIB = ranlib
|
||||
|
||||
INCSEQ = -I$(topdir)/libseq
|
||||
LIBSEQ = -L$(topdir)/libseq -lmpiseq
|
||||
|
|
|
@ -1,6 +1,5 @@
|
|||
from spack import *
|
||||
import os
|
||||
|
||||
import os, sys, glob
|
||||
|
||||
class Mumps(Package):
|
||||
"""MUMPS: a MUltifrontal Massively Parallel sparse direct Solver"""
|
||||
|
@ -19,11 +18,12 @@ class Mumps(Package):
|
|||
variant('float', default=True, description='Activate the compilation of smumps')
|
||||
variant('complex', default=True, description='Activate the compilation of cmumps and/or zmumps')
|
||||
variant('idx64', default=False, description='Use int64_t/integer*8 as default index type')
|
||||
variant('shared', default=True, description='Build shared libraries')
|
||||
|
||||
|
||||
depends_on('scotch + esmumps', when='~ptscotch+scotch')
|
||||
depends_on('scotch + esmumps + mpi', when='+ptscotch')
|
||||
depends_on('metis', when='+metis')
|
||||
depends_on('metis@5:', when='+metis')
|
||||
depends_on('parmetis', when="+parmetis")
|
||||
depends_on('blas')
|
||||
depends_on('lapack')
|
||||
|
@ -70,6 +70,9 @@ def write_makefile_inc(self):
|
|||
|
||||
makefile_conf.append("ORDERINGSF = %s" % (' '.join(orderings)))
|
||||
|
||||
# when building shared libs need -fPIC, otherwise
|
||||
# /usr/bin/ld: graph.o: relocation R_X86_64_32 against `.rodata.str1.1' can not be used when making a shared object; recompile with -fPIC
|
||||
fpic = '-fPIC' if '+shared' in self.spec else ''
|
||||
# TODO: test this part, it needs a full blas, scalapack and
|
||||
# partitionning environment with 64bit integers
|
||||
if '+idx64' in self.spec:
|
||||
|
@ -77,14 +80,14 @@ def write_makefile_inc(self):
|
|||
# the fortran compilation flags most probably are
|
||||
# working only for intel and gnu compilers this is
|
||||
# perhaps something the compiler should provide
|
||||
['OPTF = -O -DALLOW_NON_INIT %s' % '-fdefault-integer-8' if self.compiler.name == "gcc" else '-i8',
|
||||
'OPTL = -O ',
|
||||
'OPTC = -O -DINTSIZE64'])
|
||||
['OPTF = %s -O -DALLOW_NON_INIT %s' % (fpic,'-fdefault-integer-8' if self.compiler.name == "gcc" else '-i8'),
|
||||
'OPTL = %s -O ' % fpic,
|
||||
'OPTC = %s -O -DINTSIZE64' % fpic])
|
||||
else:
|
||||
makefile_conf.extend(
|
||||
['OPTF = -O -DALLOW_NON_INIT',
|
||||
'OPTL = -O ',
|
||||
'OPTC = -O '])
|
||||
['OPTF = %s -O -DALLOW_NON_INIT' % fpic,
|
||||
'OPTL = %s -O ' % fpic,
|
||||
'OPTC = %s -O ' % fpic])
|
||||
|
||||
|
||||
if '+mpi' in self.spec:
|
||||
|
@ -105,6 +108,27 @@ def write_makefile_inc(self):
|
|||
# compiler possible values are -DAdd_, -DAdd__ and/or -DUPPER
|
||||
makefile_conf.append("CDEFS = -DAdd_")
|
||||
|
||||
if '+shared' in self.spec:
|
||||
if sys.platform == 'darwin':
|
||||
# Building dylibs with mpif90 causes segfaults on 10.8 and 10.10. Use gfortran. (Homebrew)
|
||||
makefile_conf.extend([
|
||||
'LIBEXT=.dylib',
|
||||
'AR=%s -dynamiclib -Wl,-install_name -Wl,%s/$(notdir $@) -undefined dynamic_lookup -o ' % (os.environ['FC'],prefix.lib),
|
||||
'RANLIB=echo'
|
||||
])
|
||||
else:
|
||||
makefile_conf.extend([
|
||||
'LIBEXT=.so',
|
||||
'AR=$(FL) -shared -Wl,-soname -Wl,%s/$(notdir $@) -o' % prefix.lib,
|
||||
'RANLIB=echo'
|
||||
])
|
||||
else:
|
||||
makefile_conf.extend([
|
||||
'LIBEXT = .a',
|
||||
'AR = ar vr',
|
||||
'RANLIB = ranlib'
|
||||
])
|
||||
|
||||
|
||||
makefile_inc_template = join_path(os.path.dirname(self.module.__file__),
|
||||
'Makefile.inc')
|
||||
|
@ -121,7 +145,7 @@ def write_makefile_inc(self):
|
|||
def install(self, spec, prefix):
|
||||
make_libs = []
|
||||
|
||||
# the coice to compile ?examples is to have kind of a sanity
|
||||
# the choice to compile ?examples is to have kind of a sanity
|
||||
# check on the libraries generated.
|
||||
if '+float' in spec:
|
||||
make_libs.append('sexamples')
|
||||
|
@ -135,9 +159,27 @@ def install(self, spec, prefix):
|
|||
|
||||
self.write_makefile_inc()
|
||||
|
||||
make(*make_libs)
|
||||
# Build fails in parallel
|
||||
make(*make_libs, parallel=False)
|
||||
|
||||
install_tree('lib', prefix.lib)
|
||||
install_tree('include', prefix.include)
|
||||
if '~mpi' in spec:
|
||||
install('libseq/libmpiseq.a', prefix.lib)
|
||||
|
||||
if '~mpi' in spec:
|
||||
lib_dsuffix = '.dylib' if sys.platform == 'darwin' else '.so'
|
||||
lib_suffix = lib_dsuffix if '+shared' in spec else '.a'
|
||||
install('libseq/libmpiseq%s' % lib_suffix, prefix.lib)
|
||||
for f in glob.glob(join_path('libseq','*.h')):
|
||||
install(f, prefix.include)
|
||||
|
||||
# FIXME: extend the tests to mpirun -np 2 (or alike) when build with MPI
|
||||
# FIXME: use something like numdiff to compare blessed output with the current
|
||||
with working_dir('examples'):
|
||||
if '+float' in spec:
|
||||
os.system('./ssimpletest < input_simpletest_real')
|
||||
if '+complex' in spec:
|
||||
os.system('./csimpletest < input_simpletest_real')
|
||||
if '+double' in spec:
|
||||
os.system('./dsimpletest < input_simpletest_real')
|
||||
if '+complex' in spec:
|
||||
os.system('./zsimpletest < input_simpletest_cmplx')
|
||||
|
|
18
var/spack/repos/builtin/packages/muparser/package.py
Normal file
18
var/spack/repos/builtin/packages/muparser/package.py
Normal file
|
@ -0,0 +1,18 @@
|
|||
from spack import *
|
||||
|
||||
class Muparser(Package):
|
||||
"""C++ math expression parser library."""
|
||||
homepage = "http://muparser.beltoforion.de/"
|
||||
url = "https://github.com/beltoforion/muparser/archive/v2.2.5.tar.gz"
|
||||
|
||||
version('2.2.5', '02dae671aa5ad955fdcbcd3fee313fb7')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = ['--disable-debug',
|
||||
'--disable-dependency-tracking',
|
||||
'--prefix=%s' % prefix]
|
||||
|
||||
configure(*options)
|
||||
|
||||
make(parallel=False)
|
||||
make("install")
|
|
@ -140,6 +140,13 @@ def set_network_type(self, spec, configure_args):
|
|||
|
||||
configure_args.extend(network_options)
|
||||
|
||||
def setup_dependent_environment(self, spack_env, run_env, extension_spec):
|
||||
spack_env.set('MPICH_CC', spack_cc)
|
||||
spack_env.set('MPICH_CXX', spack_cxx)
|
||||
spack_env.set('MPICH_F77', spack_f77)
|
||||
spack_env.set('MPICH_F90', spack_fc)
|
||||
spack_env.set('MPICH_FC', spack_fc)
|
||||
|
||||
def install(self, spec, prefix):
|
||||
# we'll set different configure flags depending on our environment
|
||||
configure_args = [
|
||||
|
|
|
@ -8,11 +8,10 @@ class Ncurses(Package):
|
|||
"""
|
||||
|
||||
homepage = "http://invisible-island.net/ncurses/ncurses.html"
|
||||
url = "http://ftp.gnu.org/pub/gnu/ncurses/ncurses-6.0.tar.gz"
|
||||
|
||||
version('5.9', '8cb9c412e5f2d96bc6f459aa8c6282a1',
|
||||
url='http://ftp.gnu.org/pub/gnu/ncurses/ncurses-5.9.tar.gz')
|
||||
version('6.0', 'ee13d052e1ead260d7c28071f46eefb1',
|
||||
url='http://ftp.gnu.org/pub/gnu/ncurses/ncurses-6.0.tar.gz')
|
||||
version('6.0', 'ee13d052e1ead260d7c28071f46eefb1')
|
||||
version('5.9', '8cb9c412e5f2d96bc6f459aa8c6282a1')
|
||||
|
||||
patch('patch_gcc_5.txt', when='%gcc@5.0:')
|
||||
|
||||
|
|
19
var/spack/repos/builtin/packages/netcdf-cxx/package.py
Normal file
19
var/spack/repos/builtin/packages/netcdf-cxx/package.py
Normal file
|
@ -0,0 +1,19 @@
|
|||
from spack import *
|
||||
|
||||
class NetcdfCxx(Package):
|
||||
"""Deprecated C++ compatibility bindings for NetCDF.
|
||||
These do NOT read or write NetCDF-4 files, and are no longer
|
||||
maintained by Unidata. Developers should migrate to current
|
||||
NetCDF C++ bindings, in Spack package netcdf-cxx4."""
|
||||
|
||||
homepage = "http://www.unidata.ucar.edu/software/netcdf"
|
||||
url = "http://www.unidata.ucar.edu/downloads/netcdf/ftp/netcdf-cxx-4.2.tar.gz"
|
||||
|
||||
version('4.2', 'd32b20c00f144ae6565d9e98d9f6204c')
|
||||
|
||||
depends_on('netcdf')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure('--prefix=%s' % prefix)
|
||||
make()
|
||||
make("install")
|
|
@ -43,6 +43,13 @@ def install(self, spec, prefix):
|
|||
"--enable-dap"
|
||||
]
|
||||
|
||||
# Make sure Netcdf links against Spack's curl
|
||||
# Otherwise it may pick up system's curl, which could lead to link errors:
|
||||
# /usr/lib/x86_64-linux-gnu/libcurl.so: undefined reference to `SSL_CTX_use_certificate_chain_file@OPENSSL_1.0.0'
|
||||
LIBS.append("-lcurl")
|
||||
CPPFLAGS.append("-I%s" % spec['curl'].prefix.include)
|
||||
LDFLAGS.append ("-L%s" % spec['curl'].prefix.lib)
|
||||
|
||||
if '+mpi' in spec:
|
||||
config_args.append('--enable-parallel4')
|
||||
|
||||
|
|
|
@ -1,46 +0,0 @@
|
|||
from spack import *
|
||||
import os
|
||||
|
||||
|
||||
class NetlibBlas(Package):
|
||||
"""Netlib reference BLAS"""
|
||||
homepage = "http://www.netlib.org/lapack/"
|
||||
url = "http://www.netlib.org/lapack/lapack-3.5.0.tgz"
|
||||
|
||||
version('3.5.0', 'b1d3e3e425b2e44a06760ff173104bdf')
|
||||
|
||||
variant('fpic', default=False, description="Build with -fpic compiler option")
|
||||
|
||||
# virtual dependency
|
||||
provides('blas')
|
||||
|
||||
# Doesn't always build correctly in parallel
|
||||
parallel = False
|
||||
|
||||
def patch(self):
|
||||
os.symlink('make.inc.example', 'make.inc')
|
||||
|
||||
mf = FileFilter('make.inc')
|
||||
mf.filter('^FORTRAN.*', 'FORTRAN = f90')
|
||||
mf.filter('^LOADER.*', 'LOADER = f90')
|
||||
mf.filter('^CC =.*', 'CC = cc')
|
||||
|
||||
if '+fpic' in self.spec:
|
||||
mf.filter('^OPTS.*=.*', 'OPTS = -O2 -frecursive -fpic')
|
||||
mf.filter('^CFLAGS =.*', 'CFLAGS = -O3 -fpic')
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
make('blaslib')
|
||||
|
||||
# Tests that blas builds correctly
|
||||
make('blas_testing')
|
||||
|
||||
# No install provided
|
||||
mkdirp(prefix.lib)
|
||||
install('librefblas.a', prefix.lib)
|
||||
|
||||
# Blas virtual package should provide blas.a and libblas.a
|
||||
with working_dir(prefix.lib):
|
||||
symlink('librefblas.a', 'blas.a')
|
||||
symlink('librefblas.a', 'libblas.a')
|
|
@ -1,16 +1,15 @@
|
|||
from spack import *
|
||||
|
||||
|
||||
class NetlibLapack(Package):
|
||||
"""
|
||||
LAPACK version 3.X is a comprehensive FORTRAN library that does
|
||||
linear algebra operations including matrix inversions, least
|
||||
squared solutions to linear sets of equations, eigenvector
|
||||
analysis, singular value decomposition, etc. It is a very
|
||||
comprehensive and reputable package that has found extensive
|
||||
use in the scientific community.
|
||||
LAPACK version 3.X is a comprehensive FORTRAN library that does linear algebra operations including matrix
|
||||
inversions, least squared solutions to linear sets of equations, eigenvector analysis, singular value
|
||||
decomposition, etc. It is a very comprehensive and reputable package that has found extensive use in the
|
||||
scientific community.
|
||||
"""
|
||||
homepage = "http://www.netlib.org/lapack/"
|
||||
url = "http://www.netlib.org/lapack/lapack-3.5.0.tgz"
|
||||
url = "http://www.netlib.org/lapack/lapack-3.5.0.tgz"
|
||||
|
||||
version('3.6.0', 'f2f6c67134e851fe189bb3ca1fbb5101')
|
||||
version('3.5.0', 'b1d3e3e425b2e44a06760ff173104bdf')
|
||||
|
@ -19,41 +18,67 @@ class NetlibLapack(Package):
|
|||
version('3.4.0', '02d5706ec03ba885fc246e5fa10d8c70')
|
||||
version('3.3.1', 'd0d533ec9a5b74933c2a1e84eedc58b4')
|
||||
|
||||
variant('shared', default=False, description="Build shared library version")
|
||||
variant('fpic', default=False, description="Build with -fpic compiler option")
|
||||
variant('debug', default=False, description='Activates the Debug build type')
|
||||
variant('shared', default=True, description="Build shared library version")
|
||||
variant('external-blas', default=False, description='Build lapack with an external blas')
|
||||
|
||||
variant('lapacke', default=True, description='Activates the build of the LAPACKE C interface')
|
||||
|
||||
# virtual dependency
|
||||
provides('blas', when='~external-blas')
|
||||
provides('lapack')
|
||||
|
||||
# blas is a virtual dependency.
|
||||
depends_on('blas')
|
||||
depends_on('cmake')
|
||||
depends_on('blas', when='+external-blas')
|
||||
|
||||
# Doesn't always build correctly in parallel
|
||||
parallel = False
|
||||
|
||||
@when('^netlib-blas')
|
||||
def get_blas_libs(self):
|
||||
blas = self.spec['netlib-blas']
|
||||
return [join_path(blas.prefix.lib, 'blas.a')]
|
||||
def patch(self):
|
||||
# Fix cblas CMakeLists.txt -- has wrong case for subdirectory name.
|
||||
if self.spec.satisfies('@3.6.0:'):
|
||||
filter_file('${CMAKE_CURRENT_SOURCE_DIR}/CMAKE/',
|
||||
'${CMAKE_CURRENT_SOURCE_DIR}/cmake/', 'CBLAS/CMakeLists.txt', string=True)
|
||||
|
||||
def install_one(self, spec, prefix, shared):
|
||||
cmake_args = ['-DBUILD_SHARED_LIBS:BOOL=%s' % ('ON' if shared else 'OFF'),
|
||||
'-DCMAKE_BUILD_TYPE:STRING=%s' % ('Debug' if '+debug' in spec else 'Release'),
|
||||
'-DLAPACKE:BOOL=%s' % ('ON' if '+lapacke' in spec else 'OFF')]
|
||||
if spec.satisfies('@3.6.0:'):
|
||||
cmake_args.extend(['-DCBLAS=ON']) # always build CBLAS
|
||||
|
||||
if '+external-blas' in spec:
|
||||
# TODO : the mechanism to specify the library should be more general,
|
||||
# TODO : but this allows to have an hook to an external blas
|
||||
cmake_args.extend([
|
||||
'-DUSE_OPTIMIZED_BLAS:BOOL=ON',
|
||||
'-DBLAS_LIBRARIES:PATH=%s' % join_path(spec['blas'].prefix.lib, 'libblas.a')
|
||||
])
|
||||
|
||||
cmake_args.extend(std_cmake_args)
|
||||
|
||||
build_dir = 'spack-build' + ('-shared' if shared else '-static')
|
||||
with working_dir(build_dir, create=True):
|
||||
cmake('..', *cmake_args)
|
||||
make()
|
||||
make("install")
|
||||
|
||||
@when('^atlas')
|
||||
def get_blas_libs(self):
|
||||
blas = self.spec['atlas']
|
||||
return [join_path(blas.prefix.lib, l)
|
||||
for l in ('libf77blas.a', 'libatlas.a')]
|
||||
|
||||
def install(self, spec, prefix):
|
||||
blas_libs = ";".join(self.get_blas_libs())
|
||||
cmake_args = [".", '-DBLAS_LIBRARIES=' + blas_libs]
|
||||
# Always build static libraries.
|
||||
self.install_one(spec, prefix, False)
|
||||
|
||||
# Build shared libraries if requested.
|
||||
if '+shared' in spec:
|
||||
cmake_args.append('-DBUILD_SHARED_LIBS=ON')
|
||||
if '+fpic' in spec:
|
||||
cmake_args.append('-DCMAKE_POSITION_INDEPENDENT_CODE=ON')
|
||||
self.install_one(spec, prefix, True)
|
||||
|
||||
cmake_args += std_cmake_args
|
||||
|
||||
cmake(*cmake_args)
|
||||
make()
|
||||
make("install")
|
||||
def setup_dependent_package(self, module, dspec):
|
||||
# This is WIP for a prototype interface for virtual packages.
|
||||
# We can update this as more builds start depending on BLAS/LAPACK.
|
||||
libdir = find_library_path('libblas.a', self.prefix.lib64, self.prefix.lib)
|
||||
|
||||
self.spec.blas_static_lib = join_path(libdir, 'libblas.a')
|
||||
self.spec.lapack_static_lib = join_path(libdir, 'liblapack.a')
|
||||
|
||||
if '+shared' in self.spec:
|
||||
self.spec.blas_shared_lib = join_path(libdir, 'libblas.%s' % dso_suffix)
|
||||
self.spec.lapack_shared_lib = join_path(libdir, 'liblapack.%s' % dso_suffix)
|
||||
|
|
|
@ -18,6 +18,7 @@ class NetlibScalapack(Package):
|
|||
|
||||
provides('scalapack')
|
||||
|
||||
depends_on('cmake')
|
||||
depends_on('mpi')
|
||||
depends_on('lapack')
|
||||
|
||||
|
@ -41,6 +42,11 @@ def install(self, spec, prefix):
|
|||
make()
|
||||
make("install")
|
||||
|
||||
# The shared libraries are not installed correctly on Darwin; correct this
|
||||
if (sys.platform == 'darwin') and ('+shared' in spec):
|
||||
fix_darwin_install_name(prefix.lib)
|
||||
|
||||
|
||||
def setup_dependent_package(self, module, dependent_spec):
|
||||
spec = self.spec
|
||||
lib_dsuffix = '.dylib' if sys.platform == 'darwin' else '.so'
|
||||
|
|
|
@ -16,7 +16,7 @@ def install(self, spec, prefix):
|
|||
|
||||
cp = which('cp')
|
||||
|
||||
bindir = os.path.join(prefix, 'bin')
|
||||
bindir = os.path.join(prefix, 'bin/')
|
||||
mkdir(bindir)
|
||||
cp('-a', '-t', bindir, 'ninja')
|
||||
cp('-ra', 'misc', prefix)
|
||||
cp('-a', 'ninja', bindir)
|
||||
cp('-a', 'misc', prefix)
|
||||
|
|
44
var/spack/repos/builtin/packages/numdiff/package.py
Normal file
44
var/spack/repos/builtin/packages/numdiff/package.py
Normal file
|
@ -0,0 +1,44 @@
|
|||
##############################################################################
|
||||
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://github.com/llnl/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License (as published by
|
||||
# the Free Software Foundation) version 2.1 dated February 1999.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but
|
||||
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
|
||||
# conditions of the GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from spack import *
|
||||
import sys
|
||||
|
||||
class Numdiff(Package):
|
||||
"""Numdiff is a little program that can be used to compare putatively
|
||||
similar files line by line and field by field, ignoring small numeric
|
||||
differences or/and different numeric formats."""
|
||||
|
||||
homepage = 'https://www.nongnu.org/numdiff'
|
||||
url = 'http://nongnu.askapache.com/numdiff/numdiff-5.8.1.tar.gz'
|
||||
|
||||
version('5.8.1', 'a295eb391f6cb1578209fc6b4f9d994e')
|
||||
|
||||
depends_on('gettext', sys.platform=='darwin')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = ['--prefix=%s' % prefix]
|
||||
configure(*options)
|
||||
make()
|
||||
make('install')
|
482
var/spack/repos/builtin/packages/oce/null.patch
Normal file
482
var/spack/repos/builtin/packages/oce/null.patch
Normal file
|
@ -0,0 +1,482 @@
|
|||
From 61cb965b9ffeca419005bc15e635e67589c421dd Mon Sep 17 00:00:00 2001
|
||||
From: Martin Siggel <martin.siggel@dlr.de>
|
||||
Date: Thu, 28 Jan 2016 19:05:00 +0100
|
||||
Subject: [PATCH] Workaround clang optimizations for null references
|
||||
|
||||
OCCT/OCE includes some evil code that uses NULL references,
|
||||
which are normally not possible. Clang removes code in
|
||||
branches like if(&myNullRef==NULL) as it assumes this can
|
||||
never be true. This fix was inspired from the mantis issue
|
||||
http://tracker.dev.opencascade.org/view.php?id=26042. This
|
||||
code will be fixed in OCCT 7, but we might require the fix
|
||||
for earlier releases as well.
|
||||
|
||||
Fixes issue #576
|
||||
---
|
||||
inc/PLib.hxx | 2 +-
|
||||
src/BSplCLib/BSplCLib.cxx | 16 ++++++-------
|
||||
src/BSplCLib/BSplCLib_2.cxx | 6 ++---
|
||||
src/BSplCLib/BSplCLib_CurveComputation.gxx | 26 ++++++++++-----------
|
||||
src/BSplSLib/BSplSLib.cxx | 36 +++++++++++++++---------------
|
||||
src/BSplSLib/BSplSLib_BzSyntaxes.cxx | 2 +-
|
||||
src/PLib/PLib.cxx | 10 ++++-----
|
||||
7 files changed, 49 insertions(+), 49 deletions(-)
|
||||
|
||||
diff --git a/inc/PLib.hxx b/inc/PLib.hxx
|
||||
index 7513234..52b1f84 100644
|
||||
--- a/inc/PLib.hxx
|
||||
+++ b/inc/PLib.hxx
|
||||
@@ -343,6 +343,6 @@ friend class PLib_DoubleJacobiPolynomial;
|
||||
|
||||
|
||||
|
||||
-
|
||||
+#define IS_NULL_REF(ref) ((reinterpret_cast<size_t>(&ref) & 0xFFFFFF) == 0)
|
||||
|
||||
#endif // _PLib_HeaderFile
|
||||
diff --git a/src/BSplCLib/BSplCLib.cxx b/src/BSplCLib/BSplCLib.cxx
|
||||
index 683e4ab..2a2d9ea 100644
|
||||
--- a/src/BSplCLib/BSplCLib.cxx
|
||||
+++ b/src/BSplCLib/BSplCLib.cxx
|
||||
@@ -298,7 +298,7 @@ void BSplCLib::LocateParameter
|
||||
Standard_Real& NewU)
|
||||
{
|
||||
Standard_Integer first,last;
|
||||
- if (&Mults) {
|
||||
+ if (!IS_NULL_REF(Mults)) {
|
||||
if (Periodic) {
|
||||
first = Knots.Lower();
|
||||
last = Knots.Upper();
|
||||
@@ -1434,7 +1434,7 @@ void BSplCLib::BuildKnots(const Standard_Integer Degree,
|
||||
const Standard_Real * pkn = &Knots(KLower);
|
||||
pkn -= KLower;
|
||||
Standard_Real *knot = &LK;
|
||||
- if (&Mults == NULL) {
|
||||
+ if (IS_NULL_REF(Mults)) {
|
||||
switch (Degree) {
|
||||
case 1 : {
|
||||
Standard_Integer j = Index ;
|
||||
@@ -1672,7 +1672,7 @@ Standard_Boolean BSplCLib::PrepareInsertKnots
|
||||
const Standard_Real Tolerance,
|
||||
const Standard_Boolean Add)
|
||||
{
|
||||
- Standard_Boolean addflat = &AddMults == NULL;
|
||||
+ Standard_Boolean addflat = IS_NULL_REF(AddMults);
|
||||
|
||||
Standard_Integer first,last;
|
||||
if (Periodic) {
|
||||
@@ -1856,7 +1856,7 @@ void BSplCLib::InsertKnots
|
||||
const Standard_Real Tolerance,
|
||||
const Standard_Boolean Add)
|
||||
{
|
||||
- Standard_Boolean addflat = &AddMults == NULL;
|
||||
+ Standard_Boolean addflat = IS_NULL_REF(AddMults);
|
||||
|
||||
Standard_Integer i,k,mult,firstmult;
|
||||
Standard_Integer index,kn,curnk,curk;
|
||||
@@ -3902,7 +3902,7 @@ void BSplCLib::Resolution( Standard_Real& Poles,
|
||||
num_poles = FlatKnots.Length() - Deg1;
|
||||
switch (ArrayDimension) {
|
||||
case 2 : {
|
||||
- if (&Weights != NULL) {
|
||||
+ if (!IS_NULL_REF(Weights)) {
|
||||
const Standard_Real * WG = &Weights(Weights.Lower());
|
||||
min_weights = WG[0];
|
||||
|
||||
@@ -3970,7 +3970,7 @@ void BSplCLib::Resolution( Standard_Real& Poles,
|
||||
break;
|
||||
}
|
||||
case 3 : {
|
||||
- if (&Weights != NULL) {
|
||||
+ if (!IS_NULL_REF(Weights)) {
|
||||
const Standard_Real * WG = &Weights(Weights.Lower());
|
||||
min_weights = WG[0];
|
||||
|
||||
@@ -4047,7 +4047,7 @@ void BSplCLib::Resolution( Standard_Real& Poles,
|
||||
break;
|
||||
}
|
||||
case 4 : {
|
||||
- if (&Weights != NULL) {
|
||||
+ if (!IS_NULL_REF(Weights)) {
|
||||
const Standard_Real * WG = &Weights(Weights.Lower());
|
||||
min_weights = WG[0];
|
||||
|
||||
@@ -4134,7 +4134,7 @@ void BSplCLib::Resolution( Standard_Real& Poles,
|
||||
}
|
||||
default : {
|
||||
Standard_Integer kk;
|
||||
- if (&Weights != NULL) {
|
||||
+ if (!IS_NULL_REF(Weights)) {
|
||||
const Standard_Real * WG = &Weights(Weights.Lower());
|
||||
min_weights = WG[0];
|
||||
|
||||
diff --git a/src/BSplCLib/BSplCLib_2.cxx b/src/BSplCLib/BSplCLib_2.cxx
|
||||
index 35c4639..653b7cd 100644
|
||||
--- a/src/BSplCLib/BSplCLib_2.cxx
|
||||
+++ b/src/BSplCLib/BSplCLib_2.cxx
|
||||
@@ -70,7 +70,7 @@ void BSplCLib::BuildEval(const Standard_Integer Degree,
|
||||
Standard_Integer i;
|
||||
Standard_Integer ip = PLower + Index - 1;
|
||||
Standard_Real w, *pole = &LP;
|
||||
- if (&Weights == NULL) {
|
||||
+ if (IS_NULL_REF(Weights)) {
|
||||
|
||||
for (i = 0; i <= Degree; i++) {
|
||||
ip++;
|
||||
@@ -115,13 +115,13 @@ static void PrepareEval
|
||||
|
||||
// make the knots
|
||||
BSplCLib::BuildKnots(Degree,index,Periodic,Knots,Mults,*dc.knots);
|
||||
- if (&Mults == NULL)
|
||||
+ if (IS_NULL_REF(Mults))
|
||||
index -= Knots.Lower() + Degree;
|
||||
else
|
||||
index = BSplCLib::PoleIndex(Degree,index,Periodic,Mults);
|
||||
|
||||
// check truly rational
|
||||
- rational = (&Weights != NULL);
|
||||
+ rational = (!IS_NULL_REF(Weights));
|
||||
if (rational) {
|
||||
Standard_Integer WLower = Weights.Lower() + index;
|
||||
rational = BSplCLib::IsRational(Weights, WLower, WLower + Degree);
|
||||
diff --git a/src/BSplCLib/BSplCLib_CurveComputation.gxx b/src/BSplCLib/BSplCLib_CurveComputation.gxx
|
||||
index e71b4e0..9d42643 100644
|
||||
--- a/src/BSplCLib/BSplCLib_CurveComputation.gxx
|
||||
+++ b/src/BSplCLib/BSplCLib_CurveComputation.gxx
|
||||
@@ -92,7 +92,7 @@ Standard_Boolean BSplCLib::RemoveKnot
|
||||
TColStd_Array1OfInteger& NewMults,
|
||||
const Standard_Real Tolerance)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim;
|
||||
dim = Dimension_gen;
|
||||
if (rational) dim++;
|
||||
@@ -133,7 +133,7 @@ void BSplCLib::InsertKnots
|
||||
const Standard_Real Epsilon,
|
||||
const Standard_Boolean Add)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim;
|
||||
dim = Dimension_gen;
|
||||
if (rational) dim++;
|
||||
@@ -222,7 +222,7 @@ void BSplCLib::IncreaseDegree
|
||||
TColStd_Array1OfReal& NewKnots,
|
||||
TColStd_Array1OfInteger& NewMults)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim;
|
||||
dim = Dimension_gen;
|
||||
if (rational) dim++;
|
||||
@@ -256,7 +256,7 @@ void BSplCLib::Unperiodize
|
||||
Array1OfPoints& NewPoles,
|
||||
TColStd_Array1OfReal& NewWeights)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim;
|
||||
dim = Dimension_gen;
|
||||
if (rational) dim++;
|
||||
@@ -292,7 +292,7 @@ void BSplCLib::Trimming(const Standard_Integer Degree,
|
||||
Array1OfPoints& NewPoles,
|
||||
TColStd_Array1OfReal& NewWeights)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim;
|
||||
dim = Dimension_gen;
|
||||
if (rational) dim++;
|
||||
@@ -339,7 +339,7 @@ void BSplCLib::BuildEval(const Standard_Integer Degree,
|
||||
Standard_Integer PUpper = Poles.Upper();
|
||||
Standard_Integer i;
|
||||
Standard_Integer ip = PLower + Index - 1;
|
||||
- if (&Weights == NULL) {
|
||||
+ if (IS_NULL_REF(Weights)) {
|
||||
for (i = 0; i <= Degree; i++) {
|
||||
ip++;
|
||||
if (ip > PUpper) ip = PLower;
|
||||
@@ -384,13 +384,13 @@ static void PrepareEval
|
||||
|
||||
// make the knots
|
||||
BSplCLib::BuildKnots(Degree,index,Periodic,Knots,Mults,*dc.knots);
|
||||
- if (&Mults == NULL)
|
||||
+ if (IS_NULL_REF(Mults))
|
||||
index -= Knots.Lower() + Degree;
|
||||
else
|
||||
index = BSplCLib::PoleIndex(Degree,index,Periodic,Mults);
|
||||
|
||||
// check truly rational
|
||||
- rational = (&Weights != NULL);
|
||||
+ rational = (!IS_NULL_REF(Weights));
|
||||
if (rational) {
|
||||
Standard_Integer WLower = Weights.Lower() + index;
|
||||
rational = BSplCLib::IsRational(Weights, WLower, WLower + Degree);
|
||||
@@ -741,7 +741,7 @@ void BSplCLib::CacheD0(const Standard_Real Parameter,
|
||||
Degree * Dimension_gen,
|
||||
PArray[0],
|
||||
myPoint[0]) ;
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
Standard_Real *
|
||||
WArray = (Standard_Real *) &WeightsArray(WeightsArray.Lower()) ;
|
||||
PLib::NoDerivativeEvalPolynomial(NewParameter,
|
||||
@@ -798,7 +798,7 @@ void BSplCLib::CacheD1(const Standard_Real Parameter,
|
||||
|
||||
ModifyCoords (LocalPDerivatives + Dimension_gen, /= SpanLenght);
|
||||
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
Standard_Real *
|
||||
WArray = (Standard_Real *) &WeightsArray(WeightsArray.Lower()) ;
|
||||
PLib::EvalPolynomial(NewParameter,
|
||||
@@ -878,7 +878,7 @@ void BSplCLib::CacheD2(const Standard_Real Parameter,
|
||||
Index += Dimension_gen;
|
||||
}
|
||||
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
Standard_Real *
|
||||
WArray = (Standard_Real *) &WeightsArray(WeightsArray.Lower()) ;
|
||||
|
||||
@@ -971,7 +971,7 @@ void BSplCLib::CacheD3(const Standard_Real Parameter,
|
||||
Index += Dimension_gen;
|
||||
}
|
||||
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
Standard_Real *
|
||||
WArray = (Standard_Real *) &WeightsArray(WeightsArray.Lower()) ;
|
||||
|
||||
@@ -1081,7 +1081,7 @@ void BSplCLib::BuildCache
|
||||
LocalValue *= SpanDomain / (Standard_Real) ii ;
|
||||
}
|
||||
|
||||
- if (&Weights != NULL) {
|
||||
+ if (!IS_NULL_REF(Weights)) {
|
||||
for (ii = 1 ; ii <= Degree + 1 ; ii++)
|
||||
CacheWeights(ii) = 0.0e0 ;
|
||||
CacheWeights(1) = 1.0e0 ;
|
||||
diff --git a/src/BSplSLib/BSplSLib.cxx b/src/BSplSLib/BSplSLib.cxx
|
||||
index 5ad633c..07040d5 100644
|
||||
--- a/src/BSplSLib/BSplSLib.cxx
|
||||
+++ b/src/BSplSLib/BSplSLib.cxx
|
||||
@@ -309,12 +309,12 @@ static Standard_Boolean PrepareEval (const Standard_Real U,
|
||||
BSplCLib::BuildKnots(UDegree,uindex,UPer,UKnots,UMults,*dc.knots1);
|
||||
BSplCLib::BuildKnots(VDegree,vindex,VPer,VKnots,VMults,*dc.knots2);
|
||||
|
||||
- if (&UMults == NULL)
|
||||
+ if (IS_NULL_REF(UMults))
|
||||
uindex -= UKLower + UDegree;
|
||||
else
|
||||
uindex = BSplCLib::PoleIndex(UDegree,uindex,UPer,UMults);
|
||||
|
||||
- if (&VMults == NULL)
|
||||
+ if (IS_NULL_REF(VMults))
|
||||
vindex -= VKLower + VDegree;
|
||||
else
|
||||
vindex = BSplCLib::PoleIndex(VDegree,vindex,VPer,VMults);
|
||||
@@ -460,12 +460,12 @@ static Standard_Boolean PrepareEval (const Standard_Real U,
|
||||
BSplCLib::BuildKnots(UDegree,uindex,UPer,UKnots,UMults,*dc.knots2);
|
||||
BSplCLib::BuildKnots(VDegree,vindex,VPer,VKnots,VMults,*dc.knots1);
|
||||
|
||||
- if (&UMults == NULL)
|
||||
+ if (IS_NULL_REF(UMults))
|
||||
uindex -= UKLower + UDegree;
|
||||
else
|
||||
uindex = BSplCLib::PoleIndex(UDegree,uindex,UPer,UMults);
|
||||
|
||||
- if (&VMults == NULL)
|
||||
+ if (IS_NULL_REF(VMults))
|
||||
vindex -= VKLower + VDegree;
|
||||
else
|
||||
vindex = BSplCLib::PoleIndex(VDegree,vindex,VPer,VMults);
|
||||
@@ -1299,7 +1299,7 @@ void BSplSLib::Iso(const Standard_Real Param,
|
||||
{
|
||||
Standard_Integer index = 0;
|
||||
Standard_Real u = Param;
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim = rational ? 4 : 3;
|
||||
|
||||
// compute local knots
|
||||
@@ -1307,7 +1307,7 @@ void BSplSLib::Iso(const Standard_Real Param,
|
||||
NCollection_LocalArray<Standard_Real> locknots1 (2*Degree);
|
||||
BSplCLib::LocateParameter(Degree,Knots,Mults,u,Periodic,index,u);
|
||||
BSplCLib::BuildKnots(Degree,index,Periodic,Knots,Mults,*locknots1);
|
||||
- if (&Mults == NULL)
|
||||
+ if (IS_NULL_REF(Mults))
|
||||
index -= Knots.Lower() + Degree;
|
||||
else
|
||||
index = BSplCLib::PoleIndex(Degree,index,Periodic,Mults);
|
||||
@@ -1381,7 +1381,7 @@ void BSplSLib::Iso(const Standard_Real Param,
|
||||
}
|
||||
|
||||
// if the input is not rational but weights are wanted
|
||||
- if (!rational && (&CWeights != NULL)) {
|
||||
+ if (!rational && (!IS_NULL_REF(CWeights))) {
|
||||
|
||||
for (i = CWeights.Lower(); i <= CWeights.Upper(); i++)
|
||||
CWeights(i) = 1.;
|
||||
@@ -1741,7 +1741,7 @@ void BSplSLib::InsertKnots(const Standard_Boolean UDirection,
|
||||
const Standard_Real Epsilon,
|
||||
const Standard_Boolean Add )
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim = 3;
|
||||
if (rational) dim++;
|
||||
|
||||
@@ -1787,7 +1787,7 @@ Standard_Boolean BSplSLib::RemoveKnot
|
||||
TColStd_Array1OfInteger& NewMults,
|
||||
const Standard_Real Tolerance)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim = 3;
|
||||
if (rational) dim++;
|
||||
|
||||
@@ -1834,7 +1834,7 @@ void BSplSLib::IncreaseDegree
|
||||
TColStd_Array1OfReal& NewKnots,
|
||||
TColStd_Array1OfInteger& NewMults)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim = 3;
|
||||
if (rational) dim++;
|
||||
|
||||
@@ -1876,7 +1876,7 @@ void BSplSLib::Unperiodize
|
||||
TColgp_Array2OfPnt& NewPoles,
|
||||
TColStd_Array2OfReal& NewWeights)
|
||||
{
|
||||
- Standard_Boolean rational = &Weights != NULL;
|
||||
+ Standard_Boolean rational = !IS_NULL_REF(Weights);
|
||||
Standard_Integer dim = 3;
|
||||
if (rational) dim++;
|
||||
|
||||
@@ -1929,7 +1929,7 @@ void BSplSLib::BuildCache
|
||||
Standard_Boolean rational,rational_u,rational_v,flag_u_or_v;
|
||||
Standard_Integer kk,d1,d1p1,d2,d2p1,ii,jj,iii,jjj,Index;
|
||||
Standard_Real u1,min_degree_domain,max_degree_domain,f,factor[2],u2;
|
||||
- if (&Weights != NULL)
|
||||
+ if (!IS_NULL_REF(Weights))
|
||||
rational_u = rational_v = Standard_True;
|
||||
else
|
||||
rational_u = rational_v = Standard_False;
|
||||
@@ -2025,7 +2025,7 @@ void BSplSLib::BuildCache
|
||||
}
|
||||
factor[0] *= max_degree_domain / (Standard_Real) (iii) ;
|
||||
}
|
||||
- if (&Weights != NULL) {
|
||||
+ if (!IS_NULL_REF(Weights)) {
|
||||
//
|
||||
// means that PrepareEval did found out that the surface was
|
||||
// locally polynomial but since the surface is constructed
|
||||
@@ -2110,7 +2110,7 @@ void BSplSLib::CacheD0(const Standard_Real UParameter,
|
||||
(min_degree << 1) + min_degree,
|
||||
locpoles[0],
|
||||
myPoint[0]) ;
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
dimension = min_degree + 1 ;
|
||||
Standard_Real *
|
||||
WArray = (Standard_Real *)
|
||||
@@ -2190,7 +2190,7 @@ void BSplSLib::CacheD1(const Standard_Real UParameter,
|
||||
// the coefficients
|
||||
//
|
||||
//
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
|
||||
local_poles_array [0][0][0] = 0.0e0 ;
|
||||
local_poles_array [0][0][1] = 0.0e0 ;
|
||||
@@ -2275,7 +2275,7 @@ void BSplSLib::CacheD1(const Standard_Real UParameter,
|
||||
locpoles[dimension],
|
||||
local_poles_array[1][0][0]) ;
|
||||
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
dimension = min_degree + 1 ;
|
||||
Standard_Real *
|
||||
WArray = (Standard_Real *)
|
||||
@@ -2435,7 +2435,7 @@ void BSplSLib::CacheD2(const Standard_Real UParameter,
|
||||
// the coefficients
|
||||
//
|
||||
//
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
|
||||
local_poles_and_weights_array[0][0][0] = 0.0e0 ;
|
||||
local_poles_and_weights_array[0][0][1] = 0.0e0 ;
|
||||
@@ -2564,7 +2564,7 @@ void BSplSLib::CacheD2(const Standard_Real UParameter,
|
||||
locpoles[dimension + dimension],
|
||||
local_poles_array[2][0][0]) ;
|
||||
|
||||
- if (&WeightsArray != NULL) {
|
||||
+ if (!IS_NULL_REF(WeightsArray)) {
|
||||
dimension = min_degree + 1 ;
|
||||
Standard_Real *
|
||||
WArray = (Standard_Real *)
|
||||
diff --git a/src/BSplSLib/BSplSLib_BzSyntaxes.cxx b/src/BSplSLib/BSplSLib_BzSyntaxes.cxx
|
||||
index 0faf6b6..f2c0f74 100644
|
||||
--- a/src/BSplSLib/BSplSLib_BzSyntaxes.cxx
|
||||
+++ b/src/BSplSLib/BSplSLib_BzSyntaxes.cxx
|
||||
@@ -68,7 +68,7 @@ void BSplSLib::PolesCoefficients (const TColgp_Array2OfPnt& Poles,
|
||||
biduflatknots,bidvflatknots,
|
||||
Poles,Weights,
|
||||
CPoles,CWeights);
|
||||
- if (&Weights == NULL) {
|
||||
+ if (IS_NULL_REF(Weights)) {
|
||||
|
||||
for (ii = 1; ii <= uclas; ii++) {
|
||||
|
||||
diff --git a/src/PLib/PLib.cxx b/src/PLib/PLib.cxx
|
||||
index 23fa302..7ee231f 100644
|
||||
--- a/src/PLib/PLib.cxx
|
||||
+++ b/src/PLib/PLib.cxx
|
||||
@@ -2427,7 +2427,7 @@ void PLib::CoefficientsPoles (const Standard_Integer dim,
|
||||
TColStd_Array1OfReal& Poles,
|
||||
TColStd_Array1OfReal& Weights)
|
||||
{
|
||||
- Standard_Boolean rat = &WCoefs != NULL;
|
||||
+ Standard_Boolean rat = !IS_NULL_REF(WCoefs);
|
||||
Standard_Integer loc = Coefs.Lower();
|
||||
Standard_Integer lop = Poles.Lower();
|
||||
Standard_Integer lowc=0;
|
||||
@@ -2550,7 +2550,7 @@ void PLib::Trimming(const Standard_Real U1,
|
||||
Standard_Integer indc, indw=0;
|
||||
Standard_Integer upc = Coefs.Upper() - dim + 1, upw=0;
|
||||
Standard_Integer len = Coefs.Length()/dim;
|
||||
- Standard_Boolean rat = &WCoefs != NULL;
|
||||
+ Standard_Boolean rat = !IS_NULL_REF(WCoefs);
|
||||
|
||||
if (rat) {
|
||||
if(len != WCoefs.Length())
|
||||
@@ -2607,7 +2607,7 @@ void PLib::CoefficientsPoles (const TColgp_Array2OfPnt& Coefs,
|
||||
TColgp_Array2OfPnt& Poles,
|
||||
TColStd_Array2OfReal& Weights)
|
||||
{
|
||||
- Standard_Boolean rat = (&WCoefs != NULL);
|
||||
+ Standard_Boolean rat = (!IS_NULL_REF(WCoefs));
|
||||
Standard_Integer LowerRow = Poles.LowerRow();
|
||||
Standard_Integer UpperRow = Poles.UpperRow();
|
||||
Standard_Integer LowerCol = Poles.LowerCol();
|
||||
@@ -2701,7 +2701,7 @@ void PLib::UTrimming(const Standard_Real U1,
|
||||
TColgp_Array2OfPnt& Coeffs,
|
||||
TColStd_Array2OfReal& WCoeffs)
|
||||
{
|
||||
- Standard_Boolean rat = &WCoeffs != NULL;
|
||||
+ Standard_Boolean rat = !IS_NULL_REF(WCoeffs);
|
||||
Standard_Integer lr = Coeffs.LowerRow();
|
||||
Standard_Integer ur = Coeffs.UpperRow();
|
||||
Standard_Integer lc = Coeffs.LowerCol();
|
||||
@@ -2735,7 +2735,7 @@ void PLib::VTrimming(const Standard_Real V1,
|
||||
TColgp_Array2OfPnt& Coeffs,
|
||||
TColStd_Array2OfReal& WCoeffs)
|
||||
{
|
||||
- Standard_Boolean rat = &WCoeffs != NULL;
|
||||
+ Standard_Boolean rat = !IS_NULL_REF(WCoeffs);
|
||||
Standard_Integer lr = Coeffs.LowerRow();
|
||||
Standard_Integer ur = Coeffs.UpperRow();
|
||||
Standard_Integer lc = Coeffs.LowerCol();
|
67
var/spack/repos/builtin/packages/oce/package.py
Normal file
67
var/spack/repos/builtin/packages/oce/package.py
Normal file
|
@ -0,0 +1,67 @@
|
|||
from spack import *
|
||||
import platform, sys
|
||||
|
||||
class Oce(Package):
|
||||
"""
|
||||
Open CASCADE Community Edition:
|
||||
patches/improvements/experiments contributed by users over the official Open CASCADE library.
|
||||
"""
|
||||
homepage = "https://github.com/tpaviot/oce"
|
||||
url = "https://github.com/tpaviot/oce/archive/OCE-0.17.tar.gz"
|
||||
|
||||
version('0.17.1', '36c67b87093c675698b483454258af91')
|
||||
version('0.17' , 'f1a89395c4b0d199bea3db62b85f818d')
|
||||
version('0.16.1', '4d591b240c9293e879f50d86a0cb2bb3')
|
||||
version('0.16' , '7a4b4df5a104d75a537e25e7dd387eca')
|
||||
|
||||
variant('tbb', default=True, description='Build with Intel Threading Building Blocks')
|
||||
|
||||
depends_on('cmake@2.8:')
|
||||
depends_on('tbb', when='+tbb')
|
||||
|
||||
# There is a bug in OCE which appears with Clang (version?) or GCC 6.0
|
||||
# and has to do with compiler optimization, see
|
||||
# https://github.com/tpaviot/oce/issues/576
|
||||
# http://tracker.dev.opencascade.org/view.php?id=26042
|
||||
# https://github.com/tpaviot/oce/issues/605
|
||||
# https://github.com/tpaviot/oce/commit/61cb965b9ffeca419005bc15e635e67589c421dd.patch
|
||||
patch('null.patch',when='@0.16:0.17.1')
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = []
|
||||
options.extend(std_cmake_args)
|
||||
options.extend([
|
||||
'-DOCE_INSTALL_PREFIX=%s' % prefix,
|
||||
'-DOCE_BUILD_SHARED_LIB:BOOL=ON',
|
||||
'-DCMAKE_BUILD_TYPE:STRING=Release',
|
||||
'-DOCE_DATAEXCHANGE:BOOL=ON',
|
||||
'-DOCE_DISABLE_X11:BOOL=ON',
|
||||
'-DOCE_DRAW:BOOL=OFF',
|
||||
'-DOCE_MODEL:BOOL=ON',
|
||||
'-DOCE_MULTITHREAD_LIBRARY:STRING=%s' % ('TBB' if '+tbb' in spec else 'NONE'),
|
||||
'-DOCE_OCAF:BOOL=ON',
|
||||
'-DOCE_USE_TCL_TEST_FRAMEWORK:BOOL=OFF',
|
||||
'-DOCE_VISUALISATION:BOOL=OFF',
|
||||
'-DOCE_WITH_FREEIMAGE:BOOL=OFF',
|
||||
'-DOCE_WITH_GL2PS:BOOL=OFF',
|
||||
'-DOCE_WITH_OPENCL:BOOL=OFF'
|
||||
])
|
||||
|
||||
if platform.system() == 'Darwin':
|
||||
options.extend([
|
||||
'-DOCE_OSX_USE_COCOA:BOOL=ON',
|
||||
])
|
||||
|
||||
cmake('.', *options)
|
||||
|
||||
make("install/strip")
|
||||
|
||||
# OCE tests build is brocken at least on Darwin.
|
||||
# Unit tests are linked against libTKernel.10.dylib isntead of /full/path/libTKernel.10.dylib
|
||||
# see https://github.com/tpaviot/oce/issues/612
|
||||
# make("test")
|
||||
|
||||
# The shared libraries are not installed correctly on Darwin; correct this
|
||||
if (sys.platform == 'darwin'):
|
||||
fix_darwin_install_name(prefix.lib)
|
|
@ -1,30 +1,78 @@
|
|||
from spack import *
|
||||
import sys
|
||||
import os
|
||||
|
||||
class Openblas(Package):
|
||||
"""OpenBLAS: An optimized BLAS library"""
|
||||
homepage = "http://www.openblas.net"
|
||||
url = "http://github.com/xianyi/OpenBLAS/archive/v0.2.15.tar.gz"
|
||||
|
||||
version('0.2.17', '664a12807f2a2a7cda4781e3ab2ae0e1')
|
||||
version('0.2.16', 'fef46ab92463bdbb1479dcec594ef6dc')
|
||||
version('0.2.15', 'b1190f3d3471685f17cfd1ec1d252ac9')
|
||||
|
||||
variant('shared', default=True, description="Build shared libraries as well as static libs.")
|
||||
variant('openmp', default=True, description="Enable OpenMP support.")
|
||||
|
||||
# virtual dependency
|
||||
provides('blas')
|
||||
provides('lapack')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
make('libs', 'netlib', 'shared', 'CC=cc', 'FC=f77')
|
||||
make('install', "PREFIX='%s'" % prefix)
|
||||
|
||||
lib_dsuffix = 'dylib' if sys.platform == 'darwin' else 'so'
|
||||
def install(self, spec, prefix):
|
||||
# Openblas is picky about compilers. Configure fails with
|
||||
# FC=/abs/path/to/f77, whereas FC=f77 works fine.
|
||||
# To circumvent this, provide basename only:
|
||||
make_defs = ['CC=%s' % os.path.basename(spack_cc),
|
||||
'FC=%s' % os.path.basename(spack_f77)]
|
||||
|
||||
make_targets = ['libs', 'netlib']
|
||||
|
||||
# Build shared if variant is set.
|
||||
if '+shared' in spec:
|
||||
make_targets += ['shared']
|
||||
else:
|
||||
make_defs += ['NO_SHARED=1']
|
||||
|
||||
# fix missing _dggsvd_ and _sggsvd_
|
||||
if spec.satisfies('@0.2.16'):
|
||||
make_defs += ['BUILD_LAPACK_DEPRECATED=1']
|
||||
|
||||
# Add support for OpenMP
|
||||
# Note: Make sure your compiler supports OpenMP
|
||||
if '+openmp' in spec:
|
||||
make_defs += ['USE_OPENMP=1']
|
||||
|
||||
make_args = make_defs + make_targets
|
||||
make(*make_args)
|
||||
|
||||
make("tests", *make_defs)
|
||||
|
||||
# no quotes around prefix (spack doesn't use a shell)
|
||||
make('install', "PREFIX=%s" % prefix, *make_defs)
|
||||
|
||||
# Blas virtual package should provide blas.a and libblas.a
|
||||
with working_dir(prefix.lib):
|
||||
symlink('libopenblas.a', 'blas.a')
|
||||
symlink('libopenblas.a', 'libblas.a')
|
||||
symlink('libopenblas.%s' % lib_dsuffix, 'libblas.%s' % lib_dsuffix)
|
||||
if '+shared' in spec:
|
||||
symlink('libopenblas.%s' % dso_suffix, 'libblas.%s' % dso_suffix)
|
||||
|
||||
# Lapack virtual package should provide liblapack.a
|
||||
with working_dir(prefix.lib):
|
||||
symlink('libopenblas.a', 'liblapack.a')
|
||||
symlink('libopenblas.%s' % lib_dsuffix, 'liblapack.%s' % lib_dsuffix)
|
||||
if '+shared' in spec:
|
||||
symlink('libopenblas.%s' % dso_suffix, 'liblapack.%s' % dso_suffix)
|
||||
|
||||
|
||||
def setup_dependent_package(self, module, dspec):
|
||||
# This is WIP for a prototype interface for virtual packages.
|
||||
# We can update this as more builds start depending on BLAS/LAPACK.
|
||||
libdir = find_library_path('libopenblas.a', self.prefix.lib64, self.prefix.lib)
|
||||
|
||||
self.spec.blas_static_lib = join_path(libdir, 'libopenblas.a')
|
||||
self.spec.lapack_static_lib = self.spec.blas_static_lib
|
||||
|
||||
if '+shared' in self.spec:
|
||||
self.spec.blas_shared_lib = join_path(libdir, 'libopenblas.%s' % dso_suffix)
|
||||
self.spec.lapack_shared_lib = self.spec.blas_shared_lib
|
||||
|
|
26
var/spack/repos/builtin/packages/openjpeg/package.py
Normal file
26
var/spack/repos/builtin/packages/openjpeg/package.py
Normal file
|
@ -0,0 +1,26 @@
|
|||
from spack import *
|
||||
|
||||
class Openjpeg(Package):
|
||||
"""
|
||||
OpenJPEG is an open-source JPEG 2000 codec written in C language.
|
||||
It has been developed in order to promote the use of JPEG 2000, a
|
||||
still-image compression standard from the Joint Photographic
|
||||
Experts Group (JPEG).
|
||||
Since April 2015, it is officially recognized by ISO/IEC and
|
||||
ITU-T as a JPEG 2000 Reference Software.
|
||||
"""
|
||||
homepage = "https://github.com/uclouvain/openjpeg"
|
||||
url = "https://github.com/uclouvain/openjpeg/archive/version.2.1.tar.gz"
|
||||
|
||||
version('2.1' , '3e1c451c087f8462955426da38aa3b3d')
|
||||
version('2.0.1', '105876ed43ff7dbb2f90b41b5a43cfa5')
|
||||
version('2.0' , 'cdf266530fee8af87454f15feb619609')
|
||||
version('1.5.2', '545f98923430369a6b046ef3632ef95c')
|
||||
version('1.5.1', 'd774e4b5a0db5f0f171c4fc0aabfa14e')
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
cmake('.', *std_cmake_args)
|
||||
|
||||
make()
|
||||
make("install")
|
34
var/spack/repos/builtin/packages/p4est/package.py
Normal file
34
var/spack/repos/builtin/packages/p4est/package.py
Normal file
|
@ -0,0 +1,34 @@
|
|||
from spack import *
|
||||
|
||||
class P4est(Package):
|
||||
"""Dynamic management of a collection (a forest) of adaptive octrees in parallel"""
|
||||
homepage = "http://www.p4est.org"
|
||||
url = "http://p4est.github.io/release/p4est-1.1.tar.gz"
|
||||
|
||||
version('1.1', '37ba7f4410958cfb38a2140339dbf64f')
|
||||
|
||||
# disable by default to make it work on frontend of clusters
|
||||
variant('tests', default=False, description='Run small tests')
|
||||
|
||||
depends_on('mpi')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = ['--enable-mpi',
|
||||
'--enable-shared',
|
||||
'--disable-vtk-binary',
|
||||
'--without-blas',
|
||||
'CPPFLAGS=-DSC_LOG_PRIORITY=SC_LP_ESSENTIAL',
|
||||
'CFLAGS=-O2',
|
||||
'CC=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpicc'), # TODO: use ENV variables or MPI class wrappers
|
||||
'CXX=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpic++'),
|
||||
'FC=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif90'),
|
||||
'F77=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif77'),
|
||||
]
|
||||
|
||||
configure('--prefix=%s' % prefix, *options)
|
||||
|
||||
make()
|
||||
if '+tests' in self.spec:
|
||||
make("check")
|
||||
|
||||
make("install")
|
|
@ -8,6 +8,7 @@ class ParallelNetcdf(Package):
|
|||
homepage = "https://trac.mcs.anl.gov/projects/parallel-netcdf"
|
||||
url = "http://cucis.ece.northwestern.edu/projects/PnetCDF/Release/parallel-netcdf-1.6.1.tar.gz"
|
||||
|
||||
version('1.7.0', '267eab7b6f9dc78c4d0e6def2def3aea4bc7c9f0')
|
||||
version('1.6.1', '62a094eb952f9d1e15f07d56e535052604f1ac34')
|
||||
|
||||
depends_on("m4")
|
||||
|
|
|
@ -27,13 +27,14 @@ class Paraview(Package):
|
|||
|
||||
depends_on('bzip2')
|
||||
depends_on('freetype')
|
||||
depends_on('hdf5+mpi', when='+mpi')
|
||||
depends_on('hdf5~mpi', when='~mpi')
|
||||
#depends_on('hdf5+mpi', when='+mpi')
|
||||
#depends_on('hdf5~mpi', when='~mpi')
|
||||
depends_on('jpeg')
|
||||
depends_on('libpng')
|
||||
depends_on('libtiff')
|
||||
depends_on('libxml2')
|
||||
depends_on('netcdf')
|
||||
#depends_on('netcdf')
|
||||
#depends_on('netcdf-cxx')
|
||||
#depends_on('protobuf') # version mismatches?
|
||||
#depends_on('sqlite') # external version not supported
|
||||
depends_on('zlib')
|
||||
|
@ -75,13 +76,13 @@ def nfeature_to_bool(feature):
|
|||
cmake('..',
|
||||
'-DCMAKE_INSTALL_PREFIX:PATH=%s' % prefix,
|
||||
'-DBUILD_TESTING:BOOL=OFF',
|
||||
'-DVTK_USER_SYSTEM_FREETYPE:BOOL=ON',
|
||||
'-DVTK_USER_SYSTEM_HDF5:BOOL=ON',
|
||||
'-DVTK_USER_SYSTEM_JPEG:BOOL=ON',
|
||||
'-DVTK_USER_SYSTEM_LIBXML2:BOOL=ON',
|
||||
'-DVTK_USER_SYSTEM_NETCDF:BOOL=ON',
|
||||
'-DVTK_USER_SYSTEM_TIFF:BOOL=ON',
|
||||
'-DVTK_USER_SYSTEM_ZLIB:BOOL=ON',
|
||||
'-DVTK_USE_SYSTEM_FREETYPE:BOOL=ON',
|
||||
'-DVTK_USE_SYSTEM_HDF5:BOOL=OFF',
|
||||
'-DVTK_USE_SYSTEM_JPEG:BOOL=ON',
|
||||
'-DVTK_USE_SYSTEM_LIBXML2:BOOL=ON',
|
||||
'-DVTK_USE_SYSTEM_NETCDF:BOOL=OFF',
|
||||
'-DVTK_USE_SYSTEM_TIFF:BOOL=ON',
|
||||
'-DVTK_USE_SYSTEM_ZLIB:BOOL=ON',
|
||||
*feature_args)
|
||||
make()
|
||||
make('install')
|
||||
|
|
|
@ -1,13 +1,71 @@
|
|||
diff --git a/CMakeLists.txt b/CMakeLists.txt
|
||||
index ca945dd..1bf94e9 100644
|
||||
index ca945dd..aff8b5f 100644
|
||||
--- a/CMakeLists.txt
|
||||
+++ b/CMakeLists.txt
|
||||
@@ -23,7 +23,7 @@ else()
|
||||
set(ParMETIS_LIBRARY_TYPE STATIC)
|
||||
endif()
|
||||
|
||||
-include(${GKLIB_PATH}/GKlibSystem.cmake)
|
||||
+include_directories(${GKLIB_PATH})
|
||||
|
||||
# List of paths that the compiler will search for header files.
|
||||
# i.e., the -I equivalent
|
||||
@@ -33,7 +33,7 @@ include_directories(${GKLIB_PATH})
|
||||
include_directories(${METIS_PATH}/include)
|
||||
|
||||
|
||||
# List of directories that cmake will look for CMakeLists.txt
|
||||
-add_subdirectory(${METIS_PATH}/libmetis ${CMAKE_BINARY_DIR}/libmetis)
|
||||
+#add_subdirectory(${METIS_PATH}/libmetis ${CMAKE_BINARY_DIR}/libmetis)
|
||||
+find_library(METIS_LIBRARY metis PATHS ${METIS_PATH}/lib)
|
||||
add_subdirectory(include)
|
||||
add_subdirectory(libparmetis)
|
||||
add_subdirectory(programs)
|
||||
diff --git a/libparmetis/CMakeLists.txt b/libparmetis/CMakeLists.txt
|
||||
index 9cfc8a7..e0c4de7 100644
|
||||
--- a/libparmetis/CMakeLists.txt
|
||||
+++ b/libparmetis/CMakeLists.txt
|
||||
@@ -5,7 +5,10 @@ file(GLOB parmetis_sources *.c)
|
||||
# Create libparmetis
|
||||
add_library(parmetis ${ParMETIS_LIBRARY_TYPE} ${parmetis_sources})
|
||||
# Link with metis and MPI libraries.
|
||||
-target_link_libraries(parmetis metis ${MPI_LIBRARIES})
|
||||
+target_link_libraries(parmetis ${METIS_LIBRARY} ${MPI_LIBRARIES})
|
||||
+if(UNIX)
|
||||
+ target_link_libraries(parmetis m)
|
||||
+endif()
|
||||
set_target_properties(parmetis PROPERTIES LINK_FLAGS "${MPI_LINK_FLAGS}")
|
||||
|
||||
install(TARGETS parmetis
|
||||
diff --git a/libparmetis/parmetislib.h b/libparmetis/parmetislib.h
|
||||
index c1daeeb..07511f6 100644
|
||||
--- a/libparmetis/parmetislib.h
|
||||
+++ b/libparmetis/parmetislib.h
|
||||
@@ -20,13 +20,12 @@
|
||||
|
||||
#include <parmetis.h>
|
||||
|
||||
-#include "../metis/libmetis/gklib_defs.h"
|
||||
+#include <gklib_defs.h>
|
||||
|
||||
-#include <mpi.h>
|
||||
+#include <mpi.h>
|
||||
|
||||
#include <rename.h>
|
||||
#include <defs.h>
|
||||
#include <struct.h>
|
||||
#include <macros.h>
|
||||
#include <proto.h>
|
||||
-
|
||||
diff --git a/programs/parmetisbin.h b/programs/parmetisbin.h
|
||||
index e26cd2d..d156480 100644
|
||||
--- a/programs/parmetisbin.h
|
||||
+++ b/programs/parmetisbin.h
|
||||
@@ -19,7 +19,7 @@
|
||||
#include <GKlib.h>
|
||||
#include <parmetis.h>
|
||||
|
||||
-#include "../metis/libmetis/gklib_defs.h"
|
||||
+#include <gklib_defs.h>
|
||||
#include "../libparmetis/rename.h"
|
||||
#include "../libparmetis/defs.h"
|
||||
#include "../libparmetis/struct.h"
|
||||
|
|
|
@ -24,7 +24,7 @@
|
|||
##############################################################################
|
||||
|
||||
from spack import *
|
||||
|
||||
import sys
|
||||
|
||||
class Parmetis(Package):
|
||||
"""
|
||||
|
@ -44,7 +44,7 @@ class Parmetis(Package):
|
|||
depends_on('mpi')
|
||||
|
||||
patch('enable_external_metis.patch')
|
||||
depends_on('metis')
|
||||
depends_on('metis@5:')
|
||||
|
||||
# bug fixes from PETSc developers
|
||||
# https://bitbucket.org/petsc/pkg-parmetis/commits/1c1a9fd0f408dc4d42c57f5c3ee6ace411eb222b/raw/
|
||||
|
@ -64,7 +64,7 @@ def install(self, spec, prefix):
|
|||
|
||||
# FIXME : Once a contract is defined, MPI compilers should be retrieved indirectly via spec['mpi'] in case
|
||||
# FIXME : they use a non-standard name
|
||||
options.extend(['-DGKLIB_PATH:PATH={metis_source}/GKlib'.format(metis_source=metis_source), # still need headers from METIS source, and they are not installed with METIS. shame...
|
||||
options.extend(['-DGKLIB_PATH:PATH={metis_source}/GKlib'.format(metis_source=spec['metis'].prefix.include),
|
||||
'-DMETIS_PATH:PATH={metis_source}'.format(metis_source=spec['metis'].prefix),
|
||||
'-DCMAKE_C_COMPILER:STRING=mpicc',
|
||||
'-DCMAKE_CXX_COMPILER:STRING=mpicxx'])
|
||||
|
@ -83,3 +83,7 @@ def install(self, spec, prefix):
|
|||
cmake(source_directory, *options)
|
||||
make()
|
||||
make("install")
|
||||
|
||||
# The shared library is not installed correctly on Darwin; correct this
|
||||
if (sys.platform == 'darwin') and ('+shared' in spec):
|
||||
fix_darwin_install_name(prefix.lib)
|
||||
|
|
|
@ -17,14 +17,18 @@ class Petsc(Package):
|
|||
version('3.5.1', 'a557e029711ebf425544e117ffa44d8f')
|
||||
version('3.4.4', '7edbc68aa6d8d6a3295dd5f6c2f6979d')
|
||||
|
||||
variant('shared', default=True, description='Enables the build of shared libraries')
|
||||
variant('mpi', default=True, description='Activates MPI support')
|
||||
variant('double', default=True, description='Switches between single and double precision')
|
||||
variant('shared', default=True, description='Enables the build of shared libraries')
|
||||
variant('mpi', default=True, description='Activates MPI support')
|
||||
variant('double', default=True, description='Switches between single and double precision')
|
||||
variant('complex', default=False, description='Build with complex numbers')
|
||||
variant('debug', default=False, description='Compile in debug mode')
|
||||
|
||||
variant('metis', default=True, description='Activates support for metis and parmetis')
|
||||
variant('hdf5', default=True, description='Activates support for HDF5 (only parallel)')
|
||||
variant('boost', default=True, description='Activates support for Boost')
|
||||
variant('hypre', default=True, description='Activates support for Hypre')
|
||||
variant('metis', default=True, description='Activates support for metis and parmetis')
|
||||
variant('hdf5', default=True, description='Activates support for HDF5 (only parallel)')
|
||||
variant('boost', default=True, description='Activates support for Boost')
|
||||
variant('hypre', default=True, description='Activates support for Hypre (only parallel)')
|
||||
variant('mumps', default=True, description='Activates support for MUMPS (only parallel)')
|
||||
variant('superlu-dist', default=True, description='Activates support for SuperluDist (only parallel)')
|
||||
|
||||
# Virtual dependencies
|
||||
depends_on('blas')
|
||||
|
@ -36,11 +40,17 @@ class Petsc(Package):
|
|||
|
||||
# Other dependencies
|
||||
depends_on('boost', when='+boost')
|
||||
depends_on('metis', when='+metis')
|
||||
depends_on('metis@5:', when='+metis')
|
||||
|
||||
depends_on('hdf5+mpi', when='+hdf5+mpi')
|
||||
depends_on('parmetis', when='+metis+mpi')
|
||||
depends_on('hypre', when='+hypre+mpi')
|
||||
# Hypre does not support complex numbers.
|
||||
# Also PETSc prefer to build it without internal superlu, likely due to conflict in headers
|
||||
# see https://bitbucket.org/petsc/petsc/src/90564b43f6b05485163c147b464b5d6d28cde3ef/config/BuildSystem/config/packages/hypre.py
|
||||
depends_on('hypre~internal-superlu', when='+hypre+mpi~complex')
|
||||
depends_on('superlu-dist', when='+superlu-dist+mpi')
|
||||
depends_on('mumps+mpi', when='+mumps+mpi')
|
||||
depends_on('scalapack', when='+mumps+mpi')
|
||||
|
||||
def mpi_dependent_options(self):
|
||||
if '~mpi' in self.spec:
|
||||
|
@ -55,38 +65,30 @@ def mpi_dependent_options(self):
|
|||
# If mpi is disabled (~mpi), it's an error to have any of these enabled.
|
||||
# This generates a list of any such errors.
|
||||
errors = [error_message_fmt.format(library=x)
|
||||
for x in ('hdf5', 'hypre', 'parmetis')
|
||||
for x in ('hdf5', 'hypre', 'parmetis','mumps','superlu-dist')
|
||||
if ('+'+x) in self.spec]
|
||||
if errors:
|
||||
errors = ['incompatible variants given'] + errors
|
||||
raise RuntimeError('\n'.join(errors))
|
||||
else:
|
||||
if self.compiler.name == "clang":
|
||||
compiler_opts = [
|
||||
'--with-mpi=1',
|
||||
'--with-cc=%s -Qunused-arguments' % join_path(self.spec['mpi'].prefix.bin, 'mpicc'), # Avoid confusing PETSc config by clang: warning: argument unused during compilation
|
||||
'--with-cxx=%s -Qunused-arguments' % join_path(self.spec['mpi'].prefix.bin, 'mpic++'),
|
||||
'--with-fc=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif90'),
|
||||
'--with-f77=%s' % join_path(self.spec['mpi'].prefix.bin, 'mpif77'),
|
||||
]
|
||||
else:
|
||||
compiler_opts = [
|
||||
'--with-mpi=1',
|
||||
'--with-mpi-dir=%s' % self.spec['mpi'].prefix,
|
||||
]
|
||||
compiler_opts = [
|
||||
'--with-mpi=1',
|
||||
'--with-mpi-dir=%s' % self.spec['mpi'].prefix,
|
||||
]
|
||||
return compiler_opts
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = ['--with-debugging=0',
|
||||
'--with-ssl=0']
|
||||
options = ['--with-ssl=0']
|
||||
options.extend(self.mpi_dependent_options())
|
||||
options.extend([
|
||||
'--with-precision=%s' % ('double' if '+double' in spec else 'single'),
|
||||
'--with-scalar-type=%s' % ('complex' if '+complex' in spec else 'real'),
|
||||
'--with-shared-libraries=%s' % ('1' if '+shared' in spec else '0'),
|
||||
'--with-debugging=%s' % ('1' if '+debug' in spec else '0'),
|
||||
'--with-blas-lapack-dir=%s' % spec['lapack'].prefix
|
||||
])
|
||||
# Activates library support if needed
|
||||
for library in ('metis', 'boost', 'hdf5', 'hypre', 'parmetis'):
|
||||
for library in ('metis', 'boost', 'hdf5', 'hypre', 'parmetis','mumps','scalapack'):
|
||||
options.append(
|
||||
'--with-{library}={value}'.format(library=library, value=('1' if library in spec else '0'))
|
||||
)
|
||||
|
@ -94,9 +96,24 @@ def install(self, spec, prefix):
|
|||
options.append(
|
||||
'--with-{library}-dir={path}'.format(library=library, path=spec[library].prefix)
|
||||
)
|
||||
# PETSc does not pick up SuperluDist from the dir as they look for superlu_dist_4.1.a
|
||||
if 'superlu-dist' in spec:
|
||||
options.extend([
|
||||
'--with-superlu_dist-include=%s' % spec['superlu-dist'].prefix.include,
|
||||
'--with-superlu_dist-lib=%s' % join_path(spec['superlu-dist'].prefix.lib, 'libsuperlu_dist.a'),
|
||||
'--with-superlu_dist=1'
|
||||
])
|
||||
else:
|
||||
options.append(
|
||||
'--with-superlu_dist=0'
|
||||
)
|
||||
|
||||
configure('--prefix=%s' % prefix, *options)
|
||||
|
||||
# PETSc has its own way of doing parallel make.
|
||||
make('MAKE_NP=%s' % make_jobs, parallel=False)
|
||||
make("install")
|
||||
|
||||
def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
|
||||
# set up PETSC_DIR for everyone using PETSc package
|
||||
spack_env.set('PETSC_DIR', self.prefix)
|
||||
|
|
|
@ -10,7 +10,12 @@ class PkgConfig(Package):
|
|||
parallel = False
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" %prefix, "--enable-shared")
|
||||
configure("--prefix=%s" %prefix,
|
||||
"--enable-shared",
|
||||
"--with-internal-glib") # There's a bootstrapping problem here;
|
||||
# glib uses pkg-config as well, so
|
||||
# break the cycle by using the internal
|
||||
# glib.
|
||||
|
||||
make()
|
||||
make("install")
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue