+ This change fixes a problem that manifests when trilinos is built against a
MKL installation defined as an external package. In this scenario, the MKL
libraries are found one directory deeper than for the case where spack
provides MKL. The extra directory is a platform name like 'intel64'.
+ The changes in this PR were recommended by contributor @davydden. I
implemented and tested with intel@16.0.3. These changes fix the issue I
reported. I did not attempt building trilinos against other BLAS
implementations.
+ fixes#1923
* mfem: add tarball extension
Add tarball extension as a result of a feature added in PR#1926, which
fixes earlier issues in this PR (PR#1202). Prior to adding this feature,
Spack would not autodetect the extension of the tarball downloaded from
the redirected, shorted Google URL, requiring a messy hack. This hack
worked for mfem version 3.1, but led to errors when adding mfem version
3.2 because the files downloaded from Google did not contain the package
name, version number, or extension. Adding the extension enables Spack
to rename the tarball downloaded from Google to a sensible name that is
compatible with its filename parsing algorithms so that Spack "does the
right thing" (detects that the file is a GZipped tarball, decompresses
it, runs GNU Make) in fetching and staging the package.
* mfem: add linkage to KLU & BTF
Add linkage to the KLU & BTF solvers, which are now enabled in MFEM for
versions 3.2 and later.
* mfem: Add superlu-dist variant
Add linkage to SuperLU_DIST, which is a new linear solver interface for
MFEM versions 3.2 and later.
* mfem: add netcdf variant for cubit mesh support
Add NetCDF variant for MFEM versions 3.2 and later; installing the
NetCDF interfaces enables CUBIT mesh support.
+ Cray compile wrappers are MPI wrappers.
+ Packages that need to be compiled with MPI compile wrappers normally use
'mpicc', 'mpic++' and 'mpif90' provided by the MPI vendor. However, when using
cray-mpich as the MPI vendor, the compile wrappers 'CC', 'cc' and 'ftn' must
be used.
+ In this scenario, the mpich package is hijacked by specifying cray-mpich as an
external package under the 'mpich:' section of packages.yaml. For example:
packages:
mpich:
modules:
mpich@7.4.2%intel@16.0.3 arch=cray-CNL-haswell: cray-mpich/7.4.2
buildable: False
all:
providers:
mpi: [mpich]
+ This change allows packages like parmetis to be built using the Cray compile
wrappers. For example: 'spack install parmetis%intel@16.0.3 ^mpich@7.4.2 os=CNL'
+ This commit relies on the existence of the environment variable CRAYPE_VERSION
to determine if the current machine is running a Cray environment. This check is
insufficient, but I'm not sure how to improve this logic.
+ Fixes#1827
* fix blas-lapack in scipy and numpy
* py-numpy: do not set rpath on macOS
* py-scipy: do not set Blas/Lapack. This appears to be picked up from py-numpy
* py-numpy: don't write rpath= in Sierra only
* py-numpy: add a link to build notes
* Updated nettle to have m4 as an immediate dependency to match new PATH
construction logic which only includes immediate dependencies.
* Update package.py
* Added check for ppc64le because configure cant guess the build type for rhel on ppc64le
Expand/clarify description of dependency types
Refactored getting the arch so that its less verbose
* Fixed flake8 issues
* everytrace: New package
Everytrace ensures that stack trace is obtained every time a program exits, for whatever reason.
* everytrace: Change CMake to build dependency
* Renamed to everytrace-example, flake8 and copyright issues.
* flake8
* Missing type=build
* Warn user if flake8 can't find setuptools
* Add missing dependencies of flake8
* Updates to py-autopep8, make packages activateable
* Check for presence of setuptools for Sphinx too
* Fix bug in order of commands
* Adding last version of fenics and making trilinos not ambiguous on hdf5
* forcing fenics to ignore hdf5 cxx
* Adding deptypes and correcting the hdf5 patch
* flake8 corrections
* cleaning some useless code
* Provide new versions of llvm.
+ Provide file list and md5 hashes for 3.8.1 and 3.9.0.
+ Clean up indentation for the 'releases' data structure to improve
consistency.
* Adding a block of code to the 'resources' structure for cfe.
* Merge cfe and clang resources into single entity.
* Added hadoop, spark, and variant spark+hadoop
* Docstrings, dependency types, urls, copyright
* Flake8 fixes, link dependency for hadoop
* Build type for spark, env problem setting JAVA_HOME
* New package: libquo
libquo is a high-level, easy to use programming interface tailored specifically
for MPI/MPI+X codes that may benefit from evolving process binding policies
during their execution. QUO allows for arbitrary process binding policies to
be enacted and reverted during the execution of an MPI/MPI+X application as
different computational phases are entered and exited, respectively.
https://github.com/losalamos/libquo
* Remove use of 'which' and fix style non-conformance.
* Add libint package
* Add Intel optimization flags recommended by CP2K
* Add new version and Intel compiler optimization flags for libxc
* Add older version of libint
* Libint depends on GMP C++ library
Pro tips from @adamjstewart:
* line too long in package description
* name it grib-api instead of grib_api
* depend on netcdf without reference to unnecessary constraints
* py-pil: Does not build with Python3.
* Set py-pillow to be the default pil provider
* Update package.py
* Change to comments requested by adamjstewart
* Remove version constraint from extends(), avoid a Spack bug.
* dealii: add missing python dependency
* boost: fix a bug which broke it on macOS with clang+gfortran
Boost was using gcc compiler instead of clang++, which lead to
cryptic Undefined symbols linking errors for boost::python::objects::function_object()
when building other packages against boost+python.
* boost: add exceptions for intel
* boost: use spack_cxx
* Turned <provider>_libs into an iterable
Modifications :
- added class LibraryList + unit tests
- added convenience functions `find_libraries` and `dedupe`
- modifed non Intel blas/lapack providers
- modified packages using blas_shared_libs and similar functions
* atlas : added pthread variant
* intel packages : added lapack_libs and blas_libs
* find_library_path : removed unused function
* PR review : fixed last issues
* LibraryList : added test on __add__ return type
* LibraryList : added __radd__ fixed unit tests
fix : failing unit tests due to missing `self`
* cp2k and dependecies : fixed blas-lapack related statements in package.py
Python will look to link with libncursesw in preference to libncurses. Since
ncurses in spack is built without suffixes there is no libncursesw and
python will link to the system libncursesw for _curses.so and
_curses_panel.so, as well as libpanelw for _curses_panel.so.
This PR introduces a patch that simple removes the check for ncursesw in
setup.py and therefore sets `curses_library` to `ncurses`.
* Added missing libtiff dependency
* added dependency on fontconfig
* Added version 2.2.3
* use autotools rather than cmake
The cmake build was not producing a complete install.
* There was no versioning of the installed libraries.
* gdlib-config was missing
* pkgconfig directory was missing
These problems do not happen when built with autotools.
yt is a python package for analyzing and visualizing volumetric, multi-resolution data from astrophysical simulations, radio telescopes, and a burgeoning interdisciplinary community.
* Fixed a few small bugs in the 'git-lfs' install script.
* Fixed a bug in the '(go|go-bootstrap)@1.4.2+test' install scripts.
* Fixing a minor style issue in the 'git-lfs' install script.
I encountered an HPC system where PETSc's configure stage does not find a valid `cpp` (C preprocessor). Explicitly pointing to Spack's `cpp` wrapper resolves the problem.
* Fixed a bug that was causing post-install METIS tests to fail.
* Improved the patching procedure used in the 'metis' install script.
* Enabled patch skipping for the 'metis' and 'parmetis' packages.
* Fixed some minor style issues in the 'parmetis' package.
* Improved the 'metis' test fix and added 'run_tests' support.
* Refactored and reorganized the 'zoltan' install script.
* Fixed a few bugs with the '+mpi' and '+fortan' variants of 'zoltan'.
* Reintroduced and improved the '+shared' variant for the 'zoltan' package.
* Improved compatibility with different MPI providers for 'zoltan'.
Includes :
- treatment of a generic hierarchy (i.e. lapack + mpi + compiler)
- possibility to specify which compilers are to be considered Core
- correct treatment of the 'family' directive
- unit tests for most new features
- add new version 2.10.3
- explicitly require hdf5 variant ~mpi, since we don't know how to build with mpicc
- explicitly disable glew since it may not be installed
* Added py-proj package
* Added bug-fix patched version to the package.
* Removed dependency of py-proj on proj.
* py-proj: Added missing dependency
* py-proj: Removed versions from forked repos, now that necessary bug fixes have been merged into the main repo.
* Update package.py
Added a blank line that Travis wanted.
* 1. Added copyright
2. Used setup_py
3. Added type='build' for dependencies.
* Qthreads: Switch back to using tarball for download
* Don't require autotools any more
* Re-enable autotools
* Remove autotools again
* Use .tar.bz tarball; remove outdated code
* New package h5hut -- High-Performance I/O Library for Particle-based Simulations
* Set up MPI compilers
* Add version 1.8.12 to HDF5
* Correct Sphinx error
+ Starting with version 2.0, OpenMPI no longer provides C++ bindings by default
(libmpi_cxx.so). Add a configure option to instruct the build to also build
and install libmpi_cxx.so.
+ This MPI feature is needed by at least one spack package (moab).
* add pango dependency
* add new package ghostscript-fonts & add to ImageMagick as dependency
also tell ImageMagick's configure where the font dir is!
* refactor to fix flake8
* add homepage to ghostscript-fonts
* use install_tree
* remove unneeded import
lzo's download server does not present a valid certificate, so that downloads via https are failing. Spack's MD5 checksum still ensure a safe download.
Closes#1675.
@adamjstewart
```
think you'll find that if you try running something like:
spack spec libsplash ^hdf5@1.8.15
It will complain that libsplash does not depend on hdf5.
This is a bug in Spack's dependency resolution. A workaround
for this is to tell it to always depend on hdf5.
```
@davydden
```
to expand on @adamjstewart comment, spack will make a union
of dependencies,
i.e. hdf5@1.8.6: + hdf5+mpi = hdf5:1.8.6:+mpi, that's why it works.
```
thank you for the hint!
Adds a package for
[PNGwriter](https://github.com/pngwriter/pngwriter/),
a simple high-level C++ png API used in scientific projects.
```
PNGwriter is a very easy to use open source graphics library that
uses PNG as its output format. The interface has been designed to be
as simple and intuitive as possible. It supports plotting and reading
pixels in the RGB (red, green, blue), HSV (hue, saturation,
value/brightness) and CMYK (cyan, magenta, yellow, black) colour
spaces, basic shapes, scaling, bilinear interpolation, full TrueType
antialiased and rotated text support, bezier curves, opening existing
PNG images and more.
```
PNGwriter is a dependency for [PIConGPU](http://picongpu.hzdr.de),
an open-source, many-core, fully-relativistic particle-in-cell code
and further software developed at
[Helmholz-Zentrum Dresden - Rossendorf](https://www.hzdr.de).
Adds a package for
[libSplash](https://github.com/ComputationalRadiationPhysics/libSplash),
a high-level library around serial and parallel HDF5 for regular
grids and particle data sets.
```
libSplash aims at developing a HDF5-based I/O library for HPC
simulations. It is created as an easy-to-use frontend for the
standard HDF5 library with support for MPI processes in a cluster
environment. While the standard HDF5 library provides detailed
low-level control, libSplash simplifies tasks commonly found in
large-scale HPC simulations, such as iter- ative computations
and MPI distributed processes.
```
libSplash is a dependency for [PIConGPU](http://picongpu.hzdr.de),
an open-source, many-core, fully-relativistic particle-in-cell
code and further software developed at
[Helmholz-Zentrum Dresden - Rossendorf](https://www.hzdr.de).
libSplash builds in two versions, one without MPI writing
domain-decomposed posix-style HDF5 files per process and one
(default) with MPI and MPI-I/O ("parallel HDF5") support
aggregating into a single file per MPI communicator.
libSplash is used in conjunction with
[openPMD](http://openPMD.org), see also
[github.com/openPMD/](https://github.com/openPMD/).
This PR updates the ADIOS package.
**Changes:**
- add latest stable release `1.10.0`
- add previous versions (hashes)
- add default license header
- add build options (shamelessly taken from HDF5 package)
- add validation for excisting FC (as in HDF5) and make optional
- handle mxml dependency correctly (not required in 1.10.0+)
- add `CFLAGS=-fPIC` to build shared (python) libs in ADIOS' lib
- remove `-DMPICH_IGNORE_CXX_SEEK` since it is normally not required
- remove `MPICC/CXX?FC` since `--with-mpi` just performs well
- add transforms:
- `zlib`: useful (optional) default
- `szip`: optional (compile often broken)
- add transports that are not as performant as the `.bp` format:
- `hdf5`: non-default
- `netcdf`: non-default, close#1610
* opencoarrays: new package
* opencoarrays: remove tests from install due to (unimportant) failures in some configurations
* opencoarrays: fix flake8 errors
- use two empty lines before `class`
- change version numbering scheme for packages, use `url_for_version` to make things work
- specify dependency types
- add comment about temporarily moved download location
- update two packages to newer versions
py-cffi's .so was being built without the rpath being set. distutils
looks at the LDSHARED for which compiler to use to build the final .so.
Since it was not set, distutils fell back to the system provided
compiler. Setting it forces the shared library to be compiled with the
spack compiler (meaning that the rpath is set correctly).
Patch doesn't work with @when unless you specify a patch for every
version. When running `spack patch` for a version without a patch,
spack thinks that a patch exists, tries to apply it, but it doesn't
exist. Spack gets very confused.
The Lmod author changed the src so that it uses the tclsh (and shared
libraries) discovered at configure time. He did it differently that I
did in this patch, but he changes solve our problem too, so...
This commit changes the git package to depend_on('perl'). The system
perl is not always sufficient to install git (e.g. a CentOS7 system with
the development tools group installed has perl but not the
ExtUtils::MakeMaker package that git needs) and one can't always update
the system's perl.
This PR depends_on PR #1339, which adds a perl package to spack.
Update the samtools package to support v1.3.1, which
- now uses configure script; and
- now depends on external htslib package.
The dependency on mpc seems to have been bogus, it's never linked in,
nor is it mentioned in the source tree. I *do* have a version in
/usr/lib64, but ldd does not sure it being linked in either....
By depending on 'ncurses' I can do away with the need for the patch.
- It's not really a circular dependency -- git is a run dependency of gettext
- We can revert this change when Spack is smart enough to make git a run
dependency and build it.
+ Always depend on the gettext package. This simplifies the logic and I no
longer need to 'import sys'
+ Only apply the patch for the older version of glib.
Add py-meep package and dependencies
Merging to add the gettext support, will submit a separate issue for the LD_LIBRARY_PATH issue with MPI and py-meep
Before I learned that I was stumbling over a real but (#1308), I thought
I needed to arrange for the fetcher to skip the unpack step.
This commit removes the useful `def unpack`.
Use the resource machinery to fetch/cache/unpack/... the App::cpanminus
tarball.
- this hardcodes the version, I can't figure out how to use a variant to
hold/set the value and access it in the resource().
- change up the install to use the `with working_dir()` meme.
Make running perl's tests conditional, one must now specify the
`--run-tests` flag to the `spack install` command in order to run the
tests.
On one system (8 core, 16GB Digital Ocean Droplet), installing without
tests takes 3 minutes, with tests takes 16 minutes.
Rather than hard-coding the verison of `cpanm` that's [optionally]
installed into the core, make it a variant with a default value of
'1.7042'.
Also discovered that `prefix + 'bin'` is the same as `prefix.bin`, so
embetter that bit of code.
Add perl package, based on [work by
justintoo](https://github.com/LLNL/spack/pull/105). He had too many
things pulled into that pull request, this just adds a perl package.
Support the current releases on the past three minor branches.
Run perl's tests before installing.
Install cpanm into the core (makes building on top of this perl *much*
simpler). Controlled by a variant.
I cargo culted that from my *nextflow* package. I [thought I] needed it
to work around Spack trying to use tar to unpack something that was
neither a tar ball nor unpackable.
This package works fine without it. In retrospect, the error that I was
seeing in the *nextflow* package was probably this problem #1308.
Add a package for [ack](http://beyondgrep.com/install/). Simply install
the fatpacked script.
It uses '#!/usr/bin/env perl' and it very much not choosy about what
perl it needs. For now just trust that there's one available, perhaps
someday we can/should uncomment the depends_on('perl').
Follows the methodolgy I used in nextflow. Has the same
uninstall/install problem that nextflow has, there is an issue in
progress for that: https://github.com/LLNL/spack/issues/1308.
Tested on CentOS7.
r-jsonlite 0.0.21 -> 1.0
r-mime 0.4 -> 0.5
rcpp 0.12.5 -> 0.12.6
CRAN is funny. The older versions of these packages are still available
in package specific directories but the current version is not there, so
I don't see any way to make the older versions work.
This is my first cut at a package to support nextflow. It's also my
first package. It works, but has issues. I'm going to submit a pull
request and get some coaching on how to deal with it.
One issue particular: if I install, then uninstall, then try to install
again (which uses the cached copy of the "distribution file"), it
explodes.
WIP: I started trying to build gtkplus@3.20, but this package has many more
dependencies than v2 and it requires newer versions of existing packages. This
commit provides updates for 5 packages that are required by GTK+3. This is not
the complete set of changes required for GTK+3.
atk - move default version from 2.14 -> 2.20.
glib - move default version from 2.42 -> 2.49
- v2.49 requires pcre+utf as a new dependency.
pcre - if variant +utf is selected, add '--enable-unicode-properties' to the
configure options.
libepoxy - new package to spack
- manages OpenGL function pointers.
pango - move default version from 1.36 -> 1.40
Add a package for [tree](http://mama.indstate.edu/users/ice/tree/).
It has a Makefile that hardcodes a prefix and some CFLAGS. Used
filter_file to:
- set the make variable *prefix* to `prefix`; and
- comment out their CFLAGS, just use ours....
It installs, runs on CentOS7, and uninstalls cleanly.
With the addition of dependency types and with `py-setuptools` set as
type='build' there are more packages that need to have `py-setuptools`
added as a dependency.
This PR adds that dependency for the following packages:
- py-h5py
- py-networkx
- py-pytables
- py-scikit-image
This PR adds the `nolink` dependency type to r- package dependencies.
This is needed due to the new dependency types in Spack. A couple of
packages were updated with new versions as well.
This commit introduces a mechanism to insure that R package dependencies
are built with the Spack compiler wrapper. A copy of Makeconf is made
before `filter_compilers` is called. This is then pointed to by the
R_MAKEVARS_SITE environment variable set up in
`setup_dependent_environment`. With this the normal compilers are used
outside of spack and the spack wrapper compilers are used inside of
spack.
This commit also standardizes on the `join_path` call. It also sets the
commented build command to reflect what is actually used with the newer
string formatting.
A while ago I was asked to convert packages to all lowercase. That was
done but some dependencies did not get converted in the specification.
This commit fixes that as well as a couple of urls that need to be made
explicit and a missing dependency on jdk.
- OpenSSL no longer checks remote versions on the openssl site.
- Spack is used on systems that aren't connected to the internet, and
this check is probably in the wrong place and affects too many
commands. We can work on figuring out a better, more configurable
place to put a check like this.
Lmod's configure script goes to the trouble of finding
tclsh. This change uses that info to rewrite the #! lines
in the tcl scripts so that they call the tclsh that the
configure script discovered.
It needs to massage the existing shebang lines into something
that the sed statement in the makefile can manipulate and
it needs to add the path_to_tclsh info into the set of sed
statements.
Checked with versions 6.4.1 and 6.3.7 (the checksum for 6.0.1 is
incorrect, a fix for another time).
The lmod package needs a tclsh. Up until now it just assumed
that one was available on the system.
This change adds a depends_on('tcl') to the lmod package.
The tcl package installs a tclsh script with an embedded version
number (e.g. tclsh8.6) but the lmod configuration looks for tclsh.
This change extends the tcl package to symlink tclshX.Y to tclsh in
the tcl package bin directory.
- Redid the code for setting the itac symlink for the cluster edition.
- Removed the *PATH variables for MPI to avoid a conflict with other MPI
environment modules.
- Added missing test for `+all` when checking variants.
Set up the environment for the Intel compilers and tools. This commit
does the following:
- Unset variables that were incorrect from the auto guess prefix
inspections.
- Add a RemovePath environment_modifications_formats for dotkit.
- Set the module environment variables appropriate for the different
variants.
- Change the component logic so that the '+all' variant works. It was
getting split by letter and leaving COMPONENTS empty.
- Added a variant checking function.
- Added NONRPM_DB_DIR to the silent.cfg so that the product database
goes to the installation directory.
- With the product database in prefix the code to remove the product
database file from the home directory is no longer needed and was
removed.
- Reformat the 'tools' variant description.
There are probably more variables needed for the '+tools' for the
'professional' product version but I do not have access to that.
* upstream/develop: (126 commits)
Fix indent/flake8 error.
openexr : Add new package
Set environment variables
Added gnu packages datamash, parallel, and screen
added package as argument to setup_platform_environment
ilmbase : Add new IlmBase package
Documented linker default
fixed flake errors
removed commented-out code
Set default link type to dynamic on cray. Includes hooks for platform-based environment changes
fixed flake errors
fixed flake errors
Improved cray_xc detection bug fix
Improved cray_xc detection
remove FIXMEs
Ensure that per-4.4.1 NetCDF doesn't use HDF5 1.10
Re-ignore licenses directory
Add "default" configuration scope.
Draft CDO
Make frontend OS on Cray machines a proper linux distro.
...