* fix tau installation issue : setup_environment() is
called before install phase when 'Makefile.*' doesn't
exist (causing list index out of range error).
* Added detailed comment suggested by @alalazo
* November 1 seems to have brought a new texlive release, updating the
digest to match.
Also switching the url from their automagic mirror to an explicit
site to avoid inconsistencies during their updates.
It seems like only yesterday (#2073) that I updated this....
* Add comment to url warning about mirror updates
Add a comment to the download info warning to use a
specific site rather than the mirror, to avoid wobbles
during their asynchronous updates.
* Fix typo ('to no' -> 'do not')
On MacOS, brew installs /usr/local/bin/python but the Python prefix is not /usr/local/bin
Use the python command sys.exec to get the correct directory, instead of the ad hoc self.prefix
previously used
This was a bear to debug; been driving me nuts since I started using spack.
Since spack passes PYTHONHOME down to package builds in the environment
it was passing PYTHONHOME of /usr/local/bin to the PETSc build that uses Python so
the PETSc Python ./configure errored immediately with
ImportError: No module named site
since python could find no python modules. Todd Gamblin pointed out that my first try to fix
this was wrong since it assumed the spack python was the same python used to run spack. Elizabeth Fischer
suggested how to get it to work also with python3
Funded-by: IDEAS
Project: IDEAS/xSDK
Time: 7 hours
Thanks-to: Todd Gamblin, Elizabeth Fischer
* Update the krell institute products to use the latest features of spack for building on cluster platforms.
* Address travis error messages and resubmit the pull request.
* Update the contents of openspeedshop package.py so it passes the flake8 tests.
* Fix flake8 error-whitespack issue in mrnet package.py file.
* Add updates based on spack reviewer feedback.
* More fixes based on comments from reviewers. Switch using extend to using append, remove additional setting of PATH and LD_LIBRARY_PATH that should not be required due to RPATH.
* More review related changes. Update MPIOption.append lines and take out xercesc references.
* Create a base options function for common openspeedshop base cmake options to reduce redundencies.
* Add libxml2+python depends on to get around issues with the libxml2 package file.
Add a package that installed the pre-built maven distribution.
I've given up, for now, on building maven from source. That processed
stumbled on two points before I gave up:
1. It downloaded several hundred .{pop,jar} files and I despaired of
figuring out some way of mirroring and checksumming them; and
2. It exploded complaining about too many unacceptable license files,
which seems odd in its own source tree.
Perhaps someone with more Java fu that I admit to can figure it out.
In the meantime, this is useful.
* libsodium: add latest versions, fix old versions
older versions of libsodium were added to an "old" subdirectory
* zeromq: add 4.1.4
prerequiste for the latest develop version of flux
Pkg-config depends on glib which depends on pkg-config. As a result,
pkg-config used to build glib internally. However, this fails on Mac.
Building pkg-config with an internal glib is now a variant, turned on
by default, and required to build glib.
* Tells boost explictly about libraries and headers
Ideally, bjam would determine the libraries and headers from the
executable. But it doesn't. This rigs a best guess for python libraries
and headers.
* Move glob import to top of file
* variable name change: alllibs --> all_libs
* Use dso suffix rather than hard-coded string
* Use only MAJOR.MINOR when setting up python in bjam
* mumps: Add support for Intel compiler and insure both lapack and blas libraries are passed to the examples
Likely it was not discoverged before that the examples require both lapack and blas libraries because it
was tested with Openblas which is one large library containing everything.
Funded-by: IDEAS
Project: IDEAS/xSDK
Time: .3 hours
* flake8 fix.
* Created the initial version of the 'OpenSceneGraph' package.
* Added 'zlib' as a dependency and linked it during the build step.
* Fixed a few minor PEP8 style violations in the 'OpenSceneGraph' package.
* Added cmake as a build dependency and improved the build procedure.
* Made a few important argument updates to improve package compatibility.
* Fixed up a few remaining style issues in the 'openscenegraph' package.
* Added a description for the 'openscenegraph' package.
* Fixed a bug that was causing some 'openscenegraph@3.2.3%gcc' installs to fail.
* Fixed a number of small issues with the 'openscenegraph' package.
* Removed a number of superfluous flags from the 'openscenegraph' install.
* Add new version property to handle joined version numbers
* Add unit test for new joined property
* Add documentation on version.up_to() and version.joined
Things that accessed the cdd package, such as `spack info cdd run
tripped over a buglet in the *cdd* package, causing them to exit with
something like this:
```
Caused by:
TypeError: 'str' object is not callable
File "/rss/spack/lib/spack/spack/repository.py", line 584, in get
self._instances[key] = package_class(copy)
File "/rss/spack/lib/spack/spack/package.py", line 398, in __init__
f = fs.for_package_version(self, self.version)
File "/rss/spack/lib/spack/spack/fetch_strategy.py", line 852, in for_package_version
attrs['url'] = pkg.url_for_version(version)
File "/rss/spack/var/spack/repos/builtin/packages/cdd/package.py", line 40, in url_for_version
str(version.dotted()).replace('.', ''))
```
@tgamblin pointed out that `dotted` is a property, not a functin call
and that the parentheses are therefor inappropriate.
This deletes the parentheses. `spack info cdd` now works for me.
Add package for htop, an interactive text-mode process viewer for
Unix systems. Think top, with pretty colors and dyanmic bar graphs.
More info [here](https://github.com/hishamhm/htop).
* Update the krell institute products to use the latest features of spack for building on cluster platforms.
* Address travis error messages and resubmit the pull request.
* Update the contents of openspeedshop package.py so it passes the flake8 tests.
* Fix flake8 error-whitespack issue in mrnet package.py file.
* Add updates based on spack reviewer feedback.
* More fixes based on comments from reviewers. Switch using extend to using append, remove additional setting of PATH and LD_LIBRARY_PATH that should not be required due to RPATH.
* More review related changes. Update MPIOption.append lines and take out xercesc references.
* Create a base options function for common openspeedshop base cmake options to reduce redundencies.
* Made optional CGAL dependencies optional.
* cgal: Added note explaining that the CORE library is not the same as core CGAL functionality.
* Bug fix and flake8
* flake8
* build_environment: allow compilers to set up an environment
* clang: mock up a toolchain directory for xcode
Some projects ignore CC and CXX flags and instead use xcode to find the
toolchain. Clang on Apple should set up the environment properly.
Arguably, every compiler could do this on Apple, but let's see how this
works out just for AppleClang for now.
The Documentation directory is ~1.7G and the excluded platforms add up
to about 7G. Ignoring swift saves another 500M. The resulting Xcode.app
copy is in the 2G range.
* compiler: set member variables early
This is required so that later methods can query things such as the
version of the compiler.
* compiler: support finding the real path of the compiler
On Apple, the /usr/bin compilers are actually wrapping tools themselves
which query xcrun for the currently selected Xcode installation. Pierce
this veil and get the real, full path the to underlying compilers
instead.
* icu4c: install with rpath
On macOS, icu installs with a library ID of the library name. Enabling
rpath makes its ID its full installed path which lets Qt5 link against
it successfully.
* qt: no -no-gtkstyle flag on Qt5 on macOS
* First version of Abinit package
* Ignore *.swp files
* Add libxc, etsf_io packages
* AtomPaw package
* Make Abinit depend on mpi@2: and external version of libxc, netcdf, hdf5, etsf_io
* etsf_io: install Fortran modules in prefix.include
* Remove etsf_io from abinit requirements
* Add libxc2.2.1 (required by Abinit and atompaw)
* Cleanup
* Run make check
* Cleanup
* Use ld_flags instead of hard-coded libs, fix pep8, add copyright
* Put scalapack before lapackblas
* Added support for the 'maxdims' and 'maxvars' flags for 'NetCDF'.
* Added the '+mpi' variant and improved dependencies for 'exodusii'.
Improved the 'exodusii' package so that it's less reliant on patches.
* Added better type checking to variant values in the 'netcdf' package.
* Corrected the required CMake version for the 'exodusii' package.
* Fixed the dependencies of the '+mpi' variant of the 'exodusii' package.
* Updates to Mesa and other Xorg packages
* Add packages for all Xorg Protocol extensions
* Add packages for first half of Xorg libraries
* Add packages for remaining Xorg libraries
* Add packages for all Xorg utilities
* Add packages for Xorg documentation tools
* Add build deps to Xorg protocol headers
* Add packages for XCB
* Add build deps to Xorg libraries
* Add build deps to Xorg utilities
* Add packages for Xorg fonts and font-related utilities
* Change font deptype from build to default
I wasn't sure which deptype was appropriate at first since none of
the packages are actually linked together. I initially chose the
build deptype for this reason. However, the font packages don't
install into their own prefix. They install into font-config. If
font-config is a build dependency, that means you can uninstall it
without uninstalling the font packages, which wouldn't make sense
since they install into font-config. So I switched them back to
the default deptype.
* Minor formatting changes to ncview
* Add half-way done xorg-server package
* Add packages for Xorg test suites, not yet tested!
* Add packages for Xorg data
* Add first quarter of Xorg apps
* Add more packages for Xorg apps
* Add dependencies to mesa
* Remove comments from mesa package
* Flake8
* Add more packages for Xorg apps
* Add more packages for Xorg apps
* Add more packages for Xorg apps
* Add more packages for Xorg apps
* Add more packages for Xorg apps
* Add package for Sublime Text
* Add packages for remaining Xorg apps
* Revisit testing packages, add missing dependencies
* Add dependencies, clean up FIXMEs
+ This change fixes a problem that manifests when trilinos is built against a
MKL installation defined as an external package. In this scenario, the MKL
libraries are found one directory deeper than for the case where spack
provides MKL. The extra directory is a platform name like 'intel64'.
+ The changes in this PR were recommended by contributor @davydden. I
implemented and tested with intel@16.0.3. These changes fix the issue I
reported. I did not attempt building trilinos against other BLAS
implementations.
+ fixes#1923
* mfem: add tarball extension
Add tarball extension as a result of a feature added in PR#1926, which
fixes earlier issues in this PR (PR#1202). Prior to adding this feature,
Spack would not autodetect the extension of the tarball downloaded from
the redirected, shorted Google URL, requiring a messy hack. This hack
worked for mfem version 3.1, but led to errors when adding mfem version
3.2 because the files downloaded from Google did not contain the package
name, version number, or extension. Adding the extension enables Spack
to rename the tarball downloaded from Google to a sensible name that is
compatible with its filename parsing algorithms so that Spack "does the
right thing" (detects that the file is a GZipped tarball, decompresses
it, runs GNU Make) in fetching and staging the package.
* mfem: add linkage to KLU & BTF
Add linkage to the KLU & BTF solvers, which are now enabled in MFEM for
versions 3.2 and later.
* mfem: Add superlu-dist variant
Add linkage to SuperLU_DIST, which is a new linear solver interface for
MFEM versions 3.2 and later.
* mfem: add netcdf variant for cubit mesh support
Add NetCDF variant for MFEM versions 3.2 and later; installing the
NetCDF interfaces enables CUBIT mesh support.
+ Cray compile wrappers are MPI wrappers.
+ Packages that need to be compiled with MPI compile wrappers normally use
'mpicc', 'mpic++' and 'mpif90' provided by the MPI vendor. However, when using
cray-mpich as the MPI vendor, the compile wrappers 'CC', 'cc' and 'ftn' must
be used.
+ In this scenario, the mpich package is hijacked by specifying cray-mpich as an
external package under the 'mpich:' section of packages.yaml. For example:
packages:
mpich:
modules:
mpich@7.4.2%intel@16.0.3 arch=cray-CNL-haswell: cray-mpich/7.4.2
buildable: False
all:
providers:
mpi: [mpich]
+ This change allows packages like parmetis to be built using the Cray compile
wrappers. For example: 'spack install parmetis%intel@16.0.3 ^mpich@7.4.2 os=CNL'
+ This commit relies on the existence of the environment variable CRAYPE_VERSION
to determine if the current machine is running a Cray environment. This check is
insufficient, but I'm not sure how to improve this logic.
+ Fixes#1827
* fix blas-lapack in scipy and numpy
* py-numpy: do not set rpath on macOS
* py-scipy: do not set Blas/Lapack. This appears to be picked up from py-numpy
* py-numpy: don't write rpath= in Sierra only
* py-numpy: add a link to build notes
* Updated nettle to have m4 as an immediate dependency to match new PATH
construction logic which only includes immediate dependencies.
* Update package.py
* Added check for ppc64le because configure cant guess the build type for rhel on ppc64le
Expand/clarify description of dependency types
Refactored getting the arch so that its less verbose
* Fixed flake8 issues
* everytrace: New package
Everytrace ensures that stack trace is obtained every time a program exits, for whatever reason.
* everytrace: Change CMake to build dependency
* Renamed to everytrace-example, flake8 and copyright issues.
* flake8
* Missing type=build
* Warn user if flake8 can't find setuptools
* Add missing dependencies of flake8
* Updates to py-autopep8, make packages activateable
* Check for presence of setuptools for Sphinx too
* Fix bug in order of commands
* Adding last version of fenics and making trilinos not ambiguous on hdf5
* forcing fenics to ignore hdf5 cxx
* Adding deptypes and correcting the hdf5 patch
* flake8 corrections
* cleaning some useless code
* Provide new versions of llvm.
+ Provide file list and md5 hashes for 3.8.1 and 3.9.0.
+ Clean up indentation for the 'releases' data structure to improve
consistency.
* Adding a block of code to the 'resources' structure for cfe.
* Merge cfe and clang resources into single entity.
* Added hadoop, spark, and variant spark+hadoop
* Docstrings, dependency types, urls, copyright
* Flake8 fixes, link dependency for hadoop
* Build type for spark, env problem setting JAVA_HOME
* New package: libquo
libquo is a high-level, easy to use programming interface tailored specifically
for MPI/MPI+X codes that may benefit from evolving process binding policies
during their execution. QUO allows for arbitrary process binding policies to
be enacted and reverted during the execution of an MPI/MPI+X application as
different computational phases are entered and exited, respectively.
https://github.com/losalamos/libquo
* Remove use of 'which' and fix style non-conformance.
* Add libint package
* Add Intel optimization flags recommended by CP2K
* Add new version and Intel compiler optimization flags for libxc
* Add older version of libint
* Libint depends on GMP C++ library
Pro tips from @adamjstewart:
* line too long in package description
* name it grib-api instead of grib_api
* depend on netcdf without reference to unnecessary constraints
* py-pil: Does not build with Python3.
* Set py-pillow to be the default pil provider
* Update package.py
* Change to comments requested by adamjstewart
* Remove version constraint from extends(), avoid a Spack bug.
* dealii: add missing python dependency
* boost: fix a bug which broke it on macOS with clang+gfortran
Boost was using gcc compiler instead of clang++, which lead to
cryptic Undefined symbols linking errors for boost::python::objects::function_object()
when building other packages against boost+python.
* boost: add exceptions for intel
* boost: use spack_cxx
* Turned <provider>_libs into an iterable
Modifications :
- added class LibraryList + unit tests
- added convenience functions `find_libraries` and `dedupe`
- modifed non Intel blas/lapack providers
- modified packages using blas_shared_libs and similar functions
* atlas : added pthread variant
* intel packages : added lapack_libs and blas_libs
* find_library_path : removed unused function
* PR review : fixed last issues
* LibraryList : added test on __add__ return type
* LibraryList : added __radd__ fixed unit tests
fix : failing unit tests due to missing `self`
* cp2k and dependecies : fixed blas-lapack related statements in package.py
Python will look to link with libncursesw in preference to libncurses. Since
ncurses in spack is built without suffixes there is no libncursesw and
python will link to the system libncursesw for _curses.so and
_curses_panel.so, as well as libpanelw for _curses_panel.so.
This PR introduces a patch that simple removes the check for ncursesw in
setup.py and therefore sets `curses_library` to `ncurses`.
* Added missing libtiff dependency
* added dependency on fontconfig
* Added version 2.2.3
* use autotools rather than cmake
The cmake build was not producing a complete install.
* There was no versioning of the installed libraries.
* gdlib-config was missing
* pkgconfig directory was missing
These problems do not happen when built with autotools.
yt is a python package for analyzing and visualizing volumetric, multi-resolution data from astrophysical simulations, radio telescopes, and a burgeoning interdisciplinary community.
* Fixed a few small bugs in the 'git-lfs' install script.
* Fixed a bug in the '(go|go-bootstrap)@1.4.2+test' install scripts.
* Fixing a minor style issue in the 'git-lfs' install script.
I encountered an HPC system where PETSc's configure stage does not find a valid `cpp` (C preprocessor). Explicitly pointing to Spack's `cpp` wrapper resolves the problem.
* Fixed a bug that was causing post-install METIS tests to fail.
* Improved the patching procedure used in the 'metis' install script.
* Enabled patch skipping for the 'metis' and 'parmetis' packages.
* Fixed some minor style issues in the 'parmetis' package.
* Improved the 'metis' test fix and added 'run_tests' support.
* Refactored and reorganized the 'zoltan' install script.
* Fixed a few bugs with the '+mpi' and '+fortan' variants of 'zoltan'.
* Reintroduced and improved the '+shared' variant for the 'zoltan' package.
* Improved compatibility with different MPI providers for 'zoltan'.
Includes :
- treatment of a generic hierarchy (i.e. lapack + mpi + compiler)
- possibility to specify which compilers are to be considered Core
- correct treatment of the 'family' directive
- unit tests for most new features
- add new version 2.10.3
- explicitly require hdf5 variant ~mpi, since we don't know how to build with mpicc
- explicitly disable glew since it may not be installed
* Added py-proj package
* Added bug-fix patched version to the package.
* Removed dependency of py-proj on proj.
* py-proj: Added missing dependency
* py-proj: Removed versions from forked repos, now that necessary bug fixes have been merged into the main repo.
* Update package.py
Added a blank line that Travis wanted.
* 1. Added copyright
2. Used setup_py
3. Added type='build' for dependencies.
* Qthreads: Switch back to using tarball for download
* Don't require autotools any more
* Re-enable autotools
* Remove autotools again
* Use .tar.bz tarball; remove outdated code
* New package h5hut -- High-Performance I/O Library for Particle-based Simulations
* Set up MPI compilers
* Add version 1.8.12 to HDF5
* Correct Sphinx error
+ Starting with version 2.0, OpenMPI no longer provides C++ bindings by default
(libmpi_cxx.so). Add a configure option to instruct the build to also build
and install libmpi_cxx.so.
+ This MPI feature is needed by at least one spack package (moab).
* add pango dependency
* add new package ghostscript-fonts & add to ImageMagick as dependency
also tell ImageMagick's configure where the font dir is!
* refactor to fix flake8
* add homepage to ghostscript-fonts
* use install_tree
* remove unneeded import
lzo's download server does not present a valid certificate, so that downloads via https are failing. Spack's MD5 checksum still ensure a safe download.
Closes#1675.
@adamjstewart
```
think you'll find that if you try running something like:
spack spec libsplash ^hdf5@1.8.15
It will complain that libsplash does not depend on hdf5.
This is a bug in Spack's dependency resolution. A workaround
for this is to tell it to always depend on hdf5.
```
@davydden
```
to expand on @adamjstewart comment, spack will make a union
of dependencies,
i.e. hdf5@1.8.6: + hdf5+mpi = hdf5:1.8.6:+mpi, that's why it works.
```
thank you for the hint!
Adds a package for
[PNGwriter](https://github.com/pngwriter/pngwriter/),
a simple high-level C++ png API used in scientific projects.
```
PNGwriter is a very easy to use open source graphics library that
uses PNG as its output format. The interface has been designed to be
as simple and intuitive as possible. It supports plotting and reading
pixels in the RGB (red, green, blue), HSV (hue, saturation,
value/brightness) and CMYK (cyan, magenta, yellow, black) colour
spaces, basic shapes, scaling, bilinear interpolation, full TrueType
antialiased and rotated text support, bezier curves, opening existing
PNG images and more.
```
PNGwriter is a dependency for [PIConGPU](http://picongpu.hzdr.de),
an open-source, many-core, fully-relativistic particle-in-cell code
and further software developed at
[Helmholz-Zentrum Dresden - Rossendorf](https://www.hzdr.de).
Adds a package for
[libSplash](https://github.com/ComputationalRadiationPhysics/libSplash),
a high-level library around serial and parallel HDF5 for regular
grids and particle data sets.
```
libSplash aims at developing a HDF5-based I/O library for HPC
simulations. It is created as an easy-to-use frontend for the
standard HDF5 library with support for MPI processes in a cluster
environment. While the standard HDF5 library provides detailed
low-level control, libSplash simplifies tasks commonly found in
large-scale HPC simulations, such as iter- ative computations
and MPI distributed processes.
```
libSplash is a dependency for [PIConGPU](http://picongpu.hzdr.de),
an open-source, many-core, fully-relativistic particle-in-cell
code and further software developed at
[Helmholz-Zentrum Dresden - Rossendorf](https://www.hzdr.de).
libSplash builds in two versions, one without MPI writing
domain-decomposed posix-style HDF5 files per process and one
(default) with MPI and MPI-I/O ("parallel HDF5") support
aggregating into a single file per MPI communicator.
libSplash is used in conjunction with
[openPMD](http://openPMD.org), see also
[github.com/openPMD/](https://github.com/openPMD/).
This PR updates the ADIOS package.
**Changes:**
- add latest stable release `1.10.0`
- add previous versions (hashes)
- add default license header
- add build options (shamelessly taken from HDF5 package)
- add validation for excisting FC (as in HDF5) and make optional
- handle mxml dependency correctly (not required in 1.10.0+)
- add `CFLAGS=-fPIC` to build shared (python) libs in ADIOS' lib
- remove `-DMPICH_IGNORE_CXX_SEEK` since it is normally not required
- remove `MPICC/CXX?FC` since `--with-mpi` just performs well
- add transforms:
- `zlib`: useful (optional) default
- `szip`: optional (compile often broken)
- add transports that are not as performant as the `.bp` format:
- `hdf5`: non-default
- `netcdf`: non-default, close#1610
* opencoarrays: new package
* opencoarrays: remove tests from install due to (unimportant) failures in some configurations
* opencoarrays: fix flake8 errors
- use two empty lines before `class`
- change version numbering scheme for packages, use `url_for_version` to make things work
- specify dependency types
- add comment about temporarily moved download location
- update two packages to newer versions
py-cffi's .so was being built without the rpath being set. distutils
looks at the LDSHARED for which compiler to use to build the final .so.
Since it was not set, distutils fell back to the system provided
compiler. Setting it forces the shared library to be compiled with the
spack compiler (meaning that the rpath is set correctly).
Patch doesn't work with @when unless you specify a patch for every
version. When running `spack patch` for a version without a patch,
spack thinks that a patch exists, tries to apply it, but it doesn't
exist. Spack gets very confused.
The Lmod author changed the src so that it uses the tclsh (and shared
libraries) discovered at configure time. He did it differently that I
did in this patch, but he changes solve our problem too, so...
This commit changes the git package to depend_on('perl'). The system
perl is not always sufficient to install git (e.g. a CentOS7 system with
the development tools group installed has perl but not the
ExtUtils::MakeMaker package that git needs) and one can't always update
the system's perl.
This PR depends_on PR #1339, which adds a perl package to spack.
Update the samtools package to support v1.3.1, which
- now uses configure script; and
- now depends on external htslib package.
The dependency on mpc seems to have been bogus, it's never linked in,
nor is it mentioned in the source tree. I *do* have a version in
/usr/lib64, but ldd does not sure it being linked in either....
By depending on 'ncurses' I can do away with the need for the patch.
- It's not really a circular dependency -- git is a run dependency of gettext
- We can revert this change when Spack is smart enough to make git a run
dependency and build it.
+ Always depend on the gettext package. This simplifies the logic and I no
longer need to 'import sys'
+ Only apply the patch for the older version of glib.
Add py-meep package and dependencies
Merging to add the gettext support, will submit a separate issue for the LD_LIBRARY_PATH issue with MPI and py-meep
Before I learned that I was stumbling over a real but (#1308), I thought
I needed to arrange for the fetcher to skip the unpack step.
This commit removes the useful `def unpack`.
Use the resource machinery to fetch/cache/unpack/... the App::cpanminus
tarball.
- this hardcodes the version, I can't figure out how to use a variant to
hold/set the value and access it in the resource().
- change up the install to use the `with working_dir()` meme.
Make running perl's tests conditional, one must now specify the
`--run-tests` flag to the `spack install` command in order to run the
tests.
On one system (8 core, 16GB Digital Ocean Droplet), installing without
tests takes 3 minutes, with tests takes 16 minutes.
Rather than hard-coding the verison of `cpanm` that's [optionally]
installed into the core, make it a variant with a default value of
'1.7042'.
Also discovered that `prefix + 'bin'` is the same as `prefix.bin`, so
embetter that bit of code.
Add perl package, based on [work by
justintoo](https://github.com/LLNL/spack/pull/105). He had too many
things pulled into that pull request, this just adds a perl package.
Support the current releases on the past three minor branches.
Run perl's tests before installing.
Install cpanm into the core (makes building on top of this perl *much*
simpler). Controlled by a variant.
I cargo culted that from my *nextflow* package. I [thought I] needed it
to work around Spack trying to use tar to unpack something that was
neither a tar ball nor unpackable.
This package works fine without it. In retrospect, the error that I was
seeing in the *nextflow* package was probably this problem #1308.
Add a package for [ack](http://beyondgrep.com/install/). Simply install
the fatpacked script.
It uses '#!/usr/bin/env perl' and it very much not choosy about what
perl it needs. For now just trust that there's one available, perhaps
someday we can/should uncomment the depends_on('perl').
Follows the methodolgy I used in nextflow. Has the same
uninstall/install problem that nextflow has, there is an issue in
progress for that: https://github.com/LLNL/spack/issues/1308.
Tested on CentOS7.
r-jsonlite 0.0.21 -> 1.0
r-mime 0.4 -> 0.5
rcpp 0.12.5 -> 0.12.6
CRAN is funny. The older versions of these packages are still available
in package specific directories but the current version is not there, so
I don't see any way to make the older versions work.
This is my first cut at a package to support nextflow. It's also my
first package. It works, but has issues. I'm going to submit a pull
request and get some coaching on how to deal with it.
One issue particular: if I install, then uninstall, then try to install
again (which uses the cached copy of the "distribution file"), it
explodes.
WIP: I started trying to build gtkplus@3.20, but this package has many more
dependencies than v2 and it requires newer versions of existing packages. This
commit provides updates for 5 packages that are required by GTK+3. This is not
the complete set of changes required for GTK+3.
atk - move default version from 2.14 -> 2.20.
glib - move default version from 2.42 -> 2.49
- v2.49 requires pcre+utf as a new dependency.
pcre - if variant +utf is selected, add '--enable-unicode-properties' to the
configure options.
libepoxy - new package to spack
- manages OpenGL function pointers.
pango - move default version from 1.36 -> 1.40
Add a package for [tree](http://mama.indstate.edu/users/ice/tree/).
It has a Makefile that hardcodes a prefix and some CFLAGS. Used
filter_file to:
- set the make variable *prefix* to `prefix`; and
- comment out their CFLAGS, just use ours....
It installs, runs on CentOS7, and uninstalls cleanly.
With the addition of dependency types and with `py-setuptools` set as
type='build' there are more packages that need to have `py-setuptools`
added as a dependency.
This PR adds that dependency for the following packages:
- py-h5py
- py-networkx
- py-pytables
- py-scikit-image
This PR adds the `nolink` dependency type to r- package dependencies.
This is needed due to the new dependency types in Spack. A couple of
packages were updated with new versions as well.
This commit introduces a mechanism to insure that R package dependencies
are built with the Spack compiler wrapper. A copy of Makeconf is made
before `filter_compilers` is called. This is then pointed to by the
R_MAKEVARS_SITE environment variable set up in
`setup_dependent_environment`. With this the normal compilers are used
outside of spack and the spack wrapper compilers are used inside of
spack.
This commit also standardizes on the `join_path` call. It also sets the
commented build command to reflect what is actually used with the newer
string formatting.
A while ago I was asked to convert packages to all lowercase. That was
done but some dependencies did not get converted in the specification.
This commit fixes that as well as a couple of urls that need to be made
explicit and a missing dependency on jdk.
- OpenSSL no longer checks remote versions on the openssl site.
- Spack is used on systems that aren't connected to the internet, and
this check is probably in the wrong place and affects too many
commands. We can work on figuring out a better, more configurable
place to put a check like this.
Lmod's configure script goes to the trouble of finding
tclsh. This change uses that info to rewrite the #! lines
in the tcl scripts so that they call the tclsh that the
configure script discovered.
It needs to massage the existing shebang lines into something
that the sed statement in the makefile can manipulate and
it needs to add the path_to_tclsh info into the set of sed
statements.
Checked with versions 6.4.1 and 6.3.7 (the checksum for 6.0.1 is
incorrect, a fix for another time).
The lmod package needs a tclsh. Up until now it just assumed
that one was available on the system.
This change adds a depends_on('tcl') to the lmod package.
The tcl package installs a tclsh script with an embedded version
number (e.g. tclsh8.6) but the lmod configuration looks for tclsh.
This change extends the tcl package to symlink tclshX.Y to tclsh in
the tcl package bin directory.
- Redid the code for setting the itac symlink for the cluster edition.
- Removed the *PATH variables for MPI to avoid a conflict with other MPI
environment modules.
- Added missing test for `+all` when checking variants.
Set up the environment for the Intel compilers and tools. This commit
does the following:
- Unset variables that were incorrect from the auto guess prefix
inspections.
- Add a RemovePath environment_modifications_formats for dotkit.
- Set the module environment variables appropriate for the different
variants.
- Change the component logic so that the '+all' variant works. It was
getting split by letter and leaving COMPONENTS empty.
- Added a variant checking function.
- Added NONRPM_DB_DIR to the silent.cfg so that the product database
goes to the installation directory.
- With the product database in prefix the code to remove the product
database file from the home directory is no longer needed and was
removed.
- Reformat the 'tools' variant description.
There are probably more variables needed for the '+tools' for the
'professional' product version but I do not have access to that.
* upstream/develop: (126 commits)
Fix indent/flake8 error.
openexr : Add new package
Set environment variables
Added gnu packages datamash, parallel, and screen
added package as argument to setup_platform_environment
ilmbase : Add new IlmBase package
Documented linker default
fixed flake errors
removed commented-out code
Set default link type to dynamic on cray. Includes hooks for platform-based environment changes
fixed flake errors
fixed flake errors
Improved cray_xc detection bug fix
Improved cray_xc detection
remove FIXMEs
Ensure that per-4.4.1 NetCDF doesn't use HDF5 1.10
Re-ignore licenses directory
Add "default" configuration scope.
Draft CDO
Make frontend OS on Cray machines a proper linux distro.
...