* Add bowtie2@2.3.0, fix dependencies and sbangs
Add support for bowtie2@2.3.0
- digest
- a patch for 2.3.0 that parallels the existing package. Truth be
told it builds (for me) without this, but I'm assuming that they're
there for a reason...).
- tune up dependencies
- need tbb
- don't need readline or zlib
Several things were installed with sbang's that use `/usr/bin/env` to
fine perl or python. Fix the dependency and clean up the sbang lines.
* Fix python exe name, avoid path banging
I'd cut and pasted the python bit from the perl bit and missed one
reference to perl.
While I'm there, use the cleaner `spec['perl'].command` instead of
banging together the path from its bits.
* Fix up the "when" constraints on the dependencies
Get the edge cases right.
- 2.2.5 doesn't need tbb, 2.3.[01] do.
- 2.3.1 needs readline and zlib.
The go team released 1.9.2 which includes fixes for some things
that 1.9.1 broke:
> ... include fixes to the compiler, linker, runtime, documentation, go command, and the crypto/x509, database/sql, log, and net/smtp packages. They include a fix to a bug introduced in Go 1.9.1 and Go 1.8.4 that broke "go get" of non-Git repositories under certain conditions.
* Exodus: skip the -G "Unix Makefiles" part
The problem is that spack passes -G "Unix Makefiles" into cmake, which normally
works. But in the Exodus package, it is being passed into a bash wrapper
script. In there, the $@ then loses the information about "Unix Makefiles"
being just one argument, and in effect passes -G Unix Makefiles into the cmake
(without quotes), and so cmake only sees -G Unix, and then fails. This is a
known problem with bash with no simple solutions. As a workaround, this patch
skips the first two arguments, i.e., -G and "Unix Makefiles". This makes it
work.
Fixes#5895.
* Port exodusii to cmake
The cmake options were taken from the cmake-exodus bash script and ported to
spack directly.
* Use variant forwarding to forward the 'mpi'
Now instead of
spack install exodusii~mpi^netcdf~mpi^hdf5~mpi
one can just use
spack install exodusii~mpi
* sw4lite: fix build errors and add variants
* sw4lite: change linking against blas and lapack
* change order of blas and lapack
* satisfy flake8 requirements
* Update package.py
* Add the custom paraview lib directory structure to the library paths in the paraview module file.
* Fixing flake8 issues.
* Checking if lib64 exists for paraview module file generation, else use lib.
* Fixing more flake8 problems I introduced.
Since LLVM 3.9 Clang can use the libc++ library by default using the
CLANG_DEFAULT_CXX_STDLIB cmake configuration variable, without having to
specify the -stdlib=libc++ option on the clang++ command line.
This commit makes clang++ use libc++ by default for LLVM 3.9 and later if the
libcxx variant is on.
Fixes#5942.
Chasing a performance regression has lead me to this change, going from default optimization gives a significant performance win. The sweet spot for zlib is apparently `-O2`, both `-Ofast` and `-O3` are slightly worse (regression is about 3% compared with `-O2) in my testing.
Happy to share my methodology with people so we can benchmark on a wider variety of systems.
* Add package for scalpel@0.5.3
Scalpel's a bit of a mess, it expects it's users to just unpack the
tarball, build it in the resulting directory and install that
directory onto their PATH. My install step recapitulates this into
prefix.bin. The alternative was rewiring their scripts (perl), which
use `FindBin` and expect things to be located in the same dir that the
script itself is.
Sigh.
Lightly tested on CentOS 7.
* Flake8 cleanup
* Additional flake8 cleanup
* Added procedure to edit sbangs of the parallel perl scripts.
* Specify the types of perl dependency
Adding ", type=('build', 'run')" to the dependency declaration to clarify when and how perl is required
* flake8 cleanup
The problem was that the configure script was not using spack's compiler
wrappers. We now pass the proper compiler wrapper using the CC argument
explicitly.
Fixes#5892.
* r-a4: Add r dependency and update url.
* r-a4base: Add r dependency and update url.
* r-a4classif: update url and add r dependency.
* r-a4core: Update url and add r dependency.
* r-a4preproc: Update url and add r dependency.
* r-a4reporting: Update url and add r dependency.
* r-abaenrichment: Update url and add r dependency.
* r-absseq: Update url and add r dependency.
* r-acgh: Update url and add r dependency.
* r-acme: Update url and add r dependency.
* r-adsplit: Update url and add r dependency.
* r-affxparser: Update url and add r dependency.
* r-affycomp: Update url and add r dependency.
* r-affycompatible: Update url and add r dependency.
* r-affycontam: Update url and add r dependency.
* r-annaffy: Update url and add r dependency.
* r-annotate: Update url and add r dependency.
* r-annotate: Update url and add r dependency.
* r-annotationdbi: Update url and add r dependency.
* r-genefilter: Update url and add r dependency.
* r-mlinterfaces: Update url and add r dependency.
* r-limma: Update url and add r dependency.
* r-multtest: Update url and add r dependency.
* r-a4classif: Correct format.
* r-affycomp: Correct error.
- When you don't use wildcards, flake8 will find places where you used an
undefined name.
- This commit has all the bugfixes resulting from this static check.
Jansson builds only a static library by default, which is probably
not what most users want. Add Cmake args required to build a shared
library and enable those via a default 'shared' variant of the
package.
* new package: hpgmg
* removed build and changed extend() to append()
* changes based on comments
* pep8 compliant
* addressed rest of comments
* trigger checks
* changed from fe_fv to two boolean variants
* fixed compilation issues
* cleared up ambiguities in solver variants
* removed +mpi condition
* changes based on review
#5776 cleaned up the way the the current working directory is
managed (less magic state).
bcl2fastq is packaged like a russian doll, rather than an archive file
that contains the source, there's a zip file that contains a tar.gz
file that contains the source. The package definition has a bit of
extra code that unpacks the inner tarball.
That extra bit of code now needs to explicitly arrange to be in the
correct directory before it does its work.
* py-mpi4py: Add develop version and dependencies
- Add cython dependency for develop version
- Add explicit python dependency
* py-mpi4py: Specify 2.0.1 instead of develop for conditional dep
Perl installs a couple of config files that need to be munged so that
they don't refer to the spack compiler. These files are installed
read-only. Behind the scenes 'filter_file' moves its file to a safe
place, and tries to create a working file that is both O_WRONLY and
has the perms of the original file. On an NFSv4 filesystem, the
combination of 'r--r--r--' and O_WRONLY throws a permissions error.
This commit adds a simple context manager that temporarily makes the
files writable.
* Add a new +clanglibcpp option for Boost
Currently, the compile of boost with clang will use the stdlibc++. This patch adds an optional flag to use clangs included libc++ instead.
* Linting
Fix long lines and white space errors
- Tests use a session-scoped mock stage directory so as not to interfere
with the real install.
- Every test is forced to clean up after itself with an additional check.
We now automatically assert that no new files have been added to
`spack.stage_path` during each test.
- This means that tests that fail installs now need to clean up their
stages, but in all other cases the check is useful.
1.64 had issues serialization (make_array and others) when built with
+mpi+python. It appears that those issues are fixed in 1.65.1
so we can remove preferred tag from 1.63.
* initial update of sundials package
* fix bugs in initial sundials update
* add xsdk cmake setup, fix generic math option, add cuda/raja Makefiles to install fixes
* Fix lapack install bug, add new conflicts, clean up formatting
* Address pull requeset comments and make fomatting style consistent
Remove blas variant as blas is only needed when used by an external
linear solver. Set related CMake blas variables as needed depending
enabled external linear solvers.
Add minimum required CMake version.
Additional conflicts and dependencies for external libraries based
on mpi, indextype, and precision.
Fix SuperLU_MT logic to check which threading type SuperLU_MT was
configured with.
Add maintaiers.
Change Sundials solver options to use an array of values.
Consistently use % for formatting.
* change triple-single quotes to single quotes
* Change indextype option to a single int64 option
* Add dmlc/mxnet packags.
* Build mxnet+cuda+opencv with GCC-4.8.5 and GCC-5.4.0.
* Build mxnet version 0.10.0, 0.10.0.post1 and 0.10.0.post2.
* Add component version constrain for mxnet 0.10.0.x .
* Go through flake8.
* Replace commit hash with commit date as package version.
* Go throught Travis-CI.
* Update submodule version for 0.10.0.post2.
* Add openmp variant for dmlc-core and mxnet.
* Refine variant handling.
* Fix filter_file for dmlc-core.
* Cut long strings into multiple lies due to PEP8 requirements.
* Fix for PEP8.
* Add CUB_INCLUDE.
* Add py-mxnet: Python binding for MXNet.
* Remove distutils.dir_util.
* Add the profiler variant for mxnet.
* Add a shared variant for nnvm.
* Set USE_OPENMP to OFF by default.
* Fix flake8 errors.
* Fix flake8 issues.
* flake8 issues again.
closes#5506
The application of patches to upstream executables has been reworked
according to the suggestions of the main developer in #5506. In
particular we are not maintaining a dictionary that maps plumed
versions to the versions of patchable executables, and we are using a
non-interactive command to patch applications.
All the comments on substituting plumed at run-time do not apply here,
since we use RPATH and we want to maintain a 1:1 relationship between
the DAG hash and the plumed library used.
* Add package for aspell and ass't dictionaries
Add a package definition for aspell.
Add a handful of dictionaries to convince myself that the support for
a bunch of dictionaries works.
* Flake8 cleanup
* Use six's version of urlparse
`urlparse` is not python3 friendly. This works around it (stolen from
`.../cmd/md5.py`).
* Fix incorrect trimming regexp
* Clean up dictionary build
- more parsimonious use of `which` (`make()` has already been made)
- use `sh` instead of `bash`
* Use a helper method to generate info for variants
I figured out my issues with static methods. I *think* that it this
is pythonic.
* Convert aspell to an extendable package
Convert aspell to be extendable and rework the dictionaries to be
extensions.
As it stands, there's a great deal of cut and paste in the
dictionaries, I'll abstract that out next.
The {de,}activate methods copy a great deal of code out of
package.py. Perhaps there's a better way....
* Create AspellDictPackage and use it for the dictionaries
Reduce the repeated code, pull it into a base class.
I'm confused about why 'from spack import *' wasn't more useful in the
base class.
* Oops, -de & -es should be AspellDictPackages too
* Typo: pakcage -> package
* Address some commentary
* Update copyright dates, 2016->2017
This is a simple package that drops their shell wrapper into
prefix.bin and their jar files into prefix.lib.
The approach comes from the picard package.
* Updating ag to the latest version
* Pretty by request
* Restore url to previous value
There *is* an https version available, but I've also been told
to not just update the url when adding new version. I'm following
the latter advice and trusting security to the digest.
* Adding maven v3.5.0
Updating package file to include later version of maven but still signifying a preference for the older
* removing specific preference flag
* ncl: Fix temp directory
Currently, ncl is configured using a transient temp directory. This
leads to warnings such as this when executing ncl later on:
warning:"/tmp/ncl_ncar_xxxxxx" tmp dir does not exist or is not writable:
NCL functionality may be limited -- check TMPDIR environment variable
As this also breaks some functionality, use the system temp directory
instead (typically /tmp).
* ncl: Depend on esmf
esmf is required for some ncl scripts (such as ESMF_regridding.ncl).
* Add link dependency on xproto to xau
The libxcb build was failing like so:
```
1 error found in build log:
[ ... ]
131 checking whether to build developer documentation... yes
132 checking for doxygen... /usr/bin/doxygen
133 checking for dot... /usr/bin/dot
134 checking for CHECK... no
135 checking for XCBPROTO... yes
136 checking for NEEDED... no
>> 137 configure: error: Package requirements (pthread-stubs xau >= 0.99.2) were not met:
138
139 Package 'xproto', required by 'xau', not found
140
141 Consider adjusting the PKG_CONFIG_PATH environment variable if you
142 installed software in a non-standard prefix.
143
```
This adds a link dependency on libxproto that allows the libxcb build to
succeed.
* Change more build deps to build, link
These were also necessary for emacs+X to build.
* Fix flake8 complaint
* edits to address issues where spack concretization attempts to set properties on already-installed specs
* most added checks only need to check if the spec is concrete; they dont also need to check if the package is installed
* add test to ensure that patches are not applied to an installed spec
* add test to ensure that an error is detected when a dependent requests a dependency constraint which conflicts with a requested installed dependency
* Add package for multitail@6.4.2
Lightly tested on CentOS 7.
* Responde to feedback/comments
Use `install_targets` to specify PREFIX= and DESTDIR= instead of
hacking away at the Makefile. Expand commentary about "Why?".
Use `headers.include_flags` and `libs.ld_flags` to avoid explicitly
setting `-L` and `-I` when hacking away at the Makefile.
* Added exasp2 spackage
Added spackage for exasp2 proxy app
* Fixed MPI in ExaSP2
Explicitly disabled MPI when not enabled.
Set MPI variant to default as per Spack standards
* Generalized BML Passing for ExaSP2
* Modified to follow spack rules on blas
Fortunately was able to modify exasp2 build system to support spack
model for blas and lapack requirements. No guarantee is made for support
of anything other than originally supported libraries
* Fixed flake8 error
explicitly set the zlib path for libpng configure.
fixes:
```
[ ... ]
92 checking for memset... yes
93 checking for pow... no
94 checking for pow in -lm... yes
95 checking for clock_gettime... yes
96 checking for zlibVersion in -lz... no
97 checking for z_zlibVersion in -lz... no
>> 98 configure: error: zlib not installed
```
This is a partial fix for #5564.
This package used to trust that `configure` would discover `gmp` from
its environment.
It's safer to tell it where to find `gmp` explicitly.
This does that by adding a configure_args() that provides a
`--with-gmp=...` argument for configure.
* Added support for BML+mpi variant
Added support for BML+mpi variant. Currently restricted to master
(develop) branch pending release of next bml tag
* Update package.py
Removing redundant statement
* Update package.py
Added explicit disabling of MPI when not requested
- A package can depend on a special patched version of its dependencies.
- The `Spec` YAML (and therefore the hash) now includes the sha256 of
the patch in the `Spec` YAML, which changes its hash.
- The special patched version will be built separately from a "vanilla"
version of the same package.
- This allows packages to maintain patches on their dependencies
without affecting either the dependency package or its dependents.
This could previously be accomplished with special variants, but
having to add variants means the hash of the dependency changes
frequently when it really doesn't need to. This commit allows the
hash to change *just* for dependencies that need patches.
- Patching dependencies shouldn't be the common case, but some packages
(qmcpack, hpctoolkit, openspeedshop) do this kind of thing and it
makes the code structure mirror maintenance responsibilities.
- Note that this commit means that adding or changing a patch on a
package will change its hash. This is probably what *should* happen,
but we haven't done it so far.
- Only applies to `patch()` directives; `package.py` files (and their
`patch()` functions) are not hashed, but we'd like to do that in the
future.
- The interface looks like this: `depends_on()` can optionally take a
patch directive or a list of them:
depends_on(<spec>,
patches=patch(..., when=<cond>),
when=<cond>)
# or
depends_on(<spec>,
patches=[patch(..., when=<cond>),
patch(..., when=<cond>)],
when=<cond>)
- Previously, the `patch()` directive only took an `md5` parameter. Now
it only takes a `sha256` parameter. We restrict this because we want
to be consistent about which hash is used in the `Spec`.
- A side effect of hashing patches is that *compressed* patches fetched
from URLs now need *two* checksums: one for the downloaded archive and
one for the content of the patch itself. Patches fetched uncompressed
only need a checksum for the patch. Rationale:
- we include the content of the *patch* in the spec hash, as that is
the checksum we can do consistently for patches included in Spack's
source and patches fetched remotely, both compressed and
uncompressed.
- we *still* need the patch of the downloaded archive, because we want
to verify the download *before* handing it off to tar, unzip, or
another decompressor. Not doing so is a security risk and leaves
users exposed to any arbitrary code execution vulnerabilities in
compression tools.
* Add '--test=all' and '--test=root' options to test either the root or the root and all dependencies.
* add a test dependency type that is only used when --test is enabled.
* test dependencies are not added to the spec, but they are provided in the test environment.
+ Count, or compute differences of, physical lines of source code in the
given files (may be archives such as compressed tarballs or zip files)
and/or recursively below the given directories.