According to my nightly CI/CD tests, x.org is another large provider
of software in common build chains that is often down.
Added a hand-selected amount of mirrors that is well up-to-sync.
Tested with `util-macros` that has a quite "recent" patch release.
Other packages to follow in an individual PR.
* MAINT: Charliecloud OSX error
* raise an appropriate error when attempting to build
Charliecloud on Mac OSX, since it will otherwise fail
with a more confusing configure stage link check failure
* Update var/spack/repos/builtin/packages/charliecloud/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* MAINT: PR 16049 revision
* remove an unused import
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* pfft: fix to handle 'precision' variant in fftw
pfft had been checking for +double, etc. in fftw spec, which no longer
are present (replaced by Multivalued variant precision).
* pfft: fix to handle 'precision' variant in fftw
pfft had been checking for +double, etc. in fftw spec, which no
longer are present (replaced by Multivalued variant precision).
(amended to use more idiomatic checks as suggested by @alalazo)
sourceware.org is often quite overrun and times out or results in
certificate errors.
Since libffi, bzip2, elfutils, etc. are quite fundamental in
build chains, lets add some official mirrors.
libffi, bzip2, elfutils, lvm2, valgrind: add mirrors
* Patch Mathematica
Mathematica installer moves all files and directories from installation directory to a backup one. The problem is that it also moves .spack to this backup location. Once it's done it does not move .spack back where it was.
My patch creates a copy of .spack to /tmp then moves it back right before exiting the install call.
* Make lint happy
* Use Spack native copy()
As suggested in peer-review let's:
- Copy .spack to stage directory so I don't have to use random
- Use Spack native copy() to do these operations
* Use join_path to create paths
As per peer-review suggestion:
- Use join_path to create paths
- Use copy_tree since we're copying a directory that could have sub-directories
* Update package.py to include py-notebook 6.0.3 and sha
* Update package.py
* [py-notebook] updated py-tornado version requirements
* [py-notebook] reworked and reordered for readability
* [py-notebook] updated version requirement for py-jupyter-client
* [py-notebook] updated version requirements for py-jupyter-core
Co-authored-by: ehdeec <ehdeec@rit.edu>
* new package: BART
This PR adds the BART (Berkeley Advanced Reconstruction Toolset)
package.
Despite the presence of CMake files, this package builds with a
Makefile. It looks like the project is moving away from cmake. The patch
for MKL has been committed upstream so should only be necessary for this
version of BART. The Makefile patch is meant for working with Spack and
would not be useful upstream. The bart scripts are still setup to use
bart with the subcommands being individual binaries. This patches those
to use the single binary with built-in subcommands and assumes that
spack is providing the TOOLBOX environment variable and setting PATH.
* Update var/spack/repos/builtin/packages/bart/package.py
Yes, '==' make more sense for a single string.
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* The python dependencies are run time only.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* New patch release SLEPc 3.13.1
* Update var/spack/repos/builtin/packages/slepc/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* new package: py-youtube-dl + fixes for dependencies
This PR adds the py-youtube-dl program. In addition, there are a couple
of dependency packages that needed to be updated.
* ffmpeg
This is needed by py-youtube-dl. However, the spack ffmpeg recipe does
not include a lot of options, specifically, a dependency on openssl for
working with the https protocol.
- Added updated version.
- Added variants for the different licensing options.
- Added "meta" variants for X and drawtext. These turn on/off several
options.
- Set variants and dependencies for many options. The defaults are based
on the configuration settings in ffmpeg.
- Set dependencies that were missing or that will likely get pulled in
from the system.
* libxml2
The ffmpeg+libxml2 variant initially failed to build. The issue is that
libxml2 sets the headers property to
include_dir = self.spec.prefix.include.libxml2
The ffmpeg configure looks for prefix.include and fills in the rest.
This could probably be patched in ffmpeg but the headers property in the
libxml2 recipe is not consistent with the environment module or the
pkgconfig file, both of which set the headers path to prefix.include.
This PR sets the libxml2 headers property to
include_dir = self.spec.prefix.include
A spot check of a few libxml2 dependents did not rreveal any problems
with this change.
* Comment out libxml2 dependency in ffmpeg
The header property issue of the spack libxml2 package will need to be
resolved in another PR before libxml2 can be enabled in ffmpeg.
* new package: openmm
* dependency adjustments
* 1. modify dependencies
2. openmm dynamically compiles cuda kernels during runtime,
attempt to set up an environment that will work.
* Update var/spack/repos/builtin/packages/openmm/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This adds a boolean 'libtirpc' variant to the hdf package.
Default is false, which will reproduce previous behavior (which
was to rely either on system xdr headers/library, or have hdf use
it's builtin xdr lib/headers (which are only for 32 bit))
If true, a dependency is added on 'libtirpc', and the LIBS and
CPPFLAGS are updated in the configure command to find the libtirpc
library and xdr.h header.
This is needed as RHEL8 (and presumably other distros have or will be)
removed xdr.h from glib-headers package,which was breaking the previous
behavior of using system xdr headers.
* opencv: assorted fixes
1. depends on blas when +lapack
2. set cuda nvcc flags for cuda_arch
3. let cuda/contrib builds work
4. depends on hdf5 when cuda/contrib
5. depends on ant when +java
6. allow protobuf version to be different
7. let opencv recompile it's protoc files.
* ant is a build-time dependency
* register +cuda~contrib as impossible.
The old api is found in version 0.3.0 which uses a different release
name, so the url function was updated to properly find the older
releases. Also, this removes the boost constraint on the 0.3.0 version
which does not need it.
* meson: Add 0.54.0
This change also improves the rpath patch we are using. Instead of never
removing the rpath, we now only keep it if running within Spack to
guarantee consistent behavior of Meson.
* meson: Make ninja a hard dependency
* Update doxygen package
* Add new version releases 1.8.16 and 1.8.17
* Add mscgen as an optional dependency.
* Update doxygen package.py
Fix typo in mscgen dependency specification
* Remove whitespace for flake8
* primer3: move to github, add 2.5.0, fix 2.3.7
- The Primer3 project moved to GitHub.
- update the URL
- compare the tarballs 2.3.7 from Sourceforge and github, no
significant differences (e.g. the Sourceforge tarball contained a
couple of "tmp" files).
- update the signature for the 2.3.7 tarball.
- @2.3.7 doesn't build with gcc@8.4.0, there's a dubious pointer/int
comparison that causes an error. It was fixed upstream in newer
versions, apply simple patch to this version so that it continues to
be build-able with newer compilers. See:
- https://github.com/primer3-org/primer3/issues/2
- https://github.com/primer3-org/primer3/issues/3
- Add info for @2.5.0, which builds cleanly.
* Flake8 cleanup
* [gtk-doc] created template
* [gtk-doc] using custom url to standardize on dotted version
* [gtk-doc] added description and homepage
* [gtk-doc] added dependencies and added pdf variant
* [gtk-doc] commented out pdf variant
* [gtk-doc] cleaned up leftover fixmes
* [gtk-doc] flake8
* [gtk-doc] readded url
* [gtk-doc] python packages are build and run dependencies
* py-torch: Fix v1.4.0 by adding v1.4.1
version/tag hack so that 1.4.0 is still workable.
see pytorch/pytorch#35149
* delete/disable fbgemm support in 1.4.0
* moved conflict
Newer versions of glib require Meson, so this PR adds support for that
using a hybrid approach. glib@5.28: will be built using Meson, older
versions still make use of Autotools.
The most recent change to the openjdk package set expand=False for all versions
of the package. This means only the unexpanded archive will be installed, which is not correct.
* Update dependencies and support variant for Fortran Intermediate Representation.
* Add Cmake flags that toggle Fortran Intermediate Representation on/off. Exclude Flang tests for now.
* f18+fir variant needs next release of llvm or master.
* Only build tests if you are pass in --test to spack install
* New package: py-tensorboard
* some basic dependencies based on requirements.txt
remove the older version that doesn't build
* requested changes
* add additional dependencies
* more dependency changes
* py-onnx: only use py-typing if python < 3.5
avoids a 'Callable has no attribute __abc_registry' error. See
onnx/onnx#2199 and python/typing#573
* add version 1.6.0 while we're here.
Add new version 4.5.0 and update quarantine variable list to also
include LD_PRELOAD in addition to LD_LIBRARY_PATH (to avoid side-effect
when the first depends on the latter).
- Add info for version 0.68.3
- Add variant to package that enables the "extended" features. It
works with both of the existing versions (not sure how far back it
goes, but confirmed that it's valid fro 0.53).
Tested on OS X with go@1.14.1.
* Add ACTS v0.21
* Update var/spack/repos/builtin/packages/acts-core/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* slepc: updates for @devel
- +arpack now works with int64
- +blopex add conflict with int64
- switch to using --with-arpack-lib [from --with-arpack-lib] with current slepc
- use updated blopex with current slepc
* slepc: conflict with blopex should be in all versions
* slepc: add new downloads
* slepc: add whitespace around operator
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
* New package(s): py-pydeps and py-stdlib-list
* requested changes
* Update var/spack/repos/builtin/packages/py-stdlib-list/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* cryptsetup: restrict the version of automake if @2.2.1
fixes#15706
* Update var/spack/repos/builtin/packages/cryptsetup/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fixed building coreutils on Darwin
* Bump nano version to 4.9
* Coreutils: Add program prefix g so we don't conflict with Apple utilities
* Fix intendation
* Make format more spack like
* Removed unnecesary changes
* Merge branch 'develop' of github.com:DiegoMagdaleno/spack into develop
Fix linking libgit2 on Darwin
* Revert "Merge pull request #3 from spack/develop"
This reverts commit 58dbbdb82bfa4aa689758971eb423ae2b9d2c293, reversing
changes made to dd7a413f4870fc2778f801c882a50cfade18fe97.
* Revert "Revert "Merge pull request #3 from spack/develop""
This reverts commit f956aa7b135c39ea0eb4ba6b9329714067098a7c.
* Revert "Merge branch 'develop' of github.com:DiegoMagdaleno/spack into develop"
This reverts commit 50321f798674ec30d5c62fe696b4f8bccef6ab3d.
* update version: intel packages daal, ipp, mkl-dnn, mkl, mpi, parallel-studio, pin, tbb and makes url parameter consistent and always use single quote.
* Fixes a typo with one of the sha256 checksum..
* Adds version entries for new versions of Intel packages.
Co-authored-by: Robert Mijakovic <robert.mijakovic@lrz.de>
* Patching unqlite to be able to build a shared library
* Correcting a whitespace for flake8 to pass
* added comment about PR on unqlite
* extra commit to force github to merge
* Update flit package to v2.1.0 and add dependencies
* flit: comment out bash dependency
The host system should have bash available and compiling bash through
spack failed for me. I'm not sure if binutils and coreutils should
be listed as dependencies as well.
* Add new version of py-pyelftools
* py-pyelftools: add py-setuptools as a build dependency
* Address review comments
HDF5 1.12 broke backward compatibility, so we're preferring version 1.10
for now. Packages that need the new API should specify:
depends_on("hdf5@1.12:")
to be explicit. We can eventually change the preference, but at the
moment most libraries have not udpated to use the new HDF5.
* Added v3 of Laghos
Added v3 of Laghos as per
https://github.com/CEED/Laghos/blob/v3.0/README.md
* Update var/spack/repos/builtin/packages/laghos/package.py
Changed develop->master as per PR
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Made Metis Dependency Explicit
Added explicit metis dependency
* Folded @develop Laghos Deps in to @3.0:
Theoretically there will be a difference between develop and 3.0: in the
future but currently there is not
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add myself as a maintainer
* This was a regression that occured in previous PR. Flang has been excised from LLVM for now until f18 is merged upstream.
* Libraries only needed when a GPU backend is present.
* Add version 18-08-9-1
* Add variant to allow setting the sysconfdir: See below
About sysconfdir:
slurm has a server and a client.
To use the correct communication channel, the client needs
to be able to read the correct config. This config is in
PREFIX/etc.
Let's assume one has the server part installed as a system
package. This generally is a good idea, so that the server
gets started during boot. This means, that the config is
in /etc/slurm.
If one now wants to use the client part (library!) via
spack, one has a problem: spack's slurm looks in
SPACK-PACKAGE-PREFIX/etc for the config.
There needs to be a way to let the spack installed package
use the system's config.
So add a variant to override the path during build:
sysconfdir=/etc/slurm.
This is much like what happened in #15307 for munge.
* geant4: new version 10.6 plus simplifications
Add new 10.6.0 release, migrating download of source to use Geant4's
public release repo on CERN GitLab. Change versioning scheme to use
clearer and standard semantic scheme.
Update geant4-data and g4XXX data packages with new versions. Migrate
geant4-data as a BundlePackage of the g4XXX packages, installing links
to each under a single directory under share for geant4-data. Ensure
each g4XXX package exports the environment variable pointing to its
location expected by Geant4.
Remove "data" variant from Geant4 package and always use geant4-data.
Simplify cxxstd variant transport to dependencies.
* g4<DATA>: Use self to resolve correct prefix
* geant4, data: Fix flake8 errors
* g4photonevaporation: flake8 fix
* geant4: vecgeom version depends_on
Geant4 major.minor versions have specific dependencies on vecgeom
versions. Add missing vecgeom version for geant4 10.5, and match
version requirements for vecgeom in geant4 depends_on.
* geant4: c++17 patch specific for 10.4.3
* geant4: simplify geant4-data setup
* geant4: Use new define_from_variant function
* geant4: fix flake8 errors
* helics: add new package
* Remove FIXME boilerplate
* Use open @master: verison range for git dependency and remove mpi fix branch version
* Add blank line after spack import
* py-onnx: depends on cmake >= 3.1
* Update var/spack/repos/builtin/packages/py-onnx/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Aded Option to Disable Shared Lua library
Added option to disable generation of shared object library for lua to
avoid build issues on static only platforms
* Fixed Flake8 Issue with Lua Spackage
Fixed indentation issue with lua spackage
* Add initial attempt at intel-mpi-benchmarks package
* Add more checksummed versions
* Changes to how makefile is handled
* First working install version. Needs tuning to support building specific benchmarks
* Add variant for building specific benchmarks rather than all of them
* Minor syntax change
* New package: gdrcopy
provides the userspace libraries for gdrcopy.
* Update var/spack/repos/builtin/packages/gdrcopy/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* XIOS: add new versions
Patch has been removed because it was not applied to any previously
existing versions and it actually breaks the new versions added by this
PR.
* Sort versions from newest to oldest
This allows the llvm build to support:
* clang cuda
* libomptarget for:
* current host
* cuda
* bitcode compilation of libomptarget device runtime for inlining by
bootstrapping libomptarget
* split dwarf information support as an option for debug builds, if you need a
debug build, for the love of all that's good in the universe use this flag
* adds necessary dependencies for shared library builds and libomp and
libomp target to build correctly
* new version of z3 to make it sufficient to build recent llvm
The actual change is much smaller than the diff, this is because it's been formatted with black. I realize this kinda sucks right now, but I'm hoping it will make future updates here less painful.
the gcc package.py includes patches for a sanitizer related bug that look
like they've been fixed in gcc 8.4.0, which caused `spack install` to fail.
This PR excludes patching gcc >= 8.4.0 and < 9.0.0.
* py-statsmodels: update to 0.102 and fix dependencies
* Update var/spack/repos/builtin/packages/py-statsmodels/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update URL for new py2neo versions
* Use pypi for py-py2neo
* Add version 4.3.0
* Update py2neo dependencies
* Apply suggestions from code review
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* NETCDF: Remove maxdims maxvars variant
I'm not sure of the correct protocol to do this, so decided to make a stab and hopefully it works or I'm told the correct way...
The `maxdims` and `maxvars` variants for the NetCDF package were, to the best of my knowledge, only ever used for the Exodus library in the SEACAS package. In versions of NetCDF prior to 4.4.0, Exodus required that the `NC_MAX_DIMS` and `NC_MAX_VARS` be increased over the default values. This requirement was removed in 4.4.0 and later.
I do not know of any way to make a variant depend on the version and since the `maxdims` and `maxvars` variants are integer values and not boolean, then every build of NetCDF will have these variants. Typically `maxdims=1024 maxvars=8192` and the build will patch the `netcdf.h` include file for every build even though it is (almost) never needed.
The SEACAS package has a NetCDF version requirement of >4.6.2, so it no longer specifies the `maxdims` or `maxvars` variant and I could find no other package in spack that uses this variant either, so removal should not break anything *in* spack. However, there is no guarantee that some other external package doesn't use the variant, so I'm not sure of the correct way to remove the variant.
For this PR, I simply removed the variants. If there is a way to specify use of the variant tied to a specific version, I couldn't find it anywhere...
* Address review comment
Removed `is_integral` and `import numbers` since `is_integral` was only place it was used.
* Add blank line for flake8
* magma now extends CudaPackage class, taking care of the gcc conflicts
* enforce +cuda; thus cuda is dependency via CudaPackage class
* add conflict
* use cuda_arch to set GPU_TARGET build option
* get rid of unnecessary constraint
* flake8
* impose cuda version dependency found empirically
* add variant description
* add conflict
Co-authored-by: Sinan81 <Sinan81@github>
Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
* build python bindings within qscintilla package via extend_path trick
* add todo
* reflect new setup also in py-pyqt4 package
* get rid of qscintilla dependency
* also tweak qgis for the new setup
* generalize the building of python bindings
* generalize building of pythong bindings to all qt versions
* add qsci_api variant
* add qsci_variant for pyqt4 package as well; add comment
* pyqt dependency should build with +qsci_api variant enabled
* fix bugs
* improve style
* reflect recent changes
* flake8
* improve style
* more flake8
* more flake8
Co-authored-by: Sinan81 <sbulut@3vgeomatics.com>
* Add some comments explaining the choice of flag_handler.
* Fix QMCPACK install method.
* Add support for ppconvert. This requires a custom build method.
* Fix QMCPACK setup_run_environment. Nexus should be properly supported now.
* Cleaner way to check for intel-mkl in spec.
* Remove build method and use build_targets property instead.
* Additional fixed for install method. Effectively restoring the original install method.
* Add the missing backslash to fix directory names.
* Update var/spack/repos/builtin/packages/qmcpack/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/qmcpack/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/qmcpack/package.py
Co-Authored-By: Adam J. Stewart <ajstewart426@gmail.com>
* Omit these conflicts on mkl variants for now, will hopefully be supportted with new concretizer in a couple of months.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
intel moved the repository around.
github changes the prefix inside the tar according to the
repository name.
So all sha256 have changed!
I verified that the tar contents for 2019.4 did not change
except for the prefix.
* TensorFlow: Clean up/simplify the installation, make sure the headers are
installed so that horovod can find them successfully. Fix the 2.0.* builds.
* Backport of 837c8b6b upstream
"Remove contrib cloud bigtable and storage ops/kernels."
Allows 2.0.* releases to build with '--config=nogcp'
* comment regarding tensorflow issue #31187
Co-authored-by: Andrew W Elble <aweits@skl-a-00.rc.rit.edu>