* py-prompt-toolkit: Add newer version for required for newer
py-ipython versions
* py-ipykernel:
* Add newer py-ipykernel version
* Depend on py-setuptools so a python egg isn't attempted to be
built
* Update dependency for newer py-ipykernel versions
* py-jupyter-console: Remove py-prompt-toolkit dependency since it is
picked up in py-ipython
* py-ipython:
* Add missing py-backcall dependency
* Adjust py-prompt-toolkit dependencies for newer versions of
ipython
* py-jupyter-notebook: Require newer version of py-ipykernel since
jupyter is broken with previous versions
* add cxxstd variant
* add CMake constraints based on platform/version
* add older versions
* update boost dependency version constraints (which are closely
tied to mysql version) and update boost cxxstd choice to be
the same as the cxxstd chosen for mysql
* add client-only support (including a patch for 5.5.x)
* record the mysql package as a provider of the mysql-client virtual
Improve management of the Fiber library and C++ standard support:
* Remove Fiber from list of libraries to build
* Improve variant management for Fiber; add variants for Context and
Coroutine libraries.
* Add known conflict with C++17 for boost < 1.63.0
* Remove C++ standard "default" option, which left the choice of
C++ standard to the compiler used to build boost
* record conflicts with compiler versions which don't provide
required c++ standard support
* add doxygen (build) dependency
* add note that range-v3 is header-only as of 0.3.6 and update
package description
We add new variants to handle readline vs libedit, a client-only
build and install, and bindings to TCL, Python and Perl. We also add
new versions and the ability to detect remote versions not otherwise
dealt with.
This avoids using a system-installed CUDA package. In the future a
variant can be added to allow using Spack-installed CUDA, but for
now CUDA support is always disabled.
* abinit: Fix building with hdf5/netcdf.
* gromacs: Fix attempt to build with cuda support when 'cuda=False'
If for some reason there's a cuda toolkit installed by other means,
(i.e. not by spack) cmake will still try to build with cuda support,
even though 'cuda=False' is the default of the spec.
* Revert "abinit: Fix building with hdf5/netcdf."
This reverts commit e16f725e37b91193fe519b1821446c76ab551928.
This should not be here.
* Draco: add variants
+ This package has many optional build dependencies that were not registered in
older versions of this recipe. I've added (and tested) this more complete
list of optional dependencies: parmetis, superlu-dist, qt.
* fix style issues
* hpcviewer: new package
Add binary package for hpcviewer and hpctraceviewer for the Rice
hpctoolkit on Linux x86_64, ppc64 and ppc64le.
* ibm-java: add property 'home' so that spec['java'].home will work.
* Flake
* More flake.
* Test that the version, machine type pair exists before using it in the
sha dictionaries so that 'spack info' doesn't crash on unsupported
configurations.
* mariadb-c-client is a new package using the distinct, LGPL,
MySQL-compatible client library from mariadb.com. It provides the
virtual package mariadb-client
* mariadb is recorded as a provider of the mariadb-client virtual
* The mysql-client virtual package is also added, and mariadb-c-client
is recorded as a provider for it
* Throw InstallError if more than one GPU architecture is passed to cuda_arch. Previous cuda_arch test was not actually working because comparison with none string was on the cuda_arch list instead of the first entry of the list.
* Removing redundant cuda_arch statement.
* New package: py-mysql-connector
* Fixed docstring
* 1. Determined that py-setuptools was not needed at all, so removed.
2. Added py-protobuf. Docs seem to imply that only C protobuf library is required; however, the Python setup.py says differently, and some Python code seems to reference protobuf too. I don't know why this worked for me, but it looks like including py-protobuf is the right thing to do.
* Applied solution detailed in:
https://github.com/mysql/mysql-connector-python/pull/9
Uncommenting this patch will make `error: option --single-version-externally-managed not recognized` reappear.
* Clean up / reorder lines
* flake8
qt currently falls back to bundled versions of sqlite, harfbuzz, pcre,
double-conversion and xcb. This adds the appropriate dependencies and
configure arguments. A new variant adds multibyte support to pcre and
pcre2, which is required by qt.
Additionally, newer versions of gcc (starting with @8.3.0) cause build
failures. This adds a patch to fix the problem.
The changes have been tested with all versions of qt currently available
in Spack. 5.2 and 5.3 do not build for reasons that seem to be unrelated
to these changes, though.
* Add binary package of the IBM Java SDK for big and little-endian
powerpc (power7, 8 and 9). The jdk and openjdk packages only install
on x86_64.
* Add ibm-java as a java provider
* The jdk and openjdk packages only install on x86_64. Add conflicts
for ppc64 and ppc64le to jdk and openjdk.
shmemrun and oshrun do not exist in OpenMPI v4.0.0
(ref: https://www.open-mpi.org/doc/v4.0/)
The Spack OpenMPI package was failing the install by trying to
remove them. This guards the removal of several scripts when
using the Slurm scheduler to handle the case where they don't exist.
It seems that this is actually a glibc problem and while 2.6.4 builds
without the patch on newer versions of gcc (@8:), it still sometimes
segfaults (as observed during the doxygen build).
* Add 'fiber' as a default library for boost
* Add autoconf/automake etc. dependencies to libseccomp package
* New package: brotli
* New package: editline
* Add brotli, editline, boost dependencies to Nix
Remove 2.6.3 as preferred version (but keep it available for
building). The latest version (currently 2.6.4) is now preferred
(according to Spack's defaults).
* Update dependencies for py-flake8 when building version 3.7.7
* Add FIXME comment for an example dependency constraint which causes
concretization to hang
* Add py-entrypoints version 0.3
* Add py-pycodestyle version 2.5.0
* Add libuv version 1.10.0
* CMake versions before 3.12.0 do not build with libuv version
1.25.0, so a constraint is added to build earlier versions of
CMake with libuv version 1.10.x
Update CPATH in setup_environment for Eigen, so that the
Spack-generated module for Eigen will help builds outside of Spack
use the appropriate include prefix for Eigen headers
(<install_prefix>/include/eigen3/ rather than <install_prefix>/include/)
Note that this only updates the run-time environment, rather than the
build-time environment, so Spack builds depending on Eigen that use
pkgconfig will not be confused by the presence of the Eigen include
directory in CPATH.
* Replace kim-api package with kim-api-v2, which has different
versions and removes the 'cmake_args' method
* Add openkim-models-v2 as an extension package