* [add] py-metomi-rose: new recipe, required by py-cylc-rose
* py-metomi-rose: remove version constraint on python
---------
Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
* Allow branching out of the "generic build" unification set
For cases like the one in https://github.com/spack/spack/pull/39661
we need to relax rules on unification sets.
The issue is that, right now, nodes in the "generic build" unification
set are unified together with their build dependencies. This was done
out of caution to avoid the risk of circular dependencies, which would
ultimately cause a very slow solve.
For build-tools like Cython, however, the build dependencies is masked
by a long chain of "build, run" dependencies that belong in the
"generic build" unification space.
To allow splitting on cases like this, we relax the rule disallowing
branching out of the "generic build" unification set.
* Fix issue with pure build virtual dependencies
Pure build virtual dependencies were not accounted properly in the
list of possible virtuals. This caused some facts connecting virtuals
to the corresponding providers to not be emitted, and in the end
lead to unsat problems.
* Fixed a few issues in packages
py-gevent: restore dependency on py-cython@3
jsoncpp: fix typo in build dependency
ecp-data-vis-sdk: update spack.yaml and cmake recipe
py-statsmodels: add v0.13.5
* Make dependency on "blt" of type "build"
We run pip with `--no-build-isolation` because we don't wanna let pip
install build deps.
As a consequence, when pip runs hooks, it runs hooks of *any* package it
can find in `sys.path`.
For Spack-built Python this includes user site packages -- there
shouldn't be any system site packages. So in this case it suffices to
set the environment variable PYTHONNOUSERSITE=1.
For external Python, more needs to be done, cause there is no env
variable that disables both system and user site packages; setting the
`python -S` flag doesn't work because pip runs subprocesses that don't
inherit this flag (and there is no API to know if -S was passed)
So, for external Python, an empty venv is created before invoking pip in
Spack's build env ensures that pip can no longer see anything but
standard libraries and `PYTHONPATH`.
The downside of this is that pip will generate shebangs that point to
the python executable from the venv. So, for external python an extra
step is necessary where we fix up shebangs post install.
* Add new package awscli-v2 and its missing dependency awscrt
* Remove boilerplate comments from awscli-v2 and awscrt packages
* Fix typos in var/spack/repos/builtin/packages/awscli-v2/package.py
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Address reviewer comments
* Remove py-pip version dependency from var/spack/repos/builtin/packages/awscli-v2/package.py
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-isort: needs setuptools build dep before v5
Detected in #40224.
In the past, system setuptools could be picked up when using an external
python, so py-isort@4 would install fine. With the linked PR, pip can
only consider packages that Spack controls from PYTHONPATH, so the issue
of missing py-setuptools showed up.
* py-importlib-metadata: fix lowerbounds on python
* review
* py-isort unconditionally add optional setuptools dep to prevent picking up user package at runtime
* style
* drop optional py-setuptools run dep
* e4s amd64 gcc ci stack: sync with e4s-23.08
* e4s amd64 oneapi ci stack: sync with e4s-23.08
* e4s ppc64 gcc ci stack: sync with e4s-23.08
* add new ci stack: e4s amd64 gcc w/ external rocm
* add new ci stack: e4s arm gcc ci
* updates
* py-scipy: -fvisibility issue is resolved in 2023.1.0: #39464
* paraview oneapi fails
* comment out pkgs that fail to build on power
* fix arm stack name
* fix cabana +cuda specification
* comment out failing spces
* visit fails build on arm
* comment out slepc arm builds due to make issue
* comment out failing dealii arm builds
* Fix python versioning issue for py-biom-format
* Update deps according to feedback
* Remove version requirement for py-cython
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Only depend on py-six for versions >=2.1.10
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add py-future as non-dependency for 2.1.15
* There we are. Everything anyone could ever want
* Missed cython version change
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-dipy: Update version to support python@3.10
py-dipy only adds support to python@3.10 in py-dipy@1.5.0
See #40228
* py-dipy: fix formatting issues
* py-dipy: another formatting fix
* py-dipy: Use depends instead of conflicts
* py-dipy: formatting fix
* py-dipy: Updating for @1.7.0
Added new minimum version requirements for
py-cython
py-numpy
py-scipy
py-h5py
as suggested by @manuelakuhn
(py-nibabel min version unchanged for @1.7.0)
* [add] py-graphql-relay: new package
* py-graphql-relay: Update package.py
Remove leftovers from version 3.2.0:
* archive name in pypi
* dependencies
* [fix] py-graphql-relay: remove constraint on python version; add dependence py-rx because of ModuleNotFoundError during spack test
* [fix] py-graphql-relay: remove py-rx dependence; py-graphql-core: add dependencies for version 2.3.2
* py-graphql-core: Update from review; set build backend, py-poetry for version 3: and py-setuptools for version 2
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Two changes in this PR:
1. Register absolute paths in tarballs, which makes it easier
to use them as container image layers, or rootfs in general, outside
of Spack. Spack supports this already on develop.
2. Assemble the tarfile entries "by hand", which has a few advantages:
1. Avoid reading `/etc/passwd`, `/etc/groups`, `/etc/nsswitch.conf`
which `tar.add(dir)` does _for each file it adds_
2. Reduce the number of stat calls per file added by a factor two,
compared to `tar.add`, which should help with slow, shared filesystems
where these calls are expensive
4. Create normalized `TarInfo` entries from the start, instead of letting
Python create them and patching them after the fact
5. Don't recurse into subdirs before processing files, to avoid
keeping nested directories opened. (this changes the tar entry
order slightly, it's like sorting by `(not is_dir, name)`.