The checksum exception was not detailed enough and not reraised when using cache only, resulting in useless error messages.
Now it dumps the file path, expected
hash, computed hash, and the downloaded file summary.
Batch scripts in general will not function without carriage return line
endings on Windows. We rely on these scripts to support cmd, so we
should not allow these scripts to be converted to lf.
Note: Windows 11 supports lf line endings due to the use of Windows
Terminal. Once support for Windows 10 is dropped, this change can be
reverted.
When running many concurrent spack install processes that need to write
to the db, Spack regularly times out. This is because writing to the DB
after another process has written to it requires deserialization of the
db, mutating it in memory, and serializing it again, which takes some
time. On top of that, I believe there's a 1 second retry when a write
lock cannot be obtained, so I think this means only 3 processes can
really write to the DB at the same time before timing out.
* Style: black 23, skip magic trailing commas
* isort should use same line length as black
* Fix unused import
* Update version of black used in CI
* Update new packages
* Update new packages
* Update package.py
Initial new stuff
* Update package.py
* Update package.py
* Update package.py
* fix targets
* non-llvm backends
* ooops
* fix style
* Somehow that was not caught?
Somehow that was not caught?
* style
* Last fix
make capitalization consistent with Halide not LLVM...
* py-cmake-format: new version, new variants
* Update var/spack/repos/builtin/packages/py-cmake-format/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cufflinks: new package version with 0.17.3
* Update var/spack/repos/builtin/packages/py-cufflinks/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Specs that did not contribute any files to an env view caused a problem
where zip(specs, files grouped by prefix) got "out of sync", causing the
wrong merge map to be passed to a package's `add_files_to_view`, which
specifically caused an issue where *sometimes* bin/python ended up as a
symlink instead of a copy.
One such example is kokkos + kokkos-nvcc-wrapper, as the latter package
only provides the file bin/nvcc_wrapper, which is also added to view by
kokkos, causing kokkos-nvcc-wrapper to contribute 0 files.
The test feels a bit contrived, but it captures the problem... pkg a is
added first and has 0 files to contribute, pkg b adds a single file, and
we check if pkg b receives a merge map (and a does not).
* pfunit: add v4.6.3
* pfunit: use CMakePackage methods to define arguments
* pfunit: deprecate v3.X, make a variant conditional
* pfunit: simplify setting up environment variables
Reading the docs it seems only v3
needs F90_VENDOR to be set
* pfunit: fix option names
The names set before were unused
* pfunit: shared libraries seem not to be supported
See https://github.com/Goddard-Fortran-Ecosystem/pFUnit/issues/308#issuecomment-874725759
* Add py-mlflow and its dependencies
* mlflow: fix syntax error in package.py
* py-mlflow: cleanup
Process review remarks, add missing dependencies, add skinny variant
* Apply suggestions from code review
* Fix flake8 issues
* More formatting fixes
* Fix py-waitress dependency version
* py-mlflow: platform-specific dependency
* Update var/spack/repos/builtin/packages/py-mlflow/package.py
* Update var/spack/repos/builtin/packages/py-mlflow/package.py
* Process review remarks
* Fix typo in dependency version
* py-shap: fix dependencies
* py-arrow: fix dependencies
* py-slicer: remove py-setuptools explicit version
* py-pyarrow: dataset variant and pass options through environment
It appears there are some issues when using `pip install` instead of
`python setup.py` - this setup_build_environment should fix that.
* py-pyarrow: review remark
* Decouple setup_build_environment from install_options
* py-pyarrow: style
* Bump licenses to 2023
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Matthias Wolf <matthias.wolf@epfl.ch>
`spack gc` removes build deps of explicitly installed specs, but somehow
if you take one of the specs that `spack gc` would remove, and feed it
to `spack uninstall /<hash>` by hash, it complains about all the
dependents that still rely on it.
This resolves the inconsistency by only following run/link type deps in
spack uninstall.
That way you can finally do `spack uninstall cmake` without having to
remove all packages built with cmake.
Default package requirements might contain
variants that are not defined in each package,
so we shouldn't verify them when emitting facts
for the ASP solver.
Account for group when enforcing requirements
packages:all : don't emit facts for requirement conditions
that can't apply to current spec
* Update package.py
Several libraries are need to be present at run time so that the code can be run in parallel.
I have added them as dependencies and to LD_LIBRARY_PATH. Orca comes as a binary so the libraries cannot be added as RPATH at compilation time.
Also, orca 5.0.3 was compiled against 4.1.1, not 4.1.2.