Consolidate Spack's internal filepath logic to a select
few places and refactor to consistent internal useage of
os.path utilities. Creates a prefix, and a series of utilities
in the path utility module that facilitate handling paths
in a platform agnostic manner.
Convert Windows paths to posix paths internally
Prefer posixpath.join instead of os.path.join
Updated util/ directory to account for Windows integration
Co-authored-by: Stephen Crowell <stephen.crowell@khq.kitware.com>
Co-authored-by: John Parent <john.parent@kitware.com>
Module template format for windows (#23041)
* Incorporate new search location
* Add external user option
* proper doc string
* Explicit commands in getting started
* raise during chgrp on Win
recover installer changes
Notate admin privleges
Windows phase install hooks
Find external python and install ninja (#23496)
Allow external find python to find windows python and spack install ninja
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Betsy McPhail <betsy.mcphail@kitware.com>
Fixup common tests
* Remove requirement for Python 2.6
* Skip new failing test
Windows: Update url util to handle Windows paths (#27959)
* update url util to handle windows paths
* Update tests to handle fixed url handling
* canonicalize path only when the path type matches the host platform
* Skip some url tests on Windows
Co-authored-by: Omar Padron <omar.padron@kitware.com>
Use threading.TIMEOUT_MAX when available (#24246)
This value was introduced in Python 3.2. Specifying a timeout greater than
this value will raise an OverflowError.
Co-authored-by: Lou Lawrence <lou.lawrence@kitware.com>
Co-authored-by: John Parent <john.parent@kitware.com>
Co-authored-by: Betsy McPhail <betsy.mcphail@kitware.com>
Add compiler hint to the root spec for Windows
Reporters on Windows (#26038)
Reporters use Jinja2 as the templating engine, and Jinja2 indexes
templates by Unix separators, even on Windows, so search using Unix paths
on all systems.
Support patching on win via git (#25871)
Handle GRP on windows
CMake - Windows Bootstrap (#25825)
Remove hardcoded cmake compiler (#26410)
Revert breaking cmake changes
Ensure no autotools on Windows
Perl on Windows (#26612)
Python source build windows (#26313)
Reconfigure sysconf for Windows
Python2.6 compatibility
Fxixup new sbang tests for windows
Ruby support (#28287)
Add NASM support (#28319)
Add mock Ninja package for testing
* Style fixes
* Use Python's zipfile, if available
The compression libs are optional in Python. Rely on python as a
first attempt then fall back to `unzip`
MSVC's internal CMake and Ninja now detected by spack external find and added to packages.yaml
Saving progress on packaging zlib for Windows
Fixing the shared CMake flag
* Loading Intel's ifx Fortran compiler into MSVC; if there are multiple
versions of MSVC installed and detected, ifx will only be placed into
the first block written in compilers.yaml. The version number of ifx can
be detected using MSVC's version flag (instead of /QV) by using
ignore_version_errors. This commit also provides support for detection
of Intel compilers in their own compiler block by adding ifx.exe to the
fc/f77_name blocks inside intel.py
* Giving CMake a Fortran compiler argument
* Adding patch file for removing duplicated mangling header for versions 3.9.1 and older; static and shared now successfully building on Windows
* Have netlib-lapack depend on ninja@1.10
Co-authored-by: John R. Cary <cary@txcorp.com>
Co-authored-by: Jared Popelar <jpopelar@txcorp.com>
Making a default config.yaml for Windows
Small path length for build_stage
Provide more prerequisite details, mention default config.yaml
Killing an unnecessary setvars call
Replacing some lost changes, proofreading, updating windows-supported package list
Co-authored-by: John Parent <john.parent@kitware.com>
* Add 'make-installer' command for Windows
* Add '--bat' arg to env activate, env deactivate and unload commands
* An equivalent script to setup-env on linux: spack_cmd.bat. This script
has a wrapper to evaluate cd, load/unload, env activate/deactivate.(#21734)
* Add spacktivate and config editor (#22049)
* spack_cmd: will find python and spack on its own. It preferentially
tries to use python on your PATH (#22414)
* Ignore Windows python installer if found (#23134)
* Bundle git in windows installer (#23597)
* Add Windows section to Getting Started document
(#23131), (#23295), (#24240)
Co-authored-by: Stephen Crowell <stephen.crowell@kitware.com>
Co-authored-by: lou.lawrence@kitware.com <lou.lawrence@kitware.com>
Co-authored-by: Betsy McPhail <betsy.mcphail@kitware.com>
Co-authored-by: Jared Popelar <jpopelar@txcorp.com>
Co-authored-by: Ben Cowan <benc@txcorp.com>
Update Installer CI
Co-authored-by: John Parent <john.parent@kitware.com>
Made the vcvars batch script location a member variable of the msvc compiler subclass, initialized from the compiler executable path. Added a setup_custom_environment() method to the msvc subclass that sources the vcvars script, dumps the environment, and copies the relevant environment variables to the Spack environment. Added class variables to the Windows OS and MSVC compiler subclasses to enable finding the compiler executables and determining their versions.
* Fixed path and uid issues.
* Added needed import statement; kluged .exe extension.
* Got package to build. Some manual intervention necessary, including sourcing the MSVC setup script and having certain configuration parameters.
* Removed CMake executable suffix hack.
To provide Windows-compatible functionality, spack code should use
llnl.util.symlink instead of os.symlink. On non-Windows platforms
and on Windows where supported, os.symlink will still be used.
Use junctions when symlinks aren't supported on Windows (#22583)
Support islink for junctions (#24182)
Windows: Update llnl/util/filesystem
* Use '/' as path separator on Windows.
* Recognizing that Windows paths start with '<Letter>:/' instead of '/'
Co-authored-by: lou.lawrence@kitware.com <lou.lawrence@kitware.com>
Co-authored-by: John Parent <john.parent@kitware.com>
os.rename() fails on Windows if file already exists.
Create getuid utility function (#21736)
On Windows, replace os.getuid with ctypes.windll.shell32.IsUserAnAdmin().
Tests: Use getuid util function
Co-authored-by: lou.lawrence@kitware.com <lou.lawrence@kitware.com>
Co-authored-by: Betsy McPhail <betsy.mcphail@kitware.com>
1. Forwarding sys.stdin, e.g. use input_multiprocess_fd,
gives an error on Windows. Skipping for now
3. subprocess_context needs to serialize for Windows, like it does
for Mac.
Co-authored-by: lou.lawrence@kitware.com <lou.lawrence@kitware.com>
Co-authored-by: John Parent <john.parent@kitware.com>
* Snapshot of some MSVC infrastructure added during experiments a while ago. Rebasing from spack/develop.
* Added platform and OS definitions for Windows.
* Updated Windows platform file to conform to new archspec use.
* Added Windows as a platform; introduced some debugging code.
* Added type annotations.
* Fixed copyright.
* Removed print statements.
* Ensure `spack arch` returns correctly on Windows (#21428)
* Correctly identify windows as 'windows-Windows10-AMD64'
Re-work the checks and comparisons around commit versions, when no
commit version is involved the overhead is now in the noise, where one
is the overhead is now constant rather than linear.
fixes#29446
The new setup_*_environment functions have been falling back
to calling the old functions and warn the user since #11115.
This commit removes the fallback behavior and any use of:
- setup_environment
- setup_dependent_environment
in the codebase
Change the internal representation of `Spec` to allow for multiple dependencies or
dependents stemming from the same package. This change permits to represent cases
which are frequent in cross compiled environments or to bootstrap compilers.
Modifications:
- [x] Substitute `DependencyMap` with `_EdgeMap`. The main differences are that the
latter does not support direct item assignment and can be modified only through its
API. It also provides a `select_by` method to query items.
- [x] Reworked a few public APIs of `Spec` to get list of dependencies or related edges.
- [x] Added unit tests to prevent regression on #11983 and prove the synthetic construction
of specs with multiple deps from the same package.
Since #22845 went in first, this PR reuses that format and thus it should not change hashes.
The same package may be present multiple times in the list of dependencies with different
associated specs (each with its own hash).
* environment.py: allow link:run
Some users want minimal views, excluding run-type dependencies, since
those type of dependencies are covered by rpaths and the symlinked
libraries in the view aren't used anyways.
With this change, an environment like this:
```
spack:
specs: ['py-flake8']
view:
default:
root: view
link: run
```
includes python packages and python, but no link type deps of python.
Speeds up comparison on `Version` by ~2.5x, e.g.
```python
In [1]: v = spack.version.Version('1.0.0'); w = spack.version.Version('1.0.2')
In [2]: %timeit v < w
1.47 µs ± 5.59 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)
535 ns ± 1.75 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each)
```
fixes#29203
This PR fixes a subtle bug we have when importing
Spack packages as Python modules that can lead to
multiple module objects being created for the same
package.
It also fixes all the places in unit-tests where
"relying" on the old bug was crucial to have a new
"clean" state of the package class.
This commit reverts the GCS fetch strategy to before commit:
d759612523
The previous commit added some s3 syntax to handle connections, but
added them into the GCS fetch strategy in a way that prevents GCS from
working anymore.
* rocmcc compiler: initial commit based on aocc and clang
Co-authored-by: luker <luke.roskop@hpe.com>
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
The status displayed in the terminal title could be wrong when doing
distributed builds. For instance, doing `spack install glib` in two
different terminals could lead to the current package being reported as
`40/29` due to the way Spack handles retrying locks.
Work around this by keeping track of the package IDs that were already
encountered to avoid counting packages twice.
See https://github.com/spack/spack/pull/28468/files#r809156986
If we exit before generating the:
error("Dependencies must have compatible OS's with their dependents").
...
facts we'll output a problem that is effectively
different by the one solved by clingo.
* cmd/checksum: prefer url matching url_from_version
This is a minimal change toward getting the right archive from places
like github. The heuristic is:
* if an archive url exists, take its version
* generate a url from the package with pkg.url_from_version
* if they match
* stop considering other URLs for this version
* otherwise, continue replacing the url for the version
I doubt this will always work, but it should address a variety of
versions of this bug. A good test right now is `spack checksum gh`,
which checksums macos binaries without this, and the correct source
packages with it.
fixes#15985
related to #14129
related to #13940
* add heuristics to help create as well
Since create can't rely on an existing package, this commit adds another
pair of heuristics:
1. if the current version is a specifically listed archive, don't
replace it
2. if the current url matches the result of applying
`spack.url.substitute_version(a, ver)` for any a in archive_urls,
prefer it and don't replace it
fixes#13940
* clean up style and a lingering debug import
* ok flake8, you got me
* document reference_package argument
* Update lib/spack/spack/util/web.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* try to appease sphinx
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
We can see what is in the bootstrap store with `spack find -b`, and you can clean it with `spack
clean -b`, but we can't do much else with it, and if there are bootstrap issues they can be hard to
debug.
We already have `spack --mock`, which allows you to swap in the mock packages from the command
line. This PR introduces `spack -b` / `spack --bootstrap`, which runs all of spack with
`ensure_bootstrap_configuration()` set. This means that you can run `spack -b find`, `spack -b
install`, `spack -b spec`, etc. to see what *would* happen with bootstrap configuration, to remove
specific bootstrap packages, etc. This will hopefully make developers' lives easier as they deal
with bootstrap packages.
This PR also uses a `nullcontext` context manager. `nullcontext` has been implemented in several
other places in Spack, and this PR consolidates them to `llnl.util.lang`, with a note that we can
delete the function if we ever reqyire a new enough Python.
- [x] introduce `spack --bootstrap` option
- [x] consolidated all `nullcontext` usages to `llnl.util.lang`
Some "concrete" versions on the command line, e.g. `qt@5` are really
meant to satisfy some actual concrete version from a package. We should
only assume the user is introducing a new, unknown version on the CLI
if we, well, don't know of any version that satisfies the user's
request. So, if we know about `5.11.1` and `5.11.3` and they ask for
`5.11.2`, we'd ask the solver to consider `5.11.2` as a solution. If
they just ask for `5`, though, `5.11.1` or `5.11.3` are fine solutions,
as they satisfy `@5`, so use them.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
See https://github.com/spack/spack/issues/25353#issuecomment-1041868116
This commit changes the default behavior of
```
$ spack external find
```
from searching all the possible packages Spack knows about to
search only for the ones tagged as being a "build-tool".
It also introduces a `--all` option to restore the old behavior.
Prefer `sw_vers` to `platform.mac_ver`. In anaconda3 installation, for example, the latter reports 10.16 on Monterey -- I think this is affected by how and where the python instance was built.
Use MACOSX_DEPLOYMENT_TARGET if present to override the operating system choice.
It will be useful for metrics gathering and possibly debugging to
have this environment variable available in the runner pods that
do the actual rebuilds.
Since Spack does not install external packages, this commit skips them by
default when running stand-alone tests. The assumption is that such packages
have likely undergone an acceptance test process.
However, the tests can be run against installed externals using
```
% spack test run --externals ...
```
fixes#28260
Since we iterate over different variants from many packages, the variant
values may have types which are not comparable, which causes errors
at runtime. This is not a real issue though, since we don't need the facts
to be ordered. Thus, to avoid needless sorting, the sorted function has
been removed and a comment has been added to tip any developer that
might need to inspect these clauses for debugging to add back sorting
on the first two items only.
It's kind of difficult to add a test for this, since the error depends on
whether Python sorting algorithm ever needs to compare the third
value of a tuple being ordered.
* extensions: allow multiple "extends" directives
This will allow multiple extends directives in a package as long as only one of
them is selected as a dependency in the concrete spec.
* document the option to have multiple extends
Reuse previously was a very invasive change that required parameters to be added to all
the methods that called `concretize()` on a `Spec` object. With the addition of
concretizer configuration, we can use the config system to simplify this argument
passing and keep the code cleaner.
We decided that concretizer config options should be read at `Solver` instantiation
time, and if config changes between instnatiation of a particular solver and
`solve()` invocation, the `Solver` should use the settings from `__init__()`.
- [x] remove `reuse` keyword argument from most concretize functions
- [x] refactor usages to use `spack.config.override("concretizer:reuse", True)`
- [x] rework argument passing in `Solver` so that parameters are set from config
at instantiation time
`--reuse` was previously handled individually by each command that
needed it. We are growing more concretization options, and they'll
need their own section for commands that support them.
Now there are two concretization options:
* `--reuse`: Attempt to reuse packages from installs and buildcaches.
* `--fresh`: Opposite of reuse -- traditional spack install.
To handle thes, this PR adds a `ConfigSetAction` for `argparse`, so
that you can write argparse code like this:
```
subgroup.add_argument(
'--reuse', action=ConfigSetAction, dest="concretizer:reuse",
const=True, default=None,
help='reuse installed dependencies/buildcaches when possible'
)
```
With this, you don't need to add logic to pull the argument out and
handle it; the `ConfigSetAction` just does it for you. This can probably
be used to clean up some other commands later, as well.
Code that was previously passing `reuse=True` around everywhere has
been refactored to use config, and config is set from the CLI using
a new `add_concretizer_args()` function in `spack.cmd.common.arguments`.
- [x] Add `ConfigSetAction` to simplify concretizer config on the CLI
- [x] Refactor code so that it does not pass `reuse=True` to every function.
- [x] Refactor commands to use `add_concretizer_args()` and to pass
concretizer config using the config system.
Config scopes were different for `config` and `mutable_config`,
and `mutable_config` did not have a command line scope.
- [x] Fix by consolidating the creation logic for the two fixtures.
The concretizer is going to grow to have many more configuration,
and we really need some structured config for that.
* We have the `config:concretizer` option that chooses the solver,
but extending that is awkward (we'd need to replace a string with
a `dict`) and the solver choice will be deprecated eventually.
* We have the `concretization` option in environments, but it's
not a top-level config section -- it's just for environments,
and it also only admits a string right now.
To avoid overlapping with either of these and to allow the most
extensibility in the future, this adds a new `concretizer` config
section that can be used in and outside of environments. There
is only one option right now: `reuse`. This can expand to include
other options later.
Likely, we will soon deprecate `config:concretizer` and warn when
the user doesn't use `clingo`, and we will eventually (sometime later)
move the `together` / `separately` options from `concretization` into
the top-level `concretizer` section.
This commit just adds the new section and schema. Fully wiring it
up is TBD.
The solver has a lot of configuration associated with it. Rather
than adding arguments to everything, we should encapsulate that
in a class. This is the start of that work; it replaces `solve()`
and its kwargs with a class and properties.
* Add 'stable' to the list of infinity version names.
Rename libunwind 1.5-head to 1.5-stable.
* Add stable to the infinite version list in packaging_guide.rst.
* core: Make platform environment an instance not class method
In preparation for accessing data constructed in __init__.
* macos: set consistent macosx deployment target
This should silence numerous warnings from mixed gcc/macos toolchains.
* perl: prevent too-new deployment target version
```
*** Unexpected MACOSX_DEPLOYMENT_TARGET=11
***
*** Please either set it to a valid macOS version number (e.g., 10.15) or to empty.
```
* Stylin'
* Add deployment target overrides to failing autoconf packages
* Move configure workaround to base autoconf package
This reverts commit 3c119eaf8b4fb37c943d503beacf5ad2aa513d4c.
* Stylin'
* macos: add utility functions for SDK
These aren't yet used but should probably be added to spack debug
report.
* Remove node_target_satisfies/3 in favor of target_satisfies/2
When emitting input facts we don't need to couple target with
packages, but we can emit fewer facts independently and let
the grounder combine them.
* Remove compiler_version_satisfies/4 in favor of compiler_version_satisfies/3
When emitting input facts we don't need to couple compilers with
packages, but we can emit fewer facts independently and let
the grounder combine them.
* Introduce heuristic in the ASP-program
With heuristic we can drive clingo to make better
initial guesses, which lead to fewer choices and
conflicts in the overall solve
* Fix reindex with uninstalled deps
When a prefix of a dep is removed, and the db is reindexed, it is added
through the dependent, but until now it incorrectly listed the spec as
'installed'.
There was also some questionable behavior in the db when the same spec
was added multiple times, it would always be marked installed.
* Always reserve path
* Only add installed spec's prefixes to install prefixes set
* Improve warning, and ensure ensure only ensures
* test: reindex with every file system remnant removed except for the old index; it should give a database with nothing installed, including records with installed==False,external==False,ref_count==0,explicit=True, and these should be removable from the database
* stacks: add regression tests for matrix expansion
* Use constrain semantics to construct spec lists for stacks
* Fix semantics for constraining an anonymous spec. Add tests
* Add sticky variants
* Add unit tests for sticky variants
* Add documentation for sticky variants
* Revert "Revert 19736 because conflicts are avoided by clingo by default (#26721)"
This reverts commit 33ef7d57c1.
* Add stickiness to "allow-unsupported-compiler"
`spack license update-copyright-year` was updating license headers but not the MIT
license file. Make it do that and add a test.
Also simplify the way we bump the latest copyright year so that we only need to
update it in one place.
* Use pip to bootstrap pip
* Bootstrap wheel from source
* Update PythonPackage to install using pip
* Update several packages
* Add wheel as base class dep
* Build phase no longer exists
* Add py-poetry package, fix py-flit-core bootstrapping
* Fix isort build
* Clean up many more packages
* Remove unused import
* Fix unit tests
* Don't directly run setup.py
* Typo fix
* Remove unused imports
* Fix issues caught by CI
* Remove custom setup.py file handling
* Use PythonPackage for installing wheels
* Remove custom phases in PythonPackages
* Remove <phase>_args methods
* Remove unused import
* Fix various packages
* Try to test Python packages directly in CI
* Actually run the pipeline
* Fix more packages
* Fix mappings, fix packages
* Fix dep version
* Work around bug in concretizer
* Various concretization fixes
* Fix gitlab yaml, packages
* Fix typo in gitlab yaml
* Skip more packages that fail to concretize
* Fix? jupyter ecosystem concretization issues
* Solve Jupyter concretization issues
* Prevent duplicate entries in PYTHONPATH
* Skip fenics-dolfinx
* Build fewer Python packages
* Fix missing npm dep
* Specify image
* More package fixes
* Add backends for every from-source package
* Fix version arg
* Remove GitLab CI stuff, add py-installer package
* Remove test deps, re-add install_options
* Function declaration syntax fix
* More build fixes
* Update spack create template
* Update PythonPackage documentation
* Fix documentation build
* Fix unit tests
* Remove pip flag added only in newer pip
* flux: add explicit dependency on jsonschema
* Update packages that have been added since this was branched off of develop
* Move Python 2 deprecation to a separate PR
* py-neurolab: add build dep on py-setuptools
* Use wheels for pip/wheel
* Allow use of pre-installed pip for external Python
* pip -> python -m pip
* Use python -m pip for all packages
* Fix py-wrapt
* Add both platlib and purelib to PYTHONPATH
* py-pyyaml: setuptools is needed for all versions
* py-pyyaml: link flags aren't needed
* Appease spack audit packages
* Some build backend is required for all versions, distutils -> setuptools
* Correctly handle different setup.py filename
* Use wheels for py-tomli to avoid circular dep on py-flit-core
* Fix busco installation procedure
* Clarify things in spack create template
* Test other Python build backends
* Undo changes to busco
* Various fixes
* Don't test other backends
When `spack compiler list` is run without being restricted to a
particular scope, and no compilers are found, say that none are
available, and hint that the use should run spack compiler find to
auto detect compilers.
* Improve docs
* Check if stdin is a tty
* add a test