command by having it consume exactly 1 positional argument (i.e. by removing
"nargs=argparse.REMAINDER") but this does not work when configuring dependencies
of a top-level package (which show up as additional positional args). Instead
now there is an explicit check to ensure there is only 1 top-level package.
argument). The default path is [package id].xml in the CWD where test-install
is called from.
2. Fixed a bug with package.build_log_path (which was added in this branch).
3. keep_stage for package.do_install is now set. This allows uninstalling and
reinstalling packages without (re) downloading them.
Also even though I calculated which installs are new (e.g. vs. packages that
have already been installed by a previous command) I forgot to make use of that
in create_test_output (so I was always generating test output even if a package
had been installed before running the test-install command).
Note to avoid confusion: the 'handled' variable (removed in this commit) did not
serve the same purpose as 'newInstalls': it was originally required because the
recursive approach would visit the same dependency twice if more than one
package depended on it.
packages are specified and a prior one fails, it will prevent any of the others
from succeeding (and generating test output) even if they don't share
dependencies.
the original intent was to generate output as if each package was a unit test,
but I noticed that I was only generating test output for top-level packages.
Still need to add output formatting (in a commonly parse-able format like Junit
or TAP). May want to adjust how the build log is accessed in case of a build
failure.
Certain remote protocols don't support the `--depth` option. Since this can't
be checked by URL type or in any sane way locally, this version attempts to
clone git repositories with the --depth option, and if that fails attempts the
clone again without it.
Ensures all tags are ready before checkout, using `--branch` if possible and
an extra pull if that is not available. Also adds `--depth 1` to create
shallow clones if the git version is sufficient.
Fixes#64.
It is currently less painful to pull the source from github, compile it into a
gem, then install the gem, than it is to download a gem and install it. This
still lacks an activation mechanism, but `spack use tmuxinator` is functional.
- This can result in the user being prompted to download an unsafe
version.
- Avoids overly strict errors when something *could* be satisfiable
but we don't know about hte version.
- The following now work differently:
spec['mpi']
spec['blas']
This can return a spec for openmpi, mpich, mvapich, etc., EVEN if
the spec is already concretized. This means that in a package that
`depends_on('mpi')`, you can do `spec['mpi']` to see what it was
concretized to. This should simplify MPI and BLAS packages.
'mpi' in spec
'blas' in spec
Previously, if the spec had been concretized, these would be `False`
because there was not a dependency in the DAG with either of these
names. These will now be `True` even if the spec has been
concretized. So, e.g., this will print "YES"
s = Spec('callpath ^mpich')
if 'mpi' in spec:
print "YES"
- Similarly, this will be True:
Spec('mpich').satisfies('mpi')
- Because of the way virtual dependencies are currently implemented,
the above required some fiddling around with `package.py` so that it
would never call `Spec.__contains__` (and result in endless
recursion).
- This should be fixed by allowing virutal dependnecies to have their
own package class.
- This would allow a quicker check for vdeps, without a call to
`all_packages`.
- For the time being, `package.py` shouldn't call `__contains__`
- Expanding archvies like MAGMA 1.6.2 creates extra hidden files that
confuse Spack's staging mechanism.
- Added a special case to ignore hidden files when checking whether
the tarball exploded.