Patches are hashed with specs, and can be associated with dependencies.

- A package can depend on a special patched version of its dependencies.

  - The `Spec` YAML (and therefore the hash) now includes the sha256 of
    the patch in the `Spec` YAML, which changes its hash.

  - The special patched version will be built separately from a "vanilla"
    version of the same package.

  - This allows packages to maintain patches on their dependencies
    without affecting either the dependency package or its dependents.
    This could previously be accomplished with special variants, but
    having to add variants means the hash of the dependency changes
    frequently when it really doesn't need to.  This commit allows the
    hash to change *just* for dependencies that need patches.

  - Patching dependencies shouldn't be the common case, but some packages
    (qmcpack, hpctoolkit, openspeedshop) do this kind of thing and it
    makes the code structure mirror maintenance responsibilities.

- Note that this commit means that adding or changing a patch on a
  package will change its hash.  This is probably what *should* happen,
  but we haven't done it so far.

  - Only applies to `patch()` directives; `package.py` files (and their
    `patch()` functions) are not hashed, but we'd like to do that in the
    future.

- The interface looks like this: `depends_on()` can optionally take a
  patch directive or a list of them:

     depends_on(<spec>,
                patches=patch(..., when=<cond>),
                when=<cond>)
     # or
     depends_on(<spec>,
                patches=[patch(..., when=<cond>),
                         patch(..., when=<cond>)],
                when=<cond>)

- Previously, the `patch()` directive only took an `md5` parameter.  Now
  it only takes a `sha256` parameter.  We restrict this because we want
  to be consistent about which hash is used in the `Spec`.

- A side effect of hashing patches is that *compressed* patches fetched
  from URLs now need *two* checksums: one for the downloaded archive and
  one for the content of the patch itself.  Patches fetched uncompressed
  only need a checksum for the patch.  Rationale:

  - we include the content of the *patch* in the spec hash, as that is
    the checksum we can do consistently for patches included in Spack's
    source and patches fetched remotely, both compressed and
    uncompressed.

  - we *still* need the patch of the downloaded archive, because we want
    to verify the download *before* handing it off to tar, unzip, or
    another decompressor.  Not doing so is a security risk and leaves
    users exposed to any arbitrary code execution vulnerabilities in
    compression tools.
This commit is contained in:
Todd Gamblin 2017-09-23 15:25:33 -07:00
parent 14c141a410
commit 4f8c7d57eb
26 changed files with 752 additions and 195 deletions

View file

@ -26,6 +26,9 @@
""" """
from six import string_types from six import string_types
import spack
#: The types of dependency relationships that Spack understands. #: The types of dependency relationships that Spack understands.
all_deptypes = ('build', 'link', 'run', 'test') all_deptypes = ('build', 'link', 'run', 'test')
@ -78,13 +81,52 @@ class Dependency(object):
e.g. whether it is required for building the package, whether it e.g. whether it is required for building the package, whether it
needs to be linked to, or whether it is needed at runtime so that needs to be linked to, or whether it is needed at runtime so that
Spack can call commands from it. Spack can call commands from it.
A package can also depend on another package with *patches*. This is
for cases where the maintainers of one package also maintain special
patches for their dependencies. If one package depends on another
with patches, a special version of that dependency with patches
applied will be built for use by the dependent package. The patches
are included in the new version's spec hash to differentiate it from
unpatched versions of the same package, so that unpatched versions of
the dependency package can coexist with the patched version.
""" """
def __init__(self, spec, type=default_deptype): def __init__(self, pkg, spec, type=default_deptype):
"""Create a new Dependency. """Create a new Dependency.
Args: Args:
pkg (type): Package that has this dependency
spec (Spec): Spec indicating dependency requirements spec (Spec): Spec indicating dependency requirements
type (sequence): strings describing dependency relationship type (sequence): strings describing dependency relationship
""" """
self.spec = spec assert isinstance(spec, spack.spec.Spec)
self.pkg = pkg
self.spec = spec.copy()
# This dict maps condition specs to lists of Patch objects, just
# as the patches dict on packages does.
self.patches = {}
if type is None:
self.type = set(default_deptype)
else:
self.type = set(type) self.type = set(type)
@property
def name(self):
"""Get the name of the dependency package."""
return self.spec.name
def merge(self, other):
"""Merge constraints, deptypes, and patches of other into self."""
self.spec.constrain(other.spec)
self.type |= other.type
# concatenate patch lists, or just copy them in
for cond, p in other.patches.items():
if cond in self.patches:
self.patches[cond].extend(other.patches[cond])
else:
self.patches[cond] = other.patches[cond]

View file

@ -94,30 +94,26 @@ def __new__(mcs, name, bases, attr_dict):
try: try:
directive_from_base = base._directives_to_be_executed directive_from_base = base._directives_to_be_executed
attr_dict['_directives_to_be_executed'].extend( attr_dict['_directives_to_be_executed'].extend(
directive_from_base directive_from_base)
)
except AttributeError: except AttributeError:
# The base class didn't have the required attribute. # The base class didn't have the required attribute.
# Continue searching # Continue searching
pass pass
# De-duplicates directives from base classes # De-duplicates directives from base classes
attr_dict['_directives_to_be_executed'] = [ attr_dict['_directives_to_be_executed'] = [
x for x in llnl.util.lang.dedupe( x for x in llnl.util.lang.dedupe(
attr_dict['_directives_to_be_executed'] attr_dict['_directives_to_be_executed'])]
)
]
# Move things to be executed from module scope (where they # Move things to be executed from module scope (where they
# are collected first) to class scope # are collected first) to class scope
if DirectiveMetaMixin._directives_to_be_executed: if DirectiveMetaMixin._directives_to_be_executed:
attr_dict['_directives_to_be_executed'].extend( attr_dict['_directives_to_be_executed'].extend(
DirectiveMetaMixin._directives_to_be_executed DirectiveMetaMixin._directives_to_be_executed)
)
DirectiveMetaMixin._directives_to_be_executed = [] DirectiveMetaMixin._directives_to_be_executed = []
return super(DirectiveMetaMixin, mcs).__new__( return super(DirectiveMetaMixin, mcs).__new__(
mcs, name, bases, attr_dict mcs, name, bases, attr_dict)
)
def __init__(cls, name, bases, attr_dict): def __init__(cls, name, bases, attr_dict):
# The class is being created: if it is a package we must ensure # The class is being created: if it is a package we must ensure
@ -128,14 +124,20 @@ def __init__(cls, name, bases, attr_dict):
# from llnl.util.lang.get_calling_module_name # from llnl.util.lang.get_calling_module_name
pkg_name = module.__name__.split('.')[-1] pkg_name = module.__name__.split('.')[-1]
setattr(cls, 'name', pkg_name) setattr(cls, 'name', pkg_name)
# Ensure the presence of the dictionaries associated # Ensure the presence of the dictionaries associated
# with the directives # with the directives
for d in DirectiveMetaMixin._directive_names: for d in DirectiveMetaMixin._directive_names:
setattr(cls, d, {}) setattr(cls, d, {})
# Lazy execution of directives
# Lazily execute directives
for directive in cls._directives_to_be_executed: for directive in cls._directives_to_be_executed:
directive(cls) directive(cls)
# Ignore any directives executed *within* top-level
# directives by clearing out the queue they're appended to
DirectiveMetaMixin._directives_to_be_executed = []
super(DirectiveMetaMixin, cls).__init__(name, bases, attr_dict) super(DirectiveMetaMixin, cls).__init__(name, bases, attr_dict)
@staticmethod @staticmethod
@ -194,15 +196,44 @@ def _decorator(decorated_function):
@functools.wraps(decorated_function) @functools.wraps(decorated_function)
def _wrapper(*args, **kwargs): def _wrapper(*args, **kwargs):
# If any of the arguments are executors returned by a
# directive passed as an argument, don't execute them
# lazily. Instead, let the called directive handle them.
# This allows nested directive calls in packages. The
# caller can return the directive if it should be queued.
def remove_directives(arg):
directives = DirectiveMetaMixin._directives_to_be_executed
if isinstance(arg, (list, tuple)):
# Descend into args that are lists or tuples
for a in arg:
remove_directives(a)
else:
# Remove directives args from the exec queue
remove = next(
(d for d in directives if d is arg), None)
if remove is not None:
directives.remove(remove)
# Nasty, but it's the best way I can think of to avoid
# side effects if directive results are passed as args
remove_directives(args)
remove_directives(kwargs.values())
# A directive returns either something that is callable on a # A directive returns either something that is callable on a
# package or a sequence of them # package or a sequence of them
values = decorated_function(*args, **kwargs) result = decorated_function(*args, **kwargs)
# ...so if it is not a sequence make it so # ...so if it is not a sequence make it so
values = result
if not isinstance(values, collections.Sequence): if not isinstance(values, collections.Sequence):
values = (values, ) values = (values, )
DirectiveMetaMixin._directives_to_be_executed.extend(values) DirectiveMetaMixin._directives_to_be_executed.extend(values)
# wrapped function returns same result as original so
# that we can nest directives
return result
return _wrapper return _wrapper
return _decorator return _decorator
@ -229,7 +260,7 @@ def _execute_version(pkg):
return _execute_version return _execute_version
def _depends_on(pkg, spec, when=None, type=default_deptype): def _depends_on(pkg, spec, when=None, type=default_deptype, patches=None):
# If when is False do nothing # If when is False do nothing
if when is False: if when is False:
return return
@ -245,13 +276,36 @@ def _depends_on(pkg, spec, when=None, type=default_deptype):
type = canonical_deptype(type) type = canonical_deptype(type)
conditions = pkg.dependencies.setdefault(dep_spec.name, {}) conditions = pkg.dependencies.setdefault(dep_spec.name, {})
# call this patches here for clarity -- we want patch to be a list,
# but the caller doesn't have to make it one.
if patches and dep_spec.virtual:
raise DependencyPatchError("Cannot patch a virtual dependency.")
# ensure patches is a list
if patches is None:
patches = []
elif not isinstance(patches, (list, tuple)):
patches = [patches]
# auto-call patch() directive on any strings in patch list
patches = [patch(p) if isinstance(p, string_types)
else p for p in patches]
assert all(callable(p) for p in patches)
# this is where we actually add the dependency to this package
if when_spec not in conditions: if when_spec not in conditions:
conditions[when_spec] = Dependency(dep_spec, type) dependency = Dependency(pkg, dep_spec, type=type)
conditions[when_spec] = dependency
else: else:
dependency = conditions[when_spec] dependency = conditions[when_spec]
dependency.spec.constrain(dep_spec, deps=False) dependency.spec.constrain(dep_spec, deps=False)
dependency.type |= set(type) dependency.type |= set(type)
# apply patches to the dependency
for execute_patch in patches:
execute_patch(dependency)
@directive('conflicts') @directive('conflicts')
def conflicts(conflict_spec, when=None, msg=None): def conflicts(conflict_spec, when=None, msg=None):
@ -285,13 +339,24 @@ def _execute_conflicts(pkg):
@directive(('dependencies')) @directive(('dependencies'))
def depends_on(spec, when=None, type=default_deptype): def depends_on(spec, when=None, type=default_deptype, patches=None):
"""Creates a dict of deps with specs defining when they apply. """Creates a dict of deps with specs defining when they apply.
Args:
spec (Spec or str): the package and constraints depended on
when (Spec or str): when the dependent satisfies this, it has
the dependency represented by ``spec``
type (str or tuple of str): str or tuple of legal Spack deptypes
patches (obj or list): single result of ``patch()`` directive, a
``str`` to be passed to ``patch``, or a list of these
This directive is to be used inside a Package definition to declare This directive is to be used inside a Package definition to declare
that the package requires other packages to be built first. that the package requires other packages to be built first.
@see The section "Dependency specs" in the Spack Packaging Guide.""" @see The section "Dependency specs" in the Spack Packaging Guide.
"""
def _execute_depends_on(pkg): def _execute_depends_on(pkg):
_depends_on(pkg, spec, when=when, type=type) _depends_on(pkg, spec, when=when, type=type, patches=patches)
return _execute_depends_on return _execute_depends_on
@ -356,20 +421,23 @@ def patch(url_or_filename, level=1, when=None, **kwargs):
level (int): patch level (as in the patch shell command) level (int): patch level (as in the patch shell command)
when (Spec): optional anonymous spec that specifies when to apply when (Spec): optional anonymous spec that specifies when to apply
the patch the patch
**kwargs: the following list of keywords is supported
- md5 (str): md5 sum of the patch (used to verify the file Keyword Args:
if it comes from a url) sha256 (str): sha256 sum of the patch, used to verify the patch
(only required for URL patches)
archive_sha256 (str): sha256 sum of the *archive*, if the patch
is compressed (only required for compressed URL patches)
""" """
def _execute_patch(pkg): def _execute_patch(pkg_or_dep):
constraint = pkg.name if when is None else when constraint = pkg_or_dep.name if when is None else when
when_spec = parse_anonymous_spec(constraint, pkg.name) when_spec = parse_anonymous_spec(constraint, pkg_or_dep.name)
# if this spec is identical to some other, then append this # if this spec is identical to some other, then append this
# patch to the existing list. # patch to the existing list.
cur_patches = pkg.patches.setdefault(when_spec, []) cur_patches = pkg_or_dep.patches.setdefault(when_spec, [])
cur_patches.append(Patch.create(pkg, url_or_filename, level, **kwargs)) cur_patches.append(
Patch.create(pkg_or_dep, url_or_filename, level, **kwargs))
return _execute_patch return _execute_patch
@ -381,8 +449,7 @@ def variant(
description='', description='',
values=None, values=None,
multi=False, multi=False,
validator=None validator=None):
):
"""Define a variant for the package. Packager can specify a default """Define a variant for the package. Packager can specify a default
value as well as a text description. value as well as a text description.
@ -484,3 +551,7 @@ class DirectiveError(spack.error.SpackError):
class CircularReferenceError(DirectiveError): class CircularReferenceError(DirectiveError):
"""This is raised when something depends on itself.""" """This is raised when something depends on itself."""
class DependencyPatchError(DirectiveError):
"""Raised for errors with patching dependencies."""

View file

@ -542,9 +542,12 @@ class SomePackage(Package):
#: Defaults to the empty string. #: Defaults to the empty string.
license_url = '' license_url = ''
# Verbosity level, preserved across installs. #: Verbosity level, preserved across installs.
_verbose = None _verbose = None
#: index of patches by sha256 sum, built lazily
_patches_by_hash = None
#: List of strings which contains GitHub usernames of package maintainers. #: List of strings which contains GitHub usernames of package maintainers.
#: Do not include @ here in order not to unnecessarily ping the users. #: Do not include @ here in order not to unnecessarily ping the users.
maintainers = [] maintainers = []
@ -642,7 +645,7 @@ def possible_dependencies(self, transitive=True, visited=None):
@property @property
def package_dir(self): def package_dir(self):
"""Return the directory where the package.py file lives.""" """Return the directory where the package.py file lives."""
return os.path.dirname(self.module.__file__) return os.path.abspath(os.path.dirname(self.module.__file__))
@property @property
def global_license_dir(self): def global_license_dir(self):
@ -990,9 +993,42 @@ def do_stage(self, mirror_only=False):
self.stage.expand_archive() self.stage.expand_archive()
self.stage.chdir_to_source() self.stage.chdir_to_source()
@classmethod
def lookup_patch(cls, sha256):
"""Look up a patch associated with this package by its sha256 sum.
Args:
sha256 (str): sha256 sum of the patch to look up
Returns:
(Patch): ``Patch`` object with the given hash, or ``None`` if
not found.
To do the lookup, we build an index lazily. This allows us to
avoid computing a sha256 for *every* patch and on every package
load. With lazy hashing, we only compute hashes on lookup, which
usually happens at build time.
"""
if cls._patches_by_hash is None:
cls._patches_by_hash = {}
# Add patches from the class
for cond, patch_list in cls.patches.items():
for patch in patch_list:
cls._patches_by_hash[patch.sha256] = patch
# and patches on dependencies
for name, conditions in cls.dependencies.items():
for cond, dependency in conditions.items():
for pcond, patch_list in dependency.patches.items():
for patch in patch_list:
cls._patches_by_hash[patch.sha256] = patch
return cls._patches_by_hash.get(sha256, None)
def do_patch(self): def do_patch(self):
"""Calls do_stage(), then applied patches to the expanded tarball if they """Applies patches if they haven't been applied already."""
haven't been applied already."""
if not self.spec.concrete: if not self.spec.concrete:
raise ValueError("Can only patch concrete packages.") raise ValueError("Can only patch concrete packages.")
@ -1002,8 +1038,11 @@ def do_patch(self):
# Package can add its own patch function. # Package can add its own patch function.
has_patch_fun = hasattr(self, 'patch') and callable(self.patch) has_patch_fun = hasattr(self, 'patch') and callable(self.patch)
# Get the patches from the spec (this is a shortcut for the MV-variant)
patches = self.spec.patches
# If there are no patches, note it. # If there are no patches, note it.
if not self.patches and not has_patch_fun: if not patches and not has_patch_fun:
tty.msg("No patches needed for %s" % self.name) tty.msg("No patches needed for %s" % self.name)
return return
@ -1032,9 +1071,7 @@ def do_patch(self):
# Apply all the patches for specs that match this one # Apply all the patches for specs that match this one
patched = False patched = False
for spec, patch_list in self.patches.items(): for patch in patches:
if self.spec.satisfies(spec):
for patch in patch_list:
try: try:
patch.apply(self.stage) patch.apply(self.stage)
tty.msg('Applied patch %s' % patch.path_or_url) tty.msg('Applied patch %s' % patch.path_or_url)
@ -1054,9 +1091,10 @@ def do_patch(self):
# We are running a multimethod without a default case. # We are running a multimethod without a default case.
# If there's no default it means we don't need to patch. # If there's no default it means we don't need to patch.
if not patched: if not patched:
# if we didn't apply a patch, AND the patch function # if we didn't apply a patch from a patch()
# didn't apply, say no patches are needed. # directive, AND the patch function didn't apply, say
# Otherwise, we already said we applied each patch. # no patches are needed. Otherwise, we already
# printed a message for each patch.
tty.msg("No patches needed for %s" % self.name) tty.msg("No patches needed for %s" % self.name)
except: except:
tty.msg("patch() function failed for %s" % self.name) tty.msg("patch() function failed for %s" % self.name)

View file

@ -25,13 +25,16 @@
import os import os
import os.path import os.path
import inspect import inspect
import hashlib
import spack import spack
import spack.error import spack.error
import spack.fetch_strategy as fs import spack.fetch_strategy as fs
import spack.stage import spack.stage
from spack.util.crypto import checksum, Checker
from llnl.util.filesystem import working_dir from llnl.util.filesystem import working_dir
from spack.util.executable import which from spack.util.executable import which
from spack.util.compression import allowed_archive
def absolute_path_for_package(pkg): def absolute_path_for_package(pkg):
@ -39,8 +42,10 @@ def absolute_path_for_package(pkg):
the recipe for the package passed as argument. the recipe for the package passed as argument.
Args: Args:
pkg: a valid package object pkg: a valid package object, or a Dependency object.
""" """
if isinstance(pkg, spack.dependency.Dependency):
pkg = pkg.pkg
m = inspect.getmodule(pkg) m = inspect.getmodule(pkg)
return os.path.abspath(m.__file__) return os.path.abspath(m.__file__)
@ -51,7 +56,7 @@ class Patch(object):
""" """
@staticmethod @staticmethod
def create(pkg, path_or_url, level, **kwargs): def create(pkg, path_or_url, level=1, **kwargs):
""" """
Factory method that creates an instance of some class derived from Factory method that creates an instance of some class derived from
Patch Patch
@ -59,18 +64,18 @@ def create(pkg, path_or_url, level, **kwargs):
Args: Args:
pkg: package that needs to be patched pkg: package that needs to be patched
path_or_url: path or url where the patch is found path_or_url: path or url where the patch is found
level: patch level level: patch level (default 1)
Returns: Returns:
instance of some Patch class instance of some Patch class
""" """
# Check if we are dealing with a URL # Check if we are dealing with a URL
if '://' in path_or_url: if '://' in path_or_url:
return UrlPatch(pkg, path_or_url, level, **kwargs) return UrlPatch(path_or_url, level, **kwargs)
# Assume patches are stored in the repository # Assume patches are stored in the repository
return FilePatch(pkg, path_or_url, level) return FilePatch(pkg, path_or_url, level)
def __init__(self, pkg, path_or_url, level): def __init__(self, path_or_url, level):
# Check on level (must be an integer > 0) # Check on level (must be an integer > 0)
if not isinstance(level, int) or not level >= 0: if not isinstance(level, int) or not level >= 0:
raise ValueError("Patch level needs to be a non-negative integer.") raise ValueError("Patch level needs to be a non-negative integer.")
@ -100,20 +105,39 @@ def apply(self, stage):
class FilePatch(Patch): class FilePatch(Patch):
"""Describes a patch that is retrieved from a file in the repository""" """Describes a patch that is retrieved from a file in the repository"""
def __init__(self, pkg, path_or_url, level): def __init__(self, pkg, path_or_url, level):
super(FilePatch, self).__init__(pkg, path_or_url, level) super(FilePatch, self).__init__(path_or_url, level)
pkg_dir = os.path.dirname(absolute_path_for_package(pkg)) pkg_dir = os.path.dirname(absolute_path_for_package(pkg))
self.path = os.path.join(pkg_dir, path_or_url) self.path = os.path.join(pkg_dir, path_or_url)
if not os.path.isfile(self.path): if not os.path.isfile(self.path):
raise NoSuchPatchFileError(pkg.name, self.path) raise NoSuchPatchError(
"No such patch for package %s: %s" % (pkg.name, self.path))
self._sha256 = None
@property
def sha256(self):
if self._sha256 is None:
self._sha256 = checksum(hashlib.sha256, self.path)
return self._sha256
class UrlPatch(Patch): class UrlPatch(Patch):
"""Describes a patch that is retrieved from a URL""" """Describes a patch that is retrieved from a URL"""
def __init__(self, pkg, path_or_url, level, **kwargs): def __init__(self, path_or_url, level, **kwargs):
super(UrlPatch, self).__init__(pkg, path_or_url, level) super(UrlPatch, self).__init__(path_or_url, level)
self.url = path_or_url self.url = path_or_url
self.md5 = kwargs.get('md5')
self.archive_sha256 = None
if allowed_archive(self.url):
if 'archive_sha256' not in kwargs:
raise PatchDirectiveError(
"Compressed patches require 'archive_sha256' "
"and patch 'sha256' attributes: %s" % self.url)
self.archive_sha256 = kwargs.get('archive_sha256')
if 'sha256' not in kwargs:
raise PatchDirectiveError("URL patches require a sha256 checksum")
self.sha256 = kwargs.get('sha256')
def apply(self, stage): def apply(self, stage):
"""Retrieve the patch in a temporary stage, computes """Retrieve the patch in a temporary stage, computes
@ -122,7 +146,12 @@ def apply(self, stage):
Args: Args:
stage: stage for the package that needs to be patched stage: stage for the package that needs to be patched
""" """
fetcher = fs.URLFetchStrategy(self.url, digest=self.md5) # use archive digest for compressed archives
fetch_digest = self.sha256
if self.archive_sha256:
fetch_digest = self.archive_sha256
fetcher = fs.URLFetchStrategy(self.url, digest=fetch_digest)
mirror = os.path.join( mirror = os.path.join(
os.path.dirname(stage.mirror_path), os.path.dirname(stage.mirror_path),
os.path.basename(self.url)) os.path.basename(self.url))
@ -132,20 +161,40 @@ def apply(self, stage):
patch_stage.check() patch_stage.check()
patch_stage.cache_local() patch_stage.cache_local()
if spack.util.compression.allowed_archive(self.url): root = patch_stage.path
if self.archive_sha256:
patch_stage.expand_archive() patch_stage.expand_archive()
root = patch_stage.source_path
self.path = os.path.abspath( files = os.listdir(root)
os.listdir(patch_stage.path).pop()) if not files:
if self.archive_sha256:
raise NoSuchPatchError(
"Archive was empty: %s" % self.url)
else:
raise NoSuchPatchError(
"Patch failed to download: %s" % self.url)
self.path = os.path.join(root, files.pop())
if not os.path.isfile(self.path):
raise NoSuchPatchError(
"Archive %s contains no patch file!" % self.url)
# for a compressed archive, Need to check the patch sha256 again
# and the patch is in a directory, not in the same place
if self.archive_sha256:
if not Checker(self.sha256).check(self.path):
raise fs.ChecksumError(
"sha256 checksum failed for %s" % self.path,
"Expected %s but got %s" % (self.sha256, checker.sum))
super(UrlPatch, self).apply(stage) super(UrlPatch, self).apply(stage)
class NoSuchPatchFileError(spack.error.SpackError): class NoSuchPatchError(spack.error.SpackError):
"""Raised when user specifies a patch file that doesn't exist.""" """Raised when a patch file doesn't exist."""
def __init__(self, package, path):
super(NoSuchPatchFileError, self).__init__( class PatchDirectiveError(spack.error.SpackError):
"No such patch file for package %s: %s" % (package, path)) """Raised when the wrong arguments are suppled to the patch directive."""
self.package = package
self.path = path

View file

@ -1488,8 +1488,7 @@ def from_node_dict(node):
spec.compiler_flags[name] = value spec.compiler_flags[name] = value
else: else:
spec.variants[name] = MultiValuedVariant.from_node_dict( spec.variants[name] = MultiValuedVariant.from_node_dict(
name, value name, value)
)
elif 'variants' in node: elif 'variants' in node:
for name, value in node['variants'].items(): for name, value in node['variants'].items():
spec.variants[name] = MultiValuedVariant.from_node_dict( spec.variants[name] = MultiValuedVariant.from_node_dict(
@ -1806,6 +1805,43 @@ def concretize(self):
if s.namespace is None: if s.namespace is None:
s.namespace = spack.repo.repo_for_pkg(s.name).namespace s.namespace = spack.repo.repo_for_pkg(s.name).namespace
# Add any patches from the package to the spec.
patches = []
for cond, patch_list in s.package_class.patches.items():
if s.satisfies(cond):
for patch in patch_list:
patches.append(patch.sha256)
if patches:
# Special-case: keeps variant values unique but ordered.
s.variants['patches'] = MultiValuedVariant('patches', ())
mvar = s.variants['patches']
mvar._value = mvar._original_value = tuple(dedupe(patches))
# Apply patches required on dependencies by depends_on(..., patch=...)
for dspec in self.traverse_edges(deptype=all,
cover='edges', root=False):
pkg_deps = dspec.parent.package_class.dependencies
if dspec.spec.name not in pkg_deps:
continue
patches = []
for cond, dependency in pkg_deps[dspec.spec.name].items():
if dspec.parent.satisfies(cond):
for pcond, patch_list in dependency.patches.items():
if dspec.spec.satisfies(pcond):
for patch in patch_list:
patches.append(patch.sha256)
if patches:
# note that we use a special multi-valued variant and
# keep the patches ordered.
if 'patches' not in dspec.spec.variants:
mvar = MultiValuedVariant('patches', ())
dspec.spec.variants['patches'] = mvar
else:
mvar = dspec.spec.variants['patches']
mvar._value = mvar._original_value = tuple(
dedupe(list(mvar._value) + patches))
for s in self.traverse(): for s in self.traverse():
if s.external_module: if s.external_module:
compiler = spack.compilers.compiler_for_spec( compiler = spack.compilers.compiler_for_spec(
@ -1908,7 +1944,7 @@ def _evaluate_dependency_conditions(self, name):
name (str): name of dependency to evaluate conditions on. name (str): name of dependency to evaluate conditions on.
Returns: Returns:
(tuple): tuple of ``Spec`` and tuple of ``deptypes``. (Dependency): new Dependency object combining all constraints.
If the package depends on <name> in the current spec If the package depends on <name> in the current spec
configuration, return the constrained dependency and configuration, return the constrained dependency and
@ -1922,21 +1958,19 @@ def _evaluate_dependency_conditions(self, name):
substitute_abstract_variants(self) substitute_abstract_variants(self)
# evaluate when specs to figure out constraints on the dependency. # evaluate when specs to figure out constraints on the dependency.
dep, deptypes = None, None dep = None
for when_spec, dependency in conditions.items(): for when_spec, dependency in conditions.items():
if self.satisfies(when_spec, strict=True): if self.satisfies(when_spec, strict=True):
if dep is None: if dep is None:
dep = Spec(name) dep = Dependency(self.name, Spec(name), type=())
deptypes = set()
try: try:
dep.constrain(dependency.spec) dep.merge(dependency)
deptypes |= dependency.type
except UnsatisfiableSpecError as e: except UnsatisfiableSpecError as e:
e.message = ("Conflicting conditional dependencies on" e.message = ("Conflicting conditional dependencies on"
"package %s for spec %s" % (self.name, self)) "package %s for spec %s" % (self.name, self))
raise e raise e
return dep, deptypes return dep
def _find_provider(self, vdep, provider_index): def _find_provider(self, vdep, provider_index):
"""Find provider for a virtual spec in the provider index. """Find provider for a virtual spec in the provider index.
@ -1971,15 +2005,26 @@ def _find_provider(self, vdep, provider_index):
elif required: elif required:
raise UnsatisfiableProviderSpecError(required[0], vdep) raise UnsatisfiableProviderSpecError(required[0], vdep)
def _merge_dependency(self, dep, deptypes, visited, spec_deps, def _merge_dependency(
provider_index): self, dependency, visited, spec_deps, provider_index):
"""Merge the dependency into this spec. """Merge dependency information from a Package into this Spec.
Caller should assume that this routine can owns the dep parameter Args:
(i.e. it needs to be a copy of any internal structures like dependency (Dependency): dependency metadata from a package;
dependencies on Package class objects). this is typically the result of merging *all* matching
dependency constraints from the package.
visited (set): set of dependency nodes already visited by
``normalize()``.
spec_deps (dict): ``dict`` of all dependencies from the spec
being normalized.
provider_index (dict): ``provider_index`` of virtual dep
providers in the ``Spec`` as normalized so far.
This is the core of normalize(). There are some basic steps: NOTE: Caller should assume that this routine owns the
``dependency`` parameter, i.e., it needs to be a copy of any
internal structures.
This is the core of ``normalize()``. There are some basic steps:
* If dep is virtual, evaluate whether it corresponds to an * If dep is virtual, evaluate whether it corresponds to an
existing concrete dependency, and merge if so. existing concrete dependency, and merge if so.
@ -1994,6 +2039,7 @@ def _merge_dependency(self, dep, deptypes, visited, spec_deps,
""" """
changed = False changed = False
dep = dependency.spec
# If it's a virtual dependency, try to find an existing # If it's a virtual dependency, try to find an existing
# provider in the spec, and merge that. # provider in the spec, and merge that.
@ -2045,11 +2091,11 @@ def _merge_dependency(self, dep, deptypes, visited, spec_deps,
raise raise
# Add merged spec to my deps and recurse # Add merged spec to my deps and recurse
dependency = spec_deps[dep.name] spec_dependency = spec_deps[dep.name]
if dep.name not in self._dependencies: if dep.name not in self._dependencies:
self._add_dependency(dependency, deptypes) self._add_dependency(spec_dependency, dependency.type)
changed |= dependency._normalize_helper( changed |= spec_dependency._normalize_helper(
visited, spec_deps, provider_index) visited, spec_deps, provider_index)
return changed return changed
@ -2074,12 +2120,12 @@ def _normalize_helper(self, visited, spec_deps, provider_index):
changed = False changed = False
for dep_name in pkg.dependencies: for dep_name in pkg.dependencies:
# Do we depend on dep_name? If so pkg_dep is not None. # Do we depend on dep_name? If so pkg_dep is not None.
dep, deptypes = self._evaluate_dependency_conditions(dep_name) dep = self._evaluate_dependency_conditions(dep_name)
# If dep is a needed dependency, merge it. # If dep is a needed dependency, merge it.
if dep and (spack.package_testing.check(self.name) or if dep and (spack.package_testing.check(self.name) or
set(deptypes) - set(['test'])): set(dep.type) - set(['test'])):
changed |= self._merge_dependency( changed |= self._merge_dependency(
dep, deptypes, visited, spec_deps, provider_index) dep, visited, spec_deps, provider_index)
any_change |= changed any_change |= changed
return any_change return any_change
@ -2463,6 +2509,35 @@ def virtual_dependencies(self):
"""Return list of any virtual deps in this spec.""" """Return list of any virtual deps in this spec."""
return [spec for spec in self.traverse() if spec.virtual] return [spec for spec in self.traverse() if spec.virtual]
@property
def patches(self):
"""Return patch objects for any patch sha256 sums on this Spec.
This is for use after concretization to iterate over any patches
associated with this spec.
TODO: this only checks in the package; it doesn't resurrect old
patches from install directories, but it probably should.
"""
if 'patches' not in self.variants:
return []
patches = []
for sha256 in self.variants['patches'].value:
patch = self.package.lookup_patch(sha256)
if patch:
patches.append(patch)
continue
# if not found in this package, check immediate dependents
# for dependency patches
for dep in self._dependents:
patch = dep.parent.package.lookup_patch(sha256)
if patch:
patches.append(patch)
return patches
def _dup(self, other, deps=True, cleardeps=True, caches=None): def _dup(self, other, deps=True, cleardeps=True, caches=None):
"""Copy the spec other into self. This is an overwriting """Copy the spec other into self. This is an overwriting
copy. It does not copy any dependents (parents), but by default copy. It does not copy any dependents (parents), but by default
@ -2549,7 +2624,8 @@ def _dup(self, other, deps=True, cleardeps=True, caches=None):
def _dup_deps(self, other, deptypes, caches): def _dup_deps(self, other, deptypes, caches):
new_specs = {self.name: self} new_specs = {self.name: self}
for dspec in other.traverse_edges(cover='edges', root=False): for dspec in other.traverse_edges(cover='edges',
root=False):
if (dspec.deptypes and if (dspec.deptypes and
not any(d in deptypes for d in dspec.deptypes)): not any(d in deptypes for d in dspec.deptypes)):
continue continue

View file

@ -35,7 +35,8 @@
def test_immediate_dependents(builtin_mock): def test_immediate_dependents(builtin_mock):
out = dependents('libelf') out = dependents('libelf')
actual = set(re.split(r'\s+', out.strip())) actual = set(re.split(r'\s+', out.strip()))
assert actual == set(['dyninst', 'libdwarf']) assert actual == set(['dyninst', 'libdwarf',
'patch-a-dependency', 'patch-several-dependencies'])
def test_transitive_dependents(builtin_mock): def test_transitive_dependents(builtin_mock):
@ -43,7 +44,8 @@ def test_transitive_dependents(builtin_mock):
actual = set(re.split(r'\s+', out.strip())) actual = set(re.split(r'\s+', out.strip()))
assert actual == set( assert actual == set(
['callpath', 'dyninst', 'libdwarf', 'mpileaks', 'multivalue_variant', ['callpath', 'dyninst', 'libdwarf', 'mpileaks', 'multivalue_variant',
'singlevalue-variant-dependent']) 'singlevalue-variant-dependent',
'patch-a-dependency', 'patch-several-dependencies'])
def test_immediate_installed_dependents(builtin_mock, database): def test_immediate_installed_dependents(builtin_mock, database):

View file

@ -572,7 +572,7 @@ def __init__(self, name, dependencies, dependency_types, conditions=None,
assert len(dependencies) == len(dependency_types) assert len(dependencies) == len(dependency_types)
for dep, dtype in zip(dependencies, dependency_types): for dep, dtype in zip(dependencies, dependency_types):
d = Dependency(Spec(dep.name), type=dtype) d = Dependency(self, Spec(dep.name), type=dtype)
if not conditions or dep.name not in conditions: if not conditions or dep.name not in conditions:
self.dependencies[dep.name] = {Spec(name): d} self.dependencies[dep.name] = {Spec(name): d}
else: else:
@ -587,12 +587,15 @@ def __init__(self, name, dependencies, dependency_types, conditions=None,
self.variants = {} self.variants = {}
self.provided = {} self.provided = {}
self.conflicts = {} self.conflicts = {}
self.patches = {}
class MockPackageMultiRepo(object): class MockPackageMultiRepo(object):
def __init__(self, packages): def __init__(self, packages):
self.spec_to_pkg = dict((x.name, x) for x in packages) self.spec_to_pkg = dict((x.name, x) for x in packages)
self.spec_to_pkg.update(
dict(('mockrepo.' + x.name, x) for x in packages))
def get(self, spec): def get(self, spec):
if not isinstance(spec, spack.spec.Spec): if not isinstance(spec, spack.spec.Spec):

View file

@ -1 +0,0 @@
BAR

View file

@ -0,0 +1,7 @@
--- a/foo.txt 2017-09-25 21:24:33.000000000 -0700
+++ b/foo.txt 2017-09-25 14:31:17.000000000 -0700
@@ -1,2 +1,3 @@
+zeroth line
first line
-second line
+third line

View file

@ -22,63 +22,171 @@
# License along with this program; if not, write to the Free Software # License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
############################################################################## ##############################################################################
import os
import os.path
import pytest
import sys import sys
import filecmp
import pytest
from llnl.util.filesystem import working_dir, mkdirp
import spack import spack
import spack.util.compression import spack.util.compression
import spack.stage from spack.stage import Stage
from spack.spec import Spec
@pytest.fixture()
def mock_apply(monkeypatch):
"""Monkeypatches ``Patch.apply`` to test only the additional behavior of
derived classes.
"""
m = sys.modules['spack.patch']
def check_expand(self, *args, **kwargs):
# Check tarball expansion
if spack.util.compression.allowed_archive(self.url):
file = os.path.join(self.path, 'foo.txt')
assert os.path.exists(file)
# Check tarball fetching
dirname = os.path.dirname(self.path)
basename = os.path.basename(self.url)
tarball = os.path.join(dirname, basename)
assert os.path.exists(tarball)
monkeypatch.setattr(m.Patch, 'apply', check_expand)
@pytest.fixture() @pytest.fixture()
def mock_stage(tmpdir, monkeypatch): def mock_stage(tmpdir, monkeypatch):
# don't disrupt the spack install directory with tests.
monkeypatch.setattr(spack, 'stage_path', str(tmpdir)) mock_path = str(tmpdir)
monkeypatch.setattr(spack, 'stage_path', mock_path)
class MockStage(object): return mock_path
def __init__(self):
self.mirror_path = str(tmpdir)
return MockStage()
data_path = os.path.join(spack.test_path, 'data', 'patch') data_path = os.path.join(spack.test_path, 'data', 'patch')
@pytest.mark.usefixtures('mock_apply') @pytest.mark.parametrize('filename, sha256, archive_sha256', [
@pytest.mark.parametrize('filename,md5', [ # compressed patch -- needs sha256 and archive_256
(os.path.join(data_path, 'foo.tgz'), 'bff717ca9cbbb293bdf188e44c540758'), (os.path.join(data_path, 'foo.tgz'),
(os.path.join(data_path, 'bar.txt'), 'f98bf6f12e995a053b7647b10d937912') '252c0af58be3d90e5dc5e0d16658434c9efa5d20a5df6c10bf72c2d77f780866',
'4e8092a161ec6c3a1b5253176fcf33ce7ba23ee2ff27c75dbced589dabacd06e'),
# uncompressed patch -- needs only sha256
(os.path.join(data_path, 'foo.patch'),
'252c0af58be3d90e5dc5e0d16658434c9efa5d20a5df6c10bf72c2d77f780866',
None)
]) ])
def test_url_patch_expansion(mock_stage, filename, md5): def test_url_patch(mock_stage, filename, sha256, archive_sha256):
# Make a patch object
m = sys.modules['spack.patch']
url = 'file://' + filename url = 'file://' + filename
patch = m.Patch.create(None, url, 0, md5=md5) m = sys.modules['spack.patch']
patch.apply(mock_stage) patch = m.Patch.create(
None, url, sha256=sha256, archive_sha256=archive_sha256)
# make a stage
with Stage(url) as stage: # TODO: url isn't used; maybe refactor Stage
# TODO: there is probably a better way to mock this.
stage.mirror_path = mock_stage # don't disrupt the spack install
# fake a source path
with working_dir(stage.path):
mkdirp('spack-expanded-archive')
with working_dir(stage.source_path):
# write a file to be patched
with open('foo.txt', 'w') as f:
f.write("""\
first line
second line
""")
# write the expected result of patching.
with open('foo-expected.txt', 'w') as f:
f.write("""\
zeroth line
first line
third line
""")
# apply the patch and compare files
patch.apply(stage)
with working_dir(stage.source_path):
assert filecmp.cmp('foo.txt', 'foo-expected.txt')
def test_patch_in_spec(builtin_mock, config):
"""Test whether patches in a package appear in the spec."""
spec = Spec('patch')
spec.concretize()
assert 'patches' in list(spec.variants.keys())
# foo, bar, baz
assert (('b5bb9d8014a0f9b1d61e21e796d78dccdf1352f23cd32812f4850b878ae4944c',
'7d865e959b2466918c9863afca942d0fb89d7c9ac0c99bafc3749504ded97730',
'bf07a7fbb825fc0aae7bf4a1177b2b31fcf8a3feeaf7092761e18c859ee52a9c') ==
spec.variants['patches'].value)
def test_patched_dependency(builtin_mock, config):
"""Test whether patched dependencies work."""
spec = Spec('patch-a-dependency')
spec.concretize()
assert 'patches' in list(spec['libelf'].variants.keys())
# foo
assert (('b5bb9d8014a0f9b1d61e21e796d78dccdf1352f23cd32812f4850b878ae4944c',) ==
spec['libelf'].variants['patches'].value)
def test_multiple_patched_dependencies(builtin_mock, config):
"""Test whether multiple patched dependencies work."""
spec = Spec('patch-several-dependencies')
spec.concretize()
# basic patch on libelf
assert 'patches' in list(spec['libelf'].variants.keys())
# foo
assert (('b5bb9d8014a0f9b1d61e21e796d78dccdf1352f23cd32812f4850b878ae4944c',) ==
spec['libelf'].variants['patches'].value)
# URL patches
assert 'patches' in list(spec['fake'].variants.keys())
# urlpatch.patch, urlpatch.patch.gz
assert (('abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234',
'1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd') ==
spec['fake'].variants['patches'].value)
def test_conditional_patched_dependencies(builtin_mock, config):
"""Test whether conditional patched dependencies work."""
spec = Spec('patch-several-dependencies @1.0')
spec.concretize()
# basic patch on libelf
assert 'patches' in list(spec['libelf'].variants.keys())
# foo
assert (('b5bb9d8014a0f9b1d61e21e796d78dccdf1352f23cd32812f4850b878ae4944c',) ==
spec['libelf'].variants['patches'].value)
# conditional patch on libdwarf
assert 'patches' in list(spec['libdwarf'].variants.keys())
# bar
assert (('7d865e959b2466918c9863afca942d0fb89d7c9ac0c99bafc3749504ded97730',) ==
spec['libdwarf'].variants['patches'].value)
# baz is conditional on libdwarf version
assert ('bf07a7fbb825fc0aae7bf4a1177b2b31fcf8a3feeaf7092761e18c859ee52a9c'
not in spec['libdwarf'].variants['patches'].value)
# URL patches
assert 'patches' in list(spec['fake'].variants.keys())
# urlpatch.patch, urlpatch.patch.gz
assert (('abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234',
'1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd') ==
spec['fake'].variants['patches'].value)
def test_conditional_patched_deps_with_conditions(builtin_mock, config):
"""Test whether conditional patched dependencies with conditions work."""
spec = Spec('patch-several-dependencies @1.0 ^libdwarf@20111030')
spec.concretize()
# basic patch on libelf
assert 'patches' in list(spec['libelf'].variants.keys())
# foo
assert ('b5bb9d8014a0f9b1d61e21e796d78dccdf1352f23cd32812f4850b878ae4944c'
in spec['libelf'].variants['patches'].value)
# conditional patch on libdwarf
assert 'patches' in list(spec['libdwarf'].variants.keys())
# bar
assert ('7d865e959b2466918c9863afca942d0fb89d7c9ac0c99bafc3749504ded97730'
in spec['libdwarf'].variants['patches'].value)
# baz is conditional on libdwarf version (no guarantee on order w/conds)
assert ('bf07a7fbb825fc0aae7bf4a1177b2b31fcf8a3feeaf7092761e18c859ee52a9c'
in spec['libdwarf'].variants['patches'].value)
# URL patches
assert 'patches' in list(spec['fake'].variants.keys())
# urlpatch.patch, urlpatch.patch.gz
assert (('abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234',
'1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd') ==
spec['fake'].variants['patches'].value)

View file

@ -53,7 +53,7 @@ def saved_deps():
@pytest.fixture() @pytest.fixture()
def set_dependency(saved_deps): def set_dependency(saved_deps):
"""Returns a function that alters the dependency information """Returns a function that alters the dependency information
for a package. for a package in the ``saved_deps`` fixture.
""" """
def _mock(pkg_name, spec, deptypes=all_deptypes): def _mock(pkg_name, spec, deptypes=all_deptypes):
"""Alters dependence information for a package. """Alters dependence information for a package.
@ -67,7 +67,7 @@ def _mock(pkg_name, spec, deptypes=all_deptypes):
saved_deps[pkg_name] = (pkg, pkg.dependencies.copy()) saved_deps[pkg_name] = (pkg, pkg.dependencies.copy())
cond = Spec(pkg.name) cond = Spec(pkg.name)
dependency = Dependency(spec, deptypes) dependency = Dependency(pkg, spec, type=deptypes)
pkg.dependencies[spec.name] = {cond: dependency} pkg.dependencies[spec.name] = {cond: dependency}
return _mock return _mock

View file

@ -47,8 +47,7 @@ def __init__(
description, description,
values=(True, False), values=(True, False),
multi=False, multi=False,
validator=None validator=None):
):
"""Initialize a package variant. """Initialize a package variant.
Args: Args:
@ -220,10 +219,15 @@ def __init__(self, name, value):
def from_node_dict(name, value): def from_node_dict(name, value):
"""Reconstruct a variant from a node dict.""" """Reconstruct a variant from a node dict."""
if isinstance(value, list): if isinstance(value, list):
value = ','.join(value) # read multi-value variants in and be faithful to the YAML
return MultiValuedVariant(name, value) mvar = MultiValuedVariant(name, ())
mvar._value = tuple(value)
mvar._original_value = mvar._value
return mvar
elif str(value).upper() == 'TRUE' or str(value).upper() == 'FALSE': elif str(value).upper() == 'TRUE' or str(value).upper() == 'FALSE':
return BoolValuedVariant(name, value) return BoolValuedVariant(name, value)
return SingleValuedVariant(name, value) return SingleValuedVariant(name, value)
def yaml_entry(self): def yaml_entry(self):
@ -252,15 +256,16 @@ def _value_setter(self, value):
# Store the original value # Store the original value
self._original_value = value self._original_value = value
if not isinstance(value, (tuple, list)):
# Store a tuple of CSV string representations # Store a tuple of CSV string representations
# Tuple is necessary here instead of list because the # Tuple is necessary here instead of list because the
# values need to be hashed # values need to be hashed
t = re.split(r'\s*,\s*', str(value)) value = re.split(r'\s*,\s*', str(value))
# With multi-value variants it is necessary # With multi-value variants it is necessary
# to remove duplicates and give an order # to remove duplicates and give an order
# to a set # to a set
self._value = tuple(sorted(set(t))) self._value = tuple(sorted(set(value)))
def _cmp_key(self): def _cmp_key(self):
return self.name, self.value return self.name, self.value

View file

@ -0,0 +1 @@
foo

View file

@ -0,0 +1,39 @@
##############################################################################
# Copyright (c) 2013-2017, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the NOTICE and LICENSE files for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class PatchADependency(Package):
"""Package that requries a patched version of a dependency."""
homepage = "http://www.example.com"
url = "http://www.example.com/patch-a-dependency-1.0.tar.gz"
version('1.0', '0123456789abcdef0123456789abcdef')
depends_on('libelf', patches=patch('foo.patch'))
def install(self, spec, prefix):
pass

View file

@ -0,0 +1,60 @@
##############################################################################
# Copyright (c) 2013-2017, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the NOTICE and LICENSE files for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class PatchSeveralDependencies(Package):
"""Package that requries multiple patches on a dependency."""
homepage = "http://www.example.com"
url = "http://www.example.com/patch-a-dependency-1.0.tar.gz"
version('2.0', '0123456789abcdef0123456789abcdef')
version('1.0', '0123456789abcdef0123456789abcdef')
# demonstrate all the different ways to patch things
# single patch file in repo
depends_on('libelf', patches='foo.patch')
# using a list of patches in one depends_on
depends_on('libdwarf', patches=[
patch('bar.patch'), # nested patch directive
patch('baz.patch', when='@20111030') # and with a conditional
], when='@1.0') # with a depends_on conditional
# URL patches
depends_on('fake', patches=[
# uncompressed URL patch
patch('http://example.com/urlpatch.patch',
sha256='abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234'),
# compressed URL patch requires separate archive sha
patch('http://example.com/urlpatch2.patch.gz',
archive_sha256='abcdabcdabcdabcdabcdabcdabcdabcdabcdabcdabcdabcdabcdabcdabcdabcd',
sha256='1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd')
])
def install(self, spec, prefix):
pass

View file

@ -0,0 +1 @@
bar

View file

@ -0,0 +1 @@
baz

View file

@ -0,0 +1 @@
foo

View file

@ -0,0 +1,41 @@
##############################################################################
# Copyright (c) 2013-2017, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the NOTICE and LICENSE files for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Patch(Package):
"""Package that requries a patched version of a dependency."""
homepage = "http://www.example.com"
url = "http://www.example.com/patch-1.0.tar.gz"
version('1.0', '0123456789abcdef0123456789abcdef')
patch('foo.patch')
patch('bar.patch')
patch('baz.patch')
def install(self, spec, prefix):
pass

View file

@ -39,28 +39,36 @@ class Nauty(AutotoolsPackage):
urls_for_patches = { urls_for_patches = {
'@2.6r7': [ '@2.6r7': [
# Debian patch to fix the gt_numorbits declaration # Debian patch to fix the gt_numorbits declaration
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-fix-gt_numorbits.patch', 'a6e1ef4897aabd67c104fd1d78bcc334'), # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-fix-gt_numorbits.patch',
'c8e4546a7b262c92cee226beb1dc71d87d644b115375e9c8550598efcc00254f'),
# Debian patch to add explicit extern declarations where needed # Debian patch to add explicit extern declarations where needed
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-fix-include-extern.patch', '741034dec2d2f8b418b6e186aa3eb50f'), # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-fix-include-extern.patch',
'c52c62e4dc46532ad89632a3f59a9faf13dd7988e9ef29fc5e5b2a3e17449bb6'),
# Debian patch to use zlib instead of invoking zcat through a pipe # Debian patch to use zlib instead of invoking zcat through a pipe
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-zlib-blisstog.patch', '667e1ce341f2506482ad30afd04f17e3'), # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-zlib-blisstog.patch',
'b1210bfb41ddbeb4c956d660266f62e806026a559a4700ce78024a9db2b82168'),
# Debian patch to improve usage and help information # Debian patch to improve usage and help information
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-help2man.patch', '4202e6d83362daa2c4c4ab0788e11ac5'), # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-help2man.patch',
'c11544938446a3eca70d55b0f1084ce56fb1fb415db1ec1b5a69fd310a02b16c'),
# Debian patch to add libtool support for building a shared library # Debian patch to add libtool support for building a shared library
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-autotoolization.patch', 'ea75f19c8a980c4d6d4e07223785c751'), # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-autotoolization.patch',
'7f60ae3d8aeee830306db991c908efae461f103527a7899ce79d936bb15212b5'),
# Debian patch to canonicalize header file usage # Debian patch to canonicalize header file usage
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-includes.patch', 'c6ce4209d1381fb5489ed552ef35d7dc'), # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-includes.patch',
'9a305f0cd3f1136a9885518bd7912c669d1ca4b2b43bd039d6fc5535b9679778'),
# Debian patch to prefix "nauty-" to the names of the generic tools # Debian patch to prefix "nauty-" to the names of the generic tools
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-tool-prefix.patch', 'e89d87b4450adc5d0009ce11438dc975'), # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-tool-prefix.patch',
'736266813a62b3151e0b81ded6578bd0f53f03fc8ffbc54c7c2a2c64ac07b25f'),
# Fedora patch to detect availability of the popcnt # Fedora patch to detect availability of the popcnt
# instruction at runtime # instruction at runtime
('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-popcnt.patch', '8a32d31a7150c8f5f21ccb1f6dc857b1') # noqa: E50 ('https://src.fedoraproject.org/rpms/nauty/raw/0f07d01caf84e9d30cb06b11af4860dd3837636a/f/nauty-popcnt.patch',
'0dc2e0374491dddf5757f0717d0ea3f949f85b540202385662f10c358b4a08e8')
] ]
} }
# Iterate over patches # Iterate over patches
for condition, urls in urls_for_patches.items(): for condition, urls in urls_for_patches.items():
for url, md5 in urls: for url, sha256 in urls:
patch(url, when=condition, level=1, md5=md5) patch(url, when=condition, level=1, sha256=sha256)
depends_on('m4', type='build', when='@2.6r7') depends_on('m4', type='build', when='@2.6r7')
depends_on('autoconf', type='build', when='@2.6r7') depends_on('autoconf', type='build', when='@2.6r7')

View file

@ -43,34 +43,36 @@ class Nwchem(Package):
depends_on('python@2.7:2.8', type=('build', 'run')) depends_on('python@2.7:2.8', type=('build', 'run'))
# first hash is sha256 of the patch (required for URL patches),
# second is sha256 for the archive.
# patches for 6.6-27746: # patches for 6.6-27746:
urls_for_patches = { urls_for_patches = {
'@6.6': [ '@6.6': [
('http://www.nwchem-sw.org/images/Tddft_mxvec20.patch.gz', 'f91c6a04df56e228fe946291d2f38c9a'), ('http://www.nwchem-sw.org/images/Tddft_mxvec20.patch.gz', 'ae04d4754c25fc324329dab085d4cc64148c94118ee702a7e14fce6152b4a0c5', 'cdfa8a5ae7d6ee09999407573b171beb91e37e1558a3bfb2d651982a85f0bc8f'),
('http://www.nwchem-sw.org/images/Tools_lib64.patch.gz', 'b71e8dbad27f1c97b60a53ec34d3f6e0'), ('http://www.nwchem-sw.org/images/Tools_lib64.patch.gz', 'ef2eadef89c055c4651ea807079577bd90e1bc99ef6c89f112f1f0e7560ec9b4', '76b8d3e1b77829b683234c8307fde55bc9249b87410914b605a76586c8f32dae'),
('http://www.nwchem-sw.org/images/Config_libs66.patch.gz', 'cc4be792e7b5128c3f9b7b1167ade2cf'), ('http://www.nwchem-sw.org/images/Config_libs66.patch.gz', '56f9c4bab362d82fb30d97564469e77819985a38e15ccaf04f647402c1ee248e', 'aa17f03cbb22ad7d883e799e0fddad1b5957f5f30b09f14a1a2caeeb9663cc07'),
('http://www.nwchem-sw.org/images/Cosmo_meminit.patch.gz', '1d94685bf3b72d8ecd40c46334348ca7'), ('http://www.nwchem-sw.org/images/Cosmo_meminit.patch.gz', 'f05f09ca235ad222fe47d880bfd05a1b88d0148b990ca8c7437fa231924be04b', '569c5ee528f3922ee60ca831eb20ec6591633a36f80efa76cbbe41cabeb9b624'),
('http://www.nwchem-sw.org/images/Sym_abelian.patch.gz', 'b19cade61c787916a73a4aaf6e2445d6'), ('http://www.nwchem-sw.org/images/Sym_abelian.patch.gz', 'e3470fb5786ab30bf2eda3bb4acc1e4c48fb5e640a09554abecf7d22b315c8fd', 'aa693e645a98dbafbb990e26145d65b100d6075254933f36326cf00bac3c29e0'),
('http://www.nwchem-sw.org/images/Xccvs98.patch.gz', 'b9aecc516a3551dcf871cb2f066598cb'), ('http://www.nwchem-sw.org/images/Xccvs98.patch.gz', '75540e0436c12e193ed0b644cff41f5036d78c101f14141846083f03ad157afa', '1c0b0f1293e3b9b05e9e51e7d5b99977ccf1edb4b072872c8316452f6cea6f13'),
('http://www.nwchem-sw.org/images/Dplot_tolrho.patch.gz', '0a5bdad63d2d0ffe46b28db7ad6d9cec'), ('http://www.nwchem-sw.org/images/Dplot_tolrho.patch.gz', '8c30f92730d15f923ec8a623e3b311291eb2ba8b9d5a9884716db69a18d14f24', '2ebb1a5575c44eef4139da91f0e1e60057b2eccdba7f57a8fb577e840c326cbb'),
('http://www.nwchem-sw.org/images/Driver_smalleig.patch.gz', 'c3f609947220c0adb524b02c316b5564'), ('http://www.nwchem-sw.org/images/Driver_smalleig.patch.gz', 'a040df6f1d807402ce552ba6d35c9610d5efea7a9d6342bbfbf03c8d380a4058', 'dd65bfbae6b472b94c8ee81d74f6c3ece37c8fc8766ff7a3551d8005d44815b8'),
('http://www.nwchem-sw.org/images/Ga_argv.patch.gz', '7a665c981cfc17187455e1826f095f6f'), ('http://www.nwchem-sw.org/images/Ga_argv.patch.gz', '6fcd3920978ab95083483d5ed538cd9a6f2a80c2cafa0c5c7450fa5621f0a314', '8a78cb2af14314b92be9d241b801e9b9fed5527b9cb47a083134c7becdfa7cf1'),
('http://www.nwchem-sw.org/images/Raman_displ.patch.gz', 'ed334ca0b2fe81ce103ef8cada990c4c'), ('http://www.nwchem-sw.org/images/Raman_displ.patch.gz', 'ca4312cd3ed1ceacdc3a7d258bb05b7824c393bf44f44c28a789ebeb29a8dba4', '6a16f0f589a5cbb8d316f68bd2e6a0d46cd47f1c699a4b256a3973130061f6c3'),
('http://www.nwchem-sw.org/images/Ga_defs.patch.gz', '0c3cab4d5cbef5acac16ffc5e6f869ef'), ('http://www.nwchem-sw.org/images/Ga_defs.patch.gz', 'f8ac827fbc11f7d2a9d8ec840c6f79d4759ef782bd4d291f2e88ec81b1b230aa', 'c6f1a48338d196e1db22bcfc6087e2b2e6eea50a34d3a2b2d3e90cccf43742a9'),
('http://www.nwchem-sw.org/images/Zgesvd.patch.gz', '8fd5a11622968ef4351bd3d5cddce8f2'), ('http://www.nwchem-sw.org/images/Zgesvd.patch.gz', 'c333a94ceb2c35a490f24b007485ac6e334e153b03cfc1d093b6037221a03517', '4af592c047dc3e0bc4962376ae2c6ca868eb7a0b40a347ed9b88e887016ad9ed'),
('http://www.nwchem-sw.org/images/Cosmo_dftprint.patch.gz', '64dcf27f3c6ced2cadfb504fa66e9d08'), ('http://www.nwchem-sw.org/images/Cosmo_dftprint.patch.gz', '449d59983dc68c23b34e6581370b2fb3d5ea425b05c3182f0973e5b0e1a62651', 'd3b73431a68d6733eb7b669d471e18a83e03fa8e40c48e536fe8edecd99250ff'),
('http://www.nwchem-sw.org/images/Txs_gcc6.patch.gz', '56595a7252da051da13f94edc54fe059'), ('http://www.nwchem-sw.org/images/Txs_gcc6.patch.gz', '1dab87f23b210e941c765f7dd7cc2bed06d292a2621419dede73f10ba1ca1bcd', '139692215718cd7414896470c0cc8b7817a73ece1e4ca93bf752cf1081a195af'),
('http://www.nwchem-sw.org/images/Gcc6_optfix.patch.gz', 'c6642c21363c09223784b47b8636047d'), ('http://www.nwchem-sw.org/images/Gcc6_optfix.patch.gz', '8f8a5f8246bc1e42ef0137049acab4448a2e560339f44308703589adf753c148', '15cff43ab0509e0b0e83c49890032a848d6b7116bd6c8e5678e6c933f2d051ab'),
('http://www.nwchem-sw.org/images/Util_gnumakefile.patch.gz', 'af74ea2e32088030137001ce5cb047c5'), ('http://www.nwchem-sw.org/images/Util_gnumakefile.patch.gz', '173e17206a9099c3512b87e3f42441f5b089db82be1d2b306fe2a0070e5c8fad', '5dd82b9bd55583152295c999a0e4d72dd9d5c6ab7aa91117c2aae57a95a14ba1'),
('http://www.nwchem-sw.org/images/Util_getppn.patch.gz', '8dec8ee198bf5ec4c3a22a6dbf31683c'), ('http://www.nwchem-sw.org/images/Util_getppn.patch.gz', 'c4a23592fdcfb1fb6b65bc6c1906ac36f9966eec4899c4329bc8ce12015d2495', '8be418e1f8750778a31056f1fdf2a693fa4a12ea86a531f1ddf6f3620421027e'),
('http://www.nwchem-sw.org/images/Gcc6_macs_optfix.patch.gz', 'a891a2713aac8b0423c8096461c243eb'), ('http://www.nwchem-sw.org/images/Gcc6_macs_optfix.patch.gz', 'ff33d5f1ccd33385ffbe6ce7a18ec1506d55652be6e7434dc8065af64c879aaa', 'fade16098a1f54983040cdeb807e4e310425d7f66358807554e08392685a7164'),
('http://www.nwchem-sw.org/images/Notdir_fc.patch.gz', '2dc997d4ab3719ac7964201adbc6fd79') ('http://www.nwchem-sw.org/images/Notdir_fc.patch.gz', '54c722fa807671d6bf1a056586f0923593319d09c654338e7dd461dcd29ff118', 'a6a233951eb254d8aff5b243ca648def21fa491807a66c442f59c437f040ee69')
] ]
} }
# Iterate over patches # Iterate over patches
for condition, urls in urls_for_patches.items(): for condition, urls in urls_for_patches.items():
for url, md5 in urls: for url, sha256, archive_sha256 in urls:
patch(url, when=condition, level=0, md5=md5) patch(url, when=condition, level=0, sha256=sha256, archive_sha256=archive_sha256)
def install(self, spec, prefix): def install(self, spec, prefix):
scalapack = spec['scalapack'].libs scalapack = spec['scalapack'].libs

View file

@ -43,19 +43,19 @@ def fedora_patch(commit, file, **kwargs):
patch('{0}{1}'.format(prefix, file), **kwargs) patch('{0}{1}'.format(prefix, file), **kwargs)
# Upstream patches # Upstream patches
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-000-add-all-flags-for-gethost-build.patch', when='@6.20.00', md5='05f85110bf2dd17324fc9825590df63e') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-000-add-all-flags-for-gethost-build.patch', when='@6.20.00', sha256='f8266916189ebbdfbad5c2c28ac00ed25f07be70f054d9830eb84ba84b3d03ef')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-001-delay-arginp-interpreting.patch', when='@6.20.00', md5='7df17b51be5c24bc02f854f3b4237324') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-001-delay-arginp-interpreting.patch', when='@6.20.00', sha256='57c7a9b0d94dd41e4276b57b0a4a89d91303d36180c1068b9e3ab8f6149b18dd')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-002-type-of-read-in-prompt-confirm.patch', when='@6.20.00', md5='27941364ec07e797b533902a6445e0de') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-002-type-of-read-in-prompt-confirm.patch', when='@6.20.00', sha256='837a6a82f815c0905cf7ea4c4ef0112f36396fc8b2138028204000178a1befa5')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-003-fix-out-of-bounds-read.patch', when='@6.20.00', md5='da300b7bf28667ee69bbdc5219f8e0b3') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-003-fix-out-of-bounds-read.patch', when='@6.20.00', sha256='f973bd33a7fd8af0002a9b8992216ffc04fdf2927917113e42e58f28b702dc14')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-004-do-not-use-old-pointer-tricks.patch', when='@6.20.00', md5='702a0011e96495acb93653733f36b073') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-004-do-not-use-old-pointer-tricks.patch', when='@6.20.00', sha256='333e111ed39f7452f904590b47b996812590b8818f1c51ad68407dc05a1b18b0')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-005-reset-fixes-numbering.patch', when='@6.20.00', md5='8a0fc5b74107b4d7ea7b10b1d6aebe9d') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-005-reset-fixes-numbering.patch', when='@6.20.00', sha256='d1b54b5c5432faed9791ffde813560e226896a68fc5933d066172bcf3b2eb8bd')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-006-cleanup-in-readme-files.patch', when='@6.20.00', md5='2c8fec7652af53229eb22535363e9eac') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-006-cleanup-in-readme-files.patch', when='@6.20.00', sha256='b4e7428ac6c2918beacc1b73f33e784ac520ef981d87e98285610b1bfa299d7b')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-007-look-for-tgetent-in-libtinfo.patch', when='@6.20.00', md5='69eacbbe9d9768164f1272c303df44aa') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-007-look-for-tgetent-in-libtinfo.patch', when='@6.20.00', sha256='e6c88ffc291c9d4bda4d6bedf3c9be89cb96ce7dc245163e251345221fa77216')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-008-guard-ascii-only-reversion.patch', when='@6.20.00', md5='0415789a4804cf6320cc83f5c8414a63') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-008-guard-ascii-only-reversion.patch', when='@6.20.00', sha256='7ee195e4ce4c9eac81920843b4d4d27254bec7b43e0b744f457858a9f156e621')
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-009-fix-regexp-for-backlash-quoting-tests.patch', when='@6.20.00', md5='90b3f10eb744c2b26155618d8232a4e9') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-009-fix-regexp-for-backlash-quoting-tests.patch', when='@6.20.00', sha256='d2358c930d5ab89e5965204dded499591b42a22d0a865e2149b8c0f1446fac34')
# Downstream patches # Downstream patches
fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-manpage-memoryuse.patch', md5='1fd35c430992aaa52dc90261e331acd5') # noqa: E501 fedora_patch('8a6066c901fb4fc75013dd488ba958387f00c74d', 'tcsh-6.20.00-manpage-memoryuse.patch', sha256='3a4e60fe56a450632140c48acbf14d22850c1d72835bf441e3f8514d6c617a9f') # noqa: E501
depends_on('ncurses') depends_on('ncurses')