<86>Feb 13 01:21:56 userdel[7601]: delete user 'rooter' <86>Feb 13 01:21:56 userdel[7601]: removed shadow group 'rooter' owned by 'rooter' <86>Feb 13 01:21:56 groupadd[7615]: group added to /etc/group: name=rooter, GID=567 <86>Feb 13 01:21:56 groupadd[7615]: group added to /etc/gshadow: name=rooter <86>Feb 13 01:21:56 groupadd[7615]: new group: name=rooter, GID=567 <86>Feb 13 01:21:56 useradd[7625]: new user: name=rooter, UID=567, GID=567, home=/root, shell=/bin/bash <86>Feb 13 01:21:56 userdel[7639]: delete user 'builder' <86>Feb 13 01:21:56 userdel[7639]: removed group 'builder' owned by 'builder' <86>Feb 13 01:21:56 userdel[7639]: removed shadow group 'builder' owned by 'builder' <86>Feb 13 01:21:56 groupadd[7649]: group added to /etc/group: name=builder, GID=568 <86>Feb 13 01:21:56 groupadd[7649]: group added to /etc/gshadow: name=builder <86>Feb 13 01:21:56 groupadd[7649]: new group: name=builder, GID=568 <86>Feb 13 01:21:56 useradd[7657]: new user: name=builder, UID=568, GID=568, home=/usr/src, shell=/bin/bash <13>Feb 13 01:22:00 rpmi: libgdbm-1.8.3-alt10 1454943313 installed <13>Feb 13 01:22:00 rpmi: libexpat-2.2.4-alt1 1503305341 installed <13>Feb 13 01:22:00 rpmi: libtasn1-4.13-alt2 1521133848 installed <13>Feb 13 01:22:00 rpmi: libp11-kit-0.23.9-alt5 1525798241 installed <13>Feb 13 01:22:00 rpmi: rpm-macros-alternatives-0.5.0-alt1 sisyphus.219012.300 1546745004 installed <13>Feb 13 01:22:00 rpmi: alternatives-0.5.0-alt1 sisyphus.219012.300 1546745004 installed <13>Feb 13 01:22:00 rpmi: ca-certificates-2019.02.01-alt1 sisyphus+220384.200.1.1 1549032756 installed <13>Feb 13 01:22:00 rpmi: ca-trust-0.1.1-alt2 1515595785 installed <13>Feb 13 01:22:00 rpmi: p11-kit-trust-0.23.9-alt5 1525798241 installed <13>Feb 13 01:22:00 rpmi: libcrypto1.1-1.1.0j-alt1 sisyphus.216647.100 1542743840 installed <13>Feb 13 01:22:00 rpmi: libssl1.1-1.1.0j-alt1 sisyphus.216647.100 1542743840 installed <13>Feb 13 01:22:00 rpmi: python3-3.6.8-alt1 sisyphus+220164.200.3.1 1548842470 installed <13>Feb 13 01:22:01 rpmi: python3-base-3.6.8-alt1 sisyphus+220164.200.3.1 1548842470 installed <13>Feb 13 01:22:01 rpmi: libpython3-3.6.8-alt1 sisyphus+220164.200.3.1 1548842470 installed <13>Feb 13 01:22:01 rpmi: tests-for-installed-python3-pkgs-0.1.13.1-alt2 1535450458 installed <13>Feb 13 01:22:01 rpmi: rpm-build-python3-0.1.13.1-alt2 1535450458 installed <13>Feb 13 01:22:06 rpmi: python3-module-pkg_resources-1:40.6.3-alt1 sisyphus+219164.200.2.1 1548188195 installed <13>Feb 13 01:22:06 rpmi: libtinfo-devel-6.1.20180407-alt2 sisyphus.215627.200 1540831969 installed <13>Feb 13 01:22:06 rpmi: libncurses-devel-6.1.20180407-alt2 sisyphus.215627.200 1540831969 installed <13>Feb 13 01:22:06 rpmi: python3-dev-3.6.8-alt1 sisyphus+220164.200.3.1 1548842470 installed <13>Feb 13 01:22:06 rpmi: python-modules-curses-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:06 rpmi: libverto-0.3.0-alt1_5 1525957716 installed <13>Feb 13 01:22:06 rpmi: libkeyutils-1.6-alt2 sisyphus.217337.100 1544003165 installed <13>Feb 13 01:22:06 rpmi: libcom_err-1.44.5-alt1 sisyphus.218838.100 1546206092 installed <86>Feb 13 01:22:06 groupadd[19869]: group added to /etc/group: name=_keytab, GID=499 <86>Feb 13 01:22:06 groupadd[19869]: group added to /etc/gshadow: name=_keytab <86>Feb 13 01:22:06 groupadd[19869]: new group: name=_keytab, GID=499 <13>Feb 13 01:22:06 rpmi: libkrb5-1.16.3-alt1 sisyphus.219042.100 1547045738 installed <13>Feb 13 01:22:06 rpmi: libtirpc-1.0.3-alt1 1532008015 installed <13>Feb 13 01:22:06 rpmi: libnsl2-1.1.0-alt1_1 1511548748 installed <13>Feb 13 01:22:06 rpmi: python-modules-email-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:06 rpmi: python-modules-unittest-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:06 rpmi: python-modules-nis-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-encodings-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-compiler-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-module-pkg_resources-1:40.6.3-alt1 sisyphus+219164.200.2.1 1548188195 installed <13>Feb 13 01:22:07 rpmi: python-modules-xml-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-ctypes-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-multiprocessing-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-logging-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-tools-2to3-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-hotshot-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-bsddb-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-distutils-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-modules-json-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: libnsl2-devel-1.1.0-alt1_1 1511548748 installed <13>Feb 13 01:22:07 rpmi: python-dev-2.7.15-alt1 sisyphus.217364.100 1544022396 installed <13>Feb 13 01:22:07 rpmi: python-module-setuptools-1:40.6.3-alt1 sisyphus+219164.200.2.1 1548188195 installed <13>Feb 13 01:22:07 rpmi: python-module-coverage-4.5.1-alt1 1527495634 installed <13>Feb 13 01:22:07 rpmi: python3-module-setuptools-1:40.6.3-alt1 sisyphus+219164.200.2.1 1548188195 installed <13>Feb 13 01:22:07 rpmi: python3-module-coverage-4.5.1-alt1 1527495634 installed <13>Feb 13 01:22:07 rpmi: python3-modules-curses-3.6.8-alt1 sisyphus+220164.200.3.1 1548842470 installed Building target platforms: x86_64 Building for target x86_64 Wrote: /usr/src/in/nosrpm/python-module-snakeoil-0.6.1-alt1.git20150323.1.1.nosrc.rpm Installing python-module-snakeoil-0.6.1-alt1.git20150323.1.1.src.rpm Building target platforms: x86_64 Building for target x86_64 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.12200 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf python-module-snakeoil-0.6.1 + echo 'Source #0 (python-module-snakeoil-0.6.1.tar):' Source #0 (python-module-snakeoil-0.6.1.tar): + /bin/tar -xf /usr/src/RPM/SOURCES/python-module-snakeoil-0.6.1.tar + cd python-module-snakeoil-0.6.1 + /bin/chmod -c -Rf u+rwX,go-w . + cp -fR . ../python3 + find ../python3 -type f -name '*.py' -exec sed -i 's|#!/usr/bin/env python|#!/usr/bin/env python3|' '{}' + + find ../python3 -type f -name '*.py' -exec sed -i 's|#!/usr/bin/python|#!/usr/bin/python3|' '{}' + + find ../python3 -type f -name '*.py' -exec 2to3 -w -n '{}' + RefactoringTool: Skipping optional fixer: buffer RefactoringTool: Skipping optional fixer: idioms RefactoringTool: Skipping optional fixer: set_literal RefactoringTool: Skipping optional fixer: ws_comma RefactoringTool: No changes to ../python3/setup.py RefactoringTool: Refactored ../python3/snakeoil/weakrefs.py RefactoringTool: No changes to ../python3/snakeoil/version.py RefactoringTool: No changes to ../python3/snakeoil/unittest_extensions.py RefactoringTool: Refactored ../python3/snakeoil/tar.py RefactoringTool: No changes to ../python3/snakeoil/struct_compat.py RefactoringTool: Refactored ../python3/snakeoil/stringio.py RefactoringTool: No changes to ../python3/snakeoil/sequences.py RefactoringTool: No changes to ../python3/snakeoil/pyflakes_extension.py RefactoringTool: Refactored ../python3/snakeoil/pickling.py RefactoringTool: Refactored ../python3/snakeoil/obj.py RefactoringTool: Refactored ../python3/snakeoil/namespaces.py RefactoringTool: No changes to ../python3/snakeoil/modules.py RefactoringTool: Refactored ../python3/snakeoil/mappings.py --- ../python3/snakeoil/weakrefs.py (original) +++ ../python3/snakeoil/weakrefs.py (refactored) @@ -5,7 +5,7 @@ Optimized WeakValCache implementation, and a __del__ alternative """ -from __future__ import print_function + __all__ = ("WeakValCache", "WeakRefFinalizer") @@ -167,10 +167,10 @@ # no more instances holding that class in memory for example. # as such, everything here should strongly ref what we're working # on. - target_classes = cls.__known_classes__.keys() + target_classes = list(cls.__known_classes__.keys()) pid = _getpid_func() for target_cls in target_classes: - for target_ref in target_cls.__finalizer_weakrefs__.get(pid, {}).values(): + for target_ref in list(target_cls.__finalizer_weakrefs__.get(pid, {}).values()): obj = target_ref() if obj is not None: obj.__finalizer__() --- ../python3/snakeoil/tar.py (original) +++ ../python3/snakeoil/tar.py (refactored) @@ -77,7 +77,7 @@ return self._gname def set_gname(self, val): - self._gname = intern(val) + self._gname = sys.intern(val) def del_gname(self): del self._gname @@ -88,7 +88,7 @@ return self._uname def set_uname(self, val): - self._uname = intern(val) + self._uname = sys.intern(val) def del_uname(self): del self._uname --- ../python3/snakeoil/stringio.py (original) +++ ../python3/snakeoil/stringio.py (refactored) @@ -55,11 +55,11 @@ bytes_readonly = _make_ro_cls(io.BytesIO, 'bytes_readonly') else: - from StringIO import StringIO as text_writable + from io import StringIO as text_writable bytes_writable = text_writable try: - from cStringIO import StringIO as text_readonly + from io import StringIO as text_readonly except ImportError: text_readonly = text_writable # note that we rewrite both classes... this is due to cStringIO allowing --- ../python3/snakeoil/pickling.py (original) +++ ../python3/snakeoil/pickling.py (refactored) @@ -15,7 +15,7 @@ # pylint: disable=wildcard-import,unused-wildcard-import try: - from cPickle import * + from pickle import * except ImportError: from pickle import * --- ../python3/snakeoil/obj.py (original) +++ ../python3/snakeoil/obj.py (refactored) @@ -76,7 +76,7 @@ try to proxy builtin objects like tuples, lists, dicts, sets, etc. """ -from __future__ import print_function + __all__ = ("DelayedInstantiation", "DelayedInstantiation_kls", "make_kls",) --- ../python3/snakeoil/namespaces.py (original) +++ ../python3/snakeoil/namespaces.py (refactored) @@ -36,7 +36,7 @@ """ try: fp = None - if isinstance(fd, basestring): + if isinstance(fd, str): fp = open(fd) fd = fp.fileno() --- ../python3/snakeoil/mappings.py (original) +++ ../python3/snakeoil/mappings.py (refactored) @@ -5,7 +5,7 @@ Miscellanious mapping related classes and functionality """ -from __future__ import print_function + __all__ = ( "autoconvert_py3k_methods_metaclass", "DictMixin", "LazyValDict", @@ -15,7 +15,7 @@ ) from collections import deque -from itertools import imap, chain, ifilterfalse, izip +from itertools import chain, filterfalse import operator from snakeoil.klass import get, contains, steal_docs @@ -52,7 +52,7 @@ return super(autoconvert_py3k_methods_metaclass, cls).__new__(cls, name, bases, d) -class DictMixin(object): +class DictMixin(object, metaclass=autoconvert_py3k_methods_metaclass): """ new style class replacement for :py:func:`UserDict.DictMixin` designed around iter* methods rather then forcing lists as DictMixin does @@ -79,7 +79,6 @@ __slots__ = () __externally_mutable__ = True - __metaclass__ = autoconvert_py3k_methods_metaclass disable_py3k_rewriting = True @@ -96,11 +95,11 @@ self.update(iterable) if kwargs: - self.update(kwargs.iteritems()) + self.update(iter(kwargs.items())) @steal_docs(dict) def __iter__(self): - return self.iterkeys() + return iter(self.keys()) def iteritems(self): for k in self: @@ -121,15 +120,15 @@ @steal_docs(dict) def keys(self): - return list(self.iterkeys()) + return list(self.keys()) @steal_docs(dict) def values(self): - return list(self.itervalues()) + return list(self.values()) @steal_docs(dict) def items(self): - return list(self.iteritems()) + return list(self.items()) @steal_docs(dict, True) def has_key(self, key): @@ -141,7 +140,7 @@ @steal_docs(dict) def itervalues(self): - return imap(self.__getitem__, self) + return map(self.__getitem__, self) iteritems = steal_docs(dict)(iteritems) @@ -157,7 +156,7 @@ @steal_docs(dict) def values(self): - return imap(self.__getitem__, self) + return map(self.__getitem__, self) @steal_docs(dict) def update(self, iterable): @@ -169,7 +168,7 @@ # default cmp actually operates based on key len comparison, oddly enough def __cmp__(self, other): - for k1, k2 in izip(sorted(self), sorted(other)): + for k1, k2 in zip(sorted(self), sorted(other)): c = cmp(k1, k2) if c != 0: return c @@ -228,7 +227,7 @@ # yes, a bit ugly, but this works and is py3k compatible # post conversion df = self.__delitem__ - for key in self.keys(): + for key in list(self.keys()): df(key) def __len__(self): @@ -237,7 +236,7 @@ c += 1 return c - def __nonzero__(self): + def __bool__(self): for _ in self: return True return False @@ -247,7 +246,7 @@ if not self.__externally_mutable__: raise AttributeError(self, "popitem") # do it this way so python handles the stopiteration; faster - for key, val in self.iteritems(): + for key, val in self.items(): del self[key] return key, val raise KeyError("container is empty") @@ -307,10 +306,10 @@ return iter(self._keys) def itervalues(self): - return imap(self.__getitem__, self.iterkeys()) + return map(self.__getitem__, iter(self.keys())) def iteritems(self): - return ((k, self[k]) for k in self.iterkeys()) + return ((k, self[k]) for k in self.keys()) def __contains__(self, key): if self._keys_func is not None: @@ -393,9 +392,9 @@ raise KeyError(key) def iterkeys(self): - for k in self.new.iterkeys(): + for k in self.new.keys(): yield k - for k in self.orig.iterkeys(): + for k in self.orig.keys(): if k not in self.blacklist and k not in self.new: yield k @@ -420,7 +419,7 @@ __setitem__ = clear = update = pop = popitem = setdefault = __delitem__ def __hash__(self): - k = self.items() + k = list(self.items()) k.sort(key=self._hash_key_grabber) return hash(tuple(k)) @@ -487,7 +486,7 @@ def iterkeys(self): s = set() - for k in ifilterfalse(s.__contains__, chain(*map(iter, self._dicts))): + for k in filterfalse(s.__contains__, chain(*list(map(iter, self._dicts)))): s.add(k) yield k @@ -566,7 +565,7 @@ self.update(sourcedict) def copy(self): - return PreservingFoldingDict(self._folder, self.iteritems()) + return PreservingFoldingDict(self._folder, iter(self.items())) def refold(self, folder=None): """Use the remembered original keys to update to a new folder. @@ -579,7 +578,7 @@ self._folder = folder oldDict = self._dict self._dict = {} - for key, value in oldDict.itervalues(): + RefactoringTool: Refactored ../python3/snakeoil/lists.py RefactoringTool: Refactored ../python3/snakeoil/klass.py RefactoringTool: Refactored ../python3/snakeoil/iterables.py for key, value in oldDict.values(): self._dict[self._folder(key)] = (key, value) def __getitem__(self, key): @@ -592,14 +591,14 @@ del self._dict[self._folder(key)] def iteritems(self): - return self._dict.itervalues() + return iter(self._dict.values()) def iterkeys(self): - for val in self._dict.itervalues(): + for val in self._dict.values(): yield val[0] def itervalues(self): - for val in self._dict.itervalues(): + for val in self._dict.values(): yield val[1] def __contains__(self, key): @@ -631,7 +630,7 @@ self.update(sourcedict) def copy(self): - return NonPreservingFoldingDict(self._folder, self.iteritems()) + return NonPreservingFoldingDict(self._folder, iter(self.items())) def __getitem__(self, key): return self._dict[self._folder(key)] @@ -646,10 +645,10 @@ return iter(self._dict) def itervalues(self): - return self._dict.itervalues() + return iter(self._dict.values()) def iteritems(self): - return self._dict.iteritems() + return iter(self._dict.items()) def __contains__(self, key): return self._folder(key) in self._dict @@ -956,7 +955,7 @@ del self[k] def __len__(self): - return len(self.keys()) + return len(list(self.keys())) o = SlottedDict --- ../python3/snakeoil/lists.py (original) +++ ../python3/snakeoil/lists.py (refactored) @@ -5,7 +5,7 @@ sequence related operations and classes """ -from __future__ import print_function + __all__ = ( "unstable_unique", "stable_unique", "iter_stable_unique", @@ -99,7 +99,7 @@ if is_py3k: _str_kls = (str, bytes) else: - _str_kls = basestring + _str_kls = str def native_iflatten_instance(l, skip_flattening=_str_kls): """ collapse [[1],2] into [1,2] @@ -114,7 +114,7 @@ iters = expandable_chain(l) try: while True: - x = iters.next() + x = next(iters) if (hasattr(x, '__iter__') and not ( isinstance(x, skip_flattening) or ( isinstance(x, _str_kls) and len(x) == 1))): @@ -140,7 +140,7 @@ iters = expandable_chain(l) try: while True: - x = iters.next() + x = next(iters) if hasattr(x, '__iter__') and not skip_func(x): iters.appendleft(x) else: --- ../python3/snakeoil/klass.py (original) +++ ../python3/snakeoil/klass.py (refactored) @@ -9,7 +9,7 @@ involved in writing classes. """ -from __future__ import print_function + __all__ = ( "generic_equality", "reflective_hash", "inject_richcmp_methods_from_cmp", @@ -281,7 +281,7 @@ scope.setdefault(key, func) -class chained_getter(object): +class chained_getter(object, metaclass=partial(generic_equality, real_type=caching.WeakInstMeta)): """ object that will do multi part lookup, regardless of if it's in the context @@ -324,7 +324,6 @@ __fifo_cache__ = deque() __inst_caching__ = True __attr_comparison__ = ("namespace",) - __metaclass__ = partial(generic_equality, real_type=caching.WeakInstMeta) def __init__(self, namespace): """ --- ../python3/snakeoil/iterables.py (original) +++ ../python3/snakeoil/iterables.py (refactored) @@ -5,7 +5,7 @@ Collection of functionality to make using iterators transparently easier """ -from __future__ import print_function + __all__ = ("expandable_chain", "caching_iter", "iter_sort") @@ -47,11 +47,11 @@ def __iter__(self): return self - def next(self): + def __next__(self): if self.iterables is not None: while self.iterables: try: - return self.iterables[0].next() + return next(self.iterables[0]) except StopIteration: self.iterables.popleft() self.iterRefactoringTool: Refactored ../python3/snakeoil/formatters.py RefactoringTool: Refactored ../python3/snakeoil/fileutils.py RefactoringTool: No changes to ../python3/snakeoil/errors.py RefactoringTool: Refactored ../python3/snakeoil/distutils_extensions.py RefactoringTool: No changes to ../python3/snakeoil/descriptors.py RefactoringTool: Refactored ../python3/snakeoil/dependant_methods.py ables = None @@ -148,7 +148,7 @@ self.iterable = None return cmp(self.cached_list, other) - def __nonzero__(self): + def __bool__(self): if self.cached_list: return True @@ -228,7 +228,7 @@ for x in iterables: try: x = iter(x) - l.append([x.next(), x]) + l.append([next(x), x]) except StopIteration: pass if len(l) == 1: --- ../python3/snakeoil/formatters.py (original) +++ ../python3/snakeoil/formatters.py (refactored) @@ -173,7 +173,7 @@ return True else: def _encoding_conversion_needed(self, val): - return isinstance(val, unicode) + return isinstance(val, str) def _force_encoding(self, val): return val.encode(self.encoding, 'replace') @@ -191,7 +191,7 @@ thing = thing(self) if thing is None: continue - if not isinstance(thing, basestring): + if not isinstance(thing, str): thing = str(thing) self._pos += len(thing) thing = self._force_encoding(thing) @@ -252,7 +252,7 @@ arg = arg(self) if arg is None: continue - if not isinstance(arg, basestring): + if not isinstance(arg, str): arg = str(arg) conversion_needed = self._encoding_conversion_needed(arg) while wrap and self._pos + len(arg) > self.width: --- ../python3/snakeoil/fileutils.py (original) +++ ../python3/snakeoil/fileutils.py (refactored) @@ -39,7 +39,7 @@ if compatibility.is_py3k: if isinstance(stream, (str, bytes)): stream = [stream] - elif isinstance(stream, basestring): + elif isinstance(stream, str): stream = [stream] for data in stream: @@ -85,7 +85,7 @@ return getattr(self.stream, attr) -class AtomicWriteFile_mixin(object): +class AtomicWriteFile_mixin(object, metaclass=WeakRefFinalizer): """File class that stores the changes in a tempfile. @@ -102,8 +102,6 @@ If this object falls out of memory without ever being discarded nor closed, the contents are discarded and a warning is issued. """ - - __metaclass__ = WeakRefFinalizer def __init__(self, fp, binary=False, perms=None, uid=-1, gid=-1): """ @@ -126,7 +124,7 @@ old_umask = None if perms: # give it just write perms - old_umask = os.umask(0200) + old_umask = os.umask(0o200) try: self._actual_init() finally: --- ../python3/snakeoil/distutils_extensions.py (original) +++ ../python3/snakeoil/distutils_extensions.py (refactored) @@ -55,7 +55,7 @@ self.add_defaults() # This bit is roughly equivalent to a MANIFEST.in template file. - for key, globs in self.distribution.package_data.items(): + for key, globs in list(self.distribution.package_data.items()): for pattern in globs: self.filelist.include_pattern(os.path.join(key, pattern)) --- ../python3/snakeoil/dependant_methods.py (original) +++ ../python3/snakeoil/dependant_methods.py (refactored) @@ -48,7 +48,7 @@ >>> # note, no output since finish has already been ran. """ -from __future__ import print_function + from snakeoil.lists import iflatten_instance from snakeoil.currying import pre_curry @@ -89,7 +89,7 @@ return s = [k, iflatten_instance(d.get(k, ()))] while s: - if isinstance(s[-1], basestring): + if isinstance(s[-1], str): yield s.pop(-1) continue exhausted = True @@ -109,7 +109,7 @@ stage_depends = cls.stage_depends # we use id instead of the cls itself to prevent strong ref issues. cls_id = id(cls) - for x in set(x for x in iflatten_instance(stage_depends.iteritems()) if x): + for x in set(x for x in RefactoringTool: Refactored ../python3/snakeoil/demandload.py RefactoringTool: Refactored ../python3/snakeoil/debug_imports.py RefactoringTool: No changes to ../python3/snakeoil/data_source.py RefactoringTool: No changes to ../python3/snakeoil/currying.py RefactoringTool: Refactored ../python3/snakeoil/containers.py RefactoringTool: Refactored ../python3/snakeoil/compatibility.py RefactoringTool: No changes to ../python3/snakeoil/caching_2to3.py RefactoringTool: Refactored ../python3/snakeoil/caching.py RefactoringTool: Refactored ../python3/snakeoil/bash.py iflatten_instance(iter(stage_depends.items())) if x): try: f = getattr(cls, x) except AttributeError: @@ -123,7 +123,7 @@ def __unwrap_stage_dependencies__(cls): stage_depends = cls.stage_depends - for x in set(x for x in iflatten_instance(stage_depends.iteritems()) if x): + for x in set(x for x in iflatten_instance(iter(stage_depends.items())) if x): try: f = getattr(cls, x) except AttributeError: --- ../python3/snakeoil/demandload.py (original) +++ ../python3/snakeoil/demandload.py (refactored) @@ -43,7 +43,7 @@ # There are some demandloaded imports below the definition of demandload. _allowed_chars = "".join((x.isalnum() or x in "_.") and " " or "a" - for x in map(chr, xrange(256))) + for x in map(chr, range(256))) py3k_translate = { "itertools": {"i%s" % k: k for k in ("filterfalse",)}, --- ../python3/snakeoil/debug_imports.py (original) +++ ../python3/snakeoil/debug_imports.py (refactored) @@ -10,8 +10,8 @@ per import cumulative. """ -from __future__ import print_function -import __builtin__ + +import builtins class intercept_import(object): @@ -33,23 +33,23 @@ self.stack.pop() def enable(self): - cur_import = __builtin__.__import__ + cur_import = builtins.__import__ if isinstance(cur_import, intercept_import): raise RuntimeError("an intercept is already active") self.orig_import = cur_import - __builtin__.__import__ = self + builtins.__import__ = self def disable(self): - if __builtin__.__import__ != self: + if builtins.__import__ != self: raise RuntimeError( "either not active, or a different intercept is in use") - __builtin__.__import__ = self.orig_import + builtins.__import__ = self.orig_import del self.orig_import if __name__ == "__main__": import __main__ - orig = dict(__main__.__dict__.iteritems()) + orig = dict(iter(__main__.__dict__.items())) del orig["intercept_import"] del orig["__builtin__"] del orig["__main__"] --- ../python3/snakeoil/containers.py (original) +++ ../python3/snakeoil/containers.py (refactored) @@ -274,7 +274,7 @@ def __iter__(self): return chain(iter(self._new), - ifilterfalse(self._new.__contains__, self._orig)) + filterfalse(self._new.__contains__, self._orig)) def __len__(self): return len(self._new.union(self._orig)) --- ../python3/snakeoil/compatibility.py (original) +++ ../python3/snakeoil/compatibility.py (refactored) @@ -34,7 +34,7 @@ __all__ = ("is_py3k", "is_py3k_like", "intern", "cmp", "sorted_cmp", "sort_cmp") -import ConfigParser as configparser +import configparser as configparser import sys @@ -99,7 +99,7 @@ else: # note that 2to3 screws this up... non issue however, since # this codepath won't be executed. - from __builtin__ import cmp, intern + from builtins import cmp, intern def sorted_cmp(sequence, func, key=None, reverse=False): return sorted(sequence, cmp=func, key=key, reverse=reverse) @@ -130,6 +130,6 @@ #exc_info format; class, instance, tb new_exception.__cause__ = exc_info[1] - raise new_exception.__class__, new_exception, exc_info[2] + raise new_exception.__class__(new_exception).with_traceback(exc_info[2]) IGNORED_EXCEPTIONS = (RuntimeError, MemoryError, SystemExit, KeyboardInterrupt) --- ../python3/snakeoil/caching.py (original) +++ ../python3/snakeoil/caching.py (refactored) @@ -107,7 +107,7 @@ def __call__(cls, *a, **kw): """disable caching via disable_inst_caching=True""" if cls.__inst_caching__ and not kw.pop("disable_inst_caching", False): - kwlist = kw.items() + kwlist = list(kw.items()) kwlist.sort() key = (a, tuple(kwlist)) try: --- ../python3/snakeoil/bash.py (original) +++ ../python3/snakeoil/RefactoringTool: No changes to ../python3/snakeoil/_fileutils.py RefactoringTool: No changes to ../python3/snakeoil/__init__.py RefactoringTool: No changes to ../python3/snakeoil/xml/__init__.py RefactoringTool: No changes to ../python3/snakeoil/test/test_weakrefs.py RefactoringTool: No changes to ../python3/snakeoil/test/test_stringio.py RefactoringTool: No changes to ../python3/snakeoil/test/test_source_hygene.py RefactoringTool: Refactored ../python3/snakeoil/test/test_slot_shadowing.py RefactoringTool: No changes to ../python3/snakeoil/test/test_py3k_eq_hash_inheritance.py RefactoringTool: Refactored ../python3/snakeoil/test/test_process.py RefactoringTool: Refactored ../python3/snakeoil/test/test_osutils.py bash.py (refactored) @@ -39,7 +39,7 @@ after a # that isn't at the start of a line. :return: yields lines w/ commenting stripped out """ - if isinstance(bash_source, basestring): + if isinstance(bash_source, str): bash_source = readlines_utf8(bash_source, True) for s in bash_source: s = s.strip() @@ -89,7 +89,7 @@ close = False infile = None - if isinstance(bash_source, basestring): + if isinstance(bash_source, str): f = open(bash_source, "r") close = True infile = bash_source @@ -273,7 +273,7 @@ if prev != pos: l.append(val[prev:pos]) if var in self.env: - if not isinstance(self.env[var], basestring): + if not isinstance(self.env[var], str): raise ValueError( "env key %r must be a string, not %s: %r" % ( var, type(self.env[var]), self.env[var])) --- ../python3/snakeoil/test/test_slot_shadowing.py (original) +++ ../python3/snakeoil/test/test_slot_shadowing.py (refactored) @@ -42,7 +42,7 @@ if slots is None: continue - if isinstance(slots, str) or isinstance(slots, unicode): + if isinstance(slots, str) or isinstance(slots, str): slots = (slots,) raw_slottings[slots] = parent @@ -53,7 +53,7 @@ if slots is None and not slotting: return - if isinstance(slots, str) or isinstance(slots, unicode): + if isinstance(slots, str) or isinstance(slots, str): if self.err_if_slots_is_str: raise self.failureException( "cls %r; slots is %r (should be a tuple or list)" % --- ../python3/snakeoil/test/test_process.py (original) +++ ../python3/snakeoil/test/test_process.py (refactored) @@ -23,10 +23,10 @@ process.find_binary, script_name) fp = os.path.join(self.dir, script_name) open(fp, "w").close() - os.chmod(fp, 0640) + os.chmod(fp, 0o640) self.assertRaises(process.CommandNotFound, process.find_binary, script_name) - os.chmod(fp, 0750) + os.chmod(fp, 0o750) self.assertIn(self.dir, process.find_binary(script_name)) os.unlink(fp) --- ../python3/snakeoil/test/test_osutils.py (original) +++ ../python3/snakeoil/test/test_osutils.py (refactored) @@ -111,38 +111,38 @@ # default settings path = pjoin(self.dir, 'foo', 'bar') self.assertTrue(osutils.ensure_dirs(path)) - self.check_dir(path, os.geteuid(), os.getegid(), 0777) + self.check_dir(path, os.geteuid(), os.getegid(), 0o777) def test_minimal_nonmodifying(self): path = pjoin(self.dir, 'foo', 'bar') - self.assertTrue(osutils.ensure_dirs(path, mode=0755)) - os.chmod(path, 0777) - self.assertTrue(osutils.ensure_dirs(path, mode=0755, minimal=True)) - self.check_dir(path, os.geteuid(), os.getegid(), 0777) + self.assertTrue(osutils.ensure_dirs(path, mode=0o755)) + os.chmod(path, 0o777) + self.assertTrue(osutils.ensure_dirs(path, mode=0o755, minimal=True)) + self.check_dir(path, os.geteuid(), os.getegid(), 0o777) def test_minimal_modifying(self): path = pjoin(self.dir, 'foo', 'bar') - self.assertTrue(osutils.ensure_dirs(path, mode=0750)) - self.assertTrue(osutils.ensure_dirs(path, mode=0005, minimal=True)) - self.check_dir(path, os.geteuid(), os.getegid(), 0755) + self.assertTrue(osutils.ensure_dirs(path, mode=0o750)) + self.assertTrue(osutils.ensure_dirs(path, mode=0o005, minimal=True)) + self.check_dir(path, os.geteuid(), os.getegid(), 0o755) def test_create_unwritable_subdir(self): path = pjoin(self.dir, 'restricted', 'restricted') # create the subdirs without 020 first self.assertTrue(osutils.ensure_dirs(os.path.dirname(path))) - RefactoringTool: Refactored ../python3/snakeoil/test/test_obj.py RefactoringTool: No changes to ../python3/snakeoil/test/test_modules.py RefactoringTool: Refactored ../python3/snakeoil/test/test_mappings.py self.assertTrue(osutils.ensure_dirs(path, mode=0020)) - self.check_dir(path, os.geteuid(), os.getegid(), 0020) + self.assertTrue(osutils.ensure_dirs(path, mode=0o020)) + self.check_dir(path, os.geteuid(), os.getegid(), 0o020) # unrestrict it osutils.ensure_dirs(path) - self.check_dir(path, os.geteuid(), os.getegid(), 0777) + self.check_dir(path, os.geteuid(), os.getegid(), 0o777) def test_mode(self): path = pjoin(self.dir, 'mode', 'mode') - self.assertTrue(osutils.ensure_dirs(path, mode=0700)) - self.check_dir(path, os.geteuid(), os.getegid(), 0700) + self.assertTrue(osutils.ensure_dirs(path, mode=0o700)) + self.check_dir(path, os.geteuid(), os.getegid(), 0o700) # unrestrict it osutils.ensure_dirs(path) - self.check_dir(path, os.geteuid(), os.getegid(), 0777) + self.check_dir(path, os.geteuid(), os.getegid(), 0o777) def test_gid(self): # abuse the portage group as secondary group @@ -154,11 +154,11 @@ raise SkipTest('you are not in the portage group') path = pjoin(self.dir, 'group', 'group') self.assertTrue(osutils.ensure_dirs(path, gid=portage_gid)) - self.check_dir(path, os.geteuid(), portage_gid, 0777) + self.check_dir(path, os.geteuid(), portage_gid, 0o777) self.assertTrue(osutils.ensure_dirs(path)) - self.check_dir(path, os.geteuid(), portage_gid, 0777) + self.check_dir(path, os.geteuid(), portage_gid, 0o777) self.assertTrue(osutils.ensure_dirs(path, gid=os.getegid())) - self.check_dir(path, os.geteuid(), os.getegid(), 0777) + self.check_dir(path, os.geteuid(), os.getegid(), 0o777) class SymlinkTest(TempDirMixin): --- ../python3/snakeoil/test/test_obj.py (original) +++ ../python3/snakeoil/test/test_obj.py (refactored) @@ -15,9 +15,9 @@ t = tuple([1, 2, 3]) o = make_DI(tuple, lambda: t) objs = [o, t] - self.assertEqual(*map(str, objs)) - self.assertEqual(*map(repr, objs)) - self.assertEqual(*map(hash, objs)) + self.assertEqual(*list(map(str, objs))) + self.assertEqual(*list(map(repr, objs))) + self.assertEqual(*list(map(hash, objs))) self.assertEqual(*objs) self.assertTrue(cmp(t, o) == 0) self.assertFalse(t < o) @@ -77,7 +77,7 @@ # now ensure we always get the same kls back for derivatives class foon(object): - def __nonzero__(self): + def __bool__(self): return True o = make_DI(foon, foon) --- ../python3/snakeoil/test/test_mappings.py (original) +++ ../python3/snakeoil/test/test_mappings.py (refactored) @@ -10,7 +10,7 @@ def a_dozen(): - return range(12) + return list(range(12)) class BasicDict(mappings.DictMixin): @@ -106,8 +106,8 @@ def test_keys(self): # Called twice because the first call will trigger a keyfunc call. - self.assertEqual(sorted(self.dict.keys()), list(xrange(12))) - self.assertEqual(sorted(self.dict.keys()), list(xrange(12))) + self.assertEqual(sorted(self.dict.keys()), list(range(12))) + self.assertEqual(sorted(self.dict.keys()), list(range(12))) def test_len(self): # Called twice because the first call will trigger a keyfunc call. @@ -134,25 +134,25 @@ def setUp(self): RememberingNegateMixin.setUp(self) - self.dict = mappings.LazyValDict(range(12), self.negate) + self.dict = mappings.LazyValDict(list(range(12)), self.negate) def tearDown(self): RememberingNegateMixin.tearDown(self) def test_itervalues(self): - self.assertEqual(sorted(self.dict.itervalues()), range(-11, 1)) + self.assertEqual(sorted(self.dict.values()), list(range(-11, 1))) def test_len(self): self.assertEqual(len(self.dict), 12) def test_iter(self): - self.assertEqual(list(self.dict), range(12)) + self.assertEqual(list(self.dict), list(range(12))) def test_contains(self): self.assertIn(1, self.dict) def test_has_key(self): - self.assertEqual(True, self.dict.has_key(1)) + self.assertEqual(True, 1 in self.dict) class LazyValDictWithFuncTest(TestCase, LazyValDictTestMixin, RememberingNegateMixin): @@ -244,21 +244,21 @@ class StackedDictTest(TestCase): - orig_dict = dict.fromkeys(xrange(100)) - new_dict = dict.fromkeys(xrange(100, 200)) + orig_dict = dict.fromkeys(range(100)) + new_dict = dict.fromkeys(range(100, 200)) def test_contains(self): std = mappings.StackedDict(self.orig_dict, self.new_dict) self.assertIn(1, std) - self.assertTrue(std.has_key(1)) + self.assertTrue(1 in std) def test_stacking(self): o = dict(self.orig_dict) std = mappings.StackedDict(o, self.new_dict) - for x in chain(*map(iter, (self.orig_dict, self.new_dict))): + for x in chain(*list(map(iter, (self.orig_dict, self.new_dict)))): self.assertIn(x, std) - for key in self.orig_dict.iterkeys(): + for key in self.orig_dict.keys(): del o[key] for x in self.orig_dict: self.assertNotIn(x, std) @@ -290,7 +290,7 @@ def test_keys(self): self.assertEqual( sorted(mappings.StackedDict(self.orig_dict, self.new_dict)), - sorted(self.orig_dict.keys() + self.new_dict.keys())) + sorted(list(self.orig_dict.keys()) + list(self.new_dict.keys()))) class IndeterminantDictTest(TestCase): @@ -314,16 +314,16 @@ def test_starter_dict(self): d = mappings.IndeterminantDict( - lambda key: False, starter_dict={}.fromkeys(xrange(100), True)) - for x in xrange(100): + lambda key: False, starter_dict={}.fromkeys(range(100), True)) + for x in range(100): self.assertEqual(d[x], True) - for x in xrange(100, 110): + for x in range(100, 110): self.assertEqual(d[x], False) def test_behaviour(self): val = [] d = mappings.IndeterminantDict( - lambda key: val.append(key), {}.fromkeys(xrange(10), True)) + lambda key: val.append(key), {}.fromkeys(range(10), True)) self.assertEqual(d[0], True) self.assertEqual(d[11], None) self.assertEqual(val, [11]) @@ -350,31 +350,31 @@ @staticmethod def gen_dict(): - return mappings.OrderedDict(enumerate(xrange(100))) + return mappings.OrderedDict(enumerate(range(100))) def test_items(self): - self.assertEqual(list(self.gen_dict().iteritems()), - list(enumerate(xrange(100)))) - self.assertEqual(self.gen_dict().items(), - list(enumerate(xrange(100)))) + self.assertEqual(list(self.gen_dict().items()), + list(enumerate(range(100)))) + self.assertEqual(list(self.gen_dict().items()), + list(enumerate(range(100)))) def test_values(self): - self.assertEqual(list(self.gen_dict().itervalues()), - list(xrange(100))) + self.assertEqual(list(self.gen_dict().values()), + list(range(100))) l = ["asdf", "fdsa", "Dawefa", "3419", "pas", "1"] l = [s + "12" for s in l] + l l = ["1231adsfasdfagqwer" + s for s in l] + l self.assertEqual( list(mappings.OrderedDict( - (v, k) for k, v in enumerate(l)).itervalues()), - list(xrange(len(l)))) + (v, k) for k, v in enumerate(l)).values()), + list(range(len(l)))) def test_keys(self): - self.assertEqual(list(self.gen_dict().iterkeys()), list(xrange(100))) - self.assertEqual(self.gen_dict().keys(), list(xrange(100))) + self.assertEqual(list(self.gen_dict().keys()), list(range(100))) + self.assertEqual(list(self.gen_dict().keys()), list(range(100))) def test_iter(self): - RefactoringTool: Refactored ../python3/snakeoil/test/test_lists.py self.assertEqual(list(self.gen_dict()), list(xrange(100))) + self.assertEqual(list(self.gen_dict()), list(range(100))) l = ["asdf", "fdsa", "Dawefa", "3419", "pas", "1"] l = [s + "12" for s in l] + l l = ["1231adsfasdfagqwer" + s for s in l] + l @@ -383,7 +383,7 @@ def test_del(self): d = self.gen_dict() del d[50] - self.assertEqual(list(d), list(range(50) + range(51, 100))) + self.assertEqual(list(d), list(list(range(50)) + list(range(51, 100)))) self.assertRaises(KeyError, operator.delitem, d, 50) self.assertRaises(KeyError, operator.delitem, d, 'spork') @@ -401,7 +401,7 @@ def testPreserve(self): dct = mappings.PreservingFoldingDict( - str.lower, {'Foo': 'bar', 'fnz': 'donkey'}.iteritems()) + str.lower, iter({'Foo': 'bar', 'fnz': 'donkey'}.items())) self.assertEqual(dct['fnz'], 'donkey') self.assertEqual(dct['foo'], 'bar') self.assertEqual(sorted(['bar', 'donkey']), sorted(dct.values())) @@ -411,9 +411,9 @@ keys = ['Foo', 'fnz'] keysList = list(dct) for key in keys: - self.assertIn(key, dct.keys()) + self.assertIn(key, list(dct.keys())) self.assertIn(key, keysList) - self.assertIn((key, dct[key]), dct.items()) + self.assertIn((key, dct[key]), list(dct.items())) self.assertEqual(len(keys), len(dct)) self.assertEqual(dct.pop('foo'), 'bar') self.assertNotIn('foo', dct) @@ -423,26 +423,26 @@ dct.refold(lambda _: _) self.assertNotIn('foo', dct) self.assertIn('Foo', dct) - self.assertEqual(dct.items(), [('Foo', 'bar')]) + self.assertEqual(list(dct.items()), [('Foo', 'bar')]) dct.clear() self.assertEqual({}, dict(dct)) def testNoPreserve(self): dct = mappings.NonPreservingFoldingDict( - str.lower, {'Foo': 'bar', 'fnz': 'monkey'}.iteritems()) + str.lower, iter({'Foo': 'bar', 'fnz': 'monkey'}.items())) self.assertEqual(sorted(['bar', 'monkey']), sorted(dct.values())) self.assertEqual(dct.copy(), dct) keys = ['foo', 'fnz'] keysList = [key for key in dct] for key in keys: - self.assertIn(key, dct.keys()) + self.assertIn(key, list(dct.keys())) self.assertIn(key, dct) self.assertIn(key, keysList) - self.assertIn((key, dct[key]), dct.items()) + self.assertIn((key, dct[key]), list(dct.items())) self.assertEqual(len(keys), len(dct)) self.assertEqual(dct.pop('foo'), 'bar') del dct['fnz'] - self.assertEqual(dct.keys(), []) + self.assertEqual(list(dct.keys()), []) dct.clear() self.assertEqual({}, dict(dct)) @@ -455,7 +455,7 @@ d = self.kls(lambda x: [x]) self.assertEqual(d[0], [0]) val = d[0] - self.assertEqual(d.items(), [(0, [0])]) + self.assertEqual(list(d.items()), [(0, [0])]) self.assertEqual(d[0], [0]) self.assertIdentical(d[0], val) @@ -498,7 +498,7 @@ def test_it(self): class foo(object): def __init__(self, **kwargs): - for attr, val in kwargs.iteritems(): + for attr, val in kwargs.items(): setattr(self, attr, val) obj = foo() d = self.kls(obj) --- ../python3/snakeoil/test/test_lists.py (original) +++ ../python3/snakeoil/test/test_lists.py (refactored) @@ -38,7 +38,7 @@ [1, 2, 3, o, 4]) def _generator(self): - for x in xrange(5, -1, -1): + for x in range(5, -1, -1): yield x def test_unstable_unique(self): @@ -51,14 +51,14 @@ self.assertTrue( res == [uc(1, 0), uc(0, 1)] or res == [uc(0, 1), uc(1, 0)], res) self.assertEqual(sorted(lists.unstable_unique(self._generator())), - sorted(xrange(6))) + sorted(range(6))) class ChainedListsTest(TestCase): @staticmethod def gen_cl(): - return lists.ChainedLists(range(3), range(3, 6), range(6, 100)) + return lists.ChainedLists(list(range(3)), list(range(3, 6)), list(range(6, 100))) def test_contains(self): cl = self.gen_cl() @@ -66,7 +66,7 @@ self.assertTrue(x in cl) def test_iter(self): - self.assertEqual(list(self.gen_cl()), list(xrange(100))) + self.assertEqual(list(self.gen_cl()), list(range(100))) def test_len(self): self.assertEqual(100, len(self.gen_cl())) @@ -86,12 +86,12 @@ def test_append(self): cl = self.gen_cl() - cl.append(range(10)) + cl.append(list(range(10))) self.assertEqual(110, len(cl)) def test_extend(self): cl = self.gen_cl() - cl.extend(range(10) for i in range(5)) + cl.extend(list(range(10)) for i in range(5)) self.assertEqual(150, len(cl)) @@ -99,15 +99,15 @@ func = staticmethod(lists.native_iflatten_instance) def test_it(self): - o = OrderedDict((k, None) for k in xrange(10)) + o = OrderedDict((k, None) for k in range(10)) for l, correct, skip in [ (["asdf", ["asdf", "asdf"], 1, None], - ["asdf", "asdf", "asdf", 1, None], basestring), - ([o, 1, "fds"], [o, 1, "fds"], (basestring, OrderedDict)), - ([o, 1, "fds"], range(10) + [1, "fds"], basestring), - ("fds", ["fds"], basestring), + ["asdf", "asdf", "asdf", 1, None], str), + ([o, 1, "fds"], [o, 1, "fds"], (str, OrderedDict)), + ([o, 1, "fds"], list(range(10)) + [1, "fds"], str), + ("fds", ["fds"], str), ("fds", ["f", "d", "s"], int), - ('', [''], basestring), + ('', [''], str), (1, [1], int), ]: iterator = self.func(l, skip) @@ -125,7 +125,7 @@ iters = [] iterator = self.func(iters) iters.append(iterator) - self.assertRaises(ValueError, iterator.next) + self.assertRaises(ValueError, iterator.__next__) # Regression test: this was triggered through demandload. # **{} is there to explicitly force a dict. @@ -136,13 +136,13 @@ func = staticmethod(lists.native_iflatten_func) def test_it(self): - o = OrderedDict((k, None) for k in xrange(10)) + o = OrderedDict((k, None) for k in range(10)) for l, correct, skip in [ (["asdf", ["asdf", "asdf"], 1, None], - ["asdf", "asdf", "asdf", 1, None], basestring), - ([o, 1, "fds"], [o, 1, "fds"], (basestring, OrderedDict)), - ([o, 1, "fds"], range(10) + [1, "fds"], basestring), - ("fds", ["fds"], basestring), + ["asdf", "asdf", "asdf", 1, None], str), + ([o, 1, "fds"], [o, 1, "fds"], (str, OrderedDict)), + ([o, 1, "fds"], list(range(10)) + [1, "fds"], str), + ("fds", ["fds"], str), (1, [1], int), ]: iterator = self.func(l, lambda x: isinstance(x, skip)) @@ -160,7 +160,7 @@ iters = [] iterator = self.func(iters, lambda x: False) iters.append(iterator) - self.assertRaises(ValueError, iterator.next) + self.assertRaises(ValueError, iterator.__next__) # Regression test: this was triggered through demandload. # **{} is there to explicitly force a dict to the underly cpy @@ -182,15 +182,15 @@ kls = staticmethod(lists.predicate_split) def test_simple(self): - false_l, true_l = self.kls(lambda x: x % 2 == 0, xrange(100)) - self.assertEqual(false_l, range(1, 100, 2)) - self.assertEqual(true_l, range(0, 100, 2)) + false_l, true_l = self.kls(lambda x: x % 2 == 0, range(100)) + self.assertEqual(false_l, list(range(1, 100, 2))) + self.assertEqual(true_l, list(range(0, RefactoringTool: Refactored ../python3/snakeoil/test/test_klass.py RefactoringTool: Refactored ../python3/snakeoil/test/test_iterables.py 100, 2))) def test_key(self): false_l, true_l = self.kls(lambda x: x %2 == 0, - ([0, x] for x in xrange(100)), + ([0, x] for x in range(100)), key=itemgetter(1)) - self.assertEqual(false_l, [[0, x] for x in xrange(1, 100, 2)]) + self.assertEqual(false_l, [[0, x] for x in range(1, 100, 2)]) self.assertEqual(true_l, [[0, x] for x in range(0, 100, 2)]) cpy_loaded_Test = mk_cpy_loadable_testcase( --- ../python3/snakeoil/test/test_klass.py (original) +++ ../python3/snakeoil/test/test_klass.py (refactored) @@ -32,14 +32,12 @@ def test_attrlist(self): def make_class(attr_list=None): - class foo(object): - __metaclass__ = self.kls - + class foo(object, metaclass=self.kls): if attr_list is not None: locals()['__attr_comparison__'] = attr_list self.assertRaises(TypeError, make_class) - self.assertRaises(TypeError, make_class, [u'foon']) + self.assertRaises(TypeError, make_class, ['foon']) self.assertRaises(TypeError, make_class, [None]) def test_instancemethod(self): @@ -121,9 +119,8 @@ eq=klass.native_generic_attr_eq) def test_it(self): - class c(object): + class c(object, metaclass=self.kls): __attr_comparison__ = ("foo", "bar") - __metaclass__ = self.kls def __init__(self, foo, bar): self.foo, self.bar = foo, bar @@ -147,8 +144,8 @@ def test_call(self): def mk_class(meta): - class c(object): - __metaclass__ = meta + class c(object, metaclass=meta): + pass return c self.assertRaises(TypeError, mk_class) @@ -443,9 +440,9 @@ l = [] class foo(object): @klass.cached_property - def blah(self, l=l, i=iter(xrange(5))): + def blah(self, l=l, i=iter(range(5))): l.append(None) - return i.next() + return next(i) f = foo() self.assertEqual(f.blah, 0) self.assertEqual(len(l), 1) @@ -458,9 +455,9 @@ def test_cached_property(self): l = [] - def named(self, l=l, i=iter(xrange(5))): + def named(self, l=l, i=iter(range(5))): l.append(None) - return i.next() + return next(i) class foo(object): blah = klass.cached_property_named("blah")(named) @@ -518,7 +515,7 @@ func = staticmethod(klass.cached_hash) def test_it(self): - now = long(time()) + now = int(time()) class cls(object): invoked = [] @self.func --- ../python3/snakeoil/test/test_iterables.py (original) +++ ../python3/snakeoil/test/test_iterables.py (refactored) @@ -10,72 +10,72 @@ class ExpandableChainTest(TestCase): def test_normal_function(self): - i = [iter(xrange(100)) for x in xrange(3)] + i = [iter(range(100)) for x in range(3)] e = expandable_chain() e.extend(i) - self.assertEqual(list(e), range(100)*3) + self.assertEqual(list(e), list(range(100))*3) for x in i + [e]: - self.assertRaises(StopIteration, x.next) + self.assertRaises(StopIteration, x.__next__) def test_extend(self): e = expandable_chain() - e.extend(xrange(100) for i in (1, 2)) - self.assertEqual(list(e), range(100)*2) + e.extend(range(100) for i in (1, 2)) + self.assertEqual(list(e), list(range(100))*2) self.assertRaises(StopIteration, e.extend, [[]]) def test_extendleft(self): - e = expandable_chain(xrange(20, 30)) - e.extendleft([xrange(10, 20), xrange(10)]) - self.assertEqual(list(e), range(30)) + e = expandable_chain(range(20, 30)) + e.extendleft([range(10, 20), range(10)]) + self.assertEqual(list(e), list(range(30))) self.assertRaises(StopIteration, e.extendleft, [[]]) def test_append(self): e = expandable_chain() - e.append(xrange(100)) - self.assertEqual(list(e), range(100)) + e.append(range(100)) + self.assertEqual(list(e), list(range(100))) self.assertRaises(StopIteration, e.append, []) def test_appendleft(self): - e = expandable_chain(xrange(10, 20)) - e.appendleft(xrange(10)) - self.assertEqual(list(e), range(20)) + e = expandable_chain(range(10, 20)) + e.appendleft(range(10)) + self.assertEqual(list(e), list(range(20))) self.assertRaises(StopIteration, e.appendleft, []) class CachingIterTest(TestCase): def test_iter_consumption(self): - i = iter(xrange(100)) + i = iter(range(100)) c = caching_iter(i) i2 = iter(c) - for _ in xrange(20): - i2.next() - self.assertEqual(i.next(), 20) + for _ in range(20): + next(i2) + self.assertEqual(next(i), 20) # note we consumed one ourselves self.assertEqual(c[20], 21) list(c) - self.assertRaises(StopIteration, i.next) - self.assertEqual(list(c), range(20) + range(21, 100)) + self.assertRaises(StopIteration, i.__next__) + self.assertEqual(list(c), list(range(20)) + list(range(21, 100))) def test_init(self): - self.assertEqual(caching_iter(list(xrange(100)))[0], 0) + self.assertEqual(caching_iter(list(range(100)))[0], 0) def test_full_consumption(self): - i = iter(xrange(100)) + i = iter(range(100)) c = caching_iter(i) - self.assertEqual(list(c), range(100)) + self.assertEqual(list(c), list(range(100))) # do it twice, to verify it returns properly - self.assertEqual(list(c), range(100)) + self.assertEqual(list(c), list(range(100))) def test_len(self): - self.assertEqual(100, len(caching_iter(xrange(100)))) + self.assertEqual(100, len(caching_iter(range(100)))) def test_hash(self): - self.assertEqual(hash(caching_iter(xrange(100))), + self.assertEqual(hash(caching_iter(range(100))), hash(tuple(range(100)))) def test_nonzero(self): - c = caching_iter(xrange(100)) + c = caching_iter(range(100)) self.assertEqual(bool(c), True) # repeat to check if it works when cached. self.assertEqual(bool(c), True) @@ -90,48 +90,48 @@ def test_cmp(self): get_inst = self._py3k_protection - self.assertEqual(get_inst(xrange(100)), tuple(xrange(100))) - self.assertNotEqual(get_inst(xrange(90)), tuple(xrange(100))) - self.assertTrue(get_inst(xrange(100)) > tuple(xrange(90))) - self.assertFalse(get_inst(xrange(90)) > tuple(xrange(100))) - self.assertTrue(get_inst(xrange(100)) >= tuple(xrange(100))) + self.assertEqual(get_inst(range(100)), tuple(range(100))) + self.assertNotEqual(get_inst(range(90)), tuple(range(100))) + self.assertTrue(get_inst(range(100)) > tuple(range(90))) + self.assertFalse(get_inst(range(90)) > tuple(range(100))) + self.assertTrue(get_inst(range(100)) >= tuple(range(100))) def test_sorter(self): get_inst = self._py3k_protection self.assertEqual( - get_inst(xrange(100, 0, -1), sorted), tuple(xrange(1, 101))) - c = caching_iter(xrange(100, 0, -1), sorted) + get_inst(range(100, 0, -1), sorted), tuple(range(1, 101))) + c = caching_iter(range(100, 0, -1), sorted) self.assertTrue(c) if compatibility.is_py3k: c = tuple(c) - self.assertEqual(c, tuple(xrange(1, 101))) - c = caching_iter(xrange(50, 0, -1), sorted) + self.assertEqual(c, tuple(range(1, 101))) + c = caching_iter(range(50, 0, -1), sorted) self.assertEqual(c[10], 11) if compatibility.is_py3k: c = tuple(c) - self.assertEqual(tuple(xrange(RefactoringTool: Refactored ../python3/snakeoil/test/test_formatters.py 1, 51)), c) + self.assertEqual(tuple(range(1, 51)), c) def test_getitem(self): - c = caching_iter(xrange(20)) + c = caching_iter(range(20)) self.assertEqual(19, c[-1]) self.assertRaises(IndexError, operator.getitem, c, -21) self.assertRaises(IndexError, operator.getitem, c, 21) def test_edgecase(self): - c = caching_iter(xrange(5)) + c = caching_iter(range(5)) self.assertEqual(c[0], 0) # do an off by one access- this actually has broke before self.assertEqual(c[2], 2) self.assertEqual(c[1], 1) - self.assertEqual(list(c), list(xrange(5))) + self.assertEqual(list(c), list(range(5))) def test_setitem(self): self.assertRaises( - TypeError, operator.setitem, caching_iter(xrange(10)), 3, 4) + TypeError, operator.setitem, caching_iter(range(10)), 3, 4) def test_str(self): # Just make sure this works at all. - self.assertTrue(str(caching_iter(xrange(10)))) + self.assertTrue(str(caching_iter(range(10)))) class iter_sortTest(TestCase): @@ -140,5 +140,5 @@ return sorted(l, key=operator.itemgetter(0)) self.assertEqual( list(iter_sort( - f, *[iter(xrange(x, x + 10)) for x in (30, 20, 0, 10)])), - list(xrange(40))) + f, *[iter(range(x, x + 10)) for x in (30, 20, 0, 10)])), + list(range(40))) --- ../python3/snakeoil/test/test_formatters.py (original) +++ ../python3/snakeoil/test/test_formatters.py (refactored) @@ -18,7 +18,7 @@ if compatibility.is_py3k: from io import BytesIO as StringIO else: - from StringIO import StringIO + from io import StringIO # protect against python issue 7567 for the curses module. issue7567 = protect_process @@ -34,7 +34,7 @@ # As many sporks as fit in 20 chars. sporks = ' '.join(3 * ('spork',)) for inputs, output in [ - ((u'\N{SNOWMAN}',), '?'), + (('\N{SNOWMAN}',), '?'), ((7 * 'spork ',), '%s\n%s\n%s' % (sporks, sporks, 'spork ')), (7 * ('spork ',), '%s \n%s \n%s' % (sporks, sporks, 'spork ')), ((30 * 'a'), 20 * 'a' + '\n' + 10 * 'a'), @@ -49,7 +49,7 @@ def test_first_prefix(self): # As many sporks as fit in 20 chars. for inputs, output in [ - ((u'\N{SNOWMAN}',), 'foon:?'), + (('\N{SNOWMAN}',), 'foon:?'), ((7 * 'spork ',), 'foon:spork spork\n' 'spork spork spork\n' @@ -70,7 +70,7 @@ def test_later_prefix(self): for inputs, output in [ - ((u'\N{SNOWMAN}',), '?'), + (('\N{SNOWMAN}',), '?'), ((7 * 'spork ',), 'spork spork spork\n' 'foon:spork spork\n' @@ -196,7 +196,7 @@ (esc, '31m', 'red', esc, '1m', 'boldred', esc, '39;49m', 'bold', esc, '0;10m', 'done')), ((42,), ('42',)), - ((u'\N{SNOWMAN}',), ('?',)) + (('\N{SNOWMAN}',), ('?',)) ): self._test_stream(stream, f, inputs, output) f.autoline = True @@ -246,25 +246,25 @@ def test_dumb_terminal(self): master, _out = _get_pty_pair() formatter = _with_term('dumb', formatters.get_formatter, master) - self.failUnless(isinstance(formatter, formatters.PlainTextFormatter)) + self.assertTrue(isinstance(formatter, formatters.PlainTextFormatter)) @issue7567 def test_smart_terminal(self): master, _out = _get_pty_pair() formatter = _with_term('xterm', formatters.get_formatter, master) - self.failUnless(isinstance(formatter, formatters.TerminfoFormatter)) + self.assertTrue(isinstance(formatter, formatters.TerminfoFormatter)) @issue7567 def test_not_a_tty(self): stream = TemporaryFile() formatter = _with_term('xterm', formatters.get_formatter, sRefactoringTool: Refactored ../python3/snakeoil/test/test_fileutils.py RefactoringTool: No changes to ../python3/snakeoil/test/test_descriptors.py RefactoringTool: Refactored ../python3/snakeoil/test/test_dependant_methods.py tream) - self.failUnless(isinstance(formatter, formatters.PlainTextFormatter)) + self.assertTrue(isinstance(formatter, formatters.PlainTextFormatter)) @issue7567 def test_no_fd(self): stream = StringIO() formatter = _with_term('xterm', formatters.get_formatter, stream) - self.failUnless(isinstance(formatter, formatters.PlainTextFormatter)) + self.assertTrue(isinstance(formatter, formatters.PlainTextFormatter)) cpy_loaded_Test = mk_cpy_loadable_testcase( --- ../python3/snakeoil/test/test_fileutils.py (original) +++ ../python3/snakeoil/test/test_fileutils.py (refactored) @@ -30,15 +30,15 @@ def test_perms(self): fp = pjoin(self.dir, 'target') - orig_um = os.umask(0777) + orig_um = os.umask(0o777) try: - af = self.kls(fp, perms=0644) + af = self.kls(fp, perms=0o644) af.write("dar") af.close() finally: exiting_umask = os.umask(orig_um) - self.assertEqual(exiting_umask, 0777) - self.assertEqual(os.stat(fp).st_mode & 04777, 0644) + self.assertEqual(exiting_umask, 0o777) + self.assertEqual(os.stat(fp).st_mode & 0o4777, 0o644) def test_del(self): fp = pjoin(self.dir, "target") @@ -147,8 +147,8 @@ class native_readfile_ascii_strict_Test(native_readfile_ascii_Test): func = staticmethod(fileutils.native_readfile_ascii_strict) test_cases = native_readfile_ascii_Test.test_cases + [ - (u'\xf2', 'latin', (ValueError, UnicodeDecodeError)), - (u'\ua000', 'utf8', UnicodeDecodeError), + ('\xf2', 'latin', (ValueError, UnicodeDecodeError)), + ('\ua000', 'utf8', UnicodeDecodeError), ] class cpy_readfile_ascii_strict_Test(native_readfile_ascii_strict_Test): @@ -166,7 +166,7 @@ func = staticmethod(fileutils.native_readfile_utf8_strict) default_encoding = 'utf8' test_cases = native_readfile_ascii_Test.test_cases + [ - u'\ua000fa', + '\ua000fa', ] class cpy_readfile_utf8_strict_Test(native_readfile_utf8_Test): @@ -175,10 +175,10 @@ class native_readfile_bytes_Test(native_readfile_Test): func = staticmethod(fileutils.native_readfile_bytes) default_encoding = None - test_cases = map( + test_cases = list(map( currying.post_curry(native_readfile_Test.convert_data, 'ascii'), - native_readfile_Test.test_cases) - test_cases.append(u'\ua000fa'.encode("utf8")) + native_readfile_Test.test_cases)) + test_cases.append('\ua000fa'.encode("utf8")) none_on_missing_ret_data = native_readfile_Test.convert_data( native_readfile_Test.none_on_missing_ret_data, 'ascii') --- ../python3/snakeoil/test/test_dependant_methods.py (original) +++ ../python3/snakeoil/test/test_dependant_methods.py (refactored) @@ -15,11 +15,10 @@ @staticmethod def generate_instance(methods, dependencies): - class Class(object): - __metaclass__ = dm.ForcedDepends + class Class(object, metaclass=dm.ForcedDepends): stage_depends = dict(dependencies) - locals().update(methods.iteritems()) + locals().update(iter(methods.items())) return Class() @@ -30,13 +29,13 @@ results = [] o = self.generate_instance( {str(x): currying.post_curry(func, results, x) for x in range(10)}, - {str(x): str(x - 1) for x in xrange(1, 10)}) + {str(x): str(x - 1) for x in range(1, 10)}) getattr(o, "9")() - self.assertEqual(results, range(10)) + self.assertEqual(results, list(range(10))) results = [] o = self.generate_instance( {str(x): currying.post_curry(func, results, x, False) for x in range(10)}, - {str(x): str(x - 1) for x in xrange(1, 10)}) + {str(x): str(x - 1) for x in range(1, 10)}) getattr(o, "9")() self.assertEqual(results, [0]) getattr(o, "9")() @@ -46,7 +45,7 @@ results = [] o = self.generatRefactoringTool: No changes to ../python3/snakeoil/test/test_demandload_usage.py RefactoringTool: Refactored ../python3/snakeoil/test/test_demandload.py RefactoringTool: No changes to ../python3/snakeoil/test/test_del_usage.py RefactoringTool: Refactored ../python3/snakeoil/test/test_data_source.py e_instance( {str(x): currying.post_curry(func, results, x) for x in range(10)}, - {str(x): str(x - 1) for x in xrange(1, 10)}) + {str(x): str(x - 1) for x in range(1, 10)}) getattr(o, "1")() self.assertEqual(results, [0, 1]) getattr(o, "2")() @@ -63,7 +62,7 @@ def test_stage_depends(self): results = [] methods = {str(x): currying.post_curry(func, results, x) for x in range(10)} - deps = {str(x): str(x - 1) for x in xrange(1, 10)} + deps = {str(x): str(x - 1) for x in range(1, 10)} deps["1"] = ["0", "a"] methods["a"] = currying.post_curry(func, results, "a") o = self.generate_instance(methods, deps) @@ -76,7 +75,7 @@ results = [] o = self.generate_instance( {str(x): currying.post_curry(func, results, x) for x in range(10)}, - {str(x): str(x - 1) for x in xrange(1, 10)}) + {str(x): str(x - 1) for x in range(1, 10)}) getattr(o, '2')(ignore_deps=True) self.assertEqual([2], results) --- ../python3/snakeoil/test/test_demandload.py (original) +++ ../python3/snakeoil/test/test_demandload.py (refactored) @@ -118,13 +118,13 @@ def test_demand_compile_regexp(self): for kwargs, scope in (({}, globals()), ({'scope': {}}, {})): demandload.demand_compile_regexp('foo', 'frob', **kwargs) - self.assertEqual(scope.keys(), ['foo']) + self.assertEqual(list(scope.keys()), ['foo']) self.assertEqual('frob', scope['foo'].pattern) self.assertEqual('frob', scope['foo'].pattern) # verify it's delayed via a bad regex. demandload.demand_compile_regexp('foo', 'f(', **kwargs) - self.assertEqual(scope.keys(), ['foo']) + self.assertEqual(list(scope.keys()), ['foo']) # should blow up on accessing an attribute. obj = scope['foo'] self.assertRaises(sre_constants.error, getattr, obj, 'pattern') --- ../python3/snakeoil/test/test_data_source.py (original) +++ ../python3/snakeoil/test/test_data_source.py (refactored) @@ -60,7 +60,7 @@ def _mk_data(self, size=(100000)): return ''.join("%s" % (x % 10) - for x in xrange(size)) + for x in range(size)) def test_transfer_to_data_source(self): data = self._mk_data() @@ -117,7 +117,7 @@ return data_source.local_source(self.fp, mutable=mutable) def test_bytes_fileobj(self): - data = u"foonani\xf2".encode("utf8") + data = "foonani\xf2".encode("utf8") obj = self.get_obj(data=data) # this will blow up if tries to ascii decode it. f = obj.bytes_fileobj() @@ -125,11 +125,11 @@ f.close() def test_bytes_fileobj_create(self): - data = u"foonani\xf2".encode("utf8") + data = "foonani\xf2".encode("utf8") obj = self.get_obj(test_creation=True, mutable=True) # this will blow up if tries to ascii decode it. f = obj.bytes_fileobj(True) - self.assertEqual(f.read(), u''.encode("utf8")) + self.assertEqual(f.read(), ''.encode("utf8")) f.write(data) f.close() f = obj.bytes_fileobj() @@ -150,7 +150,7 @@ return data_source.bz2_source(self.fp, mutable=mutable) def test_bytes_fileobj(self): - data = u"foonani\xf2".encode("utf8") + data = "foonani\xf2".encode("utf8") obj = self.get_obj(data=data) # this will blow up if tries to ascii decode it. f = obj.bytes_fileobj() @@ -163,7 +163,7 @@ supports_mutable = False def get_obj(self, data="foonani", mutable=False): - if isinstance(data, basestring): + if isinstance(data, str): data = data.encode("utf8") return data_source.invokable_data_source( partial(self._get_data, data)) @@ -187,7 +187,7 @@ self.text_mode) def _get_data(self, data='foonani'): - if isinstance(data, basesRefactoringTool: No changes to ../python3/snakeoil/test/test_currying.py RefactoringTool: Refactored ../python3/snakeoil/test/test_containers.py tring): + if isinstance(data, str): if not self.text_mode: return data.encode("utf8") elif self.text_mode: --- ../python3/snakeoil/test/test_containers.py (original) +++ ../python3/snakeoil/test/test_containers.py (refactored) @@ -12,7 +12,7 @@ class InvertedContainsTest(TestCase): def setUp(self): - self.set = containers.InvertedContains(range(12)) + self.set = containers.InvertedContains(list(range(12))) def test_basic(self): self.assertFalse(7 in self.set) @@ -48,51 +48,51 @@ class TestSetMethods(TestCase): def test_and(self): - c = BasicSet(xrange(100)) - s = set(xrange(25, 75)) - r = BasicSet(xrange(25, 75)) + c = BasicSet(range(100)) + s = set(range(25, 75)) + r = BasicSet(range(25, 75)) self.assertEqual(c & s, r) self.assertEqual(s & c, r._data) def test_xor(self): - c = BasicSet(xrange(100)) - s = set(xrange(25, 75)) - r = BasicSet(chain(xrange(25), xrange(75, 100))) + c = BasicSet(range(100)) + s = set(range(25, 75)) + r = BasicSet(chain(range(25), range(75, 100))) self.assertEqual(c ^ s, r) self.assertEqual(s ^ c, r._data) def test_or(self): - c = BasicSet(xrange(50)) - s = set(xrange(50, 100)) - r = BasicSet(xrange(100)) + c = BasicSet(range(50)) + s = set(range(50, 100)) + r = BasicSet(range(100)) self.assertEqual(c | s, r) self.assertEqual(s | c, r._data) def test_add(self): - c = BasicSet(xrange(50)) - s = set(xrange(50, 100)) - r = BasicSet(xrange(100)) + c = BasicSet(range(50)) + s = set(range(50, 100)) + r = BasicSet(range(100)) self.assertEqual(c + s, r) self.assertEqual(s + c, r._data) def test_sub(self): - c = BasicSet(xrange(100)) - s = set(xrange(50, 150)) - r1 = BasicSet(xrange(50)) - r2 = set(xrange(100, 150)) + c = BasicSet(range(100)) + s = set(range(50, 150)) + r1 = BasicSet(range(50)) + r2 = set(range(100, 150)) self.assertEqual(c - s, r1) self.assertEqual(s - c, r2) class LimitedChangeSetTest(TestCase): def setUp(self): - self.set = containers.LimitedChangeSet(range(12)) + self.set = containers.LimitedChangeSet(list(range(12))) def test_validator(self): def f(val): self.assertTrue(isinstance(val, int)) return val - self.set = containers.LimitedChangeSet(range(12), key_validator=f) + self.set = containers.LimitedChangeSet(list(range(12)), key_validator=f) self.set.add(13) self.set.add(14) self.set.remove(11) @@ -167,7 +167,7 @@ self.set.commit() self.assertFalse(0 in self.set) self.assertEqual(11, len(self.set)) - self.assertEqual(sorted(list(self.set)), range(1, 12)) + self.assertEqual(sorted(list(self.set)), list(range(1, 12))) self.assertEqual(0, self.set.changes_count()) self.set.add(0) self.test_basic(1) @@ -187,18 +187,18 @@ str(containers.LimitedChangeSet([7])), 'LimitedChangeSet([7])') def test__eq__(self): - c = containers.LimitedChangeSet(xrange(99)) + c = containers.LimitedChangeSet(range(99)) c.add(99) - self.assertEqual(c, containers.LimitedChangeSet(xrange(100))) - self.assertEqual(containers.LimitedChangeSet(xrange(100)), - set(xrange(100))) + self.assertEqual(c, containers.LimitedChangeSet(range(100))) + self.assertEqual(containers.LimitedChangeSet(range(100)), + set(range(100))) self.assertNotEqual(containers.LimitedChangeSet([]), object()) class LimitedChangeSetWithBlacklistTest(TestCase): def setUp(self): - self.set = containers.LimitedChangeSet(range(12), [3, 13]) + self.set = containers.LimitedChangeSet(list(range(12)), [3, RefactoringTool: Refactored ../python3/snakeoil/test/test_compatibility.py RefactoringTool: Refactored ../python3/snakeoil/test/test_chksum_defaults.py 13]) def test_basic(self): self.assertTrue(0 in self.set) @@ -227,11 +227,11 @@ self.assertTrue(15 in self.set) def test_iter(self): - self.assertEqual(range(12), sorted(self.set)) + self.assertEqual(list(range(12)), sorted(self.set)) self.set.add(5) - self.assertEqual(range(12), sorted(self.set)) + self.assertEqual(list(range(12)), sorted(self.set)) self.set.add(12) - self.assertEqual(range(13), sorted(self.set)) + self.assertEqual(list(range(13)), sorted(self.set)) def test_len(self): self.assertEqual(12, len(self.set)) @@ -274,7 +274,7 @@ self.assertNotIn(4, c) def test_init(self): - self.assertEqual(self.kls(xrange(5))[4], 1) + self.assertEqual(self.kls(range(5))[4], 1) c = self.kls([1, 2, 3, 1]) self.assertEqual(c[2], 1) self.assertEqual(c[1], 2) --- ../python3/snakeoil/test/test_compatibility.py (original) +++ ../python3/snakeoil/test/test_compatibility.py (refactored) @@ -1,7 +1,7 @@ # Copyright: 2006 Brian Harring # License: BSD/GPL2 -import __builtin__ as builtins +import builtins as builtins from operator import itemgetter from snakeoil.test import TestCase @@ -56,7 +56,7 @@ @staticmethod def get_list(): - return range(100) + return list(range(100)) def test_it(self): f = self.func @@ -80,11 +80,11 @@ mangled = [incomparable_obj([x]) for x in l] # finally, verify it combines key w/ cmp properly. self.assertEqual(sorted(l, reverse=True), - map(zeroth, f(mangled, (lambda x, y: cmp(x, y)), - key=zeroth, reverse=True))) + list(map(zeroth, f(mangled, (lambda x, y: cmp(x, y)), + key=zeroth, reverse=True)))) if self.unchanging: - self.assertEqual(self.get_list(), map(zeroth, mangled)) + self.assertEqual(self.get_list(), list(map(zeroth, mangled))) class sort_cmp_test(TestCase): --- ../python3/snakeoil/test/test_chksum_defaults.py (original) +++ ../python3/snakeoil/test/test_chksum_defaults.py (refactored) @@ -26,7 +26,7 @@ def setUp(self): self.get_chf() fd, self.fn = tempfile.mkstemp() - for i in xrange(multi): + for i in range(multi): os.write(fd, data.encode()) os.close(fd) @@ -56,7 +56,7 @@ if self.chf_type == 'size': return for x in extra_chksums.get(self.chf_type, ()): - self.assertEqual(self.chf.str2long(x), long(x, 16)) + self.assertEqual(self.chf.str2long(x), int(x, 16)) def test_long2str(self): self.assertEqual(self.chf.long2str(self.expected_long), @@ -64,7 +64,7 @@ if self.chf_type == 'size': return for x in extra_chksums.get(self.chf_type, ()): - self.assertEqual(self.chf.long2str(long(x, 16)), x) + self.assertEqual(self.chf.long2str(int(x, 16)), x) checksums = { "rmd160": "b83ad488d624e7911f886420ab230f78f6368b9f", @@ -74,19 +74,19 @@ "sha512": "cdc2b749d28cd9c5fca45d3ca6b65661445decd992da93054fd6f4f3e4013ca8b44b0ba159d1cf1f58f9af2b9d267343b9e10f611494c0850fdcebe0379135c6", "whirlpool": "3f683be80ee004962cfbd1ddb99437f5f3c9f0fd024e18525b6aa080c9fd9d060415d9a8383462b9ddc065f176f5cb257728c33d8e12bbdd47216320350943aa", } -checksums.update((k, (long(v, 16), v)) for k, v in checksums.iteritems()) -checksums["size"] = (long(len(data) * multi), str(long(len(data) * multi))) +checksums.update((k, (int(v, 16), v)) for k, v in checksums.items()) +checksums["size"] = (int(len(data) * multi), str(int(len(data) * multi))) extra_chksums = { "md5": ["2dfd84279314a178d0fa842af3a40e25577e1bc"] } -for k, v in checksums.iteritems(): +for k, v in checksums.items(): extra_chksums.setdefault(k, []).extend((''.rjust(len(v[1]), '0'), '01'.rjust(len(v[1]), '0'))) # trick: create subclasRefactoringTool: No changes to ../python3/snakeoil/test/test_chksum.py RefactoringTool: Refactored ../python3/snakeoil/test/test_caching.py RefactoringTool: Refactored ../python3/snakeoil/test/test_bash.py RefactoringTool: Refactored ../python3/snakeoil/test/mixins.py RefactoringTool: Refactored ../python3/snakeoil/test/__init__.py RefactoringTool: Refactored ../python3/snakeoil/sphinx_utils/generate_api_rsts.py RefactoringTool: Refactored ../python3/snakeoil/process/__init__.py RefactoringTool: No changes to ../python3/snakeoil/osutils/native_readdir.py RefactoringTool: Refactored ../python3/snakeoil/osutils/__init__.py ses for each checksum with a useful class name. -for chf_type, expected in checksums.iteritems(): +for chf_type, expected in checksums.items(): expectedsum = expected[0] expectedstr = expected[1] globals()[chf_type + 'ChksumTest'] = type( --- ../python3/snakeoil/test/test_caching.py (original) +++ ../python3/snakeoil/test/test_caching.py (refactored) @@ -8,13 +8,11 @@ def gen_test(WeakInstMeta): - class weak_slotted(object): - __metaclass__ = WeakInstMeta + class weak_slotted(object, metaclass=WeakInstMeta): __inst_caching__ = True __slots__ = ('one',) - class weak_inst(object): - __metaclass__ = WeakInstMeta + class weak_inst(object, metaclass=WeakInstMeta): __inst_caching__ = True counter = 0 def __new__(cls, *args, **kwargs): --- ../python3/snakeoil/test/test_bash.py (original) +++ ../python3/snakeoil/test/test_bash.py (refactored) @@ -4,7 +4,7 @@ import os -from StringIO import StringIO +from io import StringIO from snakeoil.test.mixins import mk_named_tempfile from snakeoil.test import TestCase --- ../python3/snakeoil/test/mixins.py (original) +++ ../python3/snakeoil/test/mixins.py (refactored) @@ -19,7 +19,7 @@ def setUp(self): self.dir = tempfile.mkdtemp() # force it, since sticky bits spread. - os.chmod(self.dir, 0700) + os.chmod(self.dir, 0o700) def tearDown(self): # change permissions back or rmtree can't kill it @@ -27,20 +27,20 @@ return for root, dirs, _files in os.walk(self.dir): for directory in dirs: - os.chmod(os.path.join(root, directory), 0700) + os.chmod(os.path.join(root, directory), 0o700) shutil.rmtree(self.dir) def tempdir_decorator(func): def f(self, *args, **kwargs): self.dir = tempfile.mkdtemp() try: - os.chmod(self.dir, 0700) + os.chmod(self.dir, 0o700) return func(self, *args, **kwargs) finally: if os.path.exists(self.dir): for root, dirs, _files in os.walk(self.dir): for directory in dirs: - os.chmod(os.path.join(root, directory), 0777) + os.chmod(os.path.join(root, directory), 0o777) shutil.rmtree(self.dir) f.__name__ = func.__name__ return f --- ../python3/snakeoil/test/__init__.py (original) +++ ../python3/snakeoil/test/__init__.py (refactored) @@ -51,7 +51,7 @@ @classmethod def parse(cls, todo): - if isinstance(todo, basestring): + if isinstance(todo, str): return cls(reason=todo) errors, reason = todo try: --- ../python3/snakeoil/sphinx_utils/generate_api_rsts.py (original) +++ ../python3/snakeoil/sphinx_utils/generate_api_rsts.py (refactored) @@ -77,7 +77,7 @@ sys.stdout.write("regenerating rst for %s\n" % (src,)) with open(out_path, "w") as f: generate_rst(src, module, f) - os.chmod(out_path, 0644) + os.chmod(out_path, 0o644) os.utime(out_path, (cur_time, cur_time)) --- ../python3/snakeoil/process/__init__.py (original) +++ ../python3/snakeoil/process/__init__.py (refactored) @@ -47,7 +47,7 @@ return _get_linux_proc_count() core_count[physical_id] = int(cores) - return sum(core_count.itervalues()) + return sum(core_count.values()) def _get_linux_proc_count(): try: @@ -184,7 +184,7 @@ def _native_closerange(from_fd, to_fd): - for fd in xrange(from_fd, to_fd): + for fd in range(from_fd, to_fd): try: os.close(fd) except EnvironmentError: --- ../python3/snakeoil/osutils/__init__.py (original) +++ ../python3/snakeoil/osutils/__init__.py (refactored) @@ -85,7 +85,7 @@ return False return True -def ensure_dirs(path, gid=-1, uid=-1, mode=0777, minimal=True): +def ensure_dirs(path, gid=-1, uid=-1, mode=0o777, minimal=True): """ ensure dirs exist, creating as needed withRefactoringTool: Refactored ../python3/snakeoil/compression/_util.py RefactoringTool: Refactored ../python3/snakeoil/compression/_bzip2.py RefactoringTool: No changes to ../python3/snakeoil/compression/__init__.py RefactoringTool: Refactored ../python3/snakeoil/chksum/defaults.py (optional) gid, uid, and mode. @@ -111,7 +111,7 @@ try: um = os.umask(0) # if the dir perms would lack +wx, we have to force it - force_temp_perms = ((mode & 0300) != 0300) + force_temp_perms = ((mode & 0o300) != 0o300) resets = [] apath = normpath(os.path.abspath(path)) sticky_parent = False @@ -125,9 +125,9 @@ # if it's a subdir, we need +wx at least if apath != base: - if (st.st_mode & 0300) != 0300: + if (st.st_mode & 0o300) != 0o300: try: - os.chmod(base, (st.st_mode | 0300)) + os.chmod(base, (st.st_mode | 0o300)) except OSError: return False resets.append((base, st.st_mode)) @@ -137,7 +137,7 @@ # nothing exists. try: if force_temp_perms: - if not _safe_mkdir(base, 0700): + if not _safe_mkdir(base, 0o700): return False resets.append((base, mode)) else: @@ -169,7 +169,7 @@ if minimal: if mode != (st.st_mode & mode): os.chmod(path, st.st_mode | mode) - elif mode != (st.st_mode & 07777): + elif mode != (st.st_mode & 0o7777): os.chmod(path, mode) except OSError: return False @@ -268,13 +268,11 @@ # IMO, it shouldn't, but opening/closing everytime around is expensive -class FsLock(object): +class FsLock(object, metaclass=WeakRefFinalizer): """ fnctl based filesystem lock """ - - __metaclass__ = WeakRefFinalizer __slots__ = ("path", "fd", "create") def __init__(self, path, create=False): --- ../python3/snakeoil/compression/_util.py (original) +++ ../python3/snakeoil/compression/_util.py (refactored) @@ -35,9 +35,7 @@ return _drive_process(args, 'decompression', data) -class _process_handle(object): - - __metaclass__ = WeakRefFinalizer +class _process_handle(object, metaclass=WeakRefFinalizer): def __init__(self, handle, args, is_read=False): self.mode = 'wb' @@ -51,12 +49,12 @@ def _open_handle(self, handle): self._allow_reopen = None close = False - if isinstance(handle, basestring): + if isinstance(handle, str): if self.is_read: self._allow_reopen = handle handle = open(handle, mode=self.mode) close = True - elif not isinstance(handle, (long, int)): + elif not isinstance(handle, int): if not hasattr(handle, 'fileno'): raise TypeError( "handle %r isn't a string, integer, and lacks a fileno " --- ../python3/snakeoil/compression/_bzip2.py (original) +++ ../python3/snakeoil/compression/_bzip2.py (refactored) @@ -76,7 +76,7 @@ if parallelize and parallelizable: return _util.compress_handle(lbzip2_path, handle, level=level, extra_args=lbzip2_compress_args) - elif native and isinstance(handle, basestring): + elif native and isinstance(handle, str): return BZ2File(handle, mode='w', compresslevel=level) return _compress_handle(handle, level=level) @@ -84,7 +84,7 @@ if parallelize and parallelizable: return _util.decompress_handle(lbzip2_path, handle, extra_args=lbzip2_decompress_args) - elif (native and isinstance(handle, basestring) + elif (native and isinstance(handle, str) and sys.version_info[:3] >= (3, 3)): # note that <3.3, bz2file doesn't handle multiple streams. # thus don't use it. --- ../python3/snakeoil/chksum/defaults.py (original) +++ ../python3/snakeoil/chksuRefactoringTool: Refactored ../python3/snakeoil/chksum/_whirlpool.py m/defaults.py (refactored) @@ -13,12 +13,13 @@ from functools import partial import hashlib import threading -import Queue +import queue from snakeoil.data_source import base as base_data_source from snakeoil import modules from snakeoil.compatibility import intern, is_py3k from snakeoil.demandload import demandload +import sys demandload( 'os', 'snakeoil.process:get_proc_count', @@ -46,13 +47,13 @@ def chksum_loop_over_file(filename, chfs, parallelize=True): chfs = [chf() for chf in chfs] loop_over_file(filename, [chf.update for chf in chfs], parallelize=parallelize) - return [long(chf.hexdigest(), 16) for chf in chfs] + return [int(chf.hexdigest(), 16) for chf in chfs] def loop_over_file(handle, callbacks, parallelize=True): m = None close_f = True - if isinstance(handle, basestring): + if isinstance(handle, str): m, f = mmap_or_open_for_read(handle) elif isinstance(handle, base_data_source): f = handle.bytes_fileobj() @@ -71,7 +72,7 @@ try: if parallelize: - queues = [Queue.Queue(8) for _ in callbacks] + queues = [queue.Queue(8) for _ in callbacks] threads = [threading.Thread(target=chf_thread, args=(queue, functor)) for queue, functor in zip(queues, callbacks)] @@ -126,7 +127,7 @@ @staticmethod def str2long(val): - return long(val, 16) + return int(val, 16) def __call__(self, filename): return chksum_loop_over_file(filename, [self.obj])[0] @@ -289,7 +290,7 @@ @staticmethod def str2long(val): - return long(val) + return int(val) def __call__(self, file_obj): if isinstance(file_obj, base_data_source): @@ -297,7 +298,7 @@ file_obj = file_obj.path else: file_obj = file_obj.text_fileobj() - if isinstance(file_obj, basestring): + if isinstance(file_obj, str): try: st_size = os.lstat(file_obj).st_size except OSError: @@ -305,8 +306,8 @@ return st_size # seek to the end. file_obj.seek(0, 2) - return long(file_obj.tell()) + return int(file_obj.tell()) chksum_types["size"] = SizeChksummer() -chksum_types = {intern(k): v for k, v in chksum_types.iteritems()} +chksum_types = {sys.intern(k): v for k, v in chksum_types.items()} --- ../python3/snakeoil/chksum/_whirlpool.py (original) +++ ../python3/snakeoil/chksum/_whirlpool.py (refactored) @@ -50,9 +50,9 @@ # if it's unicode.. def update(self, arg): """update(arg)""" - if isinstance(arg, unicode): + if isinstance(arg, str): arg = str(arg) - self.ctx.WhirlpoolAdd(map(ord, arg), len(arg) * 8) + self.ctx.WhirlpoolAdd(list(map(ord, arg)), len(arg) * 8) self.digest_status = 0 else: def update(self, arg): @@ -722,19 +722,19 @@ bufferPos += 1 if bufferPos > 32: if bufferPos < 64: - for i in xrange(64 - bufferPos): + for i in range(64 - bufferPos): self.buffer[bufferPos+i] = 0 self._processBuffer() bufferPos = 0 if bufferPos < 32: - for i in xrange(32 - bufferPos): + for i in range(32 - bufferPos): self.buffer[bufferPos+i] = 0 bufferPos = 32 - for i in xrange(32): + for i in range(32): self.buffer[32+i] = self.bitLength[i] self._processBuffer() digest = '' - for i in xrange(8): + for i in range(8): digest += chr((self.hash[i] >> 56) % 0x100) digest += chr((self.hash[i] >> 48) % 0x100) digest += chr((self.hash[i] >> 40) % 0x100) @@ -755,7 +755,7 @@ buffr = self.buffer buf_cnt = 0 - for i in xrange(8): + for i in range(8): block[i] = ((buffr[buf_cnt ] & 0xff) << 56) ^ \ RefactoringTool: Refactored ../python3/snakeoil/chksum/__init__.py RefactoringTool: Refactored ../python3/lintplugin/snakeoil_lint.py RefactoringTool: Refactored ../python3/doc/conf.py RefactoringTool: Files that were modified: RefactoringTool: ../python3/setup.py RefactoringTool: ../python3/snakeoil/weakrefs.py RefactoringTool: ../python3/snakeoil/version.py RefactoringTool: ../python3/snakeoil/unittest_extensions.py RefactoringTool: ../python3/snakeoil/tar.py RefactoringTool: ../python3/snakeoil/struct_compat.py RefactoringTool: ../python3/snakeoil/stringio.py RefactoringTool: ../python3/snakeoil/sequences.py RefactoringTool: ../python3/snakeoil/pyflakes_extension.py RefactoringTool: ../python3/snakeoil/pickling.py RefactoringTool: ../python3/snakeoil/obj.py RefactoringTool: ../python3/snakeoil/namespaces.py RefactoringTool: ../python3/snakeoil/modules.py RefactoringTool: ../python3/snakeoil/mappings.py RefactoringTool: ../python3/snakeoil/lists.py RefactoringTool: ../python3/snakeoil/klass.py RefactoringTool: ../python3/snakeoil/iterables.py RefactoringTool: ../python3/snakeoil/formatters.py RefactoringTool: ../python3/snakeoil/fileutils.py RefactoringTool: ../python3/snakeoil/errors.py RefactoringTool: ../python3/snakeoil/distutils_extensions.py RefactoringTool: ../python3/snakeoil/descriptors.py RefactoringTool: ../python3/snakeoil/dependant_methods.py RefactoringTool: ../python3/snakeoil/demandload.py RefactoringTool: ../python3/snakeoil/debug_imports.py RefactoringTool: ../python3/snakeoil/data_source.py RefactoringTool: ../python3/snakeoil/currying.py RefactoringTool: ../python3/snakeoil/containers.py RefactoringTool: ../python3/snakeoil/compatibility.py RefactoringTool: ../python3/snakeoil/caching_2to3.py RefactoringTool: ../python3/snakeoil/caching.py RefactoringTool: ../python3/snakeoil/bash.py RefactoringTool: ../python3/snakeoil/_fileutils.py RefactoringTool: ../python3/snakeoil/__init__.py RefactoringTool: ../python3/snakeoil/xml/__init__.py RefactoringTool: ../python3/snakeoil/test/test_weakrefs.py RefactoringTool: ../python3/snakeoil/test/test_stringio.py RefactoringTool: ../python3/snakeoil/test/test_source_hygene.py RefactoringTool: ../python3/snakeoil/test/test_slot_shadowing.py RefactoringTool: ../python3/snakeoil/test/test_py3k_eq_hash_inheritance.py RefactoringTool: ../python3/snakeoil/test/test_process.py RefactoringTool: ../python3/snakeoil/test/test_osutils.py RefactoringTool: ../python3/snakeoil/test/test_obj.py RefactoringTool: ../python3/snakeoil/test/test_modules.py RefactoringTool: ../python3/snakeoil/test/test_mappings.py RefactoringTool: ../python3/snakeoil/test/test_lists.py RefactoringTool: ../python3/snakeoil/test/test_klass.py RefactoringTool: ../python3/snakeoil/test/test_iterables.py RefactoringTool: ../python3/snakeoil/test/test_formatters.py RefactoringTool: ../python3/snakeoil/test/test_fileutils.py RefactoringTool: ../python3/snakeoil/test/test_descriptors.py RefactoringTool: ../python3/snakeoil/test/test_dependant_methods.py RefactoringTool: ../python3/snakeoil/test/test_demandload_usage.py RefactoringTool: ../python3/snakeoil/test/test_demandload.py RefactoringTool: ../python3/snakeoil/test/test_del_usage.py RefactoringTool: ../python3/snakeoil/test/test_data_source.py RefactoringTool: ../python3/snakeoil/test/test_currying.py RefactoringTool: ../python3/snakeoil/test/test_containers.py RefactoringTool: ../python3/snakeoil/test/test_compatibility.py RefactoringTool: ../python3/snakeoil/test/test_chksum_defaults.py RefactoringTool: ../python3/snakeoil/test/test_chksum.py RefactoringTool: ../python3/snakeoil/test/test_caching.py RefactoringTool: ../python3/snakeoil/test/test_bash.py RefactoringTool: ../python3/snakeoil/test/mixins.py RefactoringTool: ../python3/snakeoil/test/__init__.py RefactoringTool: ../python3/snakeoil/sphinx_utils/generate_api_rsts.py RefactoringTool: ../python3/snakeoil/process/__init__.py RefactoringTool: ../python3/snakeoil/osutils/native_readdir.py RefactoringTool: ../python3/snakeoil/osutils/__init__.py RefactoringTool: ../python3/snakeoil/compression/_util.py RefactoringTool: ../python3/snakeoil/compression/_bzip2.py RefactoringTool: ../python3/snakeoil/compression/__init__.py RefactoringTool: ../python3/snakeoil/chksum/defaults.py RefactoringTool: ../python3/snakeoil/chksum/_whirlpool.py RefactoringTool: ../python3/snakeoil/chksum/__init__.py RefactoringTool: ../python3/lintplugin/snakeoil_lint.py RefactoringTool: ../python3/doc/conf.py ((buffr[buf_cnt+1] & 0xff) << 48) ^ \ ((buffr[buf_cnt+2] & 0xff) << 40) ^ \ @@ -766,11 +766,11 @@ ((buffr[buf_cnt+7] & 0xff) ) buf_cnt += 8 - for i in xrange(8): + for i in range(8): x = K[i] = self.hash[i] state[i] = block[i] ^ x - for r in xrange(1, R+1): + for r in range(1, R+1): L[0] = CDo(K, 0) ^ rc[r] L[1] = CDo(K, 1) L[2] = CDo(K, 2) @@ -790,10 +790,10 @@ L[5] = CDo(state, 5) ^ K[5] L[6] = CDo(state, 6) ^ K[6] L[7] = CDo(state, 7) ^ K[7] - for i in xrange(8): + for i in range(8): state[i] = L[i] # apply the Miyaguchi-Preneel compression function - for i in xrange(8): + for i in range(8): self.hash[i] ^= state[i] ^ block[i] return --- ../python3/snakeoil/chksum/__init__.py (original) +++ ../python3/snakeoil/chksum/__init__.py (refactored) @@ -128,16 +128,14 @@ parallelize=parallelize) -class LazilyHashedPath(object): +class LazilyHashedPath(object, metaclass=klass.immutable_instance): """Given a pathway, compute chksums on demand via attribute access.""" - - __metaclass__ = klass.immutable_instance def __init__(self, path, **initial_values): f = object.__setattr__ f(self, 'path', path) - for attr, val in initial_values.iteritems(): + for attr, val in initial_values.items(): f(self, attr, val) def __getattr__(self, attr): --- ../python3/lintplugin/snakeoil_lint.py (original) +++ ../python3/lintplugin/snakeoil_lint.py (refactored) @@ -1,9 +1,9 @@ """Pylint plugin checking for trailing whitespace.""" -from __future__ import print_function + import sys -import __builtin__ as builtins +import builtins as builtins from pylint import interfaces, checkers if hasattr(interfaces, 'IASTNGChecker'): --- ../python3/doc/conf.py (original) +++ ../python3/doc/conf.py (refactored) @@ -49,10 +49,10 @@ master_doc = 'index' # General information about the project. -project = u'snakeoil' +project = 'snakeoil' authors_list = ['Brian Harring', 'Tim Harder'] authors = ', '.join(authors_list) -copyright = u'2010-2015, ' + authors +copyright = '2010-2015, ' + authors # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the @@ -190,7 +190,7 @@ # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ - ('index', 'snakeoil.tex', u'snakeoil Documentation', + ('index', 'snakeoil.tex', 'snakeoil Documentation', authors, 'manual'), ] @@ -223,7 +223,7 @@ # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ - ('index', 'snakeoil', u'snakeoil Documentation', authors_list, 1) + ('index', 'snakeoil', 'snakeoil Documentation', authors_list, 1) ] + find . -type f -name '*.py' -exec sed -i 's|#!/usr/bin/python3|#!/usr/bin/python|' '{}' + + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.31824 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python-module-snakeoil-0.6.1 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2' + export FFLAGS + /usr/bin/python setup.py build --debug running build running build_py creating build creating build/lib.linux-x86_64-2.7 creating build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/_fileutils.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/bash.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/caching.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/caching_2to3.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/compatibility.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/containers.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/currying.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/data_source.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/debug_imports.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/demandload.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/dependant_methods.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/descriptors.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/distutils_extensions.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/errors.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/fileutils.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/formatters.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/iterables.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/klass.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/lists.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/mappings.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/modules.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/namespaces.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/obj.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/pickling.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/pyflakes_extension.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/sequences.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/stringio.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/struct_compat.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/tar.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/unittest_extensions.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/version.py -> build/lib.linux-x86_64-2.7/snakeoil copying snakeoil/weakrefs.py -> build/lib.linux-x86_64-2.7/snakeoil creating build/lib.linux-x86_64-2.7/snakeoil/xml copying snakeoil/xml/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil/xml creating build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/mixins.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_bash.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_caching.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_chksum.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_chksum_defaults.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_compatibility.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_containers.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_currying.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_data_source.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_del_usage.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_demandload.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_demandload_usage.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_dependant_methods.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_descriptors.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_fileutils.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_formatters.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_iterables.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_klass.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_lists.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_mappings.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_modules.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_obj.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_osutils.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_process.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_py3k_eq_hash_inheritance.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_slot_shadowing.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_source_hygene.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_stringio.py -> build/lib.linux-x86_64-2.7/snakeoil/test copying snakeoil/test/test_weakrefs.py -> build/lib.linux-x86_64-2.7/snakeoil/test creating build/lib.linux-x86_64-2.7/snakeoil/sphinx_utils copying snakeoil/sphinx_utils/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil/sphinx_utils copying snakeoil/sphinx_utils/generate_api_rsts.py -> build/lib.linux-x86_64-2.7/snakeoil/sphinx_utils creating build/lib.linux-x86_64-2.7/snakeoil/process copying snakeoil/process/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil/process creating build/lib.linux-x86_64-2.7/snakeoil/osutils copying snakeoil/osutils/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil/osutils copying snakeoil/osutils/native_readdir.py -> build/lib.linux-x86_64-2.7/snakeoil/osutils creating build/lib.linux-x86_64-2.7/snakeoil/compression copying snakeoil/compression/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil/compression copying snakeoil/compression/_bzip2.py -> build/lib.linux-x86_64-2.7/snakeoil/compression copying snakeoil/compression/_util.py -> build/lib.linux-x86_64-2.7/snakeoil/compression creating build/lib.linux-x86_64-2.7/snakeoil/chksum copying snakeoil/chksum/__init__.py -> build/lib.linux-x86_64-2.7/snakeoil/chksum copying snakeoil/chksum/_whirlpool.py -> build/lib.linux-x86_64-2.7/snakeoil/chksum copying snakeoil/chksum/defaults.py -> build/lib.linux-x86_64-2.7/snakeoil/chksum generating _verinfo running build_ext building 'snakeoil._posix' extension creating build/temp.linux-x86_64-2.7 creating build/temp.linux-x86_64-2.7/src x86_64-alt-linux-gcc -pthread -pipe -frecord-gcc-switches -Wall -g -O3 -fwrapv -pipe -frecord-gcc-switches -Wall -g -O2 -fPIC -Wall -fno-strict-aliasing -g -Iinclude -I/usr/include/python2.7 -c src/posix.c -o build/temp.linux-x86_64-2.7/src/posix.o src/posix.c: In function 'snakeoil_closerange': src/posix.c:784:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (entry = readdir(dir_handle)) { ^~~~~ x86_64-alt-linux-gcc -pthread -shared -L/usr/lib64/nsl -lnsl -pipe -frecord-gcc-switches -Wall -g -O2 -g build/temp.linux-x86_64-2.7/src/posix.o -lpython2.7 -o build/lib.linux-x86_64-2.7/snakeoil/_posix.so building 'snakeoil._klass' extension x86_64-alt-linux-gcc -pthread -pipe -frecord-gcc-switches -Wall -g -O3 -fwrapv -pipe -frecord-gcc-switches -Wall -g -O2 -fPIC -Wall -fno-strict-aliasing -g -Iinclude -I/usr/include/python2.7 -c src/klass.c -o build/temp.linux-x86_64-2.7/src/klass.o src/klass.c: In function 'snakeoil_mapping_slot_update': src/klass.c:1129:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (item = PyIter_Next(iterator)) { ^~~~ x86_64-alt-linux-gcc -pthread -shared -L/usr/lib64/nsl -lnsl -pipe -frecord-gcc-switches -Wall -g -O2 -g build/temp.linux-x86_64-2.7/src/klass.o -lpython2.7 -o build/lib.linux-x86_64-2.7/snakeoil/_klass.so building 'snakeoil._caching' extension x86_64-alt-linux-gcc -pthread -pipe -frecord-gcc-switches -Wall -g -O3 -fwrapv -pipe -frecord-gcc-switches -Wall -g -O2 -fPIC -Wall -fno-strict-aliasing -g -Iinclude -I/usr/include/python2.7 -c src/caching.c -o build/temp.linux-x86_64-2.7/src/caching.o x86_64-alt-linux-gcc -pthread -shared -L/usr/lib64/nsl -lnsl -pipe -frecord-gcc-switches -Wall -g -O2 -g build/temp.linux-x86_64-2.7/src/caching.o -lpython2.7 -o build/lib.linux-x86_64-2.7/snakeoil/_caching.so building 'snakeoil._lists' extension x86_64-alt-linux-gcc -pthread -pipe -frecord-gcc-switches -Wall -g -O3 -fwrapv -pipe -frecord-gcc-switches -Wall -g -O2 -fPIC -Wall -fno-strict-aliasing -g -Iinclude -I/usr/include/python2.7 -c src/lists.c -o build/temp.linux-x86_64-2.7/src/lists.o src/lists.c: In function 'snakeoil_iflatten_func_iternext': src/lists.c:133:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (n = PyList_GET_SIZE(self->iterables)) { ^ src/lists.c: In function 'snakeoil_iflatten_instance_iternext': src/lists.c:338:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (n = PyList_GET_SIZE(self->iterables)) { ^ x86_64-alt-linux-gcc -pthread -shared -L/usr/lib64/nsl -lnsl -pipe -frecord-gcc-switches -Wall -g -O2 -g build/temp.linux-x86_64-2.7/src/lists.o -lpython2.7 -o build/lib.linux-x86_64-2.7/snakeoil/_lists.so building 'snakeoil.osutils._readdir' extension x86_64-alt-linux-gcc -pthread -pipe -frecord-gcc-switches -Wall -g -O3 -fwrapv -pipe -frecord-gcc-switches -Wall -g -O2 -fPIC -Wall -fno-strict-aliasing -g -Iinclude -I/usr/include/python2.7 -c src/readdir.c -o build/temp.linux-x86_64-2.7/src/readdir.o src/readdir.c: In function 'snakeoil_readdir_actual_listdir': src/readdir.c:52:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (entry = readdir(the_dir)) { ^~~~~ src/readdir.c: In function 'snakeoil_readdir_listdir': src/readdir.c:175:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (entry = readdir(the_dir)) { ^~~~~ src/readdir.c: In function 'snakeoil_readdir_read_dir': src/readdir.c:224:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (entry = readdir(the_dir)) { ^~~~~ x86_64-alt-linux-gcc -pthread -shared -L/usr/lib64/nsl -lnsl -pipe -frecord-gcc-switches -Wall -g -O2 -g build/temp.linux-x86_64-2.7/src/readdir.o -lpython2.7 -o build/lib.linux-x86_64-2.7/snakeoil/osutils/_readdir.so building 'snakeoil._formatters' extension x86_64-alt-linux-gcc -pthread -pipe -frecord-gcc-switches -Wall -g -O3 -fwrapv -pipe -frecord-gcc-switches -Wall -g -O2 -fPIC -Wall -fno-strict-aliasing -g -Iinclude -I/usr/include/python2.7 -c src/formatters.c -o build/temp.linux-x86_64-2.7/src/formatters.o src/formatters.c: In function 'PTF_set_first_prefix': src/formatters.c:81:12: warning: unused variable 'tmp' [-Wunused-variable] PyObject *tmp; ^~~ src/formatters.c: In function 'PTF_set_later_prefix': src/formatters.c:99:12: warning: unused variable 'tmp' [-Wunused-variable] PyObject *tmp; ^~~ src/formatters.c: In function 'PTF_set_encoding': src/formatters.c:123:12: warning: unused variable 'tmp' [-Wunused-variable] PyObject *tmp; ^~~ src/formatters.c: In function '_write_prefix': src/formatters.c:360:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (arg = PyIter_Next(iter)) { ^~~ src/formatters.c: In function 'PTF_write': src/formatters.c:518:9: warning: suggest parentheses around assignment used as truth value [-Wparentheses] while (arg = PyIter_Next(iterator)) { ^~~ x86_64-alt-linux-gcc -pthread -shared -L/usr/lib64/nsl -lnsl -pipe -frecord-gcc-switches -Wall -g -O2 -g build/temp.linux-x86_64-2.7/src/formatters.o -lpython2.7 -o build/lib.linux-x86_64-2.7/snakeoil/_formatters.so building 'snakeoil.chksum._whirlpool_cdo' extension x86_64-alt-linux-gcc -pthread -pipe -frecord-gcc-switches -Wall -g -O3 -fwrapv -pipe -frecord-gcc-switches -Wall -g -O2 -fPIC -Wall -fno-strict-aliasing -g -Iinclude -I/usr/include/python2.7 -c src/whirlpool_cdo.c -o build/temp.linux-x86_64-2.7/src/whirlpool_cdo.o src/whirlpool_cdo.c: In function 'init_whirlpool_cdo': src/whirlpool_cdo.c:154:12: warning: variable 'm' set but not used [-Wunused-but-set-variable] PyObject *m; ^ x86_64-alt-linux-gcc -pthread -shared -L/usr/lib64/nsl -lnsl -pipe -frecord-gcc-switches -Wall -g -O2 -g build/temp.linux-x86_64-2.7/src/whirlpool_cdo.o -lpython2.7 -o build/lib.linux-x86_64-2.7/snakeoil/chksum/_whirlpool_cdo.so + pushd ../python3 ~/RPM/BUILD/python3 ~/RPM/BUILD/python-module-snakeoil-0.6.1 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2' + export FFLAGS + /usr/bin/python3 setup.py build --debug Traceback (most recent call last): File "setup.py", line 9, in from snakeoil import distutils_extensions as snk_distutils File "/usr/src/RPM/BUILD/python3/snakeoil/distutils_extensions.py", line 40, in class sdist(dst_sdist.sdist): File "/usr/src/RPM/BUILD/python3/snakeoil/distutils_extensions.py", line 44, in sdist default_format = dict(dst_sdist.sdist.default_format) AttributeError: type object 'sdist' has no attribute 'default_format' error: Bad exit status from /usr/src/tmp/rpm-tmp.31824 (%build) RPM build errors: Bad exit status from /usr/src/tmp/rpm-tmp.31824 (%build) Command exited with non-zero status 1 20.47user 0.69system 0:21.07elapsed 100%CPU (0avgtext+0avgdata 50436maxresident)k 0inputs+0outputs (0major+140021minor)pagefaults 0swaps hsh-rebuild: rebuild of `python-module-snakeoil-0.6.1-alt1.git20150323.1.1.src.rpm' failed. Command exited with non-zero status 1 41.26user 6.36system 0:47.61elapsed 100%CPU (0avgtext+0avgdata 122156maxresident)k 96inputs+0outputs (0major+662600minor)pagefaults 0swaps