<86>Apr 25 12:32:01 userdel[72320]: delete user 'rooter' <86>Apr 25 12:32:01 userdel[72320]: removed group 'rooter' owned by 'rooter' <86>Apr 25 12:32:01 userdel[72320]: removed shadow group 'rooter' owned by 'rooter' <86>Apr 25 12:32:01 groupadd[72336]: group added to /etc/group: name=rooter, GID=549 <86>Apr 25 12:32:01 groupadd[72336]: group added to /etc/gshadow: name=rooter <86>Apr 25 12:32:01 groupadd[72336]: new group: name=rooter, GID=549 <86>Apr 25 12:32:01 useradd[72347]: new user: name=rooter, UID=549, GID=549, home=/root, shell=/bin/bash <86>Apr 25 12:32:02 userdel[72369]: delete user 'builder' <86>Apr 25 12:32:02 userdel[72369]: removed group 'builder' owned by 'builder' <86>Apr 25 12:32:02 userdel[72369]: removed shadow group 'builder' owned by 'builder' <86>Apr 25 12:32:02 groupadd[72384]: group added to /etc/group: name=builder, GID=550 <86>Apr 25 12:32:02 groupadd[72384]: group added to /etc/gshadow: name=builder <86>Apr 25 12:32:02 groupadd[72384]: new group: name=builder, GID=550 <86>Apr 25 12:32:02 useradd[72397]: new user: name=builder, UID=550, GID=550, home=/usr/src, shell=/bin/bash /usr/src/in/srpm/python3-module-xappy-0.6.0-alt2.src.rpm: The use of such a license name is ambiguous: GPL <13>Apr 25 12:32:05 rpmi: libgdbm-1.8.3-alt10 1454943334 installed <13>Apr 25 12:32:05 rpmi: libexpat-2.2.4-alt1 1503305345 installed <13>Apr 25 12:32:05 rpmi: libp11-kit-0.23.15-alt1 sisyphus+226408.100.2.1 1554288204 installed <13>Apr 25 12:32:05 rpmi: libtasn1-4.16.0-alt1 sisyphus+245480.100.1.1 1580825062 installed <13>Apr 25 12:32:05 rpmi: rpm-macros-alternatives-0.5.1-alt1 sisyphus+226946.100.1.1 1554830426 installed <13>Apr 25 12:32:05 rpmi: alternatives-0.5.1-alt1 sisyphus+226946.100.1.1 1554830426 installed <13>Apr 25 12:32:05 rpmi: ca-certificates-2020.01.23-alt1 sisyphus+244791.300.2.1 1580285500 installed <13>Apr 25 12:32:05 rpmi: ca-trust-0.1.2-alt1 sisyphus+233348.100.1.1 1561653823 installed <13>Apr 25 12:32:05 rpmi: p11-kit-trust-0.23.15-alt1 sisyphus+226408.100.2.1 1554288204 installed <13>Apr 25 12:32:05 rpmi: libcrypto1.1-1.1.1g-alt1 sisyphus+249982.60.8.1 1587743711 installed <13>Apr 25 12:32:05 rpmi: libssl1.1-1.1.1g-alt1 sisyphus+249982.60.8.1 1587743711 installed <13>Apr 25 12:32:05 rpmi: python3-3.8.2-alt1 sisyphus+244999.100.3.1 1585218480 installed <13>Apr 25 12:32:06 rpmi: python3-base-3.8.2-alt1 sisyphus+244999.100.3.1 1585218480 installed <13>Apr 25 12:32:07 rpmi: libpython3-3.8.2-alt1 sisyphus+244999.100.3.1 1585218480 installed <13>Apr 25 12:32:07 rpmi: tests-for-installed-python3-pkgs-0.1.13.1-alt2 1535450458 installed <13>Apr 25 12:32:07 rpmi: rpm-build-python3-0.1.13.1-alt2 1535450458 installed <13>Apr 25 12:32:10 rpmi: libverto-0.3.0-alt1_7 sisyphus+225932.100.1.1 1553994919 installed <13>Apr 25 12:32:10 rpmi: libkeyutils-1.6-alt2 sisyphus+226520.100.2.1 1554512089 installed <13>Apr 25 12:32:10 rpmi: libcom_err-1.44.6-alt1 sisyphus+224154.100.1.1 1552091678 installed <86>Apr 25 12:32:10 groupadd[99735]: group added to /etc/group: name=_keytab, GID=499 <86>Apr 25 12:32:10 groupadd[99735]: group added to /etc/gshadow: name=_keytab <86>Apr 25 12:32:10 groupadd[99735]: new group: name=_keytab, GID=499 <13>Apr 25 12:32:10 rpmi: libkrb5-1.17.1-alt1 sisyphus+242784.100.1.1 1576137330 installed <13>Apr 25 12:32:10 rpmi: libtirpc-1.2.6-alt1 sisyphus+250076.100.1.1 1587038270 installed <13>Apr 25 12:32:10 rpmi: libnsl2-1.1.0-alt1_1 1511548749 installed <13>Apr 25 12:32:10 rpmi: python-modules-encodings-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:10 rpmi: python-modules-compiler-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:10 rpmi: python-modules-email-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:10 rpmi: python-modules-unittest-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:11 rpmi: python-modules-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:11 rpmi: python-modules-nis-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:11 rpmi: python-modules-ctypes-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:11 rpmi: python-modules-multiprocessing-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:11 rpmi: python-modules-logging-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed <13>Apr 25 12:32:11 rpmi: python-tools-2to3-2.7.17-alt4 sisyphus+244873.100.2.1 1581419544 installed Building target platforms: i586 Building for target i586 Wrote: /usr/src/in/nosrpm/python3-module-xappy-0.6.0-alt2.nosrc.rpm <13>Apr 25 12:32:16 rpmi: python3-module-pkg_resources-1:41.4.0-alt1 sisyphus+238787.100.2.1 1570608044 installed <13>Apr 25 12:32:16 rpmi: libtinfo-devel-6.1.20180407-alt2 sisyphus+222164.200.1.1 1550686226 installed <13>Apr 25 12:32:16 rpmi: libncurses-devel-6.1.20180407-alt2 sisyphus+222164.200.1.1 1550686226 installed <13>Apr 25 12:32:16 rpmi: python3-dev-3.8.2-alt1 sisyphus+244999.100.3.1 1585218480 installed <13>Apr 25 12:32:16 rpmi: python3-module-setuptools-1:41.4.0-alt1 sisyphus+238787.100.2.1 1570608044 installed Installing python3-module-xappy-0.6.0-alt2.src.rpm Building target platforms: i586 Building for target i586 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.17818 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf python3-module-xappy-0.6.0 + echo 'Source #0 (python3-module-xappy-0.6.0.tar):' Source #0 (python3-module-xappy-0.6.0.tar): + /bin/tar -xf /usr/src/RPM/SOURCES/python3-module-xappy-0.6.0.tar + cd python3-module-xappy-0.6.0 + /bin/chmod -c -Rf u+rwX,go-w . + echo 'Patch #0 (port-on-python3.patch):' Patch #0 (port-on-python3.patch): + /usr/bin/patch -p1 patching file xappy/cachemanager/generic.py + find -type f -name '*.py' -exec 2to3 -w -n '{}' + RefactoringTool: Skipping optional fixer: buffer RefactoringTool: Skipping optional fixer: idioms RefactoringTool: Skipping optional fixer: set_literal RefactoringTool: Skipping optional fixer: ws_comma RefactoringTool: No changes to ./xappy/utils.py RefactoringTool: No changes to ./xappy/unittests/xappytest.py RefactoringTool: Refactored ./xappy/unittests/weight_params.py RefactoringTool: Refactored ./xappy/unittests/weight_external.py RefactoringTool: Refactored ./xappy/unittests/weight_action.py RefactoringTool: Refactored ./xappy/unittests/valuemapsource_1.py RefactoringTool: Refactored ./xappy/unittests/terms_for_field.py --- ./xappy/unittests/weight_params.py (original) +++ ./xappy/unittests/weight_params.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * def result_ids(results): return [int(i.id) for i in results] @@ -23,7 +23,7 @@ self.indexpath = os.path.join(self.tempdir, 'foo') iconn = xappy.IndexerConnection(self.indexpath) iconn.add_field_action('text', xappy.FieldActions.INDEX_FREETEXT,) - for i in xrange(5): + for i in range(5): doc = xappy.UnprocessedDocument() doc.fields.append(xappy.Field('text', 'foo ' * (i + 1))) doc.fields.append(xappy.Field('text', ' '.join('one two three four five'.split()[i:]))) --- ./xappy/unittests/weight_external.py (original) +++ ./xappy/unittests/weight_external.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * import xapian class ExternalWeightReadingFromValue(xappy.ExternalWeightSource): @@ -57,7 +57,7 @@ iconn.add_field_action('name', xappy.FieldActions.INDEX_FREETEXT,) iconn.add_field_action('exact', xappy.FieldActions.INDEX_EXACT,) iconn.add_field_action('weight', xappy.FieldActions.WEIGHT,) - for i in xrange(5): + for i in range(5): doc = xappy.UnprocessedDocument() doc.fields.append(xappy.Field('name', 'bruno is a nice guy')) doc.fields.append(xappy.Field('name', ' '.join('one two three four five'.split()[i:]))) --- ./xappy/unittests/weight_action.py (original) +++ ./xappy/unittests/weight_action.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestWeightAction(TestCase): def pre_test(self): @@ -22,7 +22,7 @@ iconn.add_field_action('name', xappy.FieldActions.INDEX_FREETEXT,) iconn.add_field_action('exact', xappy.FieldActions.INDEX_EXACT,) iconn.add_field_action('weight', xappy.FieldActions.WEIGHT,) - for i in xrange(5): + for i in range(5): doc = xappy.UnprocessedDocument() doc.fields.append(xappy.Field('name', 'bruno is a nice guy')) doc.fields.append(xappy.Field('name', ' '.join('one two three four five'.split()[i:]))) --- ./xappy/unittests/valuemapsource_1.py (original) +++ ./xappy/unittests/valuemapsource_1.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * # Facets used in documents facets = [ @@ -72,7 +72,7 @@ self.iconn.add_field_action(name, xappy.FieldActions.SORTABLE) for values in docvalues: doc = xappy.UnprocessedDocument() - for name, value in values.iteritems(): + for name, value in values.items(): doc.fields.append(xappy.Field(name, value)) self.iconn.add(doc) self.iconn.flush() --- ./xappy/unittests/terms_for_field.py (original) +++ ./xappy/unittests/terms_for_field.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappyRefactoringTool: Refactored ./xappy/unittests/store_only.py RefactoringTool: Refactored ./xappy/unittests/spell_correct_1.py RefactoringTool: Refactored ./xappy/unittests/sort.py RefactoringTool: Refactored ./xappy/unittests/similar.py RefactoringTool: Refactored ./xappy/unittests/searchresults_slice.py RefactoringTool: Refactored ./xappy/unittests/searchconn_process.py test import * class TestGetTermsForField(TestCase): def pre_test(self): --- ./xappy/unittests/store_only.py (original) +++ ./xappy/unittests/store_only.py (refactored) @@ -14,7 +14,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestStoreOnly(TestCase): --- ./xappy/unittests/spell_correct_1.py (original) +++ ./xappy/unittests/spell_correct_1.py (refactored) @@ -13,14 +13,14 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestSpellCorrect(TestCase): def pre_test(self): self.indexpath = os.path.join(self.tempdir, 'foo') iconn = xappy.IndexerConnection(self.indexpath) iconn.add_field_action('name', xappy.FieldActions.INDEX_FREETEXT, spell=True,) - for i in xrange(5): + for i in range(5): doc = xappy.UnprocessedDocument() doc.fields.append(xappy.Field('name', 'bruno is a nice guy')) iconn.add(doc) --- ./xappy/unittests/sort.py (original) +++ ./xappy/unittests/sort.py (refactored) @@ -17,7 +17,7 @@ # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. from datetime import date -from xappytest import * +from .xappytest import * class TestSortBy(TestCase): def pre_test(self): @@ -34,7 +34,7 @@ {'name': 'b', 'date': date(2008, 6, 1)}] for row in data: doc = xappy.UnprocessedDocument() - for field, value in row.items(): + for field, value in list(row.items()): doc.fields.append(xappy.Field(field, value)) iconn.add(doc) iconn.flush() --- ./xappy/unittests/similar.py (original) +++ ./xappy/unittests/similar.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestSimilar(TestCase): def pre_test(self): @@ -24,7 +24,7 @@ iconn.add_field_action('text', xappy.FieldActions.STORE_CONTENT) self.docs = {} - for i in xrange(32): + for i in range(32): doc = xappy.UnprocessedDocument() if i % 2: # freq = 16 doc.fields.append(xappy.Field('text', 'termA')) --- ./xappy/unittests/searchresults_slice.py (original) +++ ./xappy/unittests/searchresults_slice.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * def _to_ids(res): """Get a set of ids from a SearchResults object. @@ -27,7 +27,7 @@ iconn = xappy.IndexerConnection(self.indexpath) iconn.add_field_action('a', xappy.FieldActions.INDEX_EXACT) - for i in xrange(5): + for i in range(5): doc = xappy.UnprocessedDocument() doc.fields.append(xappy.Field('a', str(i))) iconn.add(doc) --- ./xappy/unittests/searchconn_process.py (original) +++ ./xappy/unittests/searchconn_process.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestSearchConnProcess(TestCase): def pRefactoringTool: Refactored ./xappy/unittests/range_speed.py re_test(self): @@ -41,8 +41,8 @@ self.assertEqual([t.term for t in spdoc._doc.termlist()], [t.term for t in ipdoc._doc.termlist()]) - self.assertEqual([(v.num, v.value) for v in spdoc._doc.values()], - [(v.num, v.value) for v in ipdoc._doc.values()]) + self.assertEqual([(v.num, v.value) for v in list(spdoc._doc.values())], + [(v.num, v.value) for v in list(ipdoc._doc.values())]) self.assertEqual(spdoc.data, ipdoc.data) if __name__ == '__main__': --- ./xappy/unittests/range_speed.py (original) +++ ./xappy/unittests/range_speed.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * import random import time import xapian @@ -28,7 +28,7 @@ iconn.add_field_action('price', xappy.FieldActions.SORTABLE, type='float') iconn.add_field_action('price_text', xappy.FieldActions.INDEX_EXACT) iconn.add_field_action('price_ranges', xappy.FieldActions.SORTABLE, type='float', - ranges=[(x * 10, (x + 1) * 10) for x in xrange(10)]) + ranges=[(x * 10, (x + 1) * 10) for x in range(10)]) # Set the random seed, so that test runs are repeatable. random.seed(42) @@ -40,7 +40,7 @@ self.doccount = 100 # make documents with random price and add the price as text as well - for _ in xrange(self.doccount): + for _ in range(self.doccount): doc = xappy.UnprocessedDocument() val = float(random.randint(0, 100)) strval = conv(val) @@ -72,7 +72,7 @@ text_rangeq = self.sconn.query_composite(self.sconn.OP_OR, (self.sconn.query_field('price_text', conv(x)) - for x in xrange(self.range_bottom, self.range_top + 1))) + for x in range(self.range_bottom, self.range_top + 1))) accel_range_rangeq = self.sconn.query_range('price_ranges', self.range_bottom, self.range_top) @@ -104,16 +104,16 @@ #self.check_equal_results(r5, r9, "range_rangeq", "approx_range_rangeq") return - print "range: ", t1, range_q - print "text: ", t2, text_q - print "accel_range: ", t3, accel_range_q - print "accel_range_cons: ", t4, accel_range_q_cons + print("range: ", t1, range_q) + print("text: ", t2, text_q) + print("accel_range: ", t3, accel_range_q) + print("accel_range_cons: ", t4, accel_range_q_cons) - print "text_range: ", t5, text_rangeq - print "range_range: ", t6, range_rangeq - print "accel_range_range: ", t7, accel_range_rangeq - print "accel_range_range_cons: ", t8, accel_range_rangeq_cons - print "APPROX_range_range: ", t9, approx_range_rangeq + print("text_range: ", t5, text_rangeq) + print("range_range: ", t6, range_rangeq) + print("accel_range_range: ", t7, accel_range_rangeq) + print("accel_range_range_cons: ", t8, accel_range_rangeq_cons) + print("APPROX_range_range: ", t9, approx_range_rangeq) def check_equal_results(self, r1, r2, name1, name2): r1_ids = set((x.id for x in r1)) @@ -125,16 +125,16 @@ ids1_unique = ids1 - ids2 ids2_unique = ids2 - ids1 if ids1_unique or ids2_unique: - print "results for %s and %s differ" % (name1, name2) + print("results for %s and %s differ" % (name1, name2)) if ids1_unique: - print "ids only in %s: " % name1, ids1_unique + print("ids only in %s: " % name1, ids1_unique) if ids2_unique: - print "ids only in %s: " % name2,RefactoringTool: Refactored ./xappy/unittests/range_accel.py RefactoringTool: Refactored ./xappy/unittests/query_serialise.py RefactoringTool: Refactored ./xappy/unittests/query_id.py RefactoringTool: Refactored ./xappy/unittests/query_all.py RefactoringTool: Refactored ./xappy/unittests/multiple_caches.py ids2_unique + print("ids only in %s: " % name2, ids2_unique) for i in ids1 ^ ids2: d = self.sconn.get_document(i) - print "value: ", xapian.sortable_unserialise(d.get_value('price', 'collsort')) - print "termlist: ", map (lambda t: t.term, d._doc.termlist()) + print("value: ", xapian.sortable_unserialise(d.get_value('price', 'collsort'))) + print("termlist: ", [t.term for t in d._doc.termlist()]) def search_repeater(self, query): """Run a search repeatedly, timing it. @@ -145,7 +145,7 @@ """ now = time.time() - for _ in xrange(self.repeats): + for _ in range(self.repeats): r = query.search(0, self.results) return (time.time() - now) / self.repeats, r --- ./xappy/unittests/range_accel.py (original) +++ ./xappy/unittests/range_accel.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * import xapian class RangeAccelIndexTest(TestCase): @@ -132,12 +132,12 @@ def pre_test(self): self.dbpath = os.path.join(self.tempdir, 'db') self.iconn = xappy.IndexerConnection(self.dbpath) - ranges = [(x, x + 1) for x in xrange(10)] + ranges = [(x, x + 1) for x in range(10)] self.iconn.add_field_action('foo', xappy.FieldActions.SORTABLE, type='float', ranges=ranges) self.iconn.add_field_action('bar', xappy.FieldActions.FACET, type='float', ranges=ranges) - for val in xrange(10): + for val in range(10): doc = xappy.UnprocessedDocument() sval = val + 0.5 doc.fields.append(xappy.Field('foo', sval)) --- ./xappy/unittests/query_serialise.py (original) +++ ./xappy/unittests/query_serialise.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * def dedent(str): """Convert all indentation and newlines into single spaces. --- ./xappy/unittests/query_id.py (original) +++ ./xappy/unittests/query_id.py (refactored) @@ -16,7 +16,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestQueryId(TestCase): def pre_test(self): --- ./xappy/unittests/query_all.py (original) +++ ./xappy/unittests/query_all.py (refactored) @@ -13,14 +13,14 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestQueryAll(TestCase): def pre_test(self): self.indexpath = os.path.join(self.tempdir, 'foo') self.dbsize = 32 iconn = xappy.IndexerConnection(self.indexpath) - for i in xrange(self.dbsize): + for i in range(self.dbsize): iconn.add(xappy.UnprocessedDocument()) iconn.flush() iconn.close() @@ -42,7 +42,7 @@ self.assertEqual(len(results), self.dbsize) self.assertEqual(results.startrank, 0) self.assertEqual(results.endrank, self.dbsize) - for i in xrange(self.dbsize): + for i in range(self.dbsize): self.assertEqual(results[i]._doc.get_docid(), i + 1) self.assertEqual(results[i].rank, i) self.assertEqual(results[i].weight, wt) --RefactoringTool: Refactored ./xappy/unittests/indexer_errors.py RefactoringTool: Refactored ./xappy/unittests/imgseek.py RefactoringTool: Refactored ./xappy/unittests/general1.py RefactoringTool: Refactored ./xappy/unittests/freetext_1.py RefactoringTool: Refactored ./xappy/unittests/field_groups.py RefactoringTool: Refactored ./xappy/unittests/field_associations.py - ./xappy/unittests/multiple_caches.py (original) +++ ./xappy/unittests/multiple_caches.py (refactored) @@ -17,7 +17,7 @@ # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. -from __future__ import with_statement + import unittest import tempfile, shutil --- ./xappy/unittests/indexer_errors.py (original) +++ ./xappy/unittests/indexer_errors.py (refactored) @@ -16,7 +16,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * from xappy.errors import DuplicatedIdError --- ./xappy/unittests/imgseek.py (original) +++ ./xappy/unittests/imgseek.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TImgSeek(object): """Test of the image similarity search action. --- ./xappy/unittests/general1.py (original) +++ ./xappy/unittests/general1.py (refactored) @@ -15,7 +15,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestPercent(TestCase): def pre_test(self): @@ -71,7 +71,7 @@ try: iconn.add_field_action('date', xappy.FieldActions.SORTABLE, type='float') self.assertTrue(False) - except xappy.IndexerError, e: + except xappy.IndexerError as e: self.assertEqual(str(e), "Field 'date' is already marked for " "sorting, with a different sort type") @@ -86,7 +86,7 @@ try: iconn.process(doc) self.assertTrue(False) - except xappy.IndexerError, e: + except xappy.IndexerError as e: self.assertEqual(str(e), "Unknown sort type 'unknown' for field " "'price2'") @@ -94,7 +94,7 @@ """Add some content to the database. """ - for i in xrange(200): + for i in range(200): doc = xappy.UnprocessedDocument() doc.append('author', 'Richard Boulton') doc.append('category', 'Cat %d' % ((i + 5) % 20)) --- ./xappy/unittests/freetext_1.py (original) +++ ./xappy/unittests/freetext_1.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestFreeText(TestCase): def pre_test(self): @@ -30,7 +30,7 @@ iconn.add_field_action('b', xappy.FieldActions.STORE_CONTENT) iconn.add_field_action('c', xappy.FieldActions.STORE_CONTENT) - for i in xrange(32): + for i in range(32): doc = xappy.UnprocessedDocument() if i % 2: doc.fields.append(xappy.Field('a', 'termA')) --- ./xappy/unittests/field_groups.py (original) +++ ./xappy/unittests/field_groups.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestFieldGroups(TestCase): def pre_test(self): --- ./xappy/unittests/field_associations.py (original) +++ ./xappy/unittests/field_associations.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU GeRefactoringTool: Refactored ./xappy/unittests/facets.py RefactoringTool: Refactored ./xappy/unittests/facet_query_type_1.py RefactoringTool: Refactored ./xappy/unittests/facet_hierarchy_1.py RefactoringTool: Refactored ./xappy/unittests/exact_index_terms.py RefactoringTool: Refactored ./xappy/unittests/emptydb_search.py RefactoringTool: Refactored ./xappy/unittests/dociter.py neral Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestFieldAssociations(TestCase): def pre_test(self): --- ./xappy/unittests/facets.py (original) +++ ./xappy/unittests/facets.py (refactored) @@ -14,7 +14,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * # Facets used in documents facets = [ @@ -75,7 +75,7 @@ self.iconn.add_field_action(name, xappy.FieldActions.FACET) for values in docvalues: doc = xappy.UnprocessedDocument() - for name, value in values.iteritems(): + for name, value in values.items(): doc.fields.append(xappy.Field(name, value)) self.iconn.add(doc) self.iconn.flush() --- ./xappy/unittests/facet_query_type_1.py (original) +++ ./xappy/unittests/facet_query_type_1.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * # Facets used in documents facets = [ @@ -71,7 +71,7 @@ self.iconn.add_field_action(name, xappy.FieldActions.FACET) for values in docvalues: doc = xappy.UnprocessedDocument() - for name, value in values.iteritems(): + for name, value in values.items(): doc.fields.append(xappy.Field(name, value)) self.iconn.add(doc) self.iconn.set_facet_for_query_type('type1', 'colour', self.iconn.FacetQueryType_Preferred) --- ./xappy/unittests/facet_hierarchy_1.py (original) +++ ./xappy/unittests/facet_hierarchy_1.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * # Facets used in documents and their parent facets (or None for top-level facets) facets = { @@ -86,7 +86,7 @@ iconn.add_field_action(name, xappy.FieldActions.INDEX_EXACT) iconn.add_field_action(name, xappy.FieldActions.STORE_CONTENT) iconn.add_field_action(name, xappy.FieldActions.FACET) - for name, parents in facets.iteritems(): + for name, parents in facets.items(): for parent in parents: if parent: iconn.add_subfacet(name, parent) --- ./xappy/unittests/exact_index_terms.py (original) +++ ./xappy/unittests/exact_index_terms.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestGetTermsForField(TestCase): def pre_test(self): --- ./xappy/unittests/emptydb_search.py (original) +++ ./xappy/unittests/emptydb_search.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestEmptyDbSearch(TestCase): def pre_test(self): --- ./xappy/unittests/dociter.py (original) +++ ./xappy/unittests/dociter.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, InRefactoringTool: Refactored ./xappy/unittests/docids.py RefactoringTool: Refactored ./xappy/unittests/docbuild.py RefactoringTool: Refactored ./xappy/unittests/diversity.py RefactoringTool: Refactored ./xappy/unittests/distance.py RefactoringTool: Refactored ./xappy/unittests/difference.py c., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class RangeTest(TestCase): def pre_test(self, *args): @@ -23,7 +23,7 @@ # make documents with simple text # Add them with document IDs in decreasing order. - for i in xrange(10): + for i in range(10): doc = xappy.UnprocessedDocument() doc.fields.append(xappy.Field('text', "Hello world %d" % i)) pdoc = iconn.process(doc) --- ./xappy/unittests/docids.py (original) +++ ./xappy/unittests/docids.py (refactored) @@ -17,7 +17,7 @@ # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. -from xappytest import * +from .xappytest import * class TestDocId(TestCase): def pre_test(self): --- ./xappy/unittests/docbuild.py (original) +++ ./xappy/unittests/docbuild.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestDocBuild(TestCase): def test_build_doc_fields(self): --- ./xappy/unittests/diversity.py (original) +++ ./xappy/unittests/diversity.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestDiversity(TestCase): def pre_test(self): @@ -26,7 +26,7 @@ iconn.add_field_action('i', xappy.FieldActions.STORE_CONTENT) self.docs = {} - for i in xrange(32): + for i in range(32): doc = xappy.UnprocessedDocument() if i % 2: # freq = 16 doc.fields.append(xappy.Field('text', 'termA')) --- ./xappy/unittests/distance.py (original) +++ ./xappy/unittests/distance.py (refactored) @@ -14,7 +14,7 @@ # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * from xappy.fieldactions import FieldActions import xapian --- ./xappy/unittests/difference.py (original) +++ ./xappy/unittests/difference.py (refactored) @@ -14,7 +14,7 @@ # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * from xappy.fieldactions import FieldActions class DifferenceSearchTest(TestCase): @@ -22,14 +22,14 @@ def pre_test(self): self.dbpath = os.path.join(self.tempdir, 'db') self.iconn = xappy.IndexerConnection(self.dbpath) - ranges = [(x, x + 1) for x in xrange(10)] + ranges = [(x, x + 1) for x in range(10)] self.iconn.add_field_action('foo', xappy.FieldActions.SORTABLE, type='float', ranges=ranges) self.iconn.add_field_action('foo', xappy.FieldActions.STORE_CONTENT) self.iconn.add_field_action('bar', xappy.FieldActions.FACET, type='float', ranges=ranges) self.iconn.add_field_action('bar', xappy.FieldActions.STORE_CONTENT) - for val in xrange(10): + for val in range(10): doc = xappy.UnprocessedDocument() sval = val + 0.5 doc.fields.append(xappy.Field('foo', sval)) @@ -100,8 +100,8 @@ difference_func=difference_test) res = self.sconn.search(query, 0, 10) dist = self.make_dist_comp(val, field) - filtered = filter(lambda x: dist(x) < 3, res) - self.assert_(len(filtered) == 6) + filtered = [x for x in res if dist(x) < 3] + RefactoringTool: Refactored ./xappy/unittests/db_type_compat1.py RefactoringTool: Refactored ./xappy/unittests/db_type1.py RefactoringTool: Refactored ./xappy/unittests/colour.py RefactoringTool: Refactored ./xappy/unittests/collapse.py RefactoringTool: Refactored ./xappy/unittests/cluster.py RefactoringTool: Refactored ./xappy/unittests/calc_hash.py self.assertTrue(len(filtered) == 6) def test_cuttoff_facet_approx(self): self.cutoff_test(5, 'bar', 'facet') --- ./xappy/unittests/db_type_compat1.py (original) +++ ./xappy/unittests/db_type_compat1.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestTypeCompat(TestCase): """Test compatibility with different types. --- ./xappy/unittests/db_type1.py (original) +++ ./xappy/unittests/db_type1.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestDbType(TestCase): """Tests of specifying the type of a database. --- ./xappy/unittests/colour.py (original) +++ ./xappy/unittests/colour.py (refactored) @@ -15,7 +15,7 @@ # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import itertools -from xappytest import * +from .xappytest import * from xappy.fieldactions import FieldActions import xappy.colour import xapian @@ -86,7 +86,7 @@ count += 1 # we may have some rounding errors, so allow a bit of slack fudge = count * xapian.ColourWeight.trigger - self.assert_(995 <= cumfreq-fudge <= 1005) + self.assertTrue(995 <= cumfreq-fudge <= 1005) class ClusterTestCase(TestCase): @@ -97,7 +97,7 @@ (126, 126, 126), (50, 200, 20)] - lab_cols = itertools.imap(xappy.colour.rgb2lab, clustercols) + lab_cols = map(xappy.colour.rgb2lab, clustercols) clusters = xappy.colour.cluster_coords(lab_cols) self.assertEqual(2, len(list(clusters))) @@ -172,7 +172,7 @@ terms = set(terms) colourterm = xappy.colour.rgb2term(colour, 50) rgb_term = prefix + colourterm - self.assert_(rgb_term in terms) + self.assertTrue(rgb_term in terms) def test_palette_query(self): --- ./xappy/unittests/collapse.py (original) +++ ./xappy/unittests/collapse.py (refactored) @@ -15,7 +15,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestCollapse(TestCase): def pre_test(self): @@ -24,7 +24,7 @@ iconn.add_field_action('key1', xappy.FieldActions.COLLAPSE) iconn.add_field_action('key2', xappy.FieldActions.COLLAPSE) - for i in xrange(10): + for i in range(10): doc = xappy.UnprocessedDocument() doc.append('key1', str(i % 5)) doc.append('key2', str(i % 7)) --- ./xappy/unittests/cluster.py (original) +++ ./xappy/unittests/cluster.py (refactored) @@ -13,7 +13,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestCluster(TestCase): def pre_test(self): @@ -26,7 +26,7 @@ iconn.add_field_action('num', xappy.FieldActions.STORE_CONTENT) self.docs = {} - for i in xrange(32): + for i in range(32): doc = xappy.UnprocessedDocument() if i % 2: # freq = 16 doc.fields.append(xappy.Field('text', 'termA')) --- ./xappy/unittests/calc_hash.py (original) +++ ./xappy/unittests/calc_hash.py (refactored) @@ -14,7 +14,7 @@ # You should have received a copy of the GNU General Public License along # with this program; if not, write to the Free SofRefactoringTool: Refactored ./xappy/unittests/cachemanager.py RefactoringTool: Refactored ./xappy/unittests/cached_searches.py tware Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -from xappytest import * +from .xappytest import * class TestCalcHash(TestCase): --- ./xappy/unittests/cachemanager.py (original) +++ ./xappy/unittests/cachemanager.py (refactored) @@ -17,7 +17,7 @@ # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. -from xappytest import * +from .xappytest import * from xappy.cachemanager import * import random @@ -33,11 +33,11 @@ self.assertEqual(cacheditems, items) if exhaustive: - for j in xrange(len(cacheditems)): + for j in range(len(cacheditems)): subitems = man.get_hits(queryid, j) self.assertEqual(items[j:], subitems) if (j % 99 == 0): - for k in xrange(j, self.maxdocid, 10): + for k in range(j, self.maxdocid, 10): subitems = man.get_hits(queryid, j, k) self.assertEqual(items[j:k], subitems) @@ -59,13 +59,13 @@ # queryids, an array, to be indexed by (docid - 1). # Each item needs to be a new array, and will be filled with the # queryid and rank for the items for this docid. - queryids = map(list, [[]] * self.maxdocid) + queryids = list(map(list, [[]] * self.maxdocid)) # List of all the docids in use. docids = list(range(1, self.maxdocid + 1)) querystrs = [] - for i in xrange(1, self.maxqueryid + 1): + for i in range(1, self.maxqueryid + 1): count = random.randint(i, i * 100) items = random.sample(docids, count) random.shuffle(items) @@ -81,7 +81,7 @@ man.flush() self.assertEqual(list(sorted(man.iter_query_strs())), querystrs) - self.assertEqual(list(sorted(ids.keys())), range(0, self.maxqueryid)) + self.assertEqual(list(sorted(ids.keys())), list(range(0, self.maxqueryid))) self.assertEqual(list(man.iter_queryids()), list(sorted(ids.keys()))) man = XapianCacheManager(self.dbpath, chunksize=100) @@ -94,22 +94,22 @@ # first queryid: afterwards, just do a check that the sum of the docids # is right. exhaustive = True - for queryid, items in ids.iteritems(): + for queryid, items in ids.items(): self.check_matches(man, queryid, items, exhaustive) exhaustive = False - for i in xrange(100): + for i in range(100): queryid = random.randint(0, self.maxqueryid - 1) # Remove some of the items - ranks = random.sample(range(len(ids[queryid])), + ranks = random.sample(list(range(len(ids[queryid]))), random.randint(0, min(5, len(ids[queryid])))) ranks.sort(reverse=True) ranks_and_docids = [] for rank in ranks: docid = ids[queryid][rank] items = queryids[docid - 1] - for i in xrange(len(items)): + for i in range(len(items)): if items[i][0] == queryid: del items[i] break --- ./xappy/unittests/cached_searches.py (original) +++ ./xappy/unittests/cached_searches.py (refactored) @@ -17,7 +17,7 @@ # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. -from xappytest import * +from .xappytest import * from xappy.cachemanager import * import random @@ -36,7 +36,7 @@ iconn.add_field_action('f1', xappy.FieldActions.FACET) iconn.add_field_action('f2', xappy.FieldActions.FACET, type='float') - for i in xrange(self.doccount): + for i in range(self.doccount): doc = xappy.UnprocessedDocument() doc.RefactoringTool: Refactored ./xappy/searchresults.py append('text', 'hello') if i > self.doccount / 2: @@ -48,11 +48,11 @@ # Make a cache, and set the hits for some queries. man = XapianCacheManager(self.cachepath) man.set_hits(man.get_or_make_queryid('hello'), - range(self.doccount, 0, -10)) - - uncached_world_order = list(xrange(self.doccount / 2 + 2, + list(range(self.doccount, 0, -10))) + + uncached_world_order = list(range(self.doccount / 2 + 2, self.doccount + 1)) - world_order = list(xrange(1, self.doccount + 1)) + world_order = list(range(1, self.doccount + 1)) random.shuffle(world_order) man.set_hits(man.get_or_make_queryid('world'), world_order) man.set_stats(man.get_or_make_queryid('world'), 5) @@ -114,12 +114,12 @@ results = sconn.search(query_hello, 0, self.doccount) results = [int(result.id, 16) for result in results] - expected = list(xrange(self.doccount)) + expected = list(range(self.doccount)) expected.remove(16) expected.remove(49) self.assertEqual(results, expected) - expected2 = list(xrange(self.doccount - 1, 0, -10)) + expected2 = list(range(self.doccount - 1, 0, -10)) expected2.remove(49) # Test that merge_with_cached works @@ -133,7 +133,7 @@ search(0, self.doccount) resultids = [int(result.id, 16) for result in results] self.assertEqual(resultids[:11], expected2) - self.assertEqual(resultids[:20], expected2 + range(9)) + self.assertEqual(resultids[:20], expected2 + list(range(9))) # Try a search with a cache. results = sconn.search(query_hello.merge_with_cached(cached_id), @@ -165,7 +165,7 @@ # Try searches for each of the sub ranges. expected2_full = expected2 + sorted(set(expected) - set(expected2)) - for i in xrange(len(expected) + 10): + for i in range(len(expected) + 10): results = sconn.search(query_hello.merge_with_cached(cached_id), i, i + 10) results = [int(result.id, 16) for result in results] @@ -201,10 +201,10 @@ self.assertEqual(results, [i - 1 for i in world_order[:2]]) # Try pure cache hits at non-0 start offset. - for i in xrange(100): + for i in range(100): results = sconn.search(query_world.merge_with_cached(world_queryid), i, i + 10) - for j in xrange(len(results)): + for j in range(len(results)): self.assertEqual(int(results[j].id, 16), world_order[i + j] - 1) results = [int(result.id, 16) for result in results] self.assertEqual(results, [i - 1 for i in world_order[i:i + 10]]) --- ./xappy/searchresults.py (original) +++ ./xappy/searchresults.py (refactored) @@ -24,12 +24,12 @@ """ __docformat__ = "restructuredtext en" -from datastructures import UnprocessedDocument, ProcessedDocument -import errors -from fieldactions import FieldActions -from fields import Field -import highlight -from utils import get_significant_digits, add_to_dict_of_dicts +from .datastructures import UnprocessedDocument, ProcessedDocument +from . import errors +from .fieldactions import FieldActions +from .fields import Field +from . import highlight +from .utils import get_significant_digits, add_to_dict_of_dicts class SearchResultContext(object): """A context used by SearchResult objects to get various pieces of @@ -111,7 +111,7 @@ """ actions = self._conn._field_actions[field]._actions - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.INDEX_FREETEXT: for kwargs in kwargslist: try: @@ -124,10 +124,10 @@ """Add the associations found in assocs to those in self. """ - for fieldname, tvoffsetlist in assocs.iteritems(): + for fieldname, tvoffsetlist in assocs.items(): if fields is not None and fieldname not in fields: continue - for (tv, offset), weight in tvoffsetlist.iteritems(): + for (tv, offset), weight in tvoffsetlist.items(): if tv[0] == 'T': term = tv[1:] try: @@ -159,7 +159,7 @@ # Iterate through the stored content, extracting the set of terms and # values which are relevant to each piece. fields = set(fields) - for field, values in self.data.iteritems(): + for field, values in self.data.items(): if field not in fields: continue unpdoc = UnprocessedDocument() @@ -197,7 +197,7 @@ except KeyError: continue is_ft = None - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.INDEX_FREETEXT: is_ft = True for kwargs in kwargslist: @@ -242,7 +242,7 @@ # for each field. relevant_items = {} field_scores = {} - for field, values in self.data.iteritems(): + for field, values in self.data.items(): if field not in allowset: continue hl = highlight.Highlighter(language_code=self._get_language(field)) @@ -288,7 +288,7 @@ # Build a list of the fields which match the query, counting the number # of clauses they match. scoreditems = [(-score, field) - for field, score in field_scores.iteritems()] + for field, score in field_scores.items()] scoreditems.sort() if groupnumbers: @@ -310,7 +310,7 @@ result = [] for score, field in scoreditems: - fielddata = [(-weight, self.data[field][offset], groupnum) for offset, (weight, groupnum) in relevant_offsets[field].iteritems()] + fielddata = [(-weight, self.data[field][offset], groupnum) for offset, (weight, groupnum) in relevant_offsets[field].items()] del relevant_offsets[field] fielddata.sort() result.append((field, tuple((data, groupnum) for weight, data, groupnum in fielddata))) @@ -366,9 +366,9 @@ with some relevant data will also be returned. """ - if isinstance(allow, basestring): + if isinstance(allow, str): allow = (allow, ) - if isinstance(deny, basestring): + if isinstance(deny, str): deny = (deny, ) if allow is not None and len(allow) == 0: allow = None @@ -433,7 +433,7 @@ # of (field, data) item. Sort in decreasing order of score, and # increasing alphabetical order if the score is the same. scoreditems = [(-score, field) - for field, score in fieldscores.iteritems()] + for field, score in fieldscores.items()] scoreditems.sort() result = [] @@ -446,7 +446,7 @@ relevant_offsets = {} for score, field in scoreditems: - for offset, weight in fieldassocs[field].iteritems(): + for offset, weight in fieldassocs[field].items(): relevant_offsets.setdefault(field, {})[offset] = weight, None groupnums = self._grouplu.get((field, offset), None) if groupnums is not None: @@ -455,14 +455,14 @@ relevant_offsets.setdefault(groupfield, {})[groupoffset] = weight, gn for score, field in scoreditems: - fielddata = [(-weight, self.data[field][offset], groupnum) for offset, (weight, groupnum) in relevant_offsets[field].iteritems()] + fielddata = [(-weight, self.data[field][offset], groupnum) for offset, (weight, groupnum) in relevant_offseRefactoringTool: Refactored ./xappy/searchconnection.py ts[field].items()] del relevant_offsets[field] fielddata.sort() result.append((field, tuple((data, groupnum) for weight, data, groupnum in fielddata))) else: # Not grouped - just return the relevant data for each field. for score, field in scoreditems: - fielddata = [(-weight, self.data[field][offset]) for offset, weight in fieldassocs[field].iteritems()] + fielddata = [(-weight, self.data[field][offset]) for offset, weight in fieldassocs[field].items()] fielddata.sort() result.append((field, tuple(data for weight, data in fielddata))) return tuple(result) @@ -716,7 +716,7 @@ """ if isinstance(index_or_slice, slice): start, stop, step = index_or_slice.indices(len(self._ordering)) - return map(self.get_hit, xrange(start, stop, step)) + return list(map(self.get_hit, range(start, stop, step))) else: return self.get_hit(index_or_slice) --- ./xappy/searchconnection.py (original) +++ ./xappy/searchconnection.py (refactored) @@ -25,28 +25,28 @@ """ __docformat__ = "restructuredtext en" -import _checkxapian +from . import _checkxapian import os as _os -import cPickle as _cPickle +import pickle as _cPickle import math import inspect import itertools import xapian -from cache_search_results import CacheResultOrdering -import cachemanager -from cachemanager.xapian_manager import cache_manager_slot_start -from datastructures import UnprocessedDocument, ProcessedDocument -from fieldactions import ActionContext, FieldActions, \ +from .cache_search_results import CacheResultOrdering +from . import cachemanager +from .cachemanager.xapian_manager import cache_manager_slot_start +from .datastructures import UnprocessedDocument, ProcessedDocument +from .fieldactions import ActionContext, FieldActions, \ ActionSet, SortableMarshaller, convert_range_to_term, \ _get_imgterms -import fieldmappings -import errors -from indexerconnection import IndexerConnection, PrefixedTermIter, \ +from . import fieldmappings +from . import errors +from .indexerconnection import IndexerConnection, PrefixedTermIter, \ DocumentIter, SynonymIter, _allocate_id -from query import Query -from searchresults import SearchResults, SearchResultContext -from mset_search_results import FacetResults, NoFacetResults, \ +from .query import Query +from .searchresults import SearchResults, SearchResultContext +from .mset_search_results import FacetResults, NoFacetResults, \ MSetResultOrdering, ResultStats, MSetTermWeightGetter class ExternalWeightSource(object): @@ -139,7 +139,7 @@ actions = self._field_actions[field]._actions except KeyError: actions = {} - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.SORT_AND_COLLAPSE: for kwargs in kwargslist: return kwargs['type'] @@ -150,8 +150,8 @@ Returns a sequence of 2-tuples, (fieldname, searchbydefault) """ - for field, actions in self._field_actions.actions.iteritems(): - for action, kwargslist in actions.iteritems(): + for field, actions in self._field_actions.actions.items(): + for action, kwargslist in actions.items(): if action == FieldActions.INDEX_FREETEXT: for kwargs in kwargslist: return kwargs['type'] @@ -168,7 +168,7 @@ try: config_str = self._index.get_metadata('_xappy_config') break - except xapian.DatabaseModifiedError, e: + except xapian.DatabaseModifiedError as e: # Don't call self.reopen() since that calls _load_config()! self._index.reopen() @@ -191,7 +191,7 @@ # Backwards compatibility; there used to only be one parent. for key in self._facet_hierarchy: parents = self._facet_hierarchy[key] - if isinstance(parents, basestring): + if isinstance(parents, str): parents = [parents] self._facet_hierarchy[key] = parents except ValueError: @@ -268,9 +268,9 @@ for handler, userdata in self._close_handlers: try: handler(indexpath, userdata) - except Exception, e: + except Exception as e: import sys, traceback - print >>sys.stderr, "WARNING: unhandled exception in handler called by SearchConnection.close(): %s" % traceback.format_exception_only(type(e), e) + print("WARNING: unhandled exception in handler called by SearchConnection.close(): %s" % traceback.format_exception_only(type(e), e), file=sys.stderr) def process(self, document): """Process an UnprocessedDocument with the settings in this database. @@ -413,7 +413,7 @@ else: assert end is not None test_fn = (lambda r: r[1] <= end) - valid_ranges = filter(test_fn, ranges) + valid_ranges = list(filter(test_fn, ranges)) if len(valid_ranges) == 0: return Query(_conn=self, _ranges=query_ranges) * 0, \ self._RANGE_NONE @@ -799,7 +799,7 @@ if not ranges: errors.SearchError("Cannot do approximate difference search " "on fields with no ranges") - if isinstance(difference_func, basestring): + if isinstance(difference_func, str): difference_func = eval('lambda x, y: ' + difference_func) result = self._difference_accel_query(ranges, range_accel_prefix, val, difference_func, num) @@ -808,7 +808,7 @@ else: # not approx # NOTE - very slow: needs to be implemented in C++. - if isinstance(difference_func, basestring): + if isinstance(difference_func, str): difference_func = eval('lambda x, y: ' + difference_func) class DifferenceWeight(ExternalWeightSource): " An exteral weighting source for differences" @@ -838,14 +838,14 @@ """ coords1 = xapian.LatLongCoords() - if isinstance(location1, basestring): + if isinstance(location1, str): coords1.insert(xapian.LatLongCoord.parse_latlong(location1)) else: for coord in location1: coords1.insert(xapian.LatLongCoord.parse_latlong(coord)) coords2 = xapian.LatLongCoords() - if isinstance(location2, basestring): + if isinstance(location2, str): coords2.insert(xapian.LatLongCoord.parse_latlong(location2)) else: for coord in location2: @@ -881,7 +881,7 @@ # Build the list of coordinates coords = xapian.LatLongCoords() - if isinstance(centre, basestring): + if isinstance(centre, str): coords.insert(xapian.LatLongCoord.parse_latlong(centre)) else: for coord in centre: @@ -929,7 +929,7 @@ serialised = self._make_parent_func_repr("query_image_similarity") import xapian.imgseek - if len(filter(lambda x: x is not None, (image, docid, xapid))) != 1: + if len([x for x in (image, docid, xapid) if x is not None]) != 1: raise errors.SearchError( "Exactly one of image, docid or xapid is required for" " query_image_similarity().") @@ -1017,7 +1017,7 @@ except KeyError: actions = {} facettype = None - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.FACET: for kwargs in kwargslist: facettype = kwargs.get('type', None) @@ -1027,7 +1027,7 @@ break if facettype == 'float': - if isinstance(val, basestring): + if isinstance(val, str): val = [float(v) for v in val.split(',', 2)] assert(len(val) == 2) try: @@ -1100,9 +1100,9 @@ if self._index is None: raise errors.SearchError("SearchConnection has been closed") - if isinstance(allow, basestring): + if isinstance(allow, str): allow = (allow, ) - if isinstance(deny, basestring): + if isinstance(deny, str): deny = (deny, ) if allow is not None and len(allow) == 0: allow = None @@ -1112,9 +1112,9 @@ raise errors.SearchError("Cannot specify both `allow` and `deny` " "(got %r and %r)" % (allow, deny)) - if isinstance(default_allow, basestring): + if isinstance(default_allow, str): default_allow = (default_allow, ) - if isinstance(default_deny, basestring): + if isinstance(default_deny, str): default_deny = (default_deny, ) if default_allow is not None and len(default_allow) == 0: default_allow = None @@ -1138,7 +1138,7 @@ actions = self._field_actions[field]._actions except KeyError: actions = {} - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.INDEX_EXACT: # FIXME - need patched version of xapian to add exact prefixes #qp.add_exact_prefix(field, self._field_mappings.get_prefix(field)) @@ -1170,7 +1170,7 @@ actions = self._field_actions[field]._actions except KeyError: actions = {} - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.INDEX_FREETEXT: qp.add_prefix('', self._field_mappings.get_prefix(field)) # FIXME - set stemming options for the default prefix @@ -1204,7 +1204,7 @@ self._qp_flags_synonym | self._qp_flags_bool, prefix) - except xapian.QueryParserError, e: + except xapian.QueryParserError as e: # If we got a parse error, retry without boolean operators (since # these are the usual cause of the parse error). q1 = self._query_parse_with_prefix(qp, string, @@ -1219,7 +1219,7 @@ base_flags | self._qp_flags_bool, prefix) - except xapian.QueryParserError, e: + except xapian.QueryParserError as e: # If we got a parse error, retry without boolean operators (since # these are the usual cause of the parse error). q2 = self._query_parse_with_prefix(qp, string, base_flags, prefix) @@ -1310,7 +1310,7 @@ serialised = self._make_parent_func_repr("query_field") # need to check on field type, and stem / split as appropriate - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action in (FieldActions.INDEX_EXACT, FieldActions.FACET,): if value is None: @@ -1458,11 +1458,11 @@ if allow is not None and deny is not None: raise errors.SearchError("Cannot specify both `allow` and `deny`") - if isinstance(ids, (basestring, ProcessedDocument, UnprocessedDocument)): + if isinstance(ids, (str, ProcessedDocument, UnprocessedDocument)): ids = (ids, ) - if isinstance(allow, basestring): + if isinstance(allow, str): allow = (allow, ) - if isinstance(deny, basestring): + if isinstance(deny, str): deny = (deny, ) # Set "allow" to contain a list of all the fields to use. @@ -1478,7 +1478,7 @@ actions = self._field_actions[field]._actions except KeyError: actions = {} - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.INDEX_FREETEXT: prefixes[self._field_mappings.get_prefix(field)] = field @@ -1517,7 +1517,7 @@ try: eterms = self._perform_expand(ids, prefixes, simterms, tempdb) break; - except xapian.DatabaseModifiedError, e: + except xapian.DatabaseModifiedError as e: self.reopen() return eterms, prefixes @@ -1561,7 +1561,7 @@ # (binary, lexicographical) sorted order, though. pl = combined_db.postlist('Q' + id) try: - xapid = pl.next() + xapid = next(pl) rset.add_document(xapid.docid) except StopIteration: pass @@ -1613,7 +1613,7 @@ def next(self, minweight): try: - self.current = self.alldocs.next() + self.current = next(self.alldocs) except StopIteration: self.current = None @@ -1682,7 +1682,7 @@ particularly efficient query. """ - if isinstance(docid, basestring): + if isinstance(docid, str): terms = ['Q' + docid] else: terms = ['Q' + docid for docid in docid] @@ -1755,7 +1755,7 @@ qp.FLAG_SPELLING_CORRECTION) corrected = qp.get_corrected_query_string() if len(corrected) == 0: - if isinstance(querystr, unicode): + if isinstance(querystr, str): # Encode as UTF-8 for consistency - this happens automatically # to values passed to Xapian. return querystr.encode('utf-8') @@ -1793,7 +1793,7 @@ ends a prefix, even if followed by capital letters. """ - for p in xrange(len(term)): + for p in range(len(term)): if not term[p].isupper(): return term[:p] elif term[p] == 'R': @@ -1897,7 +1897,7 @@ try: mset = enq.get_mset(0, 0) break - except xapian.DatabaseModifiedError, e: + except xapian.DatabaseModifiedError as e: self.reopen() return mset.get_max_possible() @@ -2005,7 +2005,7 @@ actions = self._field_actions[field]._actions except KeyError: actions = {} - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.FACET: # filter out non-top-level facets that aren't subfacets # of a facet in the query @@ -2035,7 +2035,7 @@ actions = self._field_actions[field]._actions except KeyError: continue - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.FACET: slot = self._field_mappings.get_slot(field, 'facet') facettype = self._field_type_from_kwargslist(kwargslist) @@ -2058,7 +2058,7 @@ return enq def _apply_sort_parameters(self, enq, sortby): - if isinstance(sortby, basestring): + if isinstance(sortby, str): params = self._get_sort_slot_and_dir(sortby) if len(params) == 2: enq.set_sort_by_value_then_relevance(*params) @@ -2073,7 +2073,7 @@ # Get the coords coords = xapian.LatLongCoords() - RefactoringTool: Refactored ./xappy/query.py RefactoringTool: Refactored ./xappy/perftest/harness.py RefactoringTool: Refactored ./xappy/perftest/cachemanager.py if isinstance(sortby.centre, basestring): + if isinstance(sortby.centre, str): coords.insert(xapian.LatLongCoord.parse_latlong(sortby.centre)) else: for coord in sortby.centre: @@ -2270,7 +2270,7 @@ if collapse is not None: collapse_slotnum = self._apply_collapse_parameters(enq, collapse, collapse_max) if getfacets: - for facetspy in facetspies.itervalues(): + for facetspy in facetspies.values(): enq.add_matchspy(facetspy) # Set percentage and weight cutoffs @@ -2289,7 +2289,7 @@ try: mset = enq.get_mset(startrank, real_maxitems, checkatleast) break - except xapian.DatabaseModifiedError, e: + except xapian.DatabaseModifiedError as e: self.reopen() else: mset = None @@ -2396,12 +2396,12 @@ if docid is not None: postlist = self._index.postlist('Q' + docid) try: - plitem = postlist.next() + plitem = next(postlist) except StopIteration: # Unique ID not found raise KeyError('Unique ID %r not found' % docid) try: - postlist.next() + next(postlist) raise errors.IndexerError("Multiple documents " "found with same unique ID: %r" % docid) except StopIteration: @@ -2416,7 +2416,7 @@ result = ProcessedDocument(self._field_mappings) result._doc = self._index.get_document(xapid) return result - except xapian.DatabaseModifiedError, e: + except xapian.DatabaseModifiedError as e: self.reopen() def iter_synonyms(self, prefix=""): @@ -2464,7 +2464,7 @@ while True: try: return self._index.get_metadata(key) - except xapian.DatabaseModifiedError, e: + except xapian.DatabaseModifiedError as e: self.reopen() def iter_terms_for_field(self, field, starts_with=''): @@ -2498,7 +2498,7 @@ if default_weight < 0: raise ValueError("default_weight must be >= 0") ps.set_default_weight(default_weight) - for k, v in weightmap.items(): + for k, v in list(weightmap.items()): if v < 0: raise ValueError("weights in weightmap must be >= 0") ps.add_mapping(k, v) --- ./xappy/query.py (original) +++ ./xappy/query.py (refactored) @@ -23,7 +23,7 @@ """ __docformat__ = "restructuredtext en" -import _checkxapian +from . import _checkxapian import copy import xapian @@ -154,7 +154,7 @@ one which will match no documents. """ - queries = tuple(filter(lambda x: x is not None, queries)) + queries = tuple([x for x in queries if x is not None]) # Special cases for 0 or 1 subqueries - don't build pointless # combinations. --- ./xappy/perftest/harness.py (original) +++ ./xappy/perftest/harness.py (refactored) @@ -156,11 +156,11 @@ performing the standard cleanup process. """ - for timer in self.timers.keys(): + for timer in list(self.timers.keys()): self.stop_timer(timer) self.post_test() shutil.rmtree(self.tempdir) - print self.format_timers() + print(self.format_timers()) def hash_data(self): """Get a hash, or some other identifier, for the data which will be --- ./xappy/perftest/cachemanager.py (original) +++ ./xappy/perftest/cachemanager.py (refactored) @@ -20,7 +20,7 @@ from harness import * from xappy.cachemanager import * -import cPickle +import pickle import random import time @@ -40,41 +40,41 @@ class TestCacheManager(PerfTestCase): def hash_data(self): - return ','.join("%s,%s" % (key, value) for key, value in sorted(config.iteritems())) + return ','.join("%s,%s" % (key, value) for key, value in sorted(config.items())) def build_data(self): """Build the datafile. """ - print "Building datafile containing %d queries" % config['cacheditems'] + print("Building datafile containing %d queries" % config['cacheditems']) random.seed(config['seed']) datafile = os.path.join(self.builtdatadir, 'data') fd = open(datafile, 'w') def make_query_id(): - return ''.join('%x' % random.randint(1, 16) for i in xrange(16)) - - for i in xrange(config['cacheditems']): + return ''.join('%x' % random.randint(1, 16) for i in range(16)) + + for i in range(config['cacheditems']): qid = make_query_id() itemcount = random.randint(config['minitemspercache'], config['maxitemspercache']) - ids = random.sample(xrange(1, config['docids'] + 1), itemcount) - p = cPickle.dumps(ids, 2) + ids = random.sample(range(1, config['docids'] + 1), itemcount) + p = pickle.dumps(ids, 2) fd.write('%d\nQuery(%s)\n' % (len(p), qid) + p) if (i + 1) % 1000 == 0: - print "%d queries added" % (i + 1) + print("%d queries added" % (i + 1)) fd.close() - print "Building database containing %d documents" % config['docids'] + print("Building database containing %d documents" % config['docids']) iconn = xappy.IndexerConnection(self.get_origdbpath(), dbtype='chert') iconn.add_field_action('text', xappy.FieldActions.INDEX_FREETEXT) - for i in xrange(1, config['docids'] + 1): + for i in range(1, config['docids'] + 1): doc = xappy.UnprocessedDocument() doc.append('text', 'doc %d' % i) iconn.add(doc) if i % 1000 == 0: - print "%d documents added" % i + print("%d documents added" % i) iconn.flush() def iter_data(self): @@ -94,7 +94,7 @@ qid = fd.readline() data = fd.read(bytes) assert len(data) == bytes - data = cPickle.loads(data) + data = pickle.loads(data) yield qid, data def get_origdbpath(self): @@ -111,7 +111,7 @@ #cache = XapianSelfInvertingCacheManager(cachepath) # Copy the database into the temporary directory. - print "Copying pre-prepared database" + print("Copying pre-prepared database") self.start_timer('copydb', 'Copy initial database into place') shutil.copytree(self.get_origdbpath(), dbpath) self.stop_timer('copydb') @@ -122,7 +122,7 @@ self.assertEqual(list(cache.keys()), []) # Add the hits to the cache. - print "Adding hits to CacheManager" + print("Adding hits to CacheManager") self.start_timer('set_hits', 'Initial population of cache with %d queries' % config['cacheditems']) @@ -138,8 +138,8 @@ # Performing searches directly against the cache. qidstrs = random.sample(qidstrs, config['searches']) - print "Performing %d pure-cache searches" % len(qidstrs) - for repeat in xrange(2): + print("Performing %d pure-cache searches" % len(qidstrs)) + for repeat in range(2): self.reset_timers(('purecachesearch', 'getqid1', 'gethits1')) self.start_timer('purecachesearch', 'Timing pure-cached searches') for qidstr in qidstrs: @@ -151,7 +151,7 @@ self.stop_timer('gethits1') self.stop_timer('purecachesearch') - print "Preparing to apply cache to database" + print("Preparing to apply cache to database") self.start_timer('apply_cache', 'Apply cached items to the database') iconn = xappy.IndexerConnection(dbpath) iconn.set_cache_manager(cache) @@ -159,7 +159,7 @@ cache.prepare_iter_by_docid() self.stop_timer('prepare_cache') - print "Applying cache to database" + print("Applying cache to database") self.start_timer('do_apply_cache', '... apply') iconn.apply_cached_items() self.stop_timer('do_apply_cache') @@ -170,8 +170,8 @@ # Performing searches without cache on the database sconn = xappy.SearchConnection(dbpath) - print "Performing %d searches without cache, getting top 100 results" % len(qidstrs) - for repeat in xrange(2): + print("Performing %d searches without cache, getting top 100 results" % len(qidstrs)) + for repeat in range(2): self.reset_timers(('nocachesearch1',)) self.start_timer('nocachesearch1', 'No-cache searches, getting results 0-100') for num, qidstr in enumerate(qidstrs): @@ -179,8 +179,8 @@ query.search(0, 100) self.stop_timer('nocachesearch1') - print "Performing %d searches without cache, getting results 10000-10100" % len(qidstrs) - for repeat in xrange(2): + print("Performing %d searches without cache, getting results 10000-10100" % len(qidstrs)) + for repeat in range(2): self.reset_timers(('nocachesearch2',)) self.start_timer('nocachesearch2', 'No-cache searches, getting results 10000-10100') for num, qidstr in enumerate(qidstrs): @@ -189,8 +189,8 @@ self.stop_timer('nocachesearch2') # Performing cached searches on the database - print "Performing %d searches with cache, getting top 100 results" % len(qidstrs) - for repeat in xrange(2): + print("Performing %d searches with cache, getting top 100 results" % len(qidstrs)) + for repeat in range(2): self.reset_timers(('cachedsearch1', 'getqid2', 'gethits2')) self.start_timer('cachedsearch1', 'Cached searches, getting results 0-100') for num, qidstr in enumerate(qidstrs): @@ -204,8 +204,8 @@ self.stop_timer('gethits2') self.stop_timer('cachedsearch1') - print "Performing %d searches with cache, getting results 10000-10100" % len(qidstrs) - for repeat in xrange(2): + print("Performing %d searches with cache, getting results 10000-10100" % len(qidstrs)) + for repeat in range(2): self.reset_timers(('cachedsearch2', 'getqid3', 'gethits3')) self.start_timer('cachedsearch2', 'Cached searches, getting results 10000-10100') for num, qidstr in enumerate(qidstrs): @@ -222,18 +222,18 @@ iconn.close() dbcopypath = os.path.join(self.builtdatadir, 'dbcopy') - deldocids = random.sample(xrange(1, config['docids'] + 1), + deldocids = random.sample(range(1, config['docids'] + 1), config['deldocs']) - print "Copying database for nocache delete test" + print("Copying database for nocache delete test") if os.path.exists(dbcopypath): shutil.rmtree(dbcopypath) shutil.copytree(dbpath, dbcopypath) iconn = xappy.IndexerConnection(dbcopypath) # Deleting some documents without the cache - print "Delete documents without cache" + print("Delete documents without cache") self.start_timer('deldocsnocache', 'Deleting %d documents without cache attached' % len(deldocids)) for docid in deldocids: iconn.delete(xapid=docid) @@ -244,7 +244,7 @@ iconn.close() - print "Copying database for cached delete test" + print("Copying database for cached delete test") if os.path.exists(dbcopypath): shutil.rmtree(dbcopypath) shutil.copytree(dbpath, dbcopypath) @@ -252,7 +252,7 @@ iconn.set_cache_manager(cache) # Deleting some documents without RefactoringTool: No changes to ./xappy/parsedate.py RefactoringTool: Refactored ./xappy/mset_search_results.py the cache - print "Delete documents with cache" + print("Delete documents with cache") self.start_timer('deldocscached', 'Deleting %d documents with cache attached' % len(deldocids)) for docid in deldocids: iconn.delete(xapid=docid) @@ -263,7 +263,7 @@ iconn.close() - print "Finished run" + print("Finished run") if __name__ == '__main__': --- ./xappy/mset_search_results.py (original) +++ ./xappy/mset_search_results.py (refactored) @@ -24,14 +24,14 @@ """ __docformat__ = "restructuredtext en" -import _checkxapian - -import errors -from fieldactions import FieldActions -from indexerconnection import IndexerConnection +from . import _checkxapian + +from . import errors +from .fieldactions import FieldActions +from .indexerconnection import IndexerConnection import math import re -from searchresults import SearchResult +from .searchresults import SearchResult import xapian class MSetTermWeightGetter(object): @@ -53,8 +53,8 @@ self.context = context self.it = iter(mset) - def next(self): - return SearchResult(self.it.next(), self.context) + def __next__(self): + return SearchResult(next(self.it), self.context) class MSetResultOrdering(object): @@ -190,7 +190,7 @@ collapse_bins = {} # Fill collapse_bins. - for i in xrange(self.mset.get_firstitem() + len(self.mset)): + for i in range(self.mset.get_firstitem() + len(self.mset)): hit = self.mset.get_hit(i) category = hit.collapse_key try: @@ -209,12 +209,12 @@ # 1. utilities = dict((k, v / pqc_sum) for (k, v) - in utilities.iteritems()) + in utilities.items()) # Calculate scores for the potential next hits. These are the top # weighted hits in each category. potentials = {} - for category, l in collapse_bins.iteritems(): + for category, l in collapse_bins.items(): wt = l[0][1] # weight of the top item score = wt * utilities.get(category, 0.01) # current utility of the category potentials[category] = (l[0][0], score, wt) @@ -227,7 +227,7 @@ # Pick the next category to use, by finding the maximum score # (breaking ties by choosing the highest ranked one in the original # order). - next_cat, (next_i, next_score, next_wt) = max(potentials.iteritems(), key=lambda x: (x[1][1], -x[1][0])) + next_cat, (next_i, next_score, next_wt) = max(iter(potentials.items()), key=lambda x: (x[1][1], -x[1][0])) # Update the utility of the chosen category utilities[next_cat] = (1.0 - next_wt) * utilities.get(next_cat, 0.01) @@ -255,8 +255,8 @@ tophits = [] nottophits = [] - clusterstarts = dict(((c[0], None) for c in clusters.itervalues())) - for i in xrange(self.mset.get_firstitem() + len(self.mset)): + clusterstarts = dict(((c[0], None) for c in clusters.values())) + for i in range(self.mset.get_firstitem() + len(self.mset)): if i in clusterstarts: tophits.append(i) else: @@ -274,7 +274,7 @@ """ prefixes = {} - if isinstance(fields, basestring): + if isinstance(fields, str): fields = [fields] if len(fields) != 1: return None @@ -285,7 +285,7 @@ except KeyError: return None - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.SORTABLE: return self._conn._field_mappings.get_slot(field, 'collsort') if action == FieldActions.WEIGHT: @@ -300,14 +300,14 @@ """ prefixes = {} - if isinstance(fields, basestring): + if isinstance(fields, str): fields = [fields] for field in fields: RefactoringTool: No changes to ./xappy/memutils.py RefactoringTool: No changes to ./xappy/marshall.py RefactoringTool: Refactored ./xappy/indexerconnection.py try: actions = self._conn._field_actions[field]._actions except KeyError: continue - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.INDEX_FREETEXT: prefix = self._conn._field_mappings.get_prefix(field) prefixes[prefix] = None @@ -316,7 +316,7 @@ FieldActions.FACET,): prefix = self._conn._field_mappings.get_prefix(field) prefixes[prefix] = None - prefix_re = re.compile('|'.join([re.escape(x) + '[^A-Z]' for x in prefixes.keys()])) + prefix_re = re.compile('|'.join([re.escape(x) + '[^A-Z]' for x in list(prefixes.keys())])) class decider(xapian.ExpandDecider): def __call__(self, term): return prefix_re.match(term) is not None @@ -352,7 +352,7 @@ sim_count = 0 new_order = [] end = min(self.mset.get_firstitem() + len(self.mset), maxcount) - for i in xrange(end): + for i in range(end): if full: new_order.append(i) continue @@ -389,15 +389,15 @@ for hit in nottophits: new_order.append(hit.rank) if end != self.mset.get_firstitem() + len(self.mset): - new_order.extend(range(end, - self.mset.get_firstitem() + len(self.mset))) + new_order.extend(list(range(end, + self.mset.get_firstitem() + len(self.mset)))) assert len(new_order) == self.mset.get_firstitem() + len(self.mset) if reordered: return ReorderedMSetResultOrdering(self.mset, new_order, self.context) else: - assert new_order == range(self.mset.get_firstitem() + - len(self.mset)) + assert new_order == list(range(self.mset.get_firstitem() + + len(self.mset))) return self @@ -433,8 +433,8 @@ self.context = context self.it = iter(self.order) - def next(self): - index = self.it.next() + def __next__(self): + index = next(self.it) msetitem = self.mset.get_hit(index) return SearchResult(msetitem, self.context) @@ -530,15 +530,15 @@ else: ranges = xapian.NumericRanges(facetspy.get_values(), desired_num_of_categories) - values = tuple(sorted(ranges.get_ranges_as_dict().iteritems())) + values = tuple(sorted(ranges.get_ranges_as_dict().items())) else: try: values = tuple((item.term, item.termfreq) - for item in facetspy.values()) + for item in list(facetspy.values())) except AttributeError: # backwards compatibility values = facetspy.get_values_as_dict() - values = tuple(sorted(values.iteritems())) + values = tuple(sorted(values.items())) score = math.fabs(len(values) - desired_num_of_categories) if len(values) <= 1: score = 1000 @@ -558,11 +558,11 @@ `SearchResults.get_suggested_facets()`. """ - if isinstance(required_facets, basestring): + if isinstance(required_facets, str): required_facets = [required_facets] scores = [] - for field in self.facetvalues.iterkeys(): + for field in self.facetvalues.keys(): score = self.facetscore[field] scores.append((score, field)) --- ./xappy/indexerconnection.py (original) +++ ./xappy/indexerconnection.py (refactored) @@ -24,19 +24,19 @@ """ __docformat__ = "restructuredtext en" -import _checkxapian -import cPickle +from . import _checkxapian +import pickle import xapian -import cachemanager -from datastructures import * -from cachemanager.xapian_manager import BASE_CACHE_SLOT, \ +from . import cachemanager +from .datastructures import * +from .cachemanager.xapian_manager import BASE_CACHE_SLOT, \ cache_manager_slot_start, get_caches, set_caches -import errors -from fieldactions import ActionContext, FieldActions, ActionSet -import fieldmappings -import memutils +from . import errors +from .fieldactions import ActionContext, FieldActions, ActionSet +from . import fieldmappings +from . import memutils import os def _allocate_id(index, next_docid): @@ -183,7 +183,7 @@ """ assert self._index is not None - config_str = cPickle.dumps(( + config_str = pickle.dumps(( self._field_actions.actions, self._field_mappings.serialise(), self._facet_hierarchy, @@ -209,20 +209,20 @@ mappings, self._facet_hierarchy, self._facet_query_table, - self._next_docid) = cPickle.loads(config_str) + self._next_docid) = pickle.loads(config_str) self._field_actions = ActionSet() self._field_actions.actions = actions # Backwards compatibility; there used to only be one parent. for key in self._facet_hierarchy: parents = self._facet_hierarchy[key] - if isinstance(parents, basestring): + if isinstance(parents, str): parents = [parents] self._facet_hierarchy[key] = parents except ValueError: # Backwards compatibility - configuration used to lack _facet_hierarchy and _facet_query_table (actions, mappings, - self._next_docid) = cPickle.loads(config_str) + self._next_docid) = pickle.loads(config_str) self._field_actions = ActionSet() self._field_actions.actions = actions self._facet_hierarchy = {} @@ -274,7 +274,7 @@ """ if self._index is None: raise errors.IndexerError("IndexerConnection has been closed") - return self._field_actions.keys() + return list(self._field_actions.keys()) def process(self, document, store_only=False): """Process an UnprocessedDocument with the settings in this database. @@ -439,7 +439,7 @@ # Copy any cached query items over to the new document. olddoc, _ = self._get_xapdoc(id, xapid) if olddoc is not None: - for value in olddoc.values(): + for value in list(olddoc.values()): if value.num < BASE_CACHE_SLOT: continue newdoc.add_value(value.num, value.value) @@ -562,7 +562,7 @@ """ if self._index is None: raise errors.IndexerError("IndexerConnection has been closed") - return [k for k, v in self._facet_hierarchy.iteritems() if facet in v] + return [k for k, v in self._facet_hierarchy.items() if facet in v] FacetQueryType_Preferred = 1; FacetQueryType_Never = 2; @@ -607,7 +607,7 @@ if query_type not in self._facet_query_table: return None facet_dict = self._facet_query_table[query_type] - return set([facet for facet, assoc in facet_dict.iteritems() if assoc == association]) + return set([facet for facet, assoc in facet_dict.items() if assoc == association]) def set_metadata(self, key, value): """Set an item of metadata stored in the connection. @@ -655,7 +655,7 @@ if xapid is None: postlist = self._index.postlist('Q' + docid) try: - plitem = postlist.next() + plitem = next(postlist) except StopIteration: return None, None returRefactoringTool: Refactored ./xappy/highlight.py n self._index.get_document(plitem.docid), plitem.docid @@ -685,7 +685,7 @@ # date; multiple caches are therefore not really suitable for use in # production systems - they are however useful for experimenting with # different caching algorithms. - for value in doc.values(): + for value in list(doc.values()): base_slot = self._cache_manager_slot_start upper_slot = self._cache_manager_slot_start + self.cache_manager.num_cached_queries() if not (base_slot <= value.num < upper_slot): @@ -812,7 +812,7 @@ assert isinstance(self.cache_manager, cachemanager.KeyValueStoreCacheManager) - for k in self.cache_manager.keys(): + for k in list(self.cache_manager.keys()): self._index.set_metadata(k, self.cache_manager[k]) self._index.set_metadata('_xappy_hasintcache', '1') self._open_internal_cache() @@ -917,12 +917,12 @@ raise errors.IndexerError("IndexerConnection has been closed") postlist = self._index.postlist('Q' + id) try: - plitem = postlist.next() + plitem = next(postlist) except StopIteration: # Unique ID not found raise KeyError('Unique ID %r not found' % id) try: - postlist.next() + next(postlist) raise errors.IndexerError("Multiple documents " #pragma: no cover "found with same unique ID: %r"% id) except StopIteration: @@ -985,7 +985,7 @@ raise errors.IndexerError("IndexerConnection has been closed") if 'facets' in _checkxapian.missing_features: raise errors.IndexerError("Facets unsupported with this release of xapian") - return self._facet_hierarchy.iteritems() + return iter(self._facet_hierarchy.items()) def iter_facet_query_types(self, association): """Get an iterator over query types and their associated facets. @@ -1040,7 +1040,7 @@ def __iter__(self): return self - def next(self): + def __next__(self): """Get the next term with the specified prefix. """ @@ -1085,11 +1085,11 @@ def __iter__(self): return self - def next(self): + def __next__(self): """Get the next document. """ - posting = self._postingiter.next() + posting = next(self._postingiter) result = ProcessedDocument(self._connection._field_mappings) result._doc = self._connection._index.get_document(posting.docid) return result @@ -1113,11 +1113,11 @@ def __iter__(self): return self - def next(self): + def __next__(self): """Get the next synonym. """ - synkey = self._syniter.next() + synkey = next(self._syniter) pos = 0 for char in synkey: if char.isupper(): pos += 1 @@ -1145,18 +1145,18 @@ IndexerConnection.FacetQueryType_Never). """ - self._table_iter = facet_query_table.iteritems() + self._table_iter = iter(facet_query_table.items()) self._association = association def __iter__(self): return self - def next(self): + def __next__(self): """Get the next (query type, facet set) 2-tuple. """ - query_type, facet_dict = self._table_iter.next() - facet_list = [facet for facet, association in facet_dict.iteritems() if association == self._association] + query_type, facet_dict = next(self._table_iter) + facet_list = [facet for facet, association in facet_dict.items() if association == self._association] if len(facet_list) == 0: - return self.next() + return next(self) return (query_type, set(facet_list)) --- ./xappy/highlight.py (original) +++ ./xappy/highlight.py (refactored) @@ -112,7 +112,7 @@ Returns a list of utf-8 encoded simple strings. """ - if isinstance(text, unicode)RefactoringTool: No changes to ./xappy/fields.py RefactoringTool: Refactored ./xappy/fieldmappings.py RefactoringTool: Refactored ./xappy/fieldactions.py : + if isinstance(text, str): text = text.encode('utf-8') words = self._split_re.findall(text) @@ -142,7 +142,7 @@ """ - for p in xrange(len(term)): + for p in range(len(term)): if term[p].islower(): return term[p:] elif term[p] == 'R': @@ -215,7 +215,7 @@ # select high-scoring blocks first, down to zero-scoring chars = 0 - for count in xrange(3, -1, -1): + for count in range(3, -1, -1): for b in blocks: if b[3] >= count: b[4] = True @@ -238,7 +238,7 @@ # trim down to maxlen l = 0 - for i in xrange (len (words2)): + for i in range (len (words2)): l += len (words2[i]) if l >= maxlen: words2[i:] = ['..'] --- ./xappy/fieldmappings.py (original) +++ ./xappy/fieldmappings.py (refactored) @@ -22,7 +22,7 @@ """ __docformat__ = "restructuredtext en" -import cPickle as _cPickle +import pickle as _cPickle class FieldMappings(object): """Mappings from field names to term prefixes, slot values, etc. @@ -84,7 +84,7 @@ If the prefix is not found, return None. """ - for key, val in self._prefixes.iteritems(): + for key, val in self._prefixes.items(): if val == prefix: return key return None --- ./xappy/fieldactions.py (original) +++ ./xappy/fieldactions.py (refactored) @@ -24,16 +24,16 @@ import collections -import _checkxapian -import errors -import fields -import marshall +from . import _checkxapian +from . import errors +from . import fields +from . import marshall import xapian try: import xapian.imgseek except ImportError: pass -import parsedate +from . import parsedate def _act_store_content(fieldname, doc, field, context, link_associations=True): """Perform the STORE_CONTENT action. @@ -187,7 +187,7 @@ if not imgterms: actions = conn._field_actions[fieldname]._actions params = {} - for action, kwargslist in actions.iteritems(): + for action, kwargslist in actions.items(): if action == FieldActions.IMGSEEK: params = kwargslist[0] break @@ -321,7 +321,7 @@ """ try: value = parsedate.date_from_string(value) - except ValueError, e: + except ValueError as e: raise self._err("Value supplied to field %r must be a " "valid date: was %r: error is '%s'" % (fieldname, value, str(e))) @@ -535,7 +535,7 @@ info = self._action_info[action] # Check parameter names - for key in kwargs.keys(): + for key in list(kwargs.keys()): if key not in info[1]: raise errors.IndexerError("Unknown parameter name for action %r: %r" % (info[0], key)) @@ -625,7 +625,7 @@ if 'slot' in info[3]: purposes = info[3]['slot'] - if isinstance(purposes, basestring): + if isinstance(purposes, str): field_mappings.add_slot(self._fieldname, purposes) else: slotnum = None @@ -661,7 +661,7 @@ context.currfield_assoc = None # First, store the content, if we're going to, so it can be referred to # in the "associations" table. - for actiontype, actionlist in self._actions.iteritems(): + for actiontype, actionlist in self._actions.items(): if actiontype != FieldActions.STORE_CONTENT: continue info = self._action_info[actiontype] @@ -672,7 +672,7 @@ return # Then do all the other actions. - for actiontype, actionlist in self._actions.iteritems(): + for actiontype, actionlist in self._actions.items(): if actiontype == FieldActions.STORE_CONTENT: continue info = self._action_info[actiRefactoringTool: No changes to ./xappy/errors.py RefactoringTool: Refactored ./xappy/datastructures.py RefactoringTool: Refactored ./xappy/colour_data.py RefactoringTool: Refactored ./xappy/colour.py ontype] --- ./xappy/datastructures.py (original) +++ ./xappy/datastructures.py (refactored) @@ -27,10 +27,10 @@ from hashlib import sha1 as hashlib_sha1 except ImportError: from sha import sha as hashlib_sha1 -import errors -from fields import Field, FieldGroup +from . import errors +from .fields import Field, FieldGroup import xapian -import cPickle +import pickle class UnprocessedDocument(object): """A unprocessed document to be passed to the indexer. @@ -86,7 +86,7 @@ if isinstance(args[0], (Field, FieldGroup)): self.fields.append(args[0]) return - if not isinstance(args[0], basestring): + if not isinstance(args[0], str): # We assume we have a sequence of parameters for creating # Fields, to go in a FieldGroup fields = [] @@ -114,7 +114,7 @@ for field in fields: if isinstance(field, (Field, FieldGroup)): self.fields.append(field) - elif isinstance(field[0], basestring): + elif isinstance(field[0], str): self.fields.append(Field(*field)) else: self.fields.append(FieldGroup(field)) @@ -265,7 +265,7 @@ else: yield term[len(prefix):] try: - item = tl.next() + item = next(tl) except StopIteration: break @@ -312,7 +312,7 @@ unpacked[1] = self._assocs if self._groups is not None: unpacked[2] = self._groups - self._doc.set_data(cPickle.dumps(tuple(unpacked), 2)) + self._doc.set_data(pickle.dumps(tuple(unpacked), 2)) self._data = None self._assocs = None self._groups = None @@ -323,7 +323,7 @@ rawdata = self._doc.get_data() if rawdata == '': return ({}, {}, []) - unpacked = cPickle.loads(rawdata) + unpacked = pickle.loads(rawdata) if isinstance(unpacked, dict): # Backwards compatibility return unpacked, {}, [] @@ -393,15 +393,15 @@ ungrouped = {} groups = {} - for field, vals in self.data.iteritems(): - for offset in xrange(len(vals)): + for field, vals in self.data.items(): + for offset in range(len(vals)): groupnums = grouplu.get((field, offset), None) if groupnums is None: ungrouped.setdefault(field, []).append(vals[offset]) else: for gn in groupnums: groups.setdefault(gn, {}).setdefault(field, []).append(vals[offset]) - groupnums = list(groups.iterkeys()) + groupnums = list(groups.keys()) groupnums.sort() sortedgroups = [] for groupnum in groupnums: @@ -451,7 +451,7 @@ sha1.update(self.id) sha1.update(self._doc.get_data()) sha1.update("\0".join("%s\0%s" % (t.term, t.wdf) for t in self._doc.termlist())) - sha1.update("\0".join("%d\0%s" % (v.num, v.value) for v in self._doc.values())) + sha1.update("\0".join("%d\0%s" % (v.num, v.value) for v in list(self._doc.values()))) return sha1.hexdigest() def _get_assocs(self): @@ -505,7 +505,7 @@ location = xapian.LatLongCoords.unserialise(location) else: coords = xapian.LatLongCoords() - if isinstance(location, basestring): + if isinstance(location, str): coords.insert(xapian.LatLongCoord.parse_latlong(location)) else: for coord in location: --- ./xappy/colour_data.py (original) +++ ./xappy/colour_data.py (refactored) @@ -392,7 +392,7 @@ } -colour_names = list(rgb_data.iterkeys()) +colour_names = list(rgb_data.keys()) colour_names.sort(key = len, reverse=True) default_spread = 0.04 --- ./xappy/colour.py (original) +++ ./xappy/colour.py (refactored) @@ -55,7 +55,7 @@ import math # Third-party modules -import colour_data +from . import colour_data import colormath import colormath.color_objects import numpy @@ -75,9 +75,9 @@ min_l = min_a = min_b = 10000000.0 max_l = max_a = max_b = -10000000.0 - for x in xrange(256): - for y in xrange(256): - for z in xrange(256): + for x in range(256): + for y in range(256): + for z in range(256): rgb = colormath.color_objects.RGBColor(*rgb_coords) lab = rgb.convert_to('lab') min_l = min(min_l, lab.lab_l) @@ -254,7 +254,7 @@ if coord_fun is None: source = coord_list else: - source = map(coord_fun, coord_list) + source = list(map(coord_fun, coord_list)) if len(source) < 2: yield coord_list else: @@ -269,7 +269,7 @@ sfreqs = sorted(enumerate(coord_list), key=keyf) groups = itertools.groupby(sfreqs, keyf) for k, group in groups: - yield map(operator.itemgetter(1), group) + yield list(map(operator.itemgetter(1), group)) def cluster_terms(terms, step_count, distance_factor=0.05): """Clusters terms by converting them to corresponding lab coordinates. @@ -288,8 +288,8 @@ are replaced by the average weight amongst them all. """ - average = sum(terms_and_weights.itervalues()) / len(terms_and_weights) - for t in terms_and_weights.iterkeys(): + average = sum(terms_and_weights.values()) / len(terms_and_weights) + for t in terms_and_weights.keys(): terms_and_weights[t] = average # this could all be pushed down into numpy, but colormath needs a @@ -315,9 +315,9 @@ int(min(i + bucket_index_distance, step_count)) + 1) for i in bucket] - for x in xrange(*ranges[0]): - for y in xrange(*ranges[1]): - for z in xrange(*ranges[2]): + for x in range(*ranges[0]): + for y in range(*ranges[1]): + for z in range(*ranges[2]): lab = bucket2lab((x, y, z), step_count) lab_obj = colormath.color_objects.LabColor(*lab) yield ((x, y, z), origin.delta_e(lab_obj)) @@ -371,13 +371,13 @@ def query_from_clusters(sconn, field, clusters, step_count, averaging=False): import xapian - from query import Query + from .query import Query prefix = sconn._field_mappings.get_prefix(field) def term_subqs(ts): return [Query(xapian.Query(prefix + term)) * weight for - term, weight in ts.iteritems()] + term, weight in ts.items()] subqs = [] @@ -401,7 +401,7 @@ terms = colour_terms_cache[step_count] except KeyError: terms = {} - for colourname, rgb in rgb_data.iteritems(): + for colourname, rgb in rgb_data.items(): terms[colourname] = rgb2term(rgb, step_count) colour_terms_cache[step_count] = terms return terms @@ -422,7 +422,7 @@ """ weights = collections.defaultdict(float) - for colour_name, rgb in rgb_data.iteritems(): + for colour_name, rgb in rgb_data.items(): count = text.count(colour_name) spread = colour_spreads[colour_name] terms_and_weights([(rgb, count, spread)], step_count, weights) @@ -458,7 +458,7 @@ # the corresponding coordinates come from the dimensions. if len(facets) == 0: - from query import Query + from .query import Query return Query() fieldname = facets[0].fieldname @@ -480,7 +480,7 @@ spread = 0.05 def make_clusters(): - for l in xrange(1, count+1): + for l in range(1, count+1): rgbs = palette_array[labels == l] mean_weight = scipy.ndimage.mean(facet_weights, labels=labels, index=l) cluster_vals = [ ((int(x[:2], 16), int(x[2:4], 16), int(x[4:], 16)), @@ -560,8 +560,8 @@ particular value of step_count. """ - for x in xrange(step_count): - for y in xrange(step_count): - RefactoringTool: Refactored ./xappy/cachemanager/generic.py RefactoringTool: Refactored ./xappy/cachemanager/xapian_manager.py RefactoringTool: Refactored ./xappy/cachemanager/verify_cache.py RefactoringTool: Refactored ./xappy/cachemanager/queryinvert.py for z in xrange(step_count): + for x in range(step_count): + for y in range(step_count): + for z in range(step_count): rgb = bucket2rgb((x, y, z), step_count) yield rgb --- ./xappy/cachemanager/generic.py (original) +++ ./xappy/cachemanager/generic.py (refactored) @@ -25,7 +25,7 @@ """ __docformat__ = "restructuredtext en" -import cPickle +import pickle import operator from collections import MutableMapping as DictMixin try: @@ -34,10 +34,10 @@ from md5 import md5 try: - from numpy_inverter import NumpyInverterMixIn + from .numpy_inverter import NumpyInverterMixIn InverterMixIn = NumpyInverterMixIn except ImportError: - from inmemory_inverter import InMemoryInverterMixIn + from .inmemory_inverter import InMemoryInverterMixIn InverterMixIn = InMemoryInverterMixIn def sort_facets(facets): @@ -48,9 +48,9 @@ """ if isinstance(facets, dict): - facets = facets.iteritems() + facets = iter(facets.items()) return tuple(sorted((fieldname, - tuple(sorted(valfreqs.iteritems() if isinstance(valfreqs, dict) else valfreqs))) + tuple(sorted(iter(valfreqs.items()) if isinstance(valfreqs, dict) else valfreqs))) for fieldname, valfreqs in facets)) class CacheManager(object): @@ -313,8 +313,8 @@ - 'F': Followed by a queryid, contains the facets for that query. """ - encode = staticmethod(lambda x: cPickle.dumps(x, 2)) - decode = staticmethod(cPickle.loads) + encode = staticmethod(lambda x: pickle.dumps(x, 2)) + decode = staticmethod(pickle.loads) encode_int = encode decode_int = decode @@ -390,7 +390,7 @@ v = self['I'] if len(v) == 0: return iter(()) - return xrange(self.decode_int(v)) + return range(self.decode_int(v)) def iter_query_strs(self): for query_id in self.iter_queryids(): @@ -705,7 +705,7 @@ self.set_facets(queryid, facets) return newfacets = dict(self.decode(data)) - for fieldname, new_valfreqs in facets.iteritems(): + for fieldname, new_valfreqs in facets.items(): try: existing_valfreqs = newfacets[fieldname] except KeyError: @@ -719,7 +719,7 @@ except KeyError: pass existing_valfreqs[value] = freq - newfacets[fieldname] = tuple(existing_valfreqs.iteritems()) + newfacets[fieldname] = tuple(existing_valfreqs.items()) self[key] = self.encode(sort_facets(newfacets)) --- ./xappy/cachemanager/xapian_manager.py (original) +++ ./xappy/cachemanager/xapian_manager.py (refactored) @@ -25,7 +25,7 @@ """ __docformat__ = "restructuredtext en" -import generic +from . import generic import os import shutil try: --- ./xappy/cachemanager/verify_cache.py (original) +++ ./xappy/cachemanager/verify_cache.py (refactored) @@ -39,7 +39,7 @@ `msg` is the message describing the failure. """ - print "Error:", msg + print("Error:", msg) self.failcount +=1 if self.failcount >= self.failmax: raise RuntimeError("Too many failures - aborting verification") @@ -50,7 +50,7 @@ `msg` is the message. """ - print msg + print(msg) def verify(dbpath, fail_cb, info_cb): """Verify that a cache stored in a database has been applied correctly. --- ./xappy/cachemanager/queryinvert.py (original) +++ ./xappy/cachemanager/queryinvert.py (refactored) @@ -30,7 +30,7 @@ """ import tempfile -from itertools import groupby, izip, repeat +from itertools import groupby, repeat from operator import itemgetter import numpy as np @@ -66,9 +66,9 @@ for (key, values) in sequence: # we need to iterate the values sequence as well as allowing caller # to iterate, so it must be materialized - valuecopy = np.fromiter(izip(values, + valuecopRefactoringTool: Refactored ./xappy/cachemanager/numpy_inverter.py RefactoringTool: No changes to ./xappy/cachemanager/inmemory_inverter.py RefactoringTool: Refactored ./xappy/cachemanager/__init__.py RefactoringTool: Refactored ./xappy/cache_search_results.py RefactoringTool: No changes to ./xappy/_checkxapian.py RefactoringTool: Refactored ./xappy/__init__.py RefactoringTool: Refactored ./utils/replay_search_log.py y = np.fromiter(zip(values, repeat(key), - xrange(len(values))), self.dtype) + range(len(values))), self.dtype) buf = valuecopy.tostring() del valuecopy self.tf.write(buf) @@ -80,7 +80,7 @@ def __iter__(self): a = np.memmap(self.tf, dtype=self.dtype, mode='r') for (k, vals) in groupby(a, itemgetter(0)): - yield (int(k), map(lambda x: (int(x[1]), int(x[2])), vals)) + yield (int(k), [(int(x[1]), int(x[2])) for x in vals]) del a def close(self): @@ -109,15 +109,15 @@ count = 0 r = random.Random() lambd = 1.0 / (average_docs_per_query - min_docs_per_query) - for i in xrange(querycount): + for i in range(querycount): newlen = int(r.expovariate(lambd)) yield (i, (np.random.randint(0, max_docid, newlen))) if (i + 1) % 10000 == 0: - print "creating %s.." % (i + 1) + print("creating %s.." % (i + 1)) count += newlen - print "iterated %s queries with %s doc references" % (querycount, count) + print("iterated %s queries with %s doc references" % (querycount, count)) - print "testing..." + print("testing...") inverse_iter = InverseIterator(input_iter()) count = 0 @@ -127,8 +127,8 @@ count += newlen i += 1 if i % 10000 == 0: - print "iterated %s.." % i - print "iterated %s docs with %s query references" % (querycount, count) + print("iterated %s.." % i) + print("iterated %s docs with %s query references" % (querycount, count)) if __name__ == "__main__" : runtest() --- ./xappy/cachemanager/numpy_inverter.py (original) +++ ./xappy/cachemanager/numpy_inverter.py (refactored) @@ -24,7 +24,7 @@ """ __docformat__ = "restructuredtext en" -import queryinvert +from . import queryinvert class NumpyInverterMixIn(object): """Inverting implementation which uses a numpy array for storage. --- ./xappy/cachemanager/__init__.py (original) +++ ./xappy/cachemanager/__init__.py (refactored) @@ -24,9 +24,9 @@ """ __docformat__ = "restructuredtext en" -from generic import CacheManager, KeyValueStoreCacheManager +from .generic import CacheManager, KeyValueStoreCacheManager try: - from xapian_manager import \ + from .xapian_manager import \ XapianCacheManager, \ XapianSelfInvertingCacheManager except ImportError: --- ./xappy/cache_search_results.py (original) +++ ./xappy/cache_search_results.py (refactored) @@ -22,7 +22,7 @@ """ __docformat__ = "restructuredtext en" -from searchresults import SearchResult +from .searchresults import SearchResult try: import simplejson as json except ImportError: @@ -58,8 +58,8 @@ self.context = context self.it = enumerate(xapids) - def next(self): - rank, xapid = self.it.next() + def __next__(self): + rank, xapid = next(self.it) msetitem = CacheMSetItem(self.context.conn, rank, xapid) return SearchResult(msetitem, self.context) --- ./xappy/__init__.py (original) +++ ./xappy/__init__.py (refactored) @@ -28,11 +28,11 @@ __version__ = '0.6.0' -import _checkxapian -from datastructures import UnprocessedDocument, ProcessedDocument -from errors import * -from fieldactions import FieldActions -from fields import Field, FieldGroup -from indexerconnection import IndexerConnection -from query import Query -from searchconnection import SearchConnection, ExternalWeightSource +from . import _checkxapian +from .datastructures import UnprocessedDocument, ProcessedDocument +from .errors import * +from .fieldactions import FieldActions +from .fields import Field, FieldGroup +from .indexerconnection import IndexerConnection +from .query import Query +from .searchconnection import SearchConnection, ExternalWeightSource --- ./utils/replay_search_log.py (original) +++ ./utils/replay_search_loRefactoringTool: Refactored ./utils/dump_field_actions.py RefactoringTool: Refactored ./test.py g.py (refactored) @@ -20,12 +20,12 @@ def display_time(starttime, count): endtime = time.time() - print "%d,%.5f" % (count, endtime - starttime) + print("%d,%.5f" % (count, endtime - starttime)) def replay_from_file(conn, fd): starttime = time.time() count = 0 - print "Searches,Total Time (seconds)" + print("Searches,Total Time (seconds)") for line in fd: line = line.strip() queryrepr, args, kwargs = eval(line) @@ -44,7 +44,7 @@ def run_from_commandline(): import sys if len(sys.argv) != 3: - print usage.strip() + print(usage.strip()) sys.exit(1) dbpath = sys.argv[1] --- ./utils/dump_field_actions.py (original) +++ ./utils/dump_field_actions.py (refactored) @@ -30,7 +30,7 @@ } def dump_field_actions(fieldname, actions, fieldmappings): - for actiontype, kwargslist in actions._actions.iteritems(): + for actiontype, kwargslist in actions._actions.items(): info = xappy.FieldActions._action_info[actiontype] prefix = None slot = None @@ -46,11 +46,11 @@ if slot is not None: extra.append("slot=%s" % slot) extra = ', '.join(extra) - print "(%r, %s, %s) %s" % (fieldname, actionname, repr(kwargs), extra) + print("(%r, %s, %s) %s" % (fieldname, actionname, repr(kwargs), extra)) def dump_actions(conn, fieldname): if fieldname is None: - fields = conn._field_actions.keys() + fields = list(conn._field_actions.keys()) fields.sort() else: fields = (fieldname, ) @@ -65,7 +65,7 @@ def run_from_commandline(): import sys if len(sys.argv) < 2 or len(sys.argv) > 3: - print usage.strip() + print(usage.strip()) sys.exit(1) dbpath = sys.argv[1] --- ./test.py (original) +++ ./test.py (refactored) @@ -110,7 +110,7 @@ globs = { '__file__': moddir, } - for key in mod.__dict__.keys(): + for key in list(mod.__dict__.keys()): if not key.startswith('__'): globs[key] = mod.__dict__[key] return doctest.DocFileSuite(testpath, @@ -178,7 +178,7 @@ """Cleanup after running a test. """ - for key, val in list(dtobj.globs.iteritems()): + for key, val in list(dtobj.globs.items()): if hasattr(val, '__module__') and \ val.__module__ is not None and \ val.__module__.startswith('xappy'): @@ -247,7 +247,7 @@ # Check that the module imported came from the expected path. if os.path.splitext(mod.__file__)[0] != modpath: - print "Couldn't import module `%s`: got module of same name, from wrong path (%r)" % (modname, mod.__file__) + print("Couldn't import module `%s`: got module of same name, from wrong path (%r)" % (modname, mod.__file__)) continue # Add module to test suite. @@ -262,8 +262,8 @@ suite.addTest(create_docfile_suite(mod, moddir, modpath % num)) num += 1 - except ImportError, e: - print "Couldn't import module `%s`: %s" % (modname, e) + except ImportError as e: + print("Couldn't import module `%s`: %s" % (modname, e)) traceback.print_exc() # Add any other files with doctests in them. @@ -285,8 +285,8 @@ mod = __import__(modpath, None, None, ['']) test = loader.loadTestsFromModule(mod) suite.addTest(test) - except ImportError, e: - print "Skipping test module %s (%s)" % (modpath, str(e)) + except ImportError as e: + print("Skipping test module %s (%s)" % (modpath, str(e))) return modules, suite @@ -305,7 +305,7 @@ if arg in modnames: newnames.append(arg) else: - print "Module `%s' not known" % arg + print("Module `%s' not known" % arg) sys.exit(1) modnames = newnames @@ -371,12 +371,12 @@ return stats def displRefactoringTool: No changes to ./setup.py RefactoringTool: Refactored ./research/expand_prefixes.py RefactoringTool: No changes to ./perftest/setuppaths.py RefactoringTool: Refactored ./perftest/searcher.py RefactoringTool: Refactored ./perftest/perftest.py ay_coverage(stats): - print "Coverage report:" + print("Coverage report:") max_filename_len = max(len(stat[0]) for stat in stats) for filename, percent, total, missed in stats: msg = "%r%s %5.1f%% of %d" % (filename, ' ' * (max_filename_len - len(filename)), percent, total) if len(missed) != 0: - for pos in xrange(len(missed)): + for pos in range(len(missed)): if missed[pos][0] == missed[pos][1]: missed[pos] = str(missed[pos][0]) elif missed[pos][0] + 1 == missed[pos][1]: @@ -384,7 +384,7 @@ else: missed[pos] = "%d-%d" % tuple(missed[pos]) msg += "\t Missed: %s" % ','.join(missed) - print msg + print(msg) def run(specific_mods, use_coverage=False, use_profiling=False): if use_profiling: --- ./research/expand_prefixes.py (original) +++ ./research/expand_prefixes.py (refactored) @@ -13,7 +13,7 @@ z = 'Z' else: z = '' - for i in xrange(len(term)): + for i in range(len(term)): if term[i] == ':': return term[:i], z + term[i+1:] if not term[i].isupper(): @@ -28,7 +28,7 @@ return term sconn = xappy.SearchConnection("in") -fieldname = dict((v, k) for (k, v) in sconn._field_mappings._prefixes.iteritems()) +fieldname = dict((v, k) for (k, v) in sconn._field_mappings._prefixes.items()) newdb = xapian.WritableDatabase("out", xapian.DB_CREATE_OR_OVERWRITE) count = 10000 --- ./perftest/searcher.py (original) +++ ./perftest/searcher.py (refactored) @@ -19,7 +19,7 @@ import os import sys import time -import thread +import _thread import threading import getopt import xappy @@ -121,7 +121,7 @@ self.mutex.release() def run(self): - for i in xrange(self.threads): + for i in range(self.threads): runner = TestRunner(self, i + 1) runner.start() runner.join() --- ./perftest/perftest.py (original) +++ ./perftest/perftest.py (refactored) @@ -47,7 +47,7 @@ import shutil import sys import time -import urllib +import urllib.request, urllib.parse, urllib.error import setuppaths import indexer @@ -62,7 +62,7 @@ class Config(object): def __init__(self, **kwargs): - for key, val in kwargs.iteritems(): + for key, val in kwargs.items(): setattr(self, key, val) def usage(exitval): @@ -95,10 +95,10 @@ elif opt == '--usedb': config.usedb = val else: - print("Unknown option %r" % opt) + print(("Unknown option %r" % opt)) usage(1) - except getopt.GetoptError, e: - print("Bad options: %r" % str(e)) + except getopt.GetoptError as e: + print(("Bad options: %r" % str(e))) usage(1) if len(argv) != 1: @@ -120,7 +120,7 @@ if os.path.exists(indexlogpath): os.unlink(indexlogpath) - print "Starting index run (creating %s)" % dbpath + print("Starting index run (creating %s)" % dbpath) indexer.index_file(inputfile=testrun.inputfile, dbpath=dbpath, logpath=indexlogpath, @@ -128,7 +128,7 @@ description=testrun.description, maxdocs=testrun.maxdocs, logspeed=testrun.logspeed) - print "Ending index run" + print("Ending index run") def do_search(config, testrun): dbpath = testrun.dbpath(config) @@ -141,10 +141,10 @@ os.path.exists(searchlogfile): continue - print "Starting search run (logging to %s)" % searchlogfile + print("Starting search run (logging to %s)" % searchlogfile) tests = searcher.QueryTests(queryfile, dbpath, searchlogfile, concurrency, **extraargs) tests.run() - print "Ending search run" + print("Ending search run") def analyse_index(config): if analRefactoringTool: Refactored ./perftest/parseargs.py RefactoringTool: Refactored ./perftest/parse_wikipedia/wiki2dump.py yse_indexlogs is None: @@ -169,7 +169,7 @@ alltimes[filenameprefix] = (testrun.description, []) alltimes[filenameprefix][1].append(("flush=%d" % testrun.flushspeed, times)) - for desc in alltimes.iterkeys(): + for desc in alltimes.keys(): outprefix = os.path.join(config.outdir, 'index_comparison_%s_' % (desc, )) analyse_indexlogs.generate_comparison_figures(alltimes[desc][1], outprefix, alltimes[desc][0]) @@ -259,7 +259,7 @@ return os.path.join(config.outdir, 'index_%s_' % self._index_pathbit()) def extraargs_pathbit(self, extraargs): - extraargs_list = list(extraargs.iteritems()) + extraargs_list = list(extraargs.items()) extraargs_list.sort() extraargs = [] for key, val in extraargs_list: --- ./perftest/parseargs.py (original) +++ ./perftest/parseargs.py (refactored) @@ -9,7 +9,7 @@ class Config(object): def __init__(self, **kwargs): - for key, val in kwargs.iteritems(): + for key, val in kwargs.items(): setattr(self, key, val) def usage(exitval): @@ -38,10 +38,10 @@ elif opt == '--searchruns': config.searchruns = int(val) else: - print("Unknown option %r" % opt) + print(("Unknown option %r" % opt)) usage(1) - except getopt.GetoptError, e: - print("Bad options: %r" % str(e)) + except getopt.GetoptError as e: + print(("Bad options: %r" % str(e))) usage(1) if len(argv) != 1: --- ./perftest/parse_wikipedia/wiki2dump.py (original) +++ ./perftest/parse_wikipedia/wiki2dump.py (refactored) @@ -56,20 +56,20 @@ it = xml.getItems() for item in it: if item.type == item.DATA: - if item.nodeNames[-2:] == [u'contributor', u'username']: + if item.nodeNames[-2:] == ['contributor', 'username']: self.contributor_name = item.data - elif item.nodeNames[-2:] == [u'contributor', u'id']: + elif item.nodeNames[-2:] == ['contributor', 'id']: self.contributor_id = item.data # FIXME - convert (checked) to int - elif item.nodeNames[-1] == u'id': + elif item.nodeNames[-1] == 'id': self.id = item.data # FIXME - convert (checked) to int - if item.nodeNames[-1] == u'timestamp': + if item.nodeNames[-1] == 'timestamp': self.timestamp = item.data # FIXME - convert (checked) to datetime - if item.nodeNames[-1] == u'comment': + if item.nodeNames[-1] == 'comment': self.comment = item.data - if item.nodeNames[-1] == u'text': + if item.nodeNames[-1] == 'text': self.text = item.data elif item.type == item.START: - if item.nodeNames[-1] == u'minor': + if item.nodeNames[-1] == 'minor': self.minor = True if len(self.text) > 0 and self.text[0] == '#': @@ -131,7 +131,7 @@ text = maxrev.text.replace('\n', '\n=') result.append("text=%s" % text) - return (u'\n'.join(result), redirect) + return ('\n'.join(result), redirect) def parse(infile, outfile, redirfile): infile_size = os.path.getsize(infile) @@ -157,7 +157,7 @@ continue if state == 1: if item.type == item.START: - if item.nodeNames[-1] == u'page': + if item.nodeNames[-1] == 'page': page = Page(item.expand()) (dump, redirect) = page.dump() if redirect: @@ -170,9 +170,9 @@ pos = infh.tell() percent = 100.0 * pos / infile_size if redirect: - print "Processed %f%%: %r (redirect to %r)" % (percent, page.title, redirect) + print("Processed %f%%: %r (redRefactoringTool: Refactored ./perftest/parse_wikipedia/XMLUtils.py irect to %r)" % (percent, page.title, redirect)) else: - print "Processed %f%%: %r" % (percent, page.title) + print("Processed %f%%: %r" % (percent, page.title)) infh.close() outfh.close() @@ -180,9 +180,9 @@ # Start if len(sys.argv) != 4: - print """ + print(""" Usage: ./wiki2dump.py - """.strip() + """.strip()) sys.exit(0) infile = sys.argv[1] @@ -190,5 +190,5 @@ redirfile = sys.argv[3] try: parse(infile, outfile, redirfile) -except UserError, e: - print e +except UserError as e: + print(e) --- ./perftest/parse_wikipedia/XMLUtils.py (original) +++ ./perftest/parse_wikipedia/XMLUtils.py (refactored) @@ -82,14 +82,14 @@ requiretags = set(requiretags) if nospacetags.intersection(spacetags): - raise ValueError, "Tags may not be shared between nospacetags and spacetags" + raise ValueError("Tags may not be shared between nospacetags and spacetags") if nospacetags.intersection(ignoretags): - raise ValueError, "Tags may not be shared between nospacetags and ignoretags" + raise ValueError("Tags may not be shared between nospacetags and ignoretags") if spacetags.intersection(ignoretags): - raise ValueError, "Tags may not be shared between spacetags and ignoretags" + raise ValueError("Tags may not be shared between spacetags and ignoretags") if requiretags is not None and ignoretags.intersection(requiretags): - raise ValueError, "Tags may not be shared between ignoretags and requiretags" + raise ValueError("Tags may not be shared between ignoretags and requiretags") if caseInsensitiveTags: if requiretags is not None: @@ -111,7 +111,7 @@ Convert a parsedXML object to text, according to the rules set up in this getter. """ - text = u'' + text = '' ignoring = 0 # Count of number of ignore tags we're within required = 0 # Count of number of require tags we're within if self._requiretags is None: @@ -129,9 +129,9 @@ if name in self._nospacetags: pass elif name in self._spacetags: - text += u' ' + text += ' ' else: - text += u' ' + text += ' ' elif item.type == ParsedXmlItem.END: name = item.nodeNames[-1] if self._caseInsensitiveTags: @@ -142,9 +142,9 @@ if name in self._nospacetags: pass elif name in self._spacetags: - text += u' ' + text += ' ' else: - text += u' ' + text += ' ' if self._requiretags is not None and name in self._requiretags: required -= 1 elif item.type == ParsedXmlItem.DATA: @@ -196,16 +196,16 @@ return default def __str__(self): - extra=u'' + extra='' if self.atts is not None: - for i in xrange(len(self.atts)): + for i in range(len(self.atts)): att = self.atts.item(i) - extra += u" %s='%s'" % (att.name, att.nodeValue) + extra += " %s='%s'" % (att.name, att.nodeValue) if self.data is not None: - extra += u' data=%s' % repr(self.data) + extra += ' data=%s' % repr(self.data) if len(extra) != 0: - extra = u',' + extra - return u'(%s, %s%s)' % ( + extra = ',' + extra + return '(%s, %s%s)' % ( self.typenames[self.type], repr(self.nodeNames), extra @@ -229,9 +229,9 @@ """ if self.type == self.START: ret = ['<%s' % self.nodeNames[-1]] - for i in xrange(len(self.atts)): + for i in range(len(self.atts)): att = self.atts.item (i) - ret.append (u' %s="%s"' % (att.name, HTMLUtils.encodeText (att.nodeValue))) + ret.append (' %s="%s"' % (att.name, HTMLUtils.encodeText (att.nodeValue))) ret.append ('>') return ''.join (ret) @@ -249,7 +249,7 @@ """ try: return callable() - except xml.parsers.expat.ExpatError, e: + except xml.parsers.expat.ExpatError as e: context = '' if xmlString: try: @@ -263,7 +263,7 @@ raise Errors.UserError("at line %%s, column %%s%s: %%s" % context, e.lineno, e.offset, xml.parsers.expat.ErrorString(e.code)) - except xml.sax.SAXParseException, e: + except xml.sax.SAXParseException as e: context = '' if xmlString: try: @@ -277,7 +277,7 @@ raise Errors.UserError("at line %%s, column %%s%s: %%s" % context, e.getLineNumber(), e.getColumnNumber(), e.getMessage()) - except ValueError, e: + except ValueError as e: raise Errors.UserError("Parse error: %s", str(e)) @@ -364,7 +364,7 @@ Parse a fragment of XML, storing the resulting DOM node in this ParsedXml object. """ - if isinstance(xmlString, unicode): + if isinstance(xmlString, str): # Convert to utf-8 xmlString = xmlString.encode('utf-8') result = [] @@ -397,7 +397,7 @@ 'Method needed to satisfy iterator protocol.' return self - def next(self): + def __next__(self): 'Move to next element, or throw StopIteration.' if len(self._nodelist) == 0: raise StopIteration @@ -439,7 +439,7 @@ else: pass subpos += 1 - result.data = u''.join(resultdata) + result.data = ''.join(resultdata) self._nodelist[-1][1] = subpos return result else: @@ -542,10 +542,10 @@ # If we've been supplied a filename instead of a file handle, try # opening it. - if isinstance(fh, basestring): + if isinstance(fh, str): try: handle = open(fh) - except IOError, e: + except IOError as e: raise Errors.UserError("Can't open file %s: %s", fh, str(e)) fh = handle @@ -622,7 +622,7 @@ self._expandedXml = None def callable(): while True: - self._nextItem = self._events.next() + self._nextItem = next(self._events) if self._nextItem[0] == 'START_ELEMENT': break try: @@ -634,7 +634,7 @@ 'Method needed to satisfy iterator protocol.' return self - def next(self): + def __next__(self): 'Move to next element, or throw StopIteration.' # If we've got an expanded iterator, pass through items from @@ -642,7 +642,7 @@ # nodenames to it. if self._expandedIter is not None: try: - item = self._expandedIter.next() + item = next(self._expandedIter) self._expandedIterStarted = True newNodeNames = [] newNodeNames.extend(self._nodeNames[:-1]) @@ -664,7 +664,7 @@ if self._events is None: raise StopIteratiRefactoringTool: Refactored ./perftest/parse_wikipedia/HTMLUtils.py on def callable(): - self._nextItem = self._events.next() + self._nextItem = next(self._events) try: _convertParseExceptions(callable) except StopIteration: @@ -687,7 +687,7 @@ characters = [self._lastNode.data] def callable(): while self._nextItem is None: - self._nextItem = self._events.next() + self._nextItem = next(self._events) if self._nextItem[0] == 'CHARACTERS': characters.append(self._nextItem[1].data) self._nextItem = None @@ -735,7 +735,7 @@ self._expandedIter = self._expandedXml.getItems() self._expandedIterStarted = False try: - self._expandedIter.next() + next(self._expandedIter) except StopIteration: self._expandedIter = None self._expandedXml = None @@ -758,8 +758,8 @@ while len(nodeNames) != 0: if self._nextItem is not None: (self._lastEvent, self._lastNode) = self._nextItem - self._nextItem = self._events.next() - print self._nextItem, nodeNames + self._nextItem = next(self._events) + print(self._nextItem, nodeNames) (event, node) = self._nextItem if event == 'START_ELEMENT': nodeNames.append(node.nodeName) --- ./perftest/parse_wikipedia/HTMLUtils.py (original) +++ ./perftest/parse_wikipedia/HTMLUtils.py (refactored) @@ -16,9 +16,9 @@ # with this program; if not, write to the Free Software Foundation, Inc., # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. -import urllib +import urllib.request, urllib.parse, urllib.error import re -import htmlentitydefs +import html.entities def __booltonum(val): """Convert any boolean values to '0' or '1'. @@ -46,7 +46,7 @@ """ if isinstance(paramdict, dict): paramlist = [(key, __booltonum(val)) - for (key, val) in paramdict.iteritems() + for (key, val) in paramdict.items() if key not in kwargs and val is not None] else: paramlist = [(key, __booltonum(val)) @@ -54,10 +54,10 @@ if key not in kwargs and val is not None] paramlist.extend([(key, val) - for (key, val) in kwargs.iteritems() + for (key, val) in kwargs.items() if val is not None]) paramlist.sort() - return urllib.urlencode (paramlist) + return urllib.parse.urlencode (paramlist) def encodeAttribValue(value): """ @@ -90,7 +90,7 @@ """ result = [] if isinstance(value, str): - value = unicode(value, 'utf-8') + value = str(value, 'utf-8') for char in value: if (char >= '0' and char <= '9') or (char >= 'a' and char <= 'z') or (char >= 'A' and char <= 'Z') or char in './#_': result.append(char) @@ -134,7 +134,7 @@ 1 """ - if not isinstance(value, basestring): + if not isinstance(value, str): return value result = value.replace('&', '&') result = result.replace('#', '#') @@ -156,11 +156,11 @@ >>> percentEncode('q=M&%&%; S') 'q=M%26%25%26%25%3B+S' - >>> percentEncode(u'\u00a3') + >>> percentEncode(u'\\u00a3') '%C2%A3' """ result = [] - if isinstance(value, unicode): + if isinstance(value, str): value = value.encode('utf-8') assert isinstance(value, str) for char in value: @@ -190,7 +190,7 @@ """ value = value - if isinstance(value, uniRefactoringTool: No changes to ./perftest/parse_wikipedia/Errors.py RefactoringTool: No changes to ./perftest/indexer.py RefactoringTool: Refactored ./perftest/imgsearch.py RefactoringTool: Refactored ./perftest/imgidx.py RefactoringTool: Refactored ./perftest/gen_queries.py RefactoringTool: Refactored ./perftest/analyse_searchlogs.py code): + if isinstance(value, str): value = value.encode('utf-8') value = value.replace('+', ' ') i = 0 @@ -237,18 +237,18 @@ def _subst (mo): ms = mo.group(0).lower() if ms.startswith ('&#x'): - return unichr (int (ms[3:-1], 16)) + return chr (int (ms[3:-1], 16)) elif ms.startswith ('&#'): - return unichr (int (ms[2:-1])) + return chr (int (ms[2:-1])) elif ms.startswith ('&'): try: - return unichr (htmlentitydefs.name2codepoint[ms[1:-1]]) + return chr (html.entities.name2codepoint[ms[1:-1]]) except KeyError: return ms else: return '' - return _ent_re.sub (_subst, unicode (s)) + return _ent_re.sub (_subst, str (s)) _js_slash = { '"':'\\"', '\n':'\\n', '\r':'\\r' } def js_encode (s): --- ./perftest/imgsearch.py (original) +++ ./perftest/imgsearch.py (refactored) @@ -7,7 +7,7 @@ start_clock = time.clock() results = sconn.search(q, 0, 10) end_clock = time.clock() - print "Elapsed: ", end_clock - start_clock + print("Elapsed: ", end_clock - start_clock) return results def main(index): @@ -18,11 +18,11 @@ search(q, sconn) results = search(q, sconn) tdoc = sconn.get_document(target) - print "target: ", - print '' - print "results:" + print("target: ", end=' ') + print('') + print("results:") for r in results: - print '' + print('') if __name__ == '__main__': import sys --- ./perftest/imgidx.py (original) +++ ./perftest/imgidx.py (refactored) @@ -18,19 +18,19 @@ try: i = iconn.add(doc) except: - print "problem with: ", path + print("problem with: ", path) count += 1 if count % 1000 == 0: - print count + print(count) return iconn if __name__ == "__main__": import sys import time start_time = time.clock() - print "starting clock: ", start_time + print("starting clock: ", start_time) iconn = main(*sys.argv[1:]) end_time = time.clock() - print "ending clock: ", end_time, " elapsed: ", end_time - start_time, " doc_count: ", iconn.get_doccount() + print("ending clock: ", end_time, " elapsed: ", end_time - start_time, " doc_count: ", iconn.get_doccount()) --- ./perftest/gen_queries.py (original) +++ ./perftest/gen_queries.py (refactored) @@ -34,24 +34,24 @@ for word in words: self.words.append(word) - def next(self): + def __next__(self): return random.choice(self.words) class QueryGenerator: def __init__(self, source, distribution): self.source = source self.distribution = [] - for key, val in distribution.iteritems(): + for key, val in distribution.items(): self.distribution.extend((key,) * val) def __iter__(self): return self - def next(self): + def __next__(self): qlen = random.choice(self.distribution) qterms = [] while qlen > 0: - qterms.append(self.source.next()) + qterms.append(next(self.source)) qlen -= 1 return ' '.join(qterms) --- ./perftest/analyse_searchlogs.py (original) +++ ./perftest/analyse_searchlogs.py (refactored) @@ -24,10 +24,10 @@ query_v_time.sort() query_v_time.reverse() - print "Average speed: %f seconds" % (sum((row.time for row in log)) / len(log)) - print "Slowest queries:" + print("Average speed: %f seconds" % (sum((row.time for row in log)) / len(log))) + print("Slowest queries:") for time, query in query_v_time[:10]: - print "%s: %f seconds" % (query, time) + print("%s: %f seconds" % (query, time)) def generaRefactoringTool: Refactored ./perftest/analyse_indexlogs.py RefactoringTool: Refactored ./libs/get_xapian.py RefactoringTool: Refactored ./external_posting_source/sortdatabase/test_sortdatabase.py te_figures(log, outprefix, pretitle): --- ./perftest/analyse_indexlogs.py (original) +++ ./perftest/analyse_indexlogs.py (refactored) @@ -23,7 +23,7 @@ def get_av_docspersec(times, interval, docscale): docspersec = [] docs = [] - for i in xrange(interval, len(times)): + for i in range(interval, len(times)): docinterval = times[i].docs - times[i-interval].docs timeinterval = times[i].time - times[i-interval].time docspersec.append(docinterval / timeinterval) @@ -195,8 +195,8 @@ fd = open(filename) times = [] reader = csv.reader(fd) - descline = reader.next() - headings = reader.next() + descline = next(reader) + headings = next(reader) assert(','.join(headings) == 'Documents Added,Time(seconds),dbsize(bytes),inputsize(bytes)') for row in reader: newrow = logrow(*row) --- ./libs/get_xapian.py (original) +++ ./libs/get_xapian.py (refactored) @@ -28,7 +28,7 @@ import sys import tarfile import tempfile -import urllib2 +import urllib.request, urllib.error, urllib.parse try: import hashlib @@ -117,7 +117,7 @@ if not os.path.isdir(destdir): os.makedirs(destdir) - fd = urllib2.urlopen(url) + fd = urllib.request.urlopen(url) tmpfd, tmpname = tempfile.mkstemp(dir=destdir, prefix='xappy') try: os.write(tmpfd, fd.read()) @@ -155,7 +155,7 @@ the archive couldn't be downloaded """ - print("Checking for %s" % name) + print(("Checking for %s" % name)) # Get the path that the package should be downloaded to filepath = os.path.join(package_dir, archivename) @@ -164,18 +164,18 @@ if os.path.exists(filepath): calculated_hash = calc_sha_hash(filepath) if expected_hash != calculated_hash: - print("Package of %s at '%s' has wrong hash (probably an old version) - discarding" % (name, archivename)) - print("(Got %s, expected %s)" % (calculated_hash, expected_hash)) + print(("Package of %s at '%s' has wrong hash (probably an old version) - discarding" % (name, archivename))) + print(("(Got %s, expected %s)" % (calculated_hash, expected_hash))) os.unlink(filepath) # Download the package if needed. if not os.path.exists(filepath): - print("Downloading %s from %s" % (name, url)) + print(("Downloading %s from %s" % (name, url))) download_file(url, filepath) calculated_hash = calc_sha_hash(filepath) if expected_hash != calculated_hash: - print("Package of %s at '%s' has wrong hash - cannot continue" % (name, archivename)) - print("(Got %s, expected %s)" % (calculated_hash, expected_hash)) + print(("Package of %s at '%s' has wrong hash - cannot continue" % (name, archivename))) + print(("(Got %s, expected %s)" % (calculated_hash, expected_hash))) os.unlink(filepath) return None @@ -191,17 +191,17 @@ if archivepath is None: return False - print("Unpacking %s" % name) + print(("Unpacking %s" % name)) archivedir = unpack_tar_archive(archivepath, package_dir) if target_location != '': target_path = os.path.join(package_dir, target_location) if os.path.exists(target_path): - print("Removing old unpacked copy of archive from %s" % - target_location) + print(("Removing old unpacked copy of archive from %s" % + target_location)) shutil.rmtree(target_path) - print("Moving %s to %s" % (name, target_location)) + print(("Moving %s to %s" % (name, target_location))) shutil.move(archivedir, target_path) return True --- ./external_posting_source/sortdatabase/test_sortdatabase.py (original) +++ ./external_posting_source/sortdatabase/test_sortdatabase.py (refactored) @@ -4,14 +4,14 @@ import numpy import random mm=numpy.memmap(filename, numpy.int32, 'w+', shape=(50,)) - RefactoringTool: Refactored ./external_posting_source/sortdatabase/make_order.py RefactoringTool: Refactored ./external_posting_source/decreasing_weight/verifydb.py RefactoringTool: Refactored ./external_posting_source/decreasing_weight/time_dws.py RefactoringTool: Refactored ./external_posting_source/decreasing_weight/test_dws.py RefactoringTool: Refactored ./external_posting_source/decreasing_weight/dws.py RefactoringTool: Refactored ./examples/search.py for num in xrange(length): + for num in range(length): mm[num] = num + 1 random.shuffle(mm) def make_sample_db(dbpath, length): import xapian db = xapian.WritableDatabase(dbpath, xapian.DB_CREATE) - for num in xrange(1, length + 1): + for num in range(1, length + 1): doc = xapian.Document() doc.add_term("T%d" % num) db.add_document(doc) --- ./external_posting_source/sortdatabase/make_order.py (original) +++ ./external_posting_source/sortdatabase/make_order.py (refactored) @@ -4,7 +4,7 @@ value (optionally reversed) """ -from __future__ import with_statement + import numpy import xapian import xappy @@ -47,7 +47,7 @@ new_xapids.tofile(of) def describe(): - print """ + print(""" usage: python make_order.py [] where: : name of an index @@ -55,7 +55,7 @@ : purpose of the value : file containing the docids ordered by weight : order in descending order of weight (optional - anything here triggers reversing) - """ + """) if __name__ == "__main__": import sys --- ./external_posting_source/decreasing_weight/verifydb.py (original) +++ ./external_posting_source/decreasing_weight/verifydb.py (refactored) @@ -9,9 +9,9 @@ def main(dbname): conn = xappy.SearchConnection(dbname) if conn.get_doccount() == conn._index.get_lastdocid(): - print "Database %s has contiguous doc ids starting at 1." % dbname + print("Database %s has contiguous doc ids starting at 1." % dbname) else: - print "WARNING: Database %s has non-contiguous doc ids!" % dbname + print("WARNING: Database %s has non-contiguous doc ids!" % dbname) if __name__ == '__main__': import sys --- ./external_posting_source/decreasing_weight/time_dws.py (original) +++ ./external_posting_source/decreasing_weight/time_dws.py (refactored) @@ -22,7 +22,7 @@ timer = timeit.Timer(execute_string, setup_string) try: time = timer.timeit(10) - print st, warm, time, res_count + print(st, warm, time, res_count) except: timer.print_exc() --- ./external_posting_source/decreasing_weight/test_dws.py (original) +++ ./external_posting_source/decreasing_weight/test_dws.py (refactored) @@ -1,7 +1,7 @@ """ unit tests for dws """ -from __future__ import with_statement + import os import shutil import tempfile @@ -19,7 +19,7 @@ iconn = xappy.IndexerConnection(self.index) self.count = 100 self.weights = numpy.array( - [x/float(self.count) for x in xrange(self.count, 0 ,-1)], + [x/float(self.count) for x in range(self.count, 0 ,-1)], 'float32') for w in self.weights: iconn.add(xappy.UnprocessedDocument()) @@ -42,19 +42,19 @@ def test_cutoff_next(self): source = self.query._Query__refs[0] source.reset() - for _ in xrange(50): + for _ in range(50): source.next(0) - self.failIf(source.at_end()) + self.assertFalse(source.at_end()) source.next(0.7) - self.failUnless(source.at_end()) + self.assertTrue(source.at_end()) def test_cutoff_skip_to(self): source = self.query._Query__refs[0] source.reset() source.skip_to(77, 0) - self.failIf(source.at_end()) + self.assertFalse(source.at_end()) source.skip_to(80, 0.7) - self.failUnless(source.at_end()) + self.assertTrue(source.at_end()) class VDWSTestCase(sourceTest, unittest.TestCase): --- ./external_posting_source/decreasing_weight/dws.py (original) +++ ./external_posting_source/decreasing_weight/dws.py (refactored) @@ -36,7 +36,7 @@ source.reset() return source -FILE, VECTOR, CACHED_VECTOR = range(3) +FILE, VECTOR, CACHED_VECTOR = list(range(3)) _VCACHE = Vector_DWS_Cache() def make_page_rank_query(conn, source_type): --- ./examples/search.py (original) RefactoringTool: Refactored ./examples/fileindex.py RefactoringTool: Refactored ./eggsetup.py RefactoringTool: Refactored ./coverage.py +++ ./examples/search.py (refactored) @@ -28,26 +28,26 @@ dbpath = 'foo' search = ' '.join(argv[1:]) sconn = open_index(dbpath) - print "Searching %d documents for \"%s\"" % ( + print("Searching %d documents for \"%s\"" % ( sconn.get_doccount(), search - ) + )) q = sconn.query_parse(search, default_op=sconn.OP_AND) results = sconn.search(q, 0, 10) if results.estimate_is_exact: - print "Found %d results" % results.matches_estimated + print("Found %d results" % results.matches_estimated) else: - print "Found approximately %d results" % results.matches_estimated + print("Found approximately %d results" % results.matches_estimated) for result in results: - print result.data['path'][0] + print(result.data['path'][0]) try: summary = result.summarise('text', hl=('*', '*'), maxlen=300) summary = ' '.join(_whitespace_re.split(summary)) - print summary + print(summary) except KeyError: pass - print + print() if __name__ == '__main__': main(sys.argv) --- ./examples/fileindex.py (original) +++ ./examples/fileindex.py (refactored) @@ -50,7 +50,7 @@ contents = fd.read() fd.close() try: - contents = unicode(contents) + contents = str(contents) except UnicodeDecodeError: return doc.fields.append(xappy.Field('text', contents)) @@ -89,7 +89,7 @@ create_index(dbpath) iconn = open_index(dbpath) count = index_path(iconn, docpath) - print "Indexed %d documents." % count + print("Indexed %d documents." % count) if __name__ == '__main__': main(sys.argv) --- ./eggsetup.py (original) +++ ./eggsetup.py (refactored) @@ -24,4 +24,4 @@ """ import setuptools -execfile('setup.py') +exec(compile(open('setup.py', "rb").read(), 'setup.py', 'exec')) --- ./coverage.py (original) +++ ./coverage.py (refactored) @@ -64,6 +64,7 @@ import string import symbol import sys +import atexit import threading import token import types @@ -72,7 +73,7 @@ # Python version compatibility try: - strclass = basestring # new to 2.3 + strclass = str # new to 2.3 except: strclass = str @@ -221,9 +222,9 @@ return 0 # If this line is excluded, or suite_spots maps this line to # another line that is exlcuded, then we're excluded. - elif self.excluded.has_key(lineno) or \ - self.suite_spots.has_key(lineno) and \ - self.excluded.has_key(self.suite_spots[lineno][1]): + elif lineno in self.excluded or \ + lineno in self.suite_spots and \ + self.suite_spots[lineno][1] in self.excluded: return 0 # Otherwise, this is an executable line. else: @@ -252,8 +253,8 @@ lastprev = self.getLastLine(prevsuite) firstelse = self.getFirstLine(suite) for l in range(lastprev+1, firstelse): - if self.suite_spots.has_key(l): - self.doSuite(None, suite, exclude=self.excluded.has_key(l)) + if l in self.suite_spots: + self.doSuite(None, suite, exclude=l in self.excluded) break else: self.doSuite(None, suite) @@ -364,9 +365,9 @@ def help(self, error=None): #pragma: no cover if error: - print error - print - print __doc__ + print(error) + print() + print(__doc__) sys.exit(1) def command_line(self, argv, help_fn=None): @@ -386,13 +387,13 @@ '-x': 'execute', '-o:': 'omit=', } - short_opts = string.join(map(lambda o: o[1:], optmap.keys()), '') - long_opts = optmap.values() + short_opts = string.join([o[1:] for o in list(optmap.keys())], '') + long_opts = list(optmap.values()) options, args = getopt.getopt(argv, short_opts, long_opts) for o, a in options: - if optmap.has_key(o): + if o in optmap: settings[optmap[o]] = 1 - elif optmap.has_key(o + ':'): + elif o + ':' in optmap: settings[optmap[o + ':']] = a elif o[2:] in long_opts: settings[o[2:]] = 1 @@ -433,11 +434,11 @@ self.start() import __main__ sys.path[0] = os.path.dirname(sys.argv[0]) - execfile(sys.argv[0], __main__.__dict__) + exec(compile(open(sys.argv[0], "rb").read(), sys.argv[0], 'exec'), __main__.__dict__) if settings.get('collect'): self.collect() if not args: - args = self.cexecuted.keys() + args = list(self.cexecuted.keys()) ignore_errors = settings.get('ignore-errors') show_missing = settings.get('show-missing') @@ -535,7 +536,7 @@ import marshal cexecuted = marshal.load(cache) cache.close() - if isinstance(cexecuted, types.DictType): + if isinstance(cexecuted, dict): return cexecuted else: return {} @@ -555,15 +556,15 @@ self.merge_data(cexecuted) def merge_data(self, new_data): - for file_name, file_data in new_data.items(): - if self.cexecuted.has_key(file_name): + for file_name, file_data in list(new_data.items()): + if file_name in self.cexecuted: self.merge_file_data(self.cexecuted[file_name], file_data) else: self.cexecuted[file_name] = file_data def merge_file_data(self, cache_data, new_data): - for line_number in new_data.keys(): - if not cache_data.has_key(line_number): + for line_number in list(new_data.keys()): + if line_number not in cache_data: cache_data[line_number] = new_data[line_number] def abs_file(self, filename): @@ -597,7 +598,7 @@ # normalized case). See [GDR 2001-12-04b, 3.3]. def canonical_filename(self, filename): - if not self.canonical_filename_cache.has_key(filename): + if filename not in self.canonical_filename_cache: f = filename if os.path.isabs(f) and not os.path.exists(f): f = os.path.basename(f) @@ -615,12 +616,12 @@ # canonicalizing filenames on the way. Clear the "c" map. def canonicalize_filenames(self): - for filename, lineno in self.c.keys(): + for filename, lineno in list(self.c.keys()): if filename == '': # Can't do anything useful with exec'd strings, so skip them. continue f = self.canonical_filename(filename) - if not self.cexecuted.has_key(f): + if f not in self.cexecuted: self.cexecuted[f] = {} self.cexecuted[f][lineno] = 1 self.c = {} @@ -645,7 +646,7 @@ # statements that cross lines. def analyze_morf(self, morf): - if self.analysis_cache.has_key(morf): + if morf in self.analysis_cache: return self.analysis_cache[morf] filename = self.morf_filename(morf) ext = os.path.splitext(filename)[1] @@ -660,7 +661,7 @@ lines, excluded_lines, line_map = self.find_executable_statements( source.read(), exclude=self.exclude_re ) - except SyntaxError, synerr: + except SyntaxError as synerr: raise CoverageException( "Couldn't parse '%s' as Python source: '%s' at line %d" % (filename, synerr.msg, synerr.lineno) @@ -779,9 +780,9 @@ visitor = StatementFindingAstVisitor(statements, excluded, suite_spots) compiler.walk(ast, visitor, walker=visitor) - lines = statements.keys() + lines = list(statements.keys()) lines.sort() - excluded_lines = excluded.keys() + excluded_lines = list(excludedRefactoringTool: No changes to ./build.py RefactoringTool: Files that were modified: RefactoringTool: ./xappy/utils.py RefactoringTool: ./xappy/unittests/xappytest.py RefactoringTool: ./xappy/unittests/weight_params.py RefactoringTool: ./xappy/unittests/weight_external.py RefactoringTool: ./xappy/unittests/weight_action.py RefactoringTool: ./xappy/unittests/valuemapsource_1.py RefactoringTool: ./xappy/unittests/terms_for_field.py RefactoringTool: ./xappy/unittests/store_only.py RefactoringTool: ./xappy/unittests/spell_correct_1.py RefactoringTool: ./xappy/unittests/sort.py RefactoringTool: ./xappy/unittests/similar.py RefactoringTool: ./xappy/unittests/searchresults_slice.py RefactoringTool: ./xappy/unittests/searchconn_process.py RefactoringTool: ./xappy/unittests/range_speed.py RefactoringTool: ./xappy/unittests/range_accel.py RefactoringTool: ./xappy/unittests/query_serialise.py RefactoringTool: ./xappy/unittests/query_id.py RefactoringTool: ./xappy/unittests/query_all.py RefactoringTool: ./xappy/unittests/multiple_caches.py RefactoringTool: ./xappy/unittests/indexer_errors.py RefactoringTool: ./xappy/unittests/imgseek.py RefactoringTool: ./xappy/unittests/general1.py RefactoringTool: ./xappy/unittests/freetext_1.py RefactoringTool: ./xappy/unittests/field_groups.py RefactoringTool: ./xappy/unittests/field_associations.py RefactoringTool: ./xappy/unittests/facets.py RefactoringTool: ./xappy/unittests/facet_query_type_1.py RefactoringTool: ./xappy/unittests/facet_hierarchy_1.py RefactoringTool: ./xappy/unittests/exact_index_terms.py RefactoringTool: ./xappy/unittests/emptydb_search.py RefactoringTool: ./xappy/unittests/dociter.py RefactoringTool: ./xappy/unittests/docids.py RefactoringTool: ./xappy/unittests/docbuild.py RefactoringTool: ./xappy/unittests/diversity.py RefactoringTool: ./xappy/unittests/distance.py RefactoringTool: ./xappy/unittests/difference.py RefactoringTool: ./xappy/unittests/db_type_compat1.py RefactoringTool: ./xappy/unittests/db_type1.py RefactoringTool: ./xappy/unittests/colour.py RefactoringTool: ./xappy/unittests/collapse.py RefactoringTool: ./xappy/unittests/cluster.py RefactoringTool: ./xappy/unittests/calc_hash.py RefactoringTool: ./xappy/unittests/cachemanager.py RefactoringTool: ./xappy/unittests/cached_searches.py RefactoringTool: ./xappy/searchresults.py RefactoringTool: ./xappy/searchconnection.py RefactoringTool: ./xappy/query.py RefactoringTool: ./xappy/perftest/harness.py RefactoringTool: ./xappy/perftest/cachemanager.py RefactoringTool: ./xappy/parsedate.py RefactoringTool: ./xappy/mset_search_results.py RefactoringTool: ./xappy/memutils.py RefactoringTool: ./xappy/marshall.py RefactoringTool: ./xappy/indexerconnection.py RefactoringTool: ./xappy/highlight.py RefactoringTool: ./xappy/fields.py RefactoringTool: ./xappy/fieldmappings.py RefactoringTool: ./xappy/fieldactions.py RefactoringTool: ./xappy/errors.py RefactoringTool: ./xappy/datastructures.py RefactoringTool: ./xappy/colour_data.py RefactoringTool: ./xappy/colour.py RefactoringTool: ./xappy/cachemanager/generic.py RefactoringTool: ./xappy/cachemanager/xapian_manager.py RefactoringTool: ./xappy/cachemanager/verify_cache.py RefactoringTool: ./xappy/cachemanager/queryinvert.py RefactoringTool: ./xappy/cachemanager/numpy_inverter.py RefactoringTool: ./xappy/cachemanager/inmemory_inverter.py RefactoringTool: ./xappy/cachemanager/__init__.py RefactoringTool: ./xappy/cache_search_results.py RefactoringTool: ./xappy/_checkxapian.py RefactoringTool: ./xappy/__init__.py RefactoringTool: ./utils/replay_search_log.py RefactoringTool: ./utils/dump_field_actions.py RefactoringTool: ./test.py RefactoringTool: ./setup.py RefactoringTool: ./research/expand_prefixes.py RefactoringTool: ./perftest/setuppaths.py RefactoringTool: ./perftest/searcher.py RefactoringTool: ./perftest/perftest.py RefactoringTool: ./perftest/parseargs.py RefactoringTool: ./perftest/parse_wikipedia/wiki2dump.py RefactoringTool: ./perftest/parse_wikipedia/XMLUtils.py RefactoringTool: ./perftest/parse_wikipedia/HTMLUtils.py RefactoringTool: ./perftest/parse_wikipedia/Errors.py RefactoringTool: ./perftest/indexer.py RefactoringTool: ./perftest/imgsearch.py RefactoringTool: ./perftest/imgidx.py RefactoringTool: ./perftest/gen_queries.py RefactoringTool: ./perftest/analyse_searchlogs.py RefactoringTool: ./perftest/analyse_indexlogs.py RefactoringTool: ./libs/get_xapian.py RefactoringTool: ./external_posting_source/sortdatabase/test_sortdatabase.py RefactoringTool: ./external_posting_source/sortdatabase/make_order.py RefactoringTool: ./external_posting_source/decreasing_weight/verifydb.py RefactoringTool: ./external_posting_source/decreasing_weight/time_dws.py RefactoringTool: ./external_posting_source/decreasing_weight/test_dws.py RefactoringTool: ./external_posting_source/decreasing_weight/dws.py RefactoringTool: ./examples/search.py RefactoringTool: ./examples/fileindex.py RefactoringTool: ./eggsetup.py RefactoringTool: ./coverage.py RefactoringTool: ./build.py RefactoringTool: Warnings/messages while refactoring: RefactoringTool: ### In file ./xappy/cachemanager/queryinvert.py ### RefactoringTool: Line 70: You should use 'operator.mul(key)' here. .keys()) excluded_lines.sort() return lines, excluded_lines, suite_spots @@ -816,7 +817,7 @@ return "%d" % start else: return "%d-%d" % (start, end) - ret = string.join(map(stringify, pairs), ", ") + ret = string.join(list(map(stringify, pairs)), ", ") return ret # Backward compatibility with version 1. @@ -827,13 +828,13 @@ def analysis2(self, morf): filename, statements, excluded, line_map = self.analyze_morf(morf) self.canonicalize_filenames() - if not self.cexecuted.has_key(filename): + if filename not in self.cexecuted: self.cexecuted[filename] = {} missing = [] for line in statements: lines = line_map.get(line, [line, line]) for l in range(lines[0], lines[1]+1): - if self.cexecuted[filename].has_key(l): + if l in self.cexecuted[filename]: break else: missing.append(line) @@ -871,7 +872,7 @@ return cmp(self.morf_name(x), self.morf_name(y)) def report(self, morfs, show_missing=1, ignore_errors=0, file=None, omit_prefixes=[]): - if not isinstance(morfs, types.ListType): + if not isinstance(morfs, list): morfs = [morfs] # On windows, the shell doesn't expand wildcards. Do it here. globbed = [] @@ -885,7 +886,7 @@ morfs = self.filter_by_prefix(morfs, omit_prefixes) morfs.sort(self.morf_name_compare) - max_name = max([5,] + map(len, map(self.morf_name, morfs))) + max_name = max([5,] + list(map(len, list(map(self.morf_name, morfs))))) fmt_name = "%%- %ds " % max_name fmt_err = fmt_name + "%s: %s" header = fmt_name % "Name" + " Stmts Exec Cover" @@ -895,8 +896,8 @@ fmt_coverage = fmt_coverage + " %s" if not file: file = sys.stdout - print >>file, header - print >>file, "-" * len(header) + print(header, file=file) + print("-" * len(header), file=file) total_statements = 0 total_executed = 0 for morf in morfs: @@ -912,7 +913,7 @@ args = (name, n, m, pc) if show_missing: args = args + (readable,) - print >>file, fmt_coverage % args + print(fmt_coverage % args, file=file) total_statements = total_statements + n total_executed = total_executed + m except KeyboardInterrupt: #pragma: no cover @@ -920,9 +921,9 @@ except: if not ignore_errors: typ, msg = sys.exc_info()[:2] - print >>file, fmt_err % (name, typ, msg) + print(fmt_err % (name, typ, msg), file=file) if len(morfs) > 1: - print >>file, "-" * len(header) + print("-" * len(header), file=file) if total_statements > 0: pc = 100.0 * total_executed / total_statements else: @@ -930,7 +931,7 @@ args = ("TOTAL", total_statements, total_executed, pc) if show_missing: args = args + ("",) - print >>file, fmt_coverage % args + print(fmt_coverage % args, file=file) # annotate(morfs, ignore_errors). @@ -1043,7 +1044,7 @@ import atexit atexit.register(the_coverage.save) except ImportError: - sys.exitfunc = the_coverage.save + atexit.register(the_coverage.save) # Command-line interface. if __name__ == '__main__': ++ find ./ -name '*.py' + sed -i 's|#!/usr/bin/env python.*|#!/usr/bin/env python3|' ./xappy/utils.py ./xappy/unittests/xappytest.py ./xappy/unittests/weight_params.py ./xappy/unittests/weight_external.py ./xappy/unittests/weight_action.py ./xappy/unittests/valuemapsource_1.py ./xappy/unittests/terms_for_field.py ./xappy/unittests/store_only.py ./xappy/unittests/spell_correct_1.py ./xappy/unittests/sort.py ./xappy/unittests/similar.py ./xappy/unittests/searchresults_slice.py ./xappy/unittests/searchconn_process.py ./xappy/unittests/range_speed.py ./xappy/unittests/range_accel.py ./xappy/unittests/query_serialise.py ./xappy/unittests/query_id.py ./xappy/unittests/query_all.py ./xappy/unittests/multiple_caches.py ./xappy/unittests/indexer_errors.py ./xappy/unittests/imgseek.py ./xappy/unittests/general1.py ./xappy/unittests/freetext_1.py ./xappy/unittests/field_groups.py ./xappy/unittests/field_associations.py ./xappy/unittests/facets.py ./xappy/unittests/facet_query_type_1.py ./xappy/unittests/facet_hierarchy_1.py ./xappy/unittests/exact_index_terms.py ./xappy/unittests/emptydb_search.py ./xappy/unittests/dociter.py ./xappy/unittests/docids.py ./xappy/unittests/docbuild.py ./xappy/unittests/diversity.py ./xappy/unittests/distance.py ./xappy/unittests/difference.py ./xappy/unittests/db_type_compat1.py ./xappy/unittests/db_type1.py ./xappy/unittests/colour.py ./xappy/unittests/collapse.py ./xappy/unittests/cluster.py ./xappy/unittests/calc_hash.py ./xappy/unittests/cachemanager.py ./xappy/unittests/cached_searches.py ./xappy/unittests/__init__.py ./xappy/searchresults.py ./xappy/searchconnection.py ./xappy/query.py ./xappy/perftest/harness.py ./xappy/perftest/cachemanager.py ./xappy/parsedate.py ./xappy/mset_search_results.py ./xappy/memutils.py ./xappy/marshall.py ./xappy/indexerconnection.py ./xappy/highlight.py ./xappy/fields.py ./xappy/fieldmappings.py ./xappy/fieldactions.py ./xappy/errors.py ./xappy/datastructures.py ./xappy/colour_data.py ./xappy/colour.py ./xappy/cachemanager/generic.py ./xappy/cachemanager/xapian_manager.py ./xappy/cachemanager/verify_cache.py ./xappy/cachemanager/queryinvert.py ./xappy/cachemanager/numpy_inverter.py ./xappy/cachemanager/inmemory_inverter.py ./xappy/cachemanager/__init__.py ./xappy/cache_search_results.py ./xappy/_checkxapian.py ./xappy/__init__.py ./utils/replay_search_log.py ./utils/dump_field_actions.py ./test.py ./setup.py ./research/expand_prefixes.py ./perftest/setuppaths.py ./perftest/searcher.py ./perftest/perftest.py ./perftest/parseargs.py ./perftest/parse_wikipedia/wiki2dump.py ./perftest/parse_wikipedia/XMLUtils.py ./perftest/parse_wikipedia/HTMLUtils.py ./perftest/parse_wikipedia/Errors.py ./perftest/indexer.py ./perftest/imgsearch.py ./perftest/imgidx.py ./perftest/gen_queries.py ./perftest/analyse_searchlogs.py ./perftest/analyse_indexlogs.py ./libs/get_xapian.py ./external_posting_source/sortdatabase/test_sortdatabase.py ./external_posting_source/sortdatabase/make_order.py ./external_posting_source/decreasing_weight/verifydb.py ./external_posting_source/decreasing_weight/time_dws.py ./external_posting_source/decreasing_weight/test_dws.py ./external_posting_source/decreasing_weight/dws.py ./examples/search.py ./examples/fileindex.py ./eggsetup.py ./coverage.py ./build.py + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.82641 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python3-module-xappy-0.6.0 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -march=i586 -mtune=generic' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -march=i586 -mtune=generic' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -march=i586 -mtune=generic' + export FFLAGS + /usr/bin/python3 setup.py build running build running build_py creating build creating build/lib creating build/lib/xappy copying xappy/__init__.py -> build/lib/xappy copying xappy/_checkxapian.py -> build/lib/xappy copying xappy/cache_search_results.py -> build/lib/xappy copying xappy/colour.py -> build/lib/xappy copying xappy/colour_data.py -> build/lib/xappy copying xappy/datastructures.py -> build/lib/xappy copying xappy/errors.py -> build/lib/xappy copying xappy/fieldactions.py -> build/lib/xappy copying xappy/fieldmappings.py -> build/lib/xappy copying xappy/fields.py -> build/lib/xappy copying xappy/highlight.py -> build/lib/xappy copying xappy/indexerconnection.py -> build/lib/xappy copying xappy/marshall.py -> build/lib/xappy copying xappy/memutils.py -> build/lib/xappy copying xappy/mset_search_results.py -> build/lib/xappy copying xappy/parsedate.py -> build/lib/xappy copying xappy/query.py -> build/lib/xappy copying xappy/searchconnection.py -> build/lib/xappy copying xappy/searchresults.py -> build/lib/xappy copying xappy/utils.py -> build/lib/xappy creating build/lib/xappy/cachemanager copying xappy/cachemanager/__init__.py -> build/lib/xappy/cachemanager copying xappy/cachemanager/inmemory_inverter.py -> build/lib/xappy/cachemanager copying xappy/cachemanager/numpy_inverter.py -> build/lib/xappy/cachemanager copying xappy/cachemanager/queryinvert.py -> build/lib/xappy/cachemanager copying xappy/cachemanager/verify_cache.py -> build/lib/xappy/cachemanager copying xappy/cachemanager/xapian_manager.py -> build/lib/xappy/cachemanager copying xappy/cachemanager/generic.py -> build/lib/xappy/cachemanager + exit 0 Executing(%install): /bin/sh -e /usr/src/tmp/rpm-tmp.14798 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + /bin/chmod -Rf u+rwX -- /usr/src/tmp/python3-module-xappy-buildroot + : + /bin/rm -rf -- /usr/src/tmp/python3-module-xappy-buildroot + cd python3-module-xappy-0.6.0 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -march=i586 -mtune=generic' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -march=i586 -mtune=generic' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -march=i586 -mtune=generic' + export FFLAGS + /usr/bin/python3 setup.py install --skip-build --root=/usr/src/tmp/python3-module-xappy-buildroot --force running install running install_lib creating /usr/src/tmp/python3-module-xappy-buildroot creating /usr/src/tmp/python3-module-xappy-buildroot/usr creating /usr/src/tmp/python3-module-xappy-buildroot/usr/lib creating /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3 creating /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages creating /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy creating /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/cachemanager/generic.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/cachemanager/xapian_manager.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/cachemanager/verify_cache.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/cachemanager/queryinvert.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/cachemanager/numpy_inverter.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/cachemanager/inmemory_inverter.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/cachemanager/__init__.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager copying build/lib/xappy/utils.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/searchresults.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/searchconnection.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/query.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/parsedate.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/mset_search_results.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/memutils.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/marshall.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/indexerconnection.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/highlight.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/fields.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/fieldmappings.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/fieldactions.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/errors.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/datastructures.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/colour_data.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/colour.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/cache_search_results.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/_checkxapian.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy copying build/lib/xappy/__init__.py -> /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py to generic.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py to xapian_manager.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/verify_cache.py to verify_cache.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/queryinvert.py to queryinvert.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/numpy_inverter.py to numpy_inverter.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/inmemory_inverter.py to inmemory_inverter.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__init__.py to __init__.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/utils.py to utils.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py to searchresults.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py to searchconnection.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/query.py to query.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/parsedate.py to parsedate.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py to mset_search_results.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/memutils.py to memutils.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/marshall.py to marshall.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py to indexerconnection.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/highlight.py to highlight.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fields.py to fields.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldmappings.py to fieldmappings.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py to fieldactions.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/errors.py to errors.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py to datastructures.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour_data.py to colour_data.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py to colour.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cache_search_results.py to cache_search_results.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/_checkxapian.py to _checkxapian.cpython-38.pyc byte-compiling /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py to __init__.cpython-38.pyc running install_egg_info Writing /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy-0.6.0-py3.8.egg-info + /usr/lib/rpm/brp-alt Cleaning files in /usr/src/tmp/python3-module-xappy-buildroot (auto) Verifying and fixing files in /usr/src/tmp/python3-module-xappy-buildroot (binconfig,pkgconfig,libtool,desktop) Checking contents of files in /usr/src/tmp/python3-module-xappy-buildroot/ (default) Compressing files in /usr/src/tmp/python3-module-xappy-buildroot (auto) Adjusting library links in /usr/src/tmp/python3-module-xappy-buildroot ./usr/lib: Verifying ELF objects in /usr/src/tmp/python3-module-xappy-buildroot (arch=normal,fhs=normal,lfs=relaxed,lint=relaxed,rpath=normal,stack=normal,textrel=normal,unresolved=normal) Bytecompiling python modules in /usr/src/tmp/python3-module-xappy-buildroot using /usr/bin/python2.7 Bytecompiling python modules with optimization in /usr/src/tmp/python3-module-xappy-buildroot using /usr/bin/python2.7 -O Bytecompiling python3 modules in /usr/src/tmp/python3-module-xappy-buildroot using /usr/bin/python3 unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/datastructures.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/fieldactions.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/indexerconnection.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/mset_search_results.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/searchconnection.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/searchresults.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__pycache__/utils.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/generic.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-38.pyc unlink /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/xapian_manager.cpython-38.pyc compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__init__.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/inmemory_inverter.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/numpy_inverter.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/queryinvert.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/verify_cache.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/_checkxapian.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cache_search_results.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour_data.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/errors.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldmappings.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fields.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/highlight.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/marshall.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/memutils.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/parsedate.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/query.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/utils.py Bytecompiling python3 modules with optimization in /usr/src/tmp/python3-module-xappy-buildroot using /usr/bin/python3 -O compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__init__.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/inmemory_inverter.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/numpy_inverter.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/queryinvert.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/verify_cache.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/_checkxapian.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cache_search_results.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour_data.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/errors.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldmappings.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fields.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/highlight.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/marshall.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/memutils.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/parsedate.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/query.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/utils.py Bytecompiling python3 modules with optimization-2 in /usr/src/tmp/python3-module-xappy-buildroot using /usr/bin/python3 -OO compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__init__.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/inmemory_inverter.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/numpy_inverter.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/queryinvert.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/verify_cache.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/_checkxapian.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cache_search_results.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour_data.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/errors.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldmappings.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fields.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/highlight.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/marshall.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/memutils.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/parsedate.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/query.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py compile /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/utils.py Hardlinking identical .pyc and .opt-?.pyc files './usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-38.pyc' './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-38.opt-1.pyc' => './usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-38.pyc' Hardlinking identical .pyc and .pyo files Processing files: python3-module-xappy-0.6.0-alt2 Executing(%doc): /bin/sh -e /usr/src/tmp/rpm-tmp.20120 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python3-module-xappy-0.6.0 + DOCDIR=/usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-0.6.0 + export DOCDIR + rm -rf /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-0.6.0 + /bin/mkdir -p /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-0.6.0 + cp -prL AUTHORS ChangeLog README /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-0.6.0 + chmod -R go-w /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-0.6.0 + chmod -R a+rX /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-0.6.0 + exit 0 Finding Provides (using /usr/lib/rpm/find-provides) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.hXNhH8 find-provides: running scripts (alternatives,debuginfo,lib,pam,perl,pkgconfig,python,python3,shell) Finding Requires (using /usr/lib/rpm/find-requires) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.RFfSW6 find-requires: running scripts (cpp,debuginfo,files,lib,pam,perl,pkgconfig,pkgconfiglib,python,python3,rpmlib,shebang,shell,static,symlinks,systemd-services) /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=31 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=32 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=33 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=34 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=35 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=36 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=37 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/__init__.py: line=38 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/_checkxapian.py: line=51 IGNORE (for REQ=slight and deep=8) module=xapian.imgseek /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cache_search_results.py: line=25 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cache_search_results.py: line=27 IGNORE (for REQ=slight and deep=8) module=simplejson /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cache_search_results.py: line=29 IGNORE (for REQ=slight and deep=8) module=json /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__init__.py: line=27 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__init__.py: line=29 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py: line=32 IGNORE (for REQ=slight and deep=8) module=hashlib /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py: line=34 IGNORE (for REQ=slight and deep=8) module=md5 /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py: line=37 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py: line=40 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/numpy_inverter.py: line=27 possible relative import from ., UNIMPLEMENTED python3.req: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/queryinvert.py: skipping itertools /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/queryinvert.py: line=101 IGNORE (for REQ=slight and deep=8) module=random /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/verify_cache.py: line=146 IGNORE (for REQ=slight and deep=8) module=sys /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py: line=28 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py: line=32 IGNORE (for REQ=slight and deep=8) module=simplejson /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py: line=34 IGNORE (for REQ=slight and deep=8) module=json python3.req: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py: skipping itertools /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py: line=58 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py: line=373 IGNORE (for REQ=slight and deep=8) module=xapian /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py: line=374 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/colour.py: line=461 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py: line=27 IGNORE (for REQ=slight and deep=8) module=hashlib /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py: line=29 IGNORE (for REQ=slight and deep=8) module=sha /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py: line=30 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/datastructures.py: line=31 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/errors.py: line=62 IGNORE (for REQ=slight and deep=8) module=xapian /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py: line=27 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py: line=28 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py: line=29 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py: line=30 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py: line=33 IGNORE (for REQ=slight and deep=8) module=xapian.imgseek /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py: line=36 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/fieldactions.py: line=206 IGNORE (for REQ=slight and deep=12) module=xapian.imgseek /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=27 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=31 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=32 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=33 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=36 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=37 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=38 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/indexerconnection.py: line=39 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/memutils.py: line=61 IGNORE (for REQ=slight and deep=12) module=ctypes /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/memutils.py: line=62 IGNORE (for REQ=slight and deep=12) module=ctypes.wintypes /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py: line=27 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py: line=29 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py: line=30 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py: line=31 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/mset_search_results.py: line=34 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/query.py: line=26 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=28 possible relative import from ., UNIMPLEMENTED python3.req: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: skipping itertools /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=36 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=37 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=38 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=39 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=40 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=43 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=44 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=45 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=47 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=48 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=49 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=272 IGNORE (for REQ=slight and deep=19) module=sys /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=272 IGNORE (for REQ=slight and deep=19) module=traceback /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=930 IGNORE (for REQ=slight and deep=11) module=xapian.imgseek /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchconnection.py: line=1705 IGNORE (for REQ=slight and deep=11) module=xappy /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py: line=27 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py: line=28 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py: line=29 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py: line=30 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py: line=31 possible relative import from ., UNIMPLEMENTED /usr/lib/rpm/python3.req.py: /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/searchresults.py: line=32 possible relative import from ., UNIMPLEMENTED shebang.req.files: executable script /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/__init__.py is not executable shebang.req.files: executable script /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/generic.py is not executable shebang.req.files: executable script /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/inmemory_inverter.py is not executable shebang.req.files: executable script /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/numpy_inverter.py is not executable shebang.req.files: executable script /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/queryinvert.py is not executable shebang.req.files: executable script /usr/src/tmp/python3-module-xappy-buildroot/usr/lib/python3/site-packages/xappy/cachemanager/xapian_manager.py is not executable Provides: python3(xappy), python3(xappy._checkxapian), python3(xappy.cache_search_results), python3(xappy.cachemanager), python3(xappy.cachemanager.generic), python3(xappy.cachemanager.inmemory_inverter), python3(xappy.cachemanager.numpy_inverter), python3(xappy.cachemanager.queryinvert), python3(xappy.cachemanager.verify_cache), python3(xappy.cachemanager.xapian_manager), python3(xappy.colour), python3(xappy.colour_data), python3(xappy.datastructures), python3(xappy.errors), python3(xappy.fieldactions), python3(xappy.fieldmappings), python3(xappy.fields), python3(xappy.highlight), python3(xappy.indexerconnection), python3(xappy.marshall), python3(xappy.memutils), python3(xappy.mset_search_results), python3(xappy.parsedate), python3(xappy.query), python3(xappy.searchconnection), python3(xappy.searchresults), python3(xappy.utils) Requires: /usr/lib/python3/site-packages, python3(collections) < 0, python3(colormath) < 0, python3(colormath.color_objects) < 0, python3(copy) < 0, python3(datetime) < 0, python3(inspect) < 0, python3(math) < 0, python3(numpy) < 0, python3(operator) < 0, python3(os) < 0, python3(pickle) < 0, python3(re) < 0, python3(scipy.cluster) < 0, python3(scipy.ndimage) < 0, python3(shutil) < 0, python3(tempfile) < 0, python3(threading) < 0, python3(xapian) < 0 Processing files: python3-module-xappy-docs-0.6.0-alt2 Executing(%doc): /bin/sh -e /usr/src/tmp/rpm-tmp.2229 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python3-module-xappy-0.6.0 + DOCDIR=/usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-docs-0.6.0 + export DOCDIR + rm -rf /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-docs-0.6.0 + /bin/mkdir -p /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-docs-0.6.0 + cp -prL docs/cachedresults.rst docs/colour.rst docs/image.rst docs/introduction.rst docs/queries.rst docs/running_perftest.txt docs/weighting.rst examples /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-docs-0.6.0 + chmod -R go-w /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-docs-0.6.0 + chmod -R a+rX /usr/src/tmp/python3-module-xappy-buildroot/usr/share/doc/python3-module-xappy-docs-0.6.0 + exit 0 Finding Provides (using /usr/lib/rpm/find-provides) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.78rxy7 find-provides: running scripts (alternatives,debuginfo,lib,pam,perl,pkgconfig,python,python3,shell) Finding Requires (using /usr/lib/rpm/find-requires) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.em7Le8 find-requires: running scripts (cpp,debuginfo,files,lib,pam,perl,pkgconfig,pkgconfiglib,python,python3,rpmlib,shebang,shell,static,symlinks,systemd-services) Wrote: /usr/src/RPM/RPMS/noarch/python3-module-xappy-0.6.0-alt2.noarch.rpm Wrote: /usr/src/RPM/RPMS/noarch/python3-module-xappy-docs-0.6.0-alt2.noarch.rpm 33.36user 0.83system 0:54.18elapsed 63%CPU (0avgtext+0avgdata 33104maxresident)k 0inputs+0outputs (0major+215674minor)pagefaults 0swaps /.out/python3-module-xappy-0.6.0-alt2.noarch.rpm: The use of such a license name is ambiguous: GPL /.out/python3-module-xappy-docs-0.6.0-alt2.noarch.rpm: The use of such a license name is ambiguous: GPL 42.38user 4.96system 1:14.09elapsed 63%CPU (0avgtext+0avgdata 108536maxresident)k 0inputs+0outputs (0major+679374minor)pagefaults 0swaps --- python3-module-xappy-0.6.0-alt2.noarch.rpm.repo 2019-12-30 08:20:52.000000000 +0000 +++ python3-module-xappy-0.6.0-alt2.noarch.rpm.hasher 2020-04-25 12:33:13.586750816 +0000 @@ -1,65 +1,65 @@ /usr/lib/python3/site-packages/xappy 40755 -/usr/lib/python3/site-packages/xappy-0.6.0-py3.7.egg-info 100644 +/usr/lib/python3/site-packages/xappy-0.6.0-py3.8.egg-info 100644 /usr/lib/python3/site-packages/xappy/__init__.py 100644 /usr/lib/python3/site-packages/xappy/__pycache__ 40755 -/usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/datastructures.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/datastructures.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/datastructures.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fieldactions.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fieldactions.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fieldactions.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/indexerconnection.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/indexerconnection.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/indexerconnection.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/mset_search_results.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/mset_search_results.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/mset_search_results.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/searchconnection.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/searchconnection.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/searchconnection.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/searchresults.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/searchresults.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/searchresults.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/utils.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/utils.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/__pycache__/utils.cpython-37.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/__init__.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/_checkxapian.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/cache_search_results.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/colour.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/colour_data.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/datastructures.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/datastructures.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/datastructures.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/errors.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fieldactions.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fieldactions.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fieldactions.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fieldmappings.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/fields.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/highlight.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/indexerconnection.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/indexerconnection.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/indexerconnection.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/marshall.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/memutils.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/mset_search_results.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/mset_search_results.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/mset_search_results.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/parsedate.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/query.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/searchconnection.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/searchconnection.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/searchconnection.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/searchresults.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/searchresults.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/searchresults.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/utils.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/utils.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/__pycache__/utils.cpython-38.pyc 100644 /usr/lib/python3/site-packages/xappy/_checkxapian.py 100644 @@ -69,23 +69,23 @@ /usr/lib/python3/site-packages/xappy/cachemanager/__pycache__ 40755 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/generic.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/generic.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/generic.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-37.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/xapian_manager.cpython-37.opt-1.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/xapian_manager.cpython-37.opt-2.pyc 100644 -/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/xapian_manager.cpython-37.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/__init__.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/generic.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/generic.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/generic.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/inmemory_inverter.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/numpy_inverter.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/queryinvert.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/verify_cache.cpython-38.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/xapian_manager.cpython-38.opt-1.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/xapian_manager.cpython-38.opt-2.pyc 100644 +/usr/lib/python3/site-packages/xappy/cachemanager/__pycache__/xapian_manager.cpython-38.pyc 100644 /usr/lib/python3/site-packages/xappy/cachemanager/generic.py 100644