<86>Nov 26 10:14:33 userdel[2111777]: delete user 'rooter' <86>Nov 26 10:14:33 userdel[2111777]: removed group 'rooter' owned by 'rooter' <86>Nov 26 10:14:33 userdel[2111777]: removed shadow group 'rooter' owned by 'rooter' <86>Nov 26 10:14:33 groupadd[2111810]: group added to /etc/group: name=rooter, GID=708 <86>Nov 26 10:14:33 groupadd[2111810]: group added to /etc/gshadow: name=rooter <86>Nov 26 10:14:33 groupadd[2111810]: new group: name=rooter, GID=708 <86>Nov 26 10:14:33 useradd[2111827]: new user: name=rooter, UID=708, GID=708, home=/root, shell=/bin/bash <86>Nov 26 10:14:33 userdel[2111867]: delete user 'builder' <86>Nov 26 10:14:33 userdel[2111867]: removed group 'builder' owned by 'builder' <86>Nov 26 10:14:33 userdel[2111867]: removed shadow group 'builder' owned by 'builder' <86>Nov 26 10:14:33 groupadd[2111898]: group added to /etc/group: name=builder, GID=709 <86>Nov 26 10:14:33 groupadd[2111898]: group added to /etc/gshadow: name=builder <86>Nov 26 10:14:33 groupadd[2111898]: new group: name=builder, GID=709 <86>Nov 26 10:14:33 useradd[2111926]: new user: name=builder, UID=709, GID=709, home=/usr/src, shell=/bin/bash warning: Macro %cmake_insource not found <13>Nov 26 10:14:35 rpmi: libuv-1.44.2-alt1 sisyphus+303845.100.1.1 1658053887 installed <13>Nov 26 10:14:35 rpmi: libjsoncpp24-1.9.4-alt2 sisyphus+286441.100.1.1 1633444234 installed <13>Nov 26 10:14:35 rpmi: libexpat-2.5.0-alt1 sisyphus+309227.100.1.1 1667075766 installed <13>Nov 26 10:14:35 rpmi: libidn2-2.3.4-alt1 sisyphus+309023.100.1.1 1666791089 installed <13>Nov 26 10:14:35 rpmi: libxxhash-0.8.0-alt2 sisyphus+277476.100.2.1 1625621318 installed <13>Nov 26 10:14:35 rpmi: liblz4-1:1.9.4-alt1 sisyphus+309416.100.1.1 1667413000 installed <13>Nov 26 10:14:35 rpmi: gcc-c++-common-1.4.27-alt1 sisyphus+278099.1300.1.1 1626028636 installed <13>Nov 26 10:14:35 rpmi: libstdc++12-devel-12.1.1-alt2 sisyphus+307182.100.1.1 1663782147 installed <13>Nov 26 10:14:37 rpmi: gcc12-c++-12.1.1-alt2 sisyphus+307182.100.1.1 1663782147 installed <13>Nov 26 10:14:37 rpmi: rpm-macros-cmake-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345623 installed <13>Nov 26 10:14:37 rpmi: cmake-modules-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Nov 26 10:14:37 rpmi: librhash-1.3.5-alt3 sisyphus+286141.40.2.1 1632982456 installed <13>Nov 26 10:14:37 rpmi: publicsuffix-list-dafsa-20221003-alt1 sisyphus+308013.100.1.1 1665137688 installed <13>Nov 26 10:14:37 rpmi: libpsl-0.21.1-alt2 sisyphus+279461.100.1.1 1626547555 installed <13>Nov 26 10:14:37 rpmi: libnghttp2-1.51.0-alt1 sisyphus+310565.100.1.1 1669296600 installed <13>Nov 26 10:14:37 rpmi: openldap-common-2.6.3-alt1 sisyphus+306372.60.8.1 1663095223 installed <13>Nov 26 10:14:37 rpmi: libverto-0.3.2-alt1_1 sisyphus+279289.100.1.3 1626493872 installed <13>Nov 26 10:14:37 rpmi: liblmdb-0.9.29-alt1.1 sisyphus+306630.100.1.1 1663072361 installed <13>Nov 26 10:14:37 rpmi: libkeyutils-1.6.3-alt1 sisyphus+266061.100.1.1 1612919567 installed <13>Nov 26 10:14:37 rpmi: libcom_err-1.46.4.0.5.4cda-alt1 sisyphus+283826.100.1.1 1629975361 installed <13>Nov 26 10:14:37 rpmi: libbrotlicommon-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Nov 26 10:14:37 rpmi: libbrotlidec-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Nov 26 10:14:37 rpmi: libp11-kit-0.24.1-alt1 sisyphus+293720.100.1.1 1642535281 installed <13>Nov 26 10:14:37 rpmi: libtasn1-4.19.0-alt1 sisyphus+305700.100.1.1 1661359628 installed <13>Nov 26 10:14:37 rpmi: rpm-macros-alternatives-0.5.2-alt1 sisyphus+300869.100.1.1 1653844113 installed <13>Nov 26 10:14:37 rpmi: alternatives-0.5.2-alt1 sisyphus+300869.100.1.1 1653844113 installed <13>Nov 26 10:14:37 rpmi: ca-certificates-2022.09.15-alt1 sisyphus+306895.200.1.1 1663268411 installed <13>Nov 26 10:14:37 rpmi: ca-trust-0.1.4-alt1 sisyphus+308690.100.1.1 1666182992 installed <13>Nov 26 10:14:37 rpmi: p11-kit-trust-0.24.1-alt1 sisyphus+293720.100.1.1 1642535281 installed <13>Nov 26 10:14:37 rpmi: libcrypto1.1-1.1.1q-alt1 sisyphus+303203.100.1.1 1657027052 installed <13>Nov 26 10:14:37 rpmi: libssl1.1-1.1.1q-alt1 sisyphus+303203.100.1.1 1657027052 installed <86>Nov 26 10:14:37 groupadd[2127932]: group added to /etc/group: name=_keytab, GID=499 <86>Nov 26 10:14:37 groupadd[2127932]: group added to /etc/gshadow: name=_keytab <86>Nov 26 10:14:37 groupadd[2127932]: new group: name=_keytab, GID=499 <13>Nov 26 10:14:37 rpmi: libkrb5-1.19.4-alt1 sisyphus+310092.100.2.1 1668703628 installed <86>Nov 26 10:14:37 groupadd[2128325]: group added to /etc/group: name=sasl, GID=498 <86>Nov 26 10:14:37 groupadd[2128325]: group added to /etc/gshadow: name=sasl <86>Nov 26 10:14:37 groupadd[2128325]: new group: name=sasl, GID=498 <13>Nov 26 10:14:37 rpmi: libsasl2-3-2.1.27-alt2.2 sisyphus+306372.1000.8.1 1663097332 installed <13>Nov 26 10:14:37 rpmi: libldap2-2.6.3-alt1 sisyphus+306372.60.8.1 1663095246 installed <13>Nov 26 10:14:37 rpmi: libcurl-7.86.0-alt1 sisyphus+309443.100.1.1 1667479842 installed <13>Nov 26 10:14:37 rpmi: libarchive13-3.6.1-alt1 sisyphus+309072.100.1.1 1666870190 installed <13>Nov 26 10:14:38 rpmi: cmake-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345623 installed <13>Nov 26 10:14:38 rpmi: ctest-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345623 installed <13>Nov 26 10:14:38 rpmi: libsasl2-devel-2.1.27-alt2.2 sisyphus+306372.1000.8.1 1663097332 installed <13>Nov 26 10:14:38 rpmi: libssl-devel-1.1.1q-alt1 sisyphus+303203.100.1.1 1657027052 installed <13>Nov 26 10:14:38 rpmi: gcc-c++-12-alt1 sisyphus+300988.300.1.1 1654033914 installed <13>Nov 26 10:14:38 rpmi: liblz4-devel-1:1.9.4-alt1 sisyphus+309416.100.1.1 1667413000 installed <13>Nov 26 10:14:38 rpmi: libxxhash-devel-0.8.0-alt2 sisyphus+277476.100.2.1 1625621318 installed Building target platforms: i586 Building for target i586 Wrote: /usr/src/in/nosrpm/librdkafka-1.9.2-alt1.nosrc.rpm (w1.gzdio) Installing librdkafka-1.9.2-alt1.src.rpm Building target platforms: i586 Building for target i586 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.44694 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf librdkafka-1.9.2 + echo 'Source #0 (librdkafka-1.9.2.tar):' Source #0 (librdkafka-1.9.2.tar): + /bin/tar -xf /usr/src/RPM/SOURCES/librdkafka-1.9.2.tar + cd librdkafka-1.9.2 + /bin/chmod -c -Rf u+rwX,go-w . + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.44694 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd librdkafka-1.9.2 + mkdir -p . + cmake -DCMAKE_SKIP_INSTALL_RPATH:BOOL=yes '-DCMAKE_C_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' '-DCMAKE_CXX_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' '-DCMAKE_Fortran_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' -DCMAKE_INSTALL_PREFIX=/usr -DINCLUDE_INSTALL_DIR:PATH=/usr/include -DLIB_INSTALL_DIR:PATH=/usr/lib -DSYSCONF_INSTALL_DIR:PATH=/etc -DSHARE_INSTALL_PREFIX:PATH=/usr/share -DLIB_DESTINATION=lib -DLIB_SUFFIX= -S . -B . -- The C compiler identification is GNU 12.1.1 -- The CXX compiler identification is GNU 12.1.1 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Looking for pow in m -- Looking for pow in m - found -- Checking for module 'libsasl2' -- Found libsasl2, version 2.1.27 -- Found LZ4: /usr/lib/liblz4.so (found version "1.9.4") -- Found OpenSSL: /usr/lib/libcrypto.so (found version "1.1.1q") -- Looking for pthread.h -- Looking for pthread.h - found -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Configuring done -- Generating done CMake Warning: Manually-specified variables were not used by the project: CMAKE_Fortran_FLAGS INCLUDE_INSTALL_DIR LIB_DESTINATION LIB_INSTALL_DIR LIB_SUFFIX SHARE_INSTALL_PREFIX SYSCONF_INSTALL_DIR -- Build files have been written to: /usr/src/RPM/BUILD/librdkafka-1.9.2 + make -j8 make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdcrc32.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 3%] Building C object src/CMakeFiles/rdkafka.dir/rdfnv1a.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 1%] Building C object src/CMakeFiles/rdkafka.dir/crc32c.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 1%] Building C object src/CMakeFiles/rdkafka.dir/rdaddr.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdbuf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdavl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 6%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_buf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 6%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_feature.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 4%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 3%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_lz4.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_metadata_cache.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_cgrp.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_conf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 8%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msg.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_metadata.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 4%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_broker.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 9%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msgset_writer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_op.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 9%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_offset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_pattern.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 8%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msgset_reader.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_roundrobin_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_partition.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_range_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 11%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_queue.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 13%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_plain.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 14%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_subscription.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 13%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sticky_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 16%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_interceptor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 16%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_header.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_timer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_topic.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 14%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_assignment.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_transport.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_request.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 17%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_aux.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_background.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_idempotence.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 19%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_cert.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 21%] Building C object src/CMakeFiles/rdkafka.dir/rdlist.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 19%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_coord.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 21%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_error.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 22%] Building C object src/CMakeFiles/rdkafka.dir/rdmurmur2.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock_cgrp.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdports.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdrand.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 22%] Building C object src/CMakeFiles/rdkafka.dir/rdlog.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdregex.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_txnmgr.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 24%] Building C object src/CMakeFiles/rdkafka.dir/rdstring.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 26%] Building C object src/CMakeFiles/rdkafka.dir/tinycthread.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/rdvarint.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/rdmap.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock.c.o In file included from /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rd.h:73, from /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_int.h:46, from /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_mock.c:34: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_mock.c: In function 'rd_kafka_mock_cluster_new': /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_mock.c:2407:57: warning: format '%lx' expects argument of type 'long unsigned int', but argument 4 has type 'int' [-Wformat=] 2407 | rd_snprintf(mcluster->id, sizeof(mcluster->id), "mockCluster%lx", | ^~~~~~~~~~~~~~~~ 2408 | (intptr_t)mcluster >> 2); | ~~~~~~~~~~~~~~~~~~~~~~~ | | | int /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdposix.h:87:36: note: in definition of macro 'rd_snprintf' 87 | #define rd_snprintf(...) snprintf(__VA_ARGS__) | ^~~~~~~~~~~ /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_mock.c:2407:71: note: format string is defined here 2407 | rd_snprintf(mcluster->id, sizeof(mcluster->id), "mockCluster%lx", | ~~^ | | | long unsigned int | %x make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/snappy.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 26%] Building C object src/CMakeFiles/rdkafka.dir/tinycthread_extra.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 24%] Building C object src/CMakeFiles/rdkafka.dir/rdunittest.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 17%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_admin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock_handlers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rddl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 27%] Building C object src/CMakeFiles/rdkafka.dir/cJSON.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 27%] Building C object src/CMakeFiles/rdkafka.dir/rdxxhash.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 29%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_plugin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_ssl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rdhdrhistogram.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 29%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_cyrus.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_scram.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_oauthbearer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Linking C shared library librdkafka.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Built target rdkafka make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object examples/CMakeFiles/producer.dir/producer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building C object examples/CMakeFiles/consumer.dir/consumer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building C object examples/CMakeFiles/misc.dir/misc.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 34%] Linking C executable producer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Built target producer make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 33%] Building C object examples/CMakeFiles/rdkafka_performance.dir/rdkafka_performance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 33%] Building C object examples/CMakeFiles/rdkafka_example.dir/rdkafka_example.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 34%] Linking C executable consumer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Built target consumer make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Linking C executable misc make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Built target misc make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Building C object examples/CMakeFiles/rdkafka_complex_consumer_example.dir/rdkafka_complex_consumer_example.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Building C object tests/interceptor_test/CMakeFiles/interceptor_test.dir/interceptor_test.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ConfImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Linking C shared library interceptor_test.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Built target interceptor_test make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Linking C executable rdkafka_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Built target rdkafka_example make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ConsumerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Linking C executable rdkafka_complex_consumer_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Built target rdkafka_complex_consumer_example make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/HandleImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/HeadersImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Linking C executable rdkafka_performance make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Built target rdkafka_performance make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/MessageImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/KafkaConsumerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/TopicImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/QueueImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/RdKafka.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/MetadataImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/TopicPartitionImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ProducerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Linking CXX shared library librdkafka++.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target rdkafka++ make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0000-unittests.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0001-multiobj.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0002-unkpart.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 45%] Building C object tests/CMakeFiles/test-runner.dir/0003-msgmaxsize.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 45%] Building C object tests/CMakeFiles/test-runner.dir/0004-conf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0005-order.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0006-symbols.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0007-autotopic.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 47%] Building C object tests/CMakeFiles/test-runner.dir/0009-mock_cluster.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 47%] Building C object tests/CMakeFiles/test-runner.dir/0008-reqacks.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 48%] Building C object tests/CMakeFiles/test-runner.dir/0011-produce_batch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 48%] Building C object tests/CMakeFiles/test-runner.dir/0012-produce_consume.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0013-null-msgs.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0015-offset_seeks.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0014-reconsume-191.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Building CXX object examples/CMakeFiles/rdkafka_example_cpp.dir/rdkafka_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 50%] Building C object tests/CMakeFiles/test-runner.dir/0016-client_swname.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0018-cgrp_term.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Building CXX object examples/CMakeFiles/openssl_engine_example_cpp.dir/openssl_engine_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 50%] Building C object tests/CMakeFiles/test-runner.dir/0017-compression.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0020-destroy_hang.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 53%] Building C object tests/CMakeFiles/test-runner.dir/0021-rkt_destroy.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 53%] Building C object tests/CMakeFiles/test-runner.dir/0022-consume_batch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Building CXX object examples/CMakeFiles/rdkafka_complex_consumer_example_cpp.dir/rdkafka_complex_consumer_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Building CXX object examples/CMakeFiles/producer_cpp.dir/producer.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0019-list_groups.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 54%] Building C object tests/CMakeFiles/test-runner.dir/0025-timers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0028-long_topicnames.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 54%] Building C object tests/CMakeFiles/test-runner.dir/0026-consume_pause.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0029-assign_offset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0030-offset_commit.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Building CXX object examples/CMakeFiles/kafkatest_verifiable_client.dir/kafkatest_verifiable_client.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 57%] Building C object tests/CMakeFiles/test-runner.dir/0031-get_offsets.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 52%] Linking CXX executable openssl_engine_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Built target openssl_engine_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 55%] Linking CXX executable producer_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Built target producer_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 57%] Building C object tests/CMakeFiles/test-runner.dir/0033-regex_subscribe.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0034-offset_reset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0036-partial_fetch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 60%] Building C object tests/CMakeFiles/test-runner.dir/0040-io_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 59%] Building C object tests/CMakeFiles/test-runner.dir/0037-destroy_hang_local.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0035-api_version.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Linking CXX executable rdkafka_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0041-fetch_max_bytes.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Built target rdkafka_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 59%] Building C object tests/CMakeFiles/test-runner.dir/0038-performance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0043-no_connection.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0042-many_topics.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 60%] Building C object tests/CMakeFiles/test-runner.dir/0039-event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 55%] Linking CXX executable rdkafka_complex_consumer_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Built target rdkafka_complex_consumer_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 62%] Building C object tests/CMakeFiles/test-runner.dir/0044-partition_cnt.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 63%] Building C object tests/CMakeFiles/test-runner.dir/0046-rkt_cache.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building C object tests/CMakeFiles/test-runner.dir/0050-subscribe_adds.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 65%] Building C object tests/CMakeFiles/test-runner.dir/0051-assign_adds.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 62%] Building C object tests/CMakeFiles/test-runner.dir/0045-subscribe_update.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 63%] Building C object tests/CMakeFiles/test-runner.dir/0047-partial_buf_tmout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building C object tests/CMakeFiles/test-runner.dir/0048-partitioner.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building C object tests/CMakeFiles/test-runner.dir/0049-consume_conn_close.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building C object tests/CMakeFiles/test-runner.dir/0052-msg_timestamps.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building C object tests/CMakeFiles/test-runner.dir/0055-producer_latency.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 67%] Building C object tests/CMakeFiles/test-runner.dir/0056-balanced_group_mt.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building CXX object tests/CMakeFiles/test-runner.dir/0053-stats_cb.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 69%] Building C object tests/CMakeFiles/test-runner.dir/0062-stats_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building CXX object tests/CMakeFiles/test-runner.dir/0054-offset_time.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 69%] Building CXX object tests/CMakeFiles/test-runner.dir/0061-consumer_lag.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 70%] Building C object tests/CMakeFiles/test-runner.dir/0064-interceptors.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 69%] Building CXX object tests/CMakeFiles/test-runner.dir/0060-op_prio.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Building CXX object tests/CMakeFiles/test-runner.dir/0058-log.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 67%] Building CXX object tests/CMakeFiles/test-runner.dir/0057-invalid_topic.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 72%] Building C object tests/CMakeFiles/test-runner.dir/0068-produce_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 72%] Building C object tests/CMakeFiles/test-runner.dir/0069-consumer_add_parts.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 73%] Building C object tests/CMakeFiles/test-runner.dir/0072-headers_ut.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Building CXX object tests/CMakeFiles/test-runner.dir/0059-bsearch.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0073-headers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0074-producev.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0075-retry.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Building CXX object tests/CMakeFiles/test-runner.dir/0066-plugins.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Building CXX object tests/CMakeFiles/test-runner.dir/0067-empty_topic.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 75%] Building C object tests/CMakeFiles/test-runner.dir/0076-produce_retry.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Linking CXX executable kafkatest_verifiable_client make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Built target kafkatest_verifiable_client make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 75%] Building C object tests/CMakeFiles/test-runner.dir/0077-compaction.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 73%] Building CXX object tests/CMakeFiles/test-runner.dir/0070-null_empty.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building C object tests/CMakeFiles/test-runner.dir/0079-fork.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 78%] Building C object tests/CMakeFiles/test-runner.dir/0083-cb_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Building CXX object tests/CMakeFiles/test-runner.dir/0065-yield.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 78%] Building C object tests/CMakeFiles/test-runner.dir/0084-destroy_flags.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 70%] Building CXX object tests/CMakeFiles/test-runner.dir/0063-clusterid.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building C object tests/CMakeFiles/test-runner.dir/0086-purge.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 80%] Building C object tests/CMakeFiles/test-runner.dir/0089-max_poll_interval.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 80%] Building C object tests/CMakeFiles/test-runner.dir/0090-idempotence.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building C object tests/CMakeFiles/test-runner.dir/0088-produce_metadata_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 81%] Building C object tests/CMakeFiles/test-runner.dir/0091-max_poll_interval_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building C object tests/CMakeFiles/test-runner.dir/0093-holb.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building C object tests/CMakeFiles/test-runner.dir/0080-admin_ut.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 81%] Building C object tests/CMakeFiles/test-runner.dir/0092-mixed_msgver.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building CXX object tests/CMakeFiles/test-runner.dir/0078-c_from_cpp.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 77%] Building C object tests/CMakeFiles/test-runner.dir/0081-admin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building C object tests/CMakeFiles/test-runner.dir/0094-idempotence_msg_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building C object tests/CMakeFiles/test-runner.dir/0099-commit_metadata.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 77%] Building CXX object tests/CMakeFiles/test-runner.dir/0082-fetch_max_bytes.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 85%] Building C object tests/CMakeFiles/test-runner.dir/0102-static_group_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0100-thread_interceptors.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 86%] Building C object tests/CMakeFiles/test-runner.dir/0104-fetch_from_follower_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building C object tests/CMakeFiles/test-runner.dir/0106-cgrp_sess_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building CXX object tests/CMakeFiles/test-runner.dir/0085-headers.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0101-fetch-from-follower.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 83%] Building CXX object tests/CMakeFiles/test-runner.dir/0098-consumer-txn.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 85%] Building C object tests/CMakeFiles/test-runner.dir/0103-transactions.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building C object tests/CMakeFiles/test-runner.dir/0107-topic_recreate.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building C object tests/CMakeFiles/test-runner.dir/0112-assign_unknown_part.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building CXX object tests/CMakeFiles/test-runner.dir/0095-all_brokers_down.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 83%] Building CXX object tests/CMakeFiles/test-runner.dir/0097-ssl_verify.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 86%] Building C object tests/CMakeFiles/test-runner.dir/0105-transactions_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building CXX object tests/CMakeFiles/test-runner.dir/0114-sticky_partitioning.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 91%] Building C object tests/CMakeFiles/test-runner.dir/0117-mock_errors.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 91%] Building C object tests/CMakeFiles/test-runner.dir/0118-commit_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building C object tests/CMakeFiles/test-runner.dir/0120-asymmetric_subscription.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 88%] Building CXX object tests/CMakeFiles/test-runner.dir/0110-batch_size.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 88%] Building CXX object tests/CMakeFiles/test-runner.dir/0111-delay_create_topics.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building CXX object tests/CMakeFiles/test-runner.dir/0109-auto_create_topics.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building C object tests/CMakeFiles/test-runner.dir/0121-clusterid.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 90%] Building CXX object tests/CMakeFiles/test-runner.dir/0116-kafkaconsumer_close.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 90%] Building CXX object tests/CMakeFiles/test-runner.dir/0115-producer_auth.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 93%] Building C object tests/CMakeFiles/test-runner.dir/0123-connections_max_idle.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 93%] Building C object tests/CMakeFiles/test-runner.dir/0122-buffer_cleaning_after_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0124-openssl_invalid_engine.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 95%] Building C object tests/CMakeFiles/test-runner.dir/0129-fetch_aborted_msgs.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0126-oauthbearer_oidc.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0125-immediate_flush.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 96%] Building C object tests/CMakeFiles/test-runner.dir/0130-store_offsets.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 96%] Building C object tests/CMakeFiles/test-runner.dir/0131-connect_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 98%] Building C object tests/CMakeFiles/test-runner.dir/rusage.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building C object tests/CMakeFiles/test-runner.dir/0132-strategy_ordering.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Building C object tests/CMakeFiles/test-runner.dir/sockem_ctrl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Building C object tests/CMakeFiles/test-runner.dir/sockem.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building CXX object tests/CMakeFiles/test-runner.dir/0119-consumer_auth.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 98%] Building CXX object tests/CMakeFiles/test-runner.dir/testcpp.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building CXX object tests/CMakeFiles/test-runner.dir/8000-idle.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 95%] Building CXX object tests/CMakeFiles/test-runner.dir/0128-sasl_callback_queue.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building CXX object tests/CMakeFiles/test-runner.dir/0113-cooperative_rebalance.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building C object tests/CMakeFiles/test-runner.dir/test.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Linking CXX executable test-runner make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Built target test-runner make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' + exit 0 Executing(%install): /bin/sh -e /usr/src/tmp/rpm-tmp.41300 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + /bin/chmod -Rf u+rwX -- /usr/src/tmp/librdkafka-buildroot + : + /bin/rm -rf -- /usr/src/tmp/librdkafka-buildroot + PATH=/usr/libexec/rpm-build:/usr/src/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/games + cd librdkafka-1.9.2 + make 'INSTALL=/usr/libexec/rpm-build/install -p' install DESTDIR=/usr/src/tmp/librdkafka-buildroot make: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Built target rdkafka make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka++ make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Built target rdkafka++ make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target producer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target producer make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target producer_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target producer_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target consumer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target consumer make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_performance make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Built target rdkafka_performance make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Built target rdkafka_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_complex_consumer_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Built target rdkafka_complex_consumer_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target openssl_engine_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target openssl_engine_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target misc make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Built target misc make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Built target rdkafka_example make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_complex_consumer_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Built target rdkafka_complex_consumer_example make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target kafkatest_verifiable_client make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Built target kafkatest_verifiable_client make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target test-runner make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Built target test-runner make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target interceptor_test make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Built target interceptor_test make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Install the project... -- Install configuration: "" -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/cmake/RdKafka/RdKafkaConfig.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/cmake/RdKafka/RdKafkaConfigVersion.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/cmake/RdKafka/FindLZ4.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/cmake/RdKafka/RdKafkaTargets.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/cmake/RdKafka/RdKafkaTargets-noconfig.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/share/licenses/librdkafka/LICENSES.txt -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/pkgconfig/rdkafka.pc -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/librdkafka.so.1 -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/librdkafka.so -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafka.h -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafka_mock.h -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/pkgconfig/rdkafka++.pc -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/librdkafka++.so.1 -- Set runtime path of "/usr/src/tmp/librdkafka-buildroot/usr/lib/librdkafka++.so.1" to "" -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib/librdkafka++.so -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafkacpp.h make: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' + mkdir -p /usr/src/tmp/librdkafka-buildroot/usr/lib/pkgconfig + cp /usr/src/RPM/SOURCES/rdkafka.pc /usr/src/tmp/librdkafka-buildroot/usr/lib/pkgconfig/ + /usr/bin/subst 's|@VERSION@|1.9.2|g' /usr/src/tmp/librdkafka-buildroot/usr/lib/pkgconfig/rdkafka++.pc /usr/src/tmp/librdkafka-buildroot/usr/lib/pkgconfig/rdkafka.pc + rm -f '/usr/src/tmp/librdkafka-buildroot/usr/lib/*.a' + rm -f /usr/src/tmp/librdkafka-buildroot/usr/share/licenses/librdkafka/LICENSES.txt + /usr/lib/rpm/brp-alt Cleaning files in /usr/src/tmp/librdkafka-buildroot (auto) mode of './usr/lib/librdkafka++.so.1' changed from 0755 (rwxr-xr-x) to 0644 (rw-r--r--) mode of './usr/lib/librdkafka.so.1' changed from 0755 (rwxr-xr-x) to 0644 (rw-r--r--) Verifying and fixing files in /usr/src/tmp/librdkafka-buildroot (binconfig,pkgconfig,libtool,desktop,gnuconfig) /usr/lib/pkgconfig/rdkafka.pc: Libs: '-L${libdir} -lrdkafka' --> '-lrdkafka' /usr/lib/pkgconfig/rdkafka++.pc: Cflags: '-I${includedir}' --> '' /usr/lib/pkgconfig/rdkafka++.pc: Libs: '-L${libdir} -lrdkafka++' --> '-lrdkafka++' Checking contents of files in /usr/src/tmp/librdkafka-buildroot/ (default) Compressing files in /usr/src/tmp/librdkafka-buildroot (auto) Adjusting library links in /usr/src/tmp/librdkafka-buildroot ./usr/lib: (from :0) librdkafka.so.1 -> librdkafka.so.1 librdkafka++.so.1 -> librdkafka++.so.1 Verifying ELF objects in /usr/src/tmp/librdkafka-buildroot (arch=normal,fhs=normal,lfs=relaxed,lint=relaxed,rpath=normal,stack=normal,textrel=normal,unresolved=normal) verify-elf: WARNING: ./usr/lib/librdkafka.so.1: uses non-LFS functions: fcntl ftruncate open readdir stat Executing(%check): /bin/sh -e /usr/src/tmp/rpm-tmp.38630 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd librdkafka-1.9.2 + ctest -VV -R RdKafkaTestBrokerLess UpdateCTestConfiguration from :/usr/src/RPM/BUILD/librdkafka-1.9.2/DartConfiguration.tcl UpdateCTestConfiguration from :/usr/src/RPM/BUILD/librdkafka-1.9.2/DartConfiguration.tcl Test project /usr/src/RPM/BUILD/librdkafka-1.9.2 Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 3 Start 3: RdKafkaTestBrokerLess 3: Test command: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test-runner "-p5" "-l" 3: Test timeout computed to be: 10000000 3: [
/ 0.000s] Test config file test.conf not found 3: [
/ 0.000s] Setting test timeout to 10s * 1.0 3: [
/ 0.000s] Git version: HEAD 3: [
/ 0.000s] Broker version: 2.4.0.0 (2.4.0.0) 3: [
/ 0.000s] Tests to run : all 3: [
/ 0.000s] Test mode : bare 3: [
/ 0.000s] Test scenario: default 3: [
/ 0.000s] Test filter : local tests only 3: [
/ 0.000s] Test timeout multiplier: 2.7 3: [
/ 0.000s] Action on test failure: continue other tests 3: [
/ 0.000s] Current directory: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests 3: [
/ 0.000s] Setting test timeout to 30s * 2.7 3: [0000_unittests / 0.000s] ================= Running test 0000_unittests ================= 3: [0000_unittests / 0.000s] ==== Stats written to file stats_0000_unittests_1235868082292016411.json ==== 3: [0000_unittests / 0.000s] builtin.features = snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer 3: %7|1669457747.421|OPENSSL|rdkafka#producer-1| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: [
/ 0.004s] Too many tests running (5 >= 5): postponing 0025_timers start... 3: [0022_consume_batch_local / 0.000s] ================= Running test 0022_consume_batch_local ================= 3: [0022_consume_batch_local / 0.000s] ==== Stats written to file stats_0022_consume_batch_local_6493973738779588733.json ==== 3: [0022_consume_batch_local / 0.000s] [ do_test_consume_batch_oauthbearer_cb:170 ] 3: [0009_mock_cluster / 0.000s] ================= Running test 0009_mock_cluster ================= 3: [0009_mock_cluster / 0.000s] ==== Stats written to file stats_0009_mock_cluster_4292714131845048922.json ==== 3: [0009_mock_cluster / 0.000s] Using topic "rdkafkatest_rnd4adb1aad4f31c995_0009_mock_cluster" 3: [0009_mock_cluster / 0.000s] Test config file test.conf not found 3: %5|1669457747.423|CONFWARN|MOCK#producer-3| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0009_mock_cluster / 0.000s] Test config file test.conf not found 3: [0009_mock_cluster / 0.000s] Setting test timeout to 30s * 2.7 3: [0009_mock_cluster / 0.001s] Created kafka instance 0009_mock_cluster#producer-4 3: %5|1669457747.424|CONFWARN|0022_consume_batch_local#consumer-2| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0022_consume_batch_local / 0.002s] Created kafka instance 0022_consume_batch_local#consumer-2 3: [0022_consume_batch_local / 0.002s] Refresh callback called 3: %3|1669457747.424|ERROR|0022_consume_batch_local#consumer-2| [thrd:app]: Failed to acquire SASL OAUTHBEARER token: Refresh called 3: [0004_conf / 0.000s] ================= Running test 0004_conf ================= 3: [0004_conf / 0.000s] ==== Stats written to file stats_0004_conf_3328316595409909515.json ==== 3: [0004_conf / 0.000s] Test config file test.conf not found 3: [0004_conf / 0.000s] Setting test timeout to 10s * 2.7 3: [0004_conf / 0.000s] Using topic "rdkafkatest_0004" 3: [0004_conf / 0.000s] : on_new() called 3: [0006_symbols / 0.000s] ================= Running test 0006_symbols ================= 3: [0006_symbols / 0.000s] ==== Stats written to file stats_0006_symbols_8248670497074694154.json ==== 3: [0006_symbols / 0.000s] 0006_symbols: duration 0.000ms 3: [0006_symbols / 0.000s] ================= Test 0006_symbols PASSED ================= 3: %4|1669457747.429|CONFWARN|0009_mock_cluster#consumer-5| [thrd:app]: Configuration property dr_msg_cb is a producer property and will be ignored by this consumer instance 3: [0009_mock_cluster / 0.006s] Created kafka instance 0009_mock_cluster#consumer-5 3: [0009_mock_cluster / 0.006s] Test config file test.conf not found 3: [0009_mock_cluster / 0.006s] Produce to rdkafkatest_rnd4adb1aad4f31c995_0009_mock_cluster [-1]: messages #0..100 3: [
/ 0.014s] Too many tests running (5 >= 5): postponing 0033_regex_subscribe_local start... 3: [0025_timers / 0.000s] ================= Running test 0025_timers ================= 3: [0025_timers / 0.000s] ==== Stats written to file stats_0025_timers_8813758905872613363.json ==== 3: [0025_timers / 0.000s] Test config file test.conf not found 3: [0025_timers / 0.000s] Setting test timeout to 200s * 2.7 3: %5|1669457747.440|CONFWARN|0025_timers#consumer-7| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0025_timers / 0.010s] Created kafka instance 0025_timers#consumer-7 3: [0025_timers / 0.010s] rd_kafka_new(): duration 9.522ms 3: [0025_timers / 0.010s] Starting wait loop for 10 expected stats_cb calls with an interval of 600ms 3: %7|1669457747.443|INIT|my id#producer-6| [thrd:app]: librdkafka v1.9.2 (0x10902ff) my id#producer-6 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS SNAPPY SOCKEM, debug 0x80c) 3: %4|1669457747.443|CONFWARN|my id#producer-6| [thrd:app]: Configuration property auto.offset.reset is a consumer property and will be ignored by this producer instance 3: %5|1669457747.443|CONFWARN|my id#producer-6| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.014s] Created kafka instance my id#producer-6 3: %7|1669457747.443|TOPIC|my id#producer-6| [thrd:app]: New local topic: rdkafkatest_0004 3: %7|1669457747.443|TOPPARNEW|my id#producer-6| [thrd:app]: NEW rdkafkatest_0004 [-1] 0xf55007f0 refcnt 0xf550083c (at rd_kafka_topic_new0:468) 3: %7|1669457747.443|METADATA|my id#producer-6| [thrd:app]: Hinted cache of 1/1 topic(s) being queried 3: %7|1669457747.443|METADATA|my id#producer-6| [thrd:app]: Skipping metadata refresh of 1 topic(s): leader query: no usable brokers 3: %7|1669457747.443|DESTROY|my id#producer-6| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1669457747.443|DESTROY|my id#producer-6| [thrd:main]: Destroy internal 3: %7|1669457747.443|DESTROY|my id#producer-6| [thrd:main]: Removing all topics 3: %7|1669457747.443|TOPPARREMOVE|my id#producer-6| [thrd:main]: Removing toppar rdkafkatest_0004 [-1] 0xf55007f0 3: %7|1669457747.443|DESTROY|my id#producer-6| [thrd:main]: rdkafkatest_0004 [-1]: 0xf55007f0 DESTROY_FINAL 3: %7|1669457747.444|INIT|my id#producer-8| [thrd:app]: librdkafka v1.9.2 (0x10902ff) my id#producer-8 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS SNAPPY SOCKEM, debug 0x80c) 3: %4|1669457747.444|CONFWARN|my id#producer-8| [thrd:app]: Configuration property auto.offset.reset is a consumer property and will be ignored by this producer instance 3: %5|1669457747.444|CONFWARN|my id#producer-8| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.015s] Created kafka instance my id#producer-8 3: %7|1669457747.444|TOPIC|my id#producer-8| [thrd:app]: New local topic: rdkafkatest_0004 3: %7|1669457747.444|TOPPARNEW|my id#producer-8| [thrd:app]: NEW rdkafkatest_0004 [-1] 0xf55007a0 refcnt 0xf55007ec (at rd_kafka_topic_new0:468) 3: %7|1669457747.444|METADATA|my id#producer-8| [thrd:app]: Hinted cache of 1/1 topic(s) being queried 3: %7|1669457747.444|METADATA|my id#producer-8| [thrd:app]: Skipping metadata refresh of 1 topic(s): leader query: no usable brokers 3: %7|1669457747.444|DESTROY|my id#producer-8| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1669457747.444|DESTROY|my id#producer-8| [thrd:main]: Destroy internal 3: %7|1669457747.444|DESTROY|my id#producer-8| [thrd:main]: Removing all topics 3: %7|1669457747.444|TOPPARREMOVE|my id#producer-8| [thrd:main]: Removing toppar rdkafkatest_0004 [-1] 0xf55007a0 3: %7|1669457747.444|DESTROY|my id#producer-8| [thrd:main]: rdkafkatest_0004 [-1]: 0xf55007a0 DESTROY_FINAL 3: [0004_conf / 0.015s] Incremental S2F tests 3: [0004_conf / 0.015s] Set: generic,broker,queue,cgrp 3: [0004_conf / 0.015s] Now: generic,broker,queue,cgrp 3: [0004_conf / 0.015s] Set: -broker,+queue,topic 3: [0004_conf / 0.015s] Now: generic,topic,queue,cgrp 3: [0004_conf / 0.015s] Set: -all,security,-fetch,+metadata 3: [0004_conf / 0.015s] Now: metadata,security 3: [0004_conf / 0.015s] Error reporting for S2F properties 3: [0004_conf / 0.015s] Ok: Invalid value "invalid-value" for configuration property "debug" 3: [0004_conf / 0.015s] Verifying that ssl.ca.location is not overwritten (#3566) 3: %3|1669457747.444|SSL|rdkafka#producer-9| [thrd:app]: error:02001002:system library:fopen:No such file or directory: fopen('/?/does/!/not/exist!','r') 3: %3|1669457747.444|SSL|rdkafka#producer-9| [thrd:app]: error:2006D080:BIO routines:BIO_new_file:no such file 3: [0004_conf / 0.015s] rd_kafka_new() failed as expected: ssl.ca.location failed: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib 3: [0004_conf / 0.015s] Canonical tests 3: [0004_conf / 0.015s] Set: request.required.acks=0 expect 0 (topic) 3: [0004_conf / 0.015s] Set: request.required.acks=-1 expect -1 (topic) 3: [0004_conf / 0.015s] Set: request.required.acks=1 expect 1 (topic) 3: [0004_conf / 0.015s] Set: acks=3 expect 3 (topic) 3: [0004_conf / 0.015s] Set: request.required.acks=393 expect 393 (topic) 3: [0004_conf / 0.015s] Set: request.required.acks=bad expect (null) (topic) 3: [0004_conf / 0.015s] Set: request.required.acks=all expect -1 (topic) 3: [0004_conf / 0.015s] Set: request.required.acks=all expect -1 (global) 3: [0004_conf / 0.015s] Set: acks=0 expect 0 (topic) 3: [0004_conf / 0.015s] Set: sasl.mechanisms=GSSAPI expect GSSAPI (global) 3: [0004_conf / 0.015s] Set: sasl.mechanisms=PLAIN expect PLAIN (global) 3: [0004_conf / 0.015s] Set: sasl.mechanisms=GSSAPI,PLAIN expect (null) (global) 3: [0004_conf / 0.015s] Set: sasl.mechanisms= expect (null) (global) 3: [0004_conf / 0.015s] Set: linger.ms=12555.3 expect 12555.3 (global) 3: [0004_conf / 0.015s] Set: linger.ms=1500.000 expect 1500 (global) 3: [0004_conf / 0.015s] Set: linger.ms=0.0001 expect 0.0001 (global) 3: %5|1669457747.444|CONFWARN|rdkafka#producer-10| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %4|1669457747.445|CONFWARN|rdkafka#producer-12| [thrd:app]: Configuration property partition.assignment.strategy is a consumer property and will be ignored by this producer instance 3: %5|1669457747.445|CONFWARN|rdkafka#producer-12| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.016s] Ok: `acks` must be set to `all` when `enable.idempotence` is true 3: [0004_conf / 0.016s] Ok: Java TrustStores are not supported, use `ssl.ca.location` and a certificate file instead. See https://github.com/edenhill/librdkafka/wiki/Using-SSL-with-librdkafka for more information. 3: [0004_conf / 0.016s] Ok: Java JAAS configuration is not supported, see https://github.com/edenhill/librdkafka/wiki/Using-SASL-with-librdkafka for more information. 3: [0004_conf / 0.016s] Ok: Internal property "interceptors" not settable 3: %5|1669457747.445|CONFWARN|rdkafka#producer-13| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1669457747.445|TOPICCONF|rdkafka#producer-13| [thrd:app]: Incompatible configuration settings for topic "mytopic": `acks` must be set to `all` when `enable.idempotence` is true 3: %5|1669457747.445|CONFWARN|rdkafka#producer-14| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1669457747.445|CONFWARN|rdkafka#producer-15| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1669457747.446|CONFWARN|rdkafka#producer-16| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1669457747.446|TOPICCONF|rdkafka#producer-16| [thrd:app]: Incompatible configuration settings for topic "mytopic": `queuing.strategy` must be set to `fifo` when `enable.idempotence` is true 3: %4|1669457747.446|CONFWARN|rdkafka#consumer-17| [thrd:app]: Configuration property queue.buffering.max.ms is a producer property and will be ignored by this consumer instance 3: [0004_conf / 0.017s] Instance config linger.ms=123 3: [0004_conf / 0.017s] Instance config group.id=test1 3: [0004_conf / 0.017s] Instance config enable.auto.commit=false 3: [0009_mock_cluster / 0.024s] SUM(POLL): duration 0.000ms 3: [0009_mock_cluster / 0.024s] PRODUCE: duration 17.678ms 3: [0004_conf / 0.021s] [ do_test_default_topic_conf:381 ] 3: [0004_conf / 0.021s] [ do_test_default_topic_conf:381: PASS (0.00s) ] 3: [0004_conf / 0.021s] [ do_message_timeout_linger_checks:447 ] 3: %5|1669457747.455|CONFWARN|rdkafka#producer-18| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.026s] #0 "default and L and M": rd_kafka_new() succeeded 3: %5|1669457747.455|CONFWARN|rdkafka#producer-19| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.026s] #1 "set L such that L=M": rd_kafka_new() failed: `message.timeout.ms` must be greater than `linger.ms` 3: %5|1669457747.455|CONFWARN|rdkafka#producer-22| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.026s] #5 "set M such that L>=M": rd_kafka_new() succeeded 3: [0004_conf / 0.026s] #6 "set L and M such that L>=M": rd_kafka_new() failed: `message.timeout.ms` must be greater than `linger.ms` 3: [0004_conf / 0.026s] [ do_message_timeout_linger_checks:447: PASS (0.01s) ] 3: [0004_conf / 0.026s] 0004_conf: duration 26.421ms 3: [0004_conf / 0.026s] ================= Test 0004_conf PASSED ================= 3: [
/ 0.040s] Too many tests running (5 >= 5): postponing 0034_offset_reset_mock start... 3: [0033_regex_subscribe_local / 0.000s] ================= Running test 0033_regex_subscribe_local ================= 3: [0033_regex_subscribe_local / 0.000s] ==== Stats written to file stats_0033_regex_subscribe_local_8480503606161425257.json ==== 3: %7|1669457747.466|INIT|rdkafka#producer-1| [thrd:app]: librdkafka v1.9.2 (0x10902ff) rdkafka#producer-1 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS SNAPPY SOCKEM, debug 0x201) 3: %5|1669457747.466|CONFWARN|rdkafka#producer-1| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %7|1669457747.466|DESTROY|rdkafka#producer-1| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1669457747.466|TERMINATE|rdkafka#producer-1| [thrd:app]: Interrupting timers 3: %7|1669457747.466|TERMINATE|rdkafka#producer-1| [thrd:app]: Sending TERMINATE to internal main thread 3: %7|1669457747.466|TERMINATE|rdkafka#producer-1| [thrd:app]: Joining internal main thread 3: %7|1669457747.466|TERMINATE|rdkafka#producer-1| [thrd:main]: Internal main thread terminating 3: %7|1669457747.466|DESTROY|rdkafka#producer-1| [thrd:main]: Destroy internal 3: %7|1669457747.466|BROADCAST|rdkafka#producer-1| [thrd:main]: Broadcasting state change 3: %7|1669457747.466|DESTROY|rdkafka#producer-1| [thrd:main]: Removing all topics 3: %7|1669457747.466|TERMINATE|rdkafka#producer-1| [thrd:main]: Purging reply queue 3: %7|1669457747.466|TERMINATE|rdkafka#producer-1| [thrd:main]: Decommissioning internal broker 3: %7|1669457747.466|TERMINATE|rdkafka#producer-1| [thrd:main]: Join 1 broker thread(s) 3: %7|1669457747.470|BROADCAST|rdkafka#producer-1| [thrd::0/internal]: Broadcasting state change 3: %7|1669457747.470|TERMINATE|rdkafka#producer-1| [thrd:main]: Internal main thread termination done 3: %7|1669457747.470|TERMINATE|rdkafka#producer-1| [thrd:app]: Destroying op queues 3: %7|1669457747.470|TERMINATE|rdkafka#producer-1| [thrd:app]: Destroying SSL CTX 3: %7|1669457747.471|TERMINATE|rdkafka#producer-1| [thrd:app]: Termination done: freeing resources 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: empty tqh[0] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: prepend 1,0 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: prepend 2,1,0 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: append 1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: append 1,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1,0,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 2,0,1 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:345: unittest_sysqueue 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: sysqueue: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:393: ut_strcasestr: BEGIN:  3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:409: ut_strcasestr 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:590: ut_string_split: BEGIN:  3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:616: ut_string_split 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: string: PASS 3: [0009_mock_cluster / 0.073s] PRODUCE.DELIVERY.WAIT: duration 49.101ms 3: [0009_mock_cluster / 0.073s] Produce to rdkafkatest_rnd4adb1aad4f31c995_0009_mock_cluster [-1]: messages #0..100 3: [0009_mock_cluster / 0.073s] SUM(POLL): duration 0.001ms 3: [0009_mock_cluster / 0.073s] PRODUCE: duration 0.088ms 3: [0009_mock_cluster / 0.083s] PRODUCE.DELIVERY.WAIT: duration 10.045ms 3: [0009_mock_cluster / 0.084s] ASSIGN.PARTITIONS: duration 0.194ms 3: [0009_mock_cluster / 0.084s] CONSUME: assigned 4 partition(s) 3: [0009_mock_cluster / 0.084s] CONSUME: consume 100 messages 3: [0025_timers / 0.111s] rd_kafka_poll(): duration 100.936ms 3: [0033_regex_subscribe_local / 0.137s] 0033_regex_subscribe_local: duration 137.156ms 3: [0033_regex_subscribe_local / 0.137s] ================= Test 0033_regex_subscribe_local PASSED ================= 3: [0025_timers / 0.211s] rd_kafka_poll(): duration 100.075ms 3: [
/ 0.282s] Too many tests running (5 >= 5): postponing 0037_destroy_hang_local start... 3: [0034_offset_reset_mock / 0.000s] ================= Running test 0034_offset_reset_mock ================= 3: [0034_offset_reset_mock / 0.000s] ==== Stats written to file stats_0034_offset_reset_mock_8595706861553721397.json ==== 3: [0034_offset_reset_mock / 0.000s] [ offset_reset_errors:201 ] 3: %5|1669457747.703|CONFWARN|MOCK#producer-24| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0034_offset_reset_mock / 0.003s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.017s] Created kafka instance 0034_offset_reset_mock#producer-25 3: [0034_offset_reset_mock / 0.017s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.017s] Produce to topic [0]: messages #0..10 3: [0034_offset_reset_mock / 0.017s] SUM(POLL): duration 0.001ms 3: [0034_offset_reset_mock / 0.017s] PRODUCE: duration 0.021ms 3: [0034_offset_reset_mock / 0.026s] PRODUCE.DELIVERY.WAIT: duration 8.822ms 3: [0034_offset_reset_mock / 0.030s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.030s] Setting test timeout to 300s * 2.7 3: [0034_offset_reset_mock / 0.034s] Created kafka instance 0034_offset_reset_mock#consumer-26 3: [0034_offset_reset_mock / 0.034s] Waiting for up to 5000ms for metadata update 3: [0025_timers / 0.312s] rd_kafka_poll(): duration 100.883ms 3: [0034_offset_reset_mock / 0.056s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 0.056s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 0.056s] METADATA.WAIT: duration 22.105ms 3: [0025_timers / 0.412s] rd_kafka_poll(): duration 100.074ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:457: unittest_untyped_map: 500000 map_get iterations took 312.097ms = 0us/get 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:474: unittest_untyped_map: Total time over 100000 entries took 467.512ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:477: unittest_untyped_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:305: unittest_typed_map: enumerated key 2 person Hedvig Lindahl 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:305: unittest_typed_map: enumerated key 1 person Roy McPhearsome 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:323: unittest_typed_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: map: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1353: do_unittest_write_read 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1518: do_unittest_write_split_seek 3: [0025_timers / 0.512s] rd_kafka_poll(): duration 100.076ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1608: do_unittest_write_read_payload_correctness 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1676: do_unittest_write_iov 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1866: do_unittest_erase 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdbuf: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdvarint: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:411: unittest_rd_crc32c: Calculate CRC32C using software 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:422: unittest_rd_crc32c: Calculate CRC32C using software 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:429: unittest_rd_crc32c 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: crc32c: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:1985: unittest_msgq_order: FIFO: testing in FIFO mode 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2172: unittest_msg_seq_wrap 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: get baseline insert time 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 2 messages into destq with 2 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 2us, 0.5000us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: get baseline insert time 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 2 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 3 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 1.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.5000us/msg over 2 messages in 1us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: single-message ranges 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 4 messages into destq with 5 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 1us, 0.1111us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: single-message ranges 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 5 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 6 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 7 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 8 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 1.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.2500us/msg over 4 messages in 1us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: many messages 3: [0025_timers / 0.608s] Call #0: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 0.608s] rd_kafka_poll(): duration 95.920ms 3: [0025_timers / 0.708s] rd_kafka_poll(): duration 100.076ms 3: [0025_timers / 0.808s] rd_kafka_poll(): duration 100.075ms 3: [0025_timers / 0.908s] rd_kafka_poll(): duration 100.075ms 3: [0022_consume_batch_local / 1.006s] refresh_called = 1 3: [0022_consume_batch_local / 1.006s] 0022_consume_batch_local: duration 1005.953ms 3: [0022_consume_batch_local / 1.006s] ================= Test 0022_consume_batch_local PASSED ================= 3: [
/ 1.010s] Too many tests running (5 >= 5): postponing 0039_event_log start... 3: [0037_destroy_hang_local / 0.000s] ================= Running test 0037_destroy_hang_local ================= 3: [0037_destroy_hang_local / 0.000s] ==== Stats written to file stats_0037_destroy_hang_local_3886836612900714077.json ==== 3: [0037_destroy_hang_local / 0.000s] Test config file test.conf not found 3: [0037_destroy_hang_local / 0.000s] Setting test timeout to 30s * 2.7 3: [0037_destroy_hang_local / 0.000s] Using topic "rdkafkatest_legacy_consumer_early_destroy" 3: [0037_destroy_hang_local / 0.000s] legacy_consumer_early_destroy: pass #0 3: [0037_destroy_hang_local / 0.000s] Test config file test.conf not found 3: [0025_timers / 1.011s] rd_kafka_poll(): duration 103.553ms 3: %5|1669457748.445|CONFWARN|0037_destroy_hang_local#consumer-27| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0037_destroy_hang_local / 0.017s] Created kafka instance 0037_destroy_hang_local#consumer-27 3: [0037_destroy_hang_local / 0.025s] legacy_consumer_early_destroy: pass #1 3: [0037_destroy_hang_local / 0.025s] Test config file test.conf not found 3: %5|1669457748.453|CONFWARN|0037_destroy_hang_local#consumer-28| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0037_destroy_hang_local / 0.025s] Created kafka instance 0037_destroy_hang_local#consumer-28 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 4315956 messages into destq with 165288 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 4423us, 0.0010us/msg 3: [0025_timers / 1.112s] rd_kafka_poll(): duration 100.072ms 3: [0025_timers / 1.208s] Call #1: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 1.208s] rd_kafka_poll(): duration 96.625ms 3: [0009_mock_cluster / 1.226s] CONSUME: duration 1141.947ms 3: [0009_mock_cluster / 1.226s] CONSUME: consumed 100/100 messages (0/-1 EOFs) 3: [0009_mock_cluster / 1.231s] 0009_mock_cluster: duration 1231.221ms 3: [0009_mock_cluster / 1.231s] ================= Test 0009_mock_cluster PASSED ================= 3: [
/ 1.236s] Too many tests running (5 >= 5): postponing 0039_event start... 3: [0039_event_log / 0.000s] ================= Running test 0039_event_log ================= 3: [0039_event_log / 0.000s] ==== Stats written to file stats_0039_event_log_2970629129593071331.json ==== 3: [0039_event_log / 0.000s] Created kafka instance 0039_event_log#producer-29 3: [0039_event_log / 0.000s] rd_kafka_set_log_queue(rk, eventq): duration 0.001ms 3: [0039_event_log / 0.000s] Got log event: level: 7 ctx: queue fac: WAKEUPFD: msg: [thrd:app]: 0:65534/bootstrap: Enabled low-latency ops queue wake-ups 3: [0039_event_log / 0.000s] Destroying kafka instance 0039_event_log#producer-29 3: [0039_event_log / 0.008s] 0039_event_log: duration 7.819ms 3: [0039_event_log / 0.008s] ================= Test 0039_event_log PASSED ================= 3: [
/ 1.244s] Too many tests running (5 >= 5): postponing 0043_no_connection start... 3: [0039_event / 0.000s] ================= Running test 0039_event ================= 3: [0039_event / 0.000s] ==== Stats written to file stats_0039_event_8482251602669971397.json ==== 3: [0039_event / 0.004s] Created kafka instance 0039_event#producer-30 3: %3|1669457748.672|FAIL|0039_event#producer-30| [thrd:0:65534/bootstrap]: 0:65534/bootstrap: Connect to ipv4#0.0.0.0:65534 failed: Connection refused (after 0ms in state CONNECT) 3: [0039_event / 0.012s] Got Error event: _TRANSPORT: 0:65534/bootstrap: Connect to ipv4#0.0.0.0:65534 failed: Connection refused (after 0ms in state CONNECT) 3: [0039_event / 0.012s] Destroying kafka instance 0039_event#producer-30 3: [0039_event / 0.020s] 0039_event: duration 19.982ms 3: [0039_event / 0.020s] ================= Test 0039_event PASSED ================= 3: [
/ 1.268s] Too many tests running (5 >= 5): postponing 0045_subscribe_update_mock start... 3: [0043_no_connection / 0.000s] ================= Running test 0043_no_connection ================= 3: [0043_no_connection / 0.000s] ==== Stats written to file stats_0043_no_connection_7849266753140410022.json ==== 3: [0043_no_connection / 0.000s] Test config file test.conf not found 3: [0043_no_connection / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1669457748.691|CONFWARN|0043_no_connection#producer-31| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0043_no_connection / 0.005s] Created kafka instance 0043_no_connection#producer-31 3: [0043_no_connection / 0.005s] Test config file test.conf not found 3: [0043_no_connection / 0.005s] Produce to test_producer_no_connection [-1]: messages #0..100 3: [0043_no_connection / 0.005s] SUM(POLL): duration 0.000ms 3: [0043_no_connection / 0.005s] PRODUCE: duration 0.111ms 3: [0043_no_connection / 0.005s] Produce to test_producer_no_connection [0]: messages #0..100 3: [0043_no_connection / 0.005s] SUM(POLL): duration 0.000ms 3: [0043_no_connection / 0.005s] PRODUCE: duration 0.106ms 3: [0043_no_connection / 0.005s] Produce to test_producer_no_connection [1]: messages #0..100 3: [0043_no_connection / 0.005s] SUM(POLL): duration 0.000ms 3: [0043_no_connection / 0.005s] PRODUCE: duration 0.105ms 3: [0025_timers / 1.310s] rd_kafka_poll(): duration 101.260ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: many messages 3: [0034_offset_reset_mock / 1.056s] #0: injecting _TRANSPORT, expecting NO_ERROR 3: [0034_offset_reset_mock / 1.056s] Bringing down the broker 3: %6|1669457748.756|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:44345/bootstrap]: 127.0.0.1:44345/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1000ms in state UP) 3: %3|1669457748.757|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:44345/bootstrap]: 127.0.0.1:44345/1: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT) 3: %6|1669457748.757|FAIL|0034_offset_reset_mock#consumer-26| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:44345: Disconnected (after 997ms in state UP) 3: %3|1669457748.757|FAIL|0034_offset_reset_mock#consumer-26| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:44345: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT) 3: [0034_offset_reset_mock / 1.058s] ASSIGN.PARTITIONS: duration 1.402ms 3: [0034_offset_reset_mock / 1.058s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 1.058s] #0: Ignoring Error event: 127.0.0.1:44345/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1000ms in state UP) 3: [0034_offset_reset_mock / 1.058s] #0: Ignoring Error event: 127.0.0.1:44345/1: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT) 3: [0034_offset_reset_mock / 1.058s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:44345: Disconnected (after 997ms in state UP) 3: [0034_offset_reset_mock / 1.058s] #0: Ignoring Error event: 2/2 brokers are down 3: [0034_offset_reset_mock / 1.058s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:44345: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457748.758|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:44345/bootstrap]: 127.0.0.1:44345/1: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0034_offset_reset_mock / 1.058s] #0: Ignoring Error event: 127.0.0.1:44345/1: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 100001 messages into destq with 165288 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 17us, 0.0002us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 50001 messages into destq with 265289 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1448us, 0.0290us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 20001 messages into destq with 315290 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2046us, 0.1023us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 59129 messages into destq with 335291 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2045us, 0.0346us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 86823 messages into destq with 394420 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 4212us, 0.0485us/msg 3: [0025_timers / 1.410s] rd_kafka_poll(): duration 100.075ms 3: %3|1669457748.941|FAIL|0034_offset_reset_mock#consumer-26| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:44345: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0034_offset_reset_mock / 1.242s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:44345: Connect to ipv4#127.0.0.1:44345 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0025_timers / 1.510s] rd_kafka_poll(): duration 100.070ms 3: [0025_timers / 1.610s] rd_kafka_poll(): duration 100.074ms 3: [0025_timers / 1.710s] rd_kafka_poll(): duration 100.077ms 3: [0025_timers / 1.808s] Call #2: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 1.808s] rd_kafka_poll(): duration 98.337ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 4000001 messages into destq with 481243 messages 3: [0025_timers / 1.908s] rd_kafka_poll(): duration 100.073ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 4994us, 0.0012us/msg 3: [0025_timers / 2.008s] rd_kafka_poll(): duration 100.065ms 3: [0037_destroy_hang_local / 1.026s] 0037_destroy_hang_local: duration 1025.702ms 3: [0037_destroy_hang_local / 1.026s] ================= Test 0037_destroy_hang_local PASSED ================= 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0034us/msg over 4315956 messages in 14762us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: issue #2508 3: [0025_timers / 2.109s] rd_kafka_poll(): duration 100.063ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 145952 messages into destq with 154875 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 2482us, 0.0083us/msg 3: [
/ 2.136s] Too many tests running (5 >= 5): postponing 0046_rkt_cache start... 3: [0045_subscribe_update_mock / 0.000s] ================= Running test 0045_subscribe_update_mock ================= 3: [0045_subscribe_update_mock / 0.000s] ==== Stats written to file stats_0045_subscribe_update_mock_6019537686844855064.json ==== 3: [0045_subscribe_update_mock / 0.000s] [ do_test_regex_many_mock:378: range with 50 topics ] 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: issue #2508 3: %5|1669457749.559|CONFWARN|MOCK#producer-32| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0045_subscribe_update_mock / 0.005s] Test config file test.conf not found 3: [0045_subscribe_update_mock / 0.005s] Setting test timeout to 300s * 2.7 3: [0045_subscribe_update_mock / 0.013s] Created kafka instance 0045_subscribe_update_mock#consumer-33 3: [0045_subscribe_update_mock / 0.017s] Creating topic topic_0 3: [0045_subscribe_update_mock / 0.018s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.143ms 3: [0045_subscribe_update_mock / 0.018s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 0.018s] TEST FAILURE 3: ### Test "0045_subscribe_update_mock (do_test_regex_many_mock:378: range with 50 topics)" failed at /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:4048:test_consumer_poll_no_msgs() at Sat Nov 26 10:15:49 2022: ### 3: ^topic_.* [0] error (offset -1001): Subscribed topic not available: ^topic_.*: Broker: Unknown topic or partition 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 59129 messages into destq with 154875 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 86823 messages into destq with 214004 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0000us/msg over 145952 messages in 3us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: issue #2450 (v1.2.1 regression) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 86 messages into destq with 199999 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 2us, 0.0000us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: issue #2450 (v1.2.1 regression) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 199999 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.5000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 5 messages into destq with 200001 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2us, 0.4000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 4 messages into destq with 200006 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200010 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.5000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 3 messages into destq with 200012 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.3333us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 61 messages into destq with 200015 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.0164us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200076 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.5000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200078 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2us, 1.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200080 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: [0025_timers / 2.210s] rd_kafka_poll(): duration 100.967ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 3 messages into destq with 200082 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2us, 0.6667us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.1279us/msg over 86 messages in 11us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: msg: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmurmur2.c:166: unittest_murmur2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: murmurhash: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdfnv1a.c:112: unittest_fnv1a 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: fnv1a: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:468: ut_high_sigfig 3: [
/ 2.237s] Too many tests running (5 >= 5): postponing 0053_stats_timing start... 3: [0046_rkt_cache / 0.000s] ================= Running test 0046_rkt_cache ================= 3: [0046_rkt_cache / 0.000s] ==== Stats written to file stats_0046_rkt_cache_2291148029435838658.json ==== 3: [0046_rkt_cache / 0.000s] Using topic "rdkafkatest_0046_rkt_cache" 3: [0046_rkt_cache / 0.000s] Test config file test.conf not found 3: %5|1669457749.660|CONFWARN|0046_rkt_cache#producer-34| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0046_rkt_cache / 0.005s] Created kafka instance 0046_rkt_cache#producer-34 3: [0046_rkt_cache / 0.005s] Test config file test.conf not found 3: [0046_rkt_cache / 0.012s] 0046_rkt_cache: duration 11.918ms 3: [0046_rkt_cache / 0.012s] ================= Test 0046_rkt_cache PASSED ================= 3: [
/ 2.249s] Too many tests running (5 >= 5): postponing 0058_log start... 3: [0053_stats_timing / 0.000s] ================= Running test 0053_stats_timing ================= 3: [0053_stats_timing / 0.000s] ==== Stats written to file stats_0053_stats_timing_8452406878840087287.json ==== 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:495: ut_quantile 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:514: ut_mean 3: [0043_no_connection / 1.005s] 300 messages in queue 3: %4|1669457749.691|TERMINATE|0043_no_connection#producer-31| [thrd:app]: Producer terminating with 300 messages (30000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0043_no_connection / 1.009s] rd_kafka_destroy(): duration 3.586ms 3: [0043_no_connection / 1.009s] 0043_no_connection: duration 1008.667ms 3: [0043_no_connection / 1.009s] ================= Test 0043_no_connection PASSED ================= 3: [
/ 2.277s] Too many tests running (5 >= 5): postponing 0062_stats_event start... 3: [0058_log / 0.000s] ================= Running test 0058_log ================= 3: [0058_log / 0.000s] ==== Stats written to file stats_0058_log_7699935012866328981.json ==== 3: [0058_log / 0.000s] main.queue: Creating producer, not expecting any log messages 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:536: ut_stddev 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:555: ut_totalcount 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:573: ut_max 3: [0025_timers / 2.315s] rd_kafka_poll(): duration 100.990ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:590: ut_min 3: [0053_stats_timing / 0.100s] Stats (#0): { "name": "rdkafka#producer-35", "client_id": "rdkafka", "type": "producer", "ts":9411133359597, "time":1669457749, "age":100216, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:609: ut_reset 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:623: ut_nan 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:638: ut_sigfigs 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:654: ut_minmax_trackable 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:664: ut_unitmagnitude_overflow 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:697: ut_subbucketmask_overflow 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdhdrhistogram: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4311: unittest_conf: Safified client.software.name="aba.-va" 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4319: unittest_conf: Safified client.software.version="1.2.3.4.5----a" 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4323: unittest_conf 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: conf: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_broker.c:2081: rd_ut_reconnect_backoff 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: broker: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4827: unittest_idempotent_producer: Verifying idempotent producer error handling 3: %5|1669457749.778|CONFWARN|rdkafka#producer-37| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: [0025_timers / 2.408s] Call #3: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 2.408s] rd_kafka_poll(): duration 93.733ms 3: [0025_timers / 2.508s] rd_kafka_poll(): duration 100.073ms 3: [0025_timers / 2.608s] rd_kafka_poll(): duration 100.073ms 3: [0025_timers / 2.709s] rd_kafka_poll(): duration 100.038ms 3: [0025_timers / 2.809s] rd_kafka_poll(): duration 100.072ms 3: [0025_timers / 2.909s] rd_kafka_poll(): duration 100.129ms 3: [0025_timers / 3.008s] Call #4: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 3.008s] rd_kafka_poll(): duration 99.524ms 3: [0025_timers / 3.108s] rd_kafka_poll(): duration 100.073ms 3: [0025_timers / 3.208s] rd_kafka_poll(): duration 100.075ms 3: [0058_log / 1.000s] main.queue: Setting log queue 3: [0058_log / 1.000s] main.queue: Expecting at least one log message 3: [0058_log / 1.000s] Log: level 7, facility INIT, str [thrd:app]: librdkafka v1.9.2 (0x10902ff) 0058_log#producer-36 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS SNAPPY SOCKEM, debug 0x1) 3: [0058_log / 1.000s] Log: level 5, facility CONFWARN, str [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0058_log / 1.000s] main.queue: Saw 2 logs 3: [0058_log / 1.001s] local.queue: Creating producer, not expecting any log messages 3: [0025_timers / 3.309s] rd_kafka_poll(): duration 100.245ms 3: [0053_stats_timing / 1.101s] Stats (#10): { "name": "rdkafka#producer-35", "client_id": "rdkafka", "type": "producer", "ts":9411134359796, "time":1669457750, "age":1100415, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:5022: unittest_idempotent_producer 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: request: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1581: do_unittest_config_no_principal_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1610: do_unittest_config_empty_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1677: do_unittest_config_empty_value_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1715: do_unittest_config_value_with_quote_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1640: do_unittest_config_unrecognized_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1464: do_unittest_config_defaults 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1507: do_unittest_config_explicit_scope_and_life 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1551: do_unittest_config_all_explicit_values 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1754: do_unittest_config_extensions 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1773: do_unittest_illegal_extension_keys_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1806: do_unittest_odd_extension_size_should_fail 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: sasl_oauthbearer: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msgset_reader.c:1781: unittest_aborted_txns 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: aborted_txns: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5705: unittest_consumer_group_metadata 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5776: unittest_set_intersect 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5825: unittest_set_subtract 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5852: unittest_map_to_list 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5882: unittest_list_to_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: cgrp: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_scram.c:910: unittest_scram_nonce 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_scram.c:949: unittest_scram_safe 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: scram: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case Symmetrical subscription: range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case Symmetrical subscription: roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case 1*3 partitions (asymmetrical): range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case 1*3 partitions (asymmetrical): roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case #2121 (asymmetrical): range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case #2121 (asymmetrical): roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #0 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerNoTopic:2211: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2217: ut_testOneConsumerNoTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #0 ran for 0.026ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #1 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerNonexistentTopic:2237: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2243: ut_testOneConsumerNonexistentTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #1 ran for 0.016ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #2 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerOneTopic:2269: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2275: ut_testOneConsumerOneTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #2 ran for 0.025ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #3 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOnlyAssignsPartitionsFromSubscribedTopics:2300: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2306: ut_testOnlyAssignsPartitionsFromSubscribedTopics 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #3 ran for 0.022ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #4 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerMultipleTopics:2329: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2335: ut_testOneConsumerMultipleTopics 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #4 ran for 0.021ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #5 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersOneTopicOnePartition:2358: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2365: ut_testTwoConsumersOneTopicOnePartition 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #5 ran for 0.022ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #6 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersOneTopicTwoPartitions:2389: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2396: ut_testTwoConsumersOneTopicTwoPartitions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #6 ran for 0.020ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #7 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMultipleConsumersMixedTopicSubscriptions:2424: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2432: ut_testMultipleConsumersMixedTopicSubscriptions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #7 ran for 0.039ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #8 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersTwoTopicsSixPartitions:2459: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2466: ut_testTwoConsumersTwoTopicsSixPartitions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #8 ran for 0.033ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #9 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2487: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2501: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2514: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2522: ut_testAddRemoveConsumerOneTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #9 ran for 0.065ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #10 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testPoorRoundRobinAssignmentScenario:2576: verifying assignment for 4 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2585: ut_testPoorRoundRobinAssignmentScenario 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #10 ran for 0.058ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #11 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2609: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2615: ut_testAddRemoveTopicTwoConsumers: Adding topic2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2630: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2638: ut_testAddRemoveTopicTwoConsumers: Removing topic1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2650: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2658: ut_testAddRemoveTopicTwoConsumers 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #11 ran for 0.079ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #12 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerLeaves:2706: verifying assignment for 19 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerLeaves:2721: verifying assignment for 18 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2728: ut_testReassignmentAfterOneConsumerLeaves 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #12 ran for 5.421ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #13 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerAdded:2762: verifying assignment for 8 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerAdded:2774: verifying assignment for 9 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2781: ut_testReassignmentAfterOneConsumerAdded 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #13 ran for 0.353ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #14 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testSameSubscriptions:2823: verifying assignment for 9 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testSameSubscriptions:2836: verifying assignment for 8 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2844: ut_testSameSubscriptions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #14 ran for 3.766ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #15 ] 3: [0025_timers / 3.410s] rd_kafka_poll(): duration 100.733ms 3: [0053_stats_timing / 1.201s] 12 (expected 12) stats callbacks received in 1200ms (expected 1200ms +-25%) 3: [0053_stats_timing / 1.201s] 0053_stats_timing: duration 1200.819ms 3: [0053_stats_timing / 1.201s] ================= Test 0053_stats_timing PASSED ================= 3: [
/ 3.450s] Too many tests running (5 >= 5): postponing 0066_plugins start... 3: [0062_stats_event / 0.000s] ================= Running test 0062_stats_event ================= 3: [0062_stats_event / 0.000s] ==== Stats written to file stats_0062_stats_event_4019139421165376566.json ==== 3: [0062_stats_event / 0.000s] Test config file test.conf not found 3: [0062_stats_event / 0.000s] Setting test timeout to 10s * 2.7 3: %5|1669457750.869|CONFWARN|0062_stats_event#producer-41| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0062_stats_event / 0.000s] Created kafka instance 0062_stats_event#producer-41 3: [0025_timers / 3.511s] rd_kafka_poll(): duration 100.971ms 3: [0062_stats_event / 0.100s] Stats event 3: [0062_stats_event / 0.100s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":9411134561239, "time":1669457750, "age":100102, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.100s] STATS_EVENT: duration 100.203ms 3: [0025_timers / 3.608s] Call #5: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 3.608s] rd_kafka_poll(): duration 97.789ms 3: [0062_stats_event / 0.200s] Stats event 3: [0062_stats_event / 0.200s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":9411134661259, "time":1669457751, "age":200122, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.200s] STATS_EVENT: duration 99.839ms 3: [0025_timers / 3.712s] rd_kafka_poll(): duration 103.380ms 3: [0062_stats_event / 0.300s] Stats event 3: [0062_stats_event / 0.300s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":9411134761277, "time":1669457751, "age":300140, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.300s] STATS_EVENT: duration 99.991ms 3: [0025_timers / 3.812s] rd_kafka_poll(): duration 100.769ms 3: [0062_stats_event / 0.400s] Stats event 3: [0062_stats_event / 0.400s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":9411134861295, "time":1669457751, "age":400158, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.400s] STATS_EVENT: duration 100.002ms 3: [0025_timers / 3.913s] rd_kafka_poll(): duration 100.982ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testLargeAssignmentWithMultipleConsumersLeaving:2895: verifying assignment for 200 member(s): 3: [0062_stats_event / 0.500s] Stats event 3: [0062_stats_event / 0.500s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":9411134961311, "time":1669457751, "age":500174, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.500s] STATS_EVENT: duration 100.007ms 3: [0062_stats_event / 0.501s] 0062_stats_event: duration 500.606ms 3: [0062_stats_event / 0.501s] ================= Test 0062_stats_event PASSED ================= 3: [
/ 3.955s] Too many tests running (5 >= 5): postponing 0072_headers_ut start... 3: [0066_plugins / 0.000s] ================= Running test 0066_plugins ================= 3: [0066_plugins / 0.000s] ==== Stats written to file stats_0066_plugins_4818659869951661240.json ==== 3: [0066_plugins / 0.000s] Using topic "rdkafkatest_rnd56010cb37e721a61_0066_plugins" 3: [0066_plugins / 0.000s] running test from cwd /usr/src/RPM/BUILD/librdkafka-1.9.2/tests 3: [0066_plugins / 0.000s] set(session.timeout.ms, 6000) 3: [0066_plugins / 0.000s] set(plugin.library.paths, interceptor_test/interceptor_test) 3: [0066_plugins / 0.000s] conf_init(conf 0xf410cb80) called (setting opaque to 0xf5ea00ea) 3: [0066_plugins / 0.000s] conf_init0(conf 0xf410cb80) for ici 0xf410d490 with ici->conf 0xf410d9c0 3: [0066_plugins / 0.000s] set(socket.timeout.ms, 12) 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410cb80, "socket.timeout.ms", "12"): 0xf410d490 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410cb80, "socket.timeout.ms", "12"): 0xf410d490 3: [0066_plugins / 0.000s] set(interceptor_test.config1, one) 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410cb80, "interceptor_test.config1", "one"): 0xf410d490 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410cb80, interceptor_test.config1, one): 0xf410d490 3: [0066_plugins / 0.000s] set(interceptor_test.config2, two) 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410cb80, "interceptor_test.config2", "two"): 0xf410d490 3: [0066_plugins / 0.000s] set(topic.metadata.refresh.interval.ms, 1234) 3: [0066_plugins / 0.000s] on_conf_dup(new_conf 0xf410e460, old_conf 0xf410cb80, filter_cnt 0, ici 0xf410d490) 3: [0066_plugins / 0.000s] conf_init0(conf 0xf410e460) for ici 0xf410eae0 with ici->conf 0xf410eb00 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410e460, "socket.timeout.ms", "12"): 0xf410eae0 3: [0066_plugins / 0.000s] conf_init(conf 0xf410eb00) called (setting opaque to 0xf5ea00ea) 3: [0066_plugins / 0.000s] conf_init0(conf 0xf410eb00) for ici 0xf410f510 with ici->conf 0xf410f530 3: [0066_plugins / 0.000s] conf_init(conf 0xf410e460) called (setting opaque to 0xf5ea00ea) 3: [0066_plugins / 0.000s] on_conf_dup(new_conf 0xf410ffc0, old_conf 0xf410e460, filter_cnt 2, ici 0xf410eae0) 3: [0066_plugins / 0.000s] conf_init0(conf 0xf410ffc0) for ici 0xf4110640 with ici->conf 0xf4110660 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410ffc0, "socket.timeout.ms", "12"): 0xf4110640 3: [0066_plugins / 0.000s] conf_init0(conf 0xf410e460) for ici 0xf410ffa0 with ici->conf 0xf410ffc0 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410e460, "interceptor_test.config1", "one"): 0xf410eae0 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410e460, interceptor_test.config1, one): 0xf410eae0 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410e460, "interceptor_test.config2", "two"): 0xf410eae0 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410e460, "session.timeout.ms", "6000"): 0xf410eae0 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410eb00, "session.timeout.ms", "6000"): 0xf410f510 3: [0066_plugins / 0.000s] on_conf_set(conf 0xf410eb00, "session.timeout.ms", "6000"): 0xf410f510 3: [0066_plugins / 0.000s] on_new(rk 0xf41111c0, conf 0xf4111294, ici->conf 0xf410eb00): 0xf410eae0: #1 3: %4|1669457751.377|CONFWARN|rdkafka#producer-42| [thrd:app]: Configuration property session.timeout.ms is a consumer property and will be ignored by this producer instance 3: %5|1669457751.377|CONFWARN|rdkafka#producer-42| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0066_plugins / 0.010s] 0066_plugins: duration 9.957ms 3: [0066_plugins / 0.010s] ================= Test 0066_plugins PASSED ================= 3: [
/ 3.965s] Too many tests running (5 >= 5): postponing 0074_producev start... 3: [0072_headers_ut / 0.000s] ================= Running test 0072_headers_ut ================= 3: [0072_headers_ut / 0.000s] ==== Stats written to file stats_0072_headers_ut_7553239240891180896.json ==== 3: [0072_headers_ut / 0.000s] Using topic "rdkafkatest_0072_headers_ut" 3: %5|1669457751.384|CONFWARN|0072_headers_ut#producer-43| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0072_headers_ut / 0.000s] Created kafka instance 0072_headers_ut#producer-43 3: [0025_timers / 4.014s] rd_kafka_poll(): duration 100.643ms 3: [0025_timers / 4.114s] rd_kafka_poll(): duration 100.316ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testLargeAssignmentWithMultipleConsumersLeaving:2911: verifying assignment for 150 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2918: ut_testLargeAssignmentWithMultipleConsumersLeaving 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #15 ran for 807.914ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #16 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNewSubscription:2957: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2963: ut_testNewSubscription: Adding topic1 to consumer1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNewSubscription:2972: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2980: ut_testNewSubscription 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #16 ran for 0.176ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #17 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMoveExistingAssignments:3006: verifying assignment for 4 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMoveExistingAssignments:3027: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3063: ut_testMoveExistingAssignments 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #17 ran for 0.065ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #18 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness:3111: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3118: ut_testStickiness 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #18 ran for 0.043ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #19 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3144: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3154: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3168: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3168: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3180: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3192: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3201: ut_testStickiness2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #19 ran for 0.344ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #20 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAssignmentUpdatedForDeletedTopic:3223: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3233: ut_testAssignmentUpdatedForDeletedTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #20 ran for 0.553ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #21 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted:3255: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted:3269: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3275: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #21 ran for 0.023ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #22 ] 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3290: ut_testConflictingPreviousAssignments 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #22 ran for 0.005ms ] 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:1054: ut_assignors 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: assignors: PASS 3: [0000_unittests / 4.187s] 0000_unittests: duration 4187.044ms 3: [0000_unittests / 4.187s] ================= Test 0000_unittests PASSED ================= 3: [
/ 4.188s] Too many tests running (5 >= 5): postponing 0078_c_from_cpp start... 3: [0074_producev / 0.000s] ================= Running test 0074_producev ================= 3: [0074_producev / 0.000s] ==== Stats written to file stats_0074_producev_5594565809489003033.json ==== 3: %5|1669457751.606|CONFWARN|0074_producev#producer-44| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0074_producev / 0.000s] Created kafka instance 0074_producev#producer-44 3: [0074_producev / 0.000s] produceva() error (expected): Failed to produce message: Broker: Message size too large 3: [0074_producev / 0.002s] 0074_producev: duration 1.574ms 3: [0074_producev / 0.002s] ================= Test 0074_producev PASSED ================= 3: [0025_timers / 4.208s] Call #6: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 4.208s] rd_kafka_poll(): duration 93.819ms 3: [0058_log / 2.001s] local.queue: Setting log queue 3: [0058_log / 2.001s] local.queue: Expecting at least one log message 3: [0058_log / 2.001s] Log: level 7, facility INIT, str [thrd:app]: librdkafka v1.9.2 (0x10902ff) 0058_log#producer-38 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS SNAPPY SOCKEM, debug 0x1) 3: [0058_log / 2.001s] Log: level 5, facility CONFWARN, str [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0058_log / 2.001s] local.queue: Saw 2 logs 3: [0058_log / 2.001s] 0058_log: duration 2001.161ms 3: [0058_log / 2.001s] ================= Test 0058_log PASSED ================= 3: [0079_fork / 0.000s] WARN: SKIPPING TEST: Filtered due to negative test flags 3: [0078_c_from_cpp / 0.000s] ================= Running test 0078_c_from_cpp ================= 3: [0078_c_from_cpp / 0.000s] ==== Stats written to file stats_0078_c_from_cpp_3554790719444267279.json ==== 3: [
/ 4.280s] Too many tests running (5 >= 5): postponing 0084_destroy_flags_local start... 3: [0080_admin_ut / 0.000s] ================= Running test 0080_admin_ut ================= 3: [0080_admin_ut / 0.000s] ==== Stats written to file stats_0080_admin_ut_346501757211196716.json ==== 3: [0080_admin_ut / 0.000s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq ] 3: [0080_admin_ut / 0.000s] Test config file test.conf not found 3: %5|1669457751.699|CONFWARN|0080_admin_ut#producer-46| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1669457751.699|CONFWARN|myclient#producer-45| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0078_c_from_cpp / 0.001s] Compare C name myclient#producer-45 to C++ name myclient#producer-45 3: [0078_c_from_cpp / 0.001s] Compare C topic mytopic to C++ topic mytopic 3: [0078_c_from_cpp / 0.001s] 0078_c_from_cpp: duration 0.671ms 3: [0078_c_from_cpp / 0.001s] ================= Test 0078_c_from_cpp PASSED ================= 3: [
/ 4.281s] Too many tests running (5 >= 5): postponing 0086_purge_local start... 3: [0084_destroy_flags_local / 0.000s] ================= Running test 0084_destroy_flags_local ================= 3: [0084_destroy_flags_local / 0.000s] ==== Stats written to file stats_0084_destroy_flags_local_9196580530064185041.json ==== 3: [0084_destroy_flags_local / 0.000s] Using topic "rdkafkatest_rnd48d338a28da9cb9_destroy_flags" 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.000s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1669457751.699|CONFWARN|0084_destroy_flags_local#producer-47| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.000s] Created kafka instance 0084_destroy_flags_local#producer-47 3: [0084_destroy_flags_local / 0.000s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 0.000s] rd_kafka_destroy_flags(0x0): duration 0.117ms 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.000s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1669457751.700|CONFWARN|0084_destroy_flags_local#producer-48| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.001s] Created kafka instance 0084_destroy_flags_local#producer-48 3: [0084_destroy_flags_local / 0.001s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 0.001s] rd_kafka_destroy_flags(0x8): duration 0.194ms 3: [0084_destroy_flags_local / 0.001s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.001s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.001s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.001s] Setting test timeout to 20s * 2.7 3: %5|1669457751.700|CONFWARN|0084_destroy_flags_local#producer-49| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.001s] Created kafka instance 0084_destroy_flags_local#producer-49 3: [0084_destroy_flags_local / 0.001s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.001s] Produce to rdkafkatest_rnd48d338a28da9cb9_destroy_flags [-1]: messages #0..10000 3: [0084_destroy_flags_local / 0.013s] SUM(POLL): duration 0.000ms 3: [0084_destroy_flags_local / 0.013s] PRODUCE: duration 12.164ms 3: [0084_destroy_flags_local / 0.013s] Calling rd_kafka_destroy_flags(0x0) 3: %4|1669457751.712|TERMINATE|0084_destroy_flags_local#producer-49| [thrd:app]: Producer terminating with 10000 messages (1000000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0084_destroy_flags_local / 0.014s] rd_kafka_destroy_flags(0x0): duration 0.894ms 3: [0084_destroy_flags_local / 0.014s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.014s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.014s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.014s] Setting test timeout to 20s * 2.7 3: %5|1669457751.713|CONFWARN|0084_destroy_flags_local#producer-50| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.014s] Created kafka instance 0084_destroy_flags_local#producer-50 3: [0084_destroy_flags_local / 0.014s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.014s] Produce to rdkafkatest_rnd48d338a28da9cb9_destroy_flags [-1]: messages #0..10000 3: [0084_destroy_flags_local / 0.025s] SUM(POLL): duration 0.001ms 3: [0084_destroy_flags_local / 0.025s] PRODUCE: duration 10.380ms 3: [0084_destroy_flags_local / 0.025s] Calling rd_kafka_destroy_flags(0x8) 3: %4|1669457751.724|TERMINATE|0084_destroy_flags_local#producer-50| [thrd:app]: Producer terminating with 10000 messages (1000000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0084_destroy_flags_local / 0.027s] rd_kafka_destroy_flags(0x8): duration 2.221ms 3: [0084_destroy_flags_local / 0.027s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.027s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.027s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.027s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 0.027s] Created kafka instance 0084_destroy_flags_local#consumer-51 3: [0025_timers / 4.309s] rd_kafka_poll(): duration 100.168ms 3: [0080_admin_ut / 0.100s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 0.100s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 0.101s] rd_kafka_destroy(): duration 0.388ms 3: [0080_admin_ut / 0.101s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq: PASS (0.10s) ] 3: [0080_admin_ut / 0.101s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 0.101s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq ] 3: [0080_admin_ut / 0.101s] Test config file test.conf not found 3: %5|1669457751.799|CONFWARN|0080_admin_ut#producer-52| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0025_timers / 4.409s] rd_kafka_poll(): duration 100.070ms 3: [0080_admin_ut / 0.201s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 0.201s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 0.201s] rd_kafka_destroy(): duration 0.307ms 3: [0080_admin_ut / 0.201s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq: PASS (0.10s) ] 3: [0080_admin_ut / 0.201s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 0.201s] Test config file test.conf not found 3: %5|1669457751.900|CONFWARN|0080_admin_ut#producer-53| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 0.202s] [ do_test_options:1588 ] 3: [0080_admin_ut / 0.202s] [ do_test_options:1588: PASS (0.00s) ] 3: [0080_admin_ut / 0.202s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd7927db57a44366e_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd7480b03b527bfd15_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd14f783537247c813_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnda447e99789cab0a_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd2dc2082f3063850d_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd70a159ce424999c5_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 0.202s] CreateTopics: duration 0.145ms 3: [0025_timers / 4.509s] rd_kafka_poll(): duration 100.073ms 3: [0080_admin_ut / 0.302s] CreateTopics.queue_poll: duration 99.891ms 3: [0080_admin_ut / 0.302s] CreateTopics: got CreateTopicsResult in 99.891s 3: [0080_admin_ut / 0.302s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 0.302s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms ] 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd6f1374302b1c18d7_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd3cb758260e3f5af0_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rndc0936130e6073_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd7facda7a4dce3745_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd288777335560b4fa_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd1a03bbb149e16535_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 0.302s] CreateTopics: duration 0.139ms 3: [0025_timers / 4.609s] rd_kafka_poll(): duration 100.071ms 3: [0080_admin_ut / 0.402s] CreateTopics.wait_background_event_cb: duration 99.917ms 3: [0080_admin_ut / 0.402s] CreateTopics: got CreateTopicsResult in 99.917s 3: [0080_admin_ut / 0.402s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 0.402s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd5f09b1d15ed8e889_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnda6ba8b4691d6722_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd20a949af16df6f51_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd143bbee10780c920_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd5dcf40c576943d50_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd483eea0a334b9576_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 0.403s] CreateTopics: duration 0.119ms 3: [0025_timers / 4.709s] rd_kafka_poll(): duration 100.071ms 3: [0084_destroy_flags_local / 0.529s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 0.530s] rd_kafka_destroy_flags(0x0): duration 0.658ms 3: [0084_destroy_flags_local / 0.530s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.530s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.530s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.530s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 0.530s] Created kafka instance 0084_destroy_flags_local#consumer-54 3: [0025_timers / 4.808s] Call #7: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 4.808s] rd_kafka_poll(): duration 99.455ms 3: [0080_admin_ut / 0.603s] CreateTopics.queue_poll: duration 199.919ms 3: [0080_admin_ut / 0.603s] CreateTopics: got CreateTopicsResult in 199.919s 3: [0080_admin_ut / 0.603s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 0.603s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.603s] Using topic "rdkafkatest_rnd6403baa233579eac_do_test_CreateTopics" 3: [0080_admin_ut / 0.603s] Using topic "rdkafkatest_rnd2867c86633047927_do_test_CreateTopics" 3: [0080_admin_ut / 0.603s] Using topic "rdkafkatest_rnd715909b55b8bf05a_do_test_CreateTopics" 3: [0080_admin_ut / 0.603s] Using topic "rdkafkatest_rnd348e8172758fac0c_do_test_CreateTopics" 3: [0080_admin_ut / 0.603s] Using topic "rdkafkatest_rnd4a3ff64454995ddd_do_test_CreateTopics" 3: [0080_admin_ut / 0.603s] Using topic "rdkafkatest_rnd523933295f050691_do_test_CreateTopics" 3: [0080_admin_ut / 0.603s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 0.603s] CreateTopics: duration 0.109ms 3: [0025_timers / 4.908s] rd_kafka_poll(): duration 100.073ms 3: [0072_headers_ut / 1.001s] 0072_headers_ut: duration 1000.950ms 3: [0072_headers_ut / 1.001s] ================= Test 0072_headers_ut PASSED ================= 3: [0025_timers / 5.008s] rd_kafka_poll(): duration 100.039ms 3: [
/ 5.066s] Too many tests running (5 >= 5): postponing 0095_all_brokers_down start... 3: [0086_purge_local / 0.000s] ================= Running test 0086_purge_local ================= 3: [0086_purge_local / 0.000s] ==== Stats written to file stats_0086_purge_local_5967017823324491841.json ==== 3: [0086_purge_local / 0.000s] Using topic "rdkafkatest_0086_purge" 3: [0086_purge_local / 0.000s] Test rd_kafka_purge(): local 3: [0086_purge_local / 0.000s] Test config file test.conf not found 3: [0086_purge_local / 0.000s] Setting test timeout to 20s * 2.7 3: %4|1669457752.485|CONFWARN|0086_purge_local#producer-55| [thrd:app]: Configuration property enable.gapless.guarantee is experimental: When set to `true`, any error that could result in a gap in the produced message series when a batch of messages fails, will raise a fatal error (ERR__GAPLESS_GUARANTEE) and stop the producer. Messages failing due to `message.timeout.ms` are not covered by this guarantee. Requires `enable.idempotence=true`. 3: %5|1669457752.485|CONFWARN|0086_purge_local#producer-55| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0086_purge_local / 0.000s] Created kafka instance 0086_purge_local#producer-55 3: [0086_purge_local / 0.000s] Producing 20 messages to topic rdkafkatest_0086_purge 3: [0086_purge_local / 0.000s] local:281: purge(0x2): expecting 20 messages to remain when done 3: [0086_purge_local / 0.000s] local:281: purge(0x2): duration 0.004ms 3: [0086_purge_local / 0.000s] local:285: purge(0x1): expecting 0 messages to remain when done 3: [0086_purge_local / 0.000s] local:285: purge(0x1): duration 0.004ms 3: [0086_purge_local / 0.000s] DeliveryReport for msg #0: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #1: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #2: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #3: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #4: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #5: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #6: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #7: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #8: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #9: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #10: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #11: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #12: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #13: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #14: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #15: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #16: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #17: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #18: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #19: _PURGE_QUEUE 3: [0086_purge_local / 0.008s] 0086_purge_local: duration 7.621ms 3: [0086_purge_local / 0.008s] ================= Test 0086_purge_local PASSED ================= 3: [
/ 5.074s] Too many tests running (5 >= 5): postponing 0097_ssl_verify_local start... 3: [0095_all_brokers_down / 0.000s] ================= Running test 0095_all_brokers_down ================= 3: [0095_all_brokers_down / 0.000s] ==== Stats written to file stats_0095_all_brokers_down_833987738759366677.json ==== 3: [0095_all_brokers_down / 0.000s] Setting test timeout to 20s * 2.7 3: [0095_all_brokers_down / 0.000s] Test Producer 3: [
/ 5.075s] Log: [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 0.001s] Error: Local: Broker transport failure: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0080_admin_ut / 0.803s] CreateTopics.queue_poll: duration 199.925ms 3: [0080_admin_ut / 0.803s] CreateTopics: got CreateTopicsResult in 199.925s 3: [0080_admin_ut / 0.803s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 0.803s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-53 DeleteTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 0.803s] Using topic "rdkafkatest_rnd13ea0f221313b48e_do_test_DeleteTopics" 3: [0080_admin_ut / 0.803s] Using topic "rdkafkatest_rnd770b8ed671b94fe7_do_test_DeleteTopics" 3: [0080_admin_ut / 0.803s] Using topic "rdkafkatest_rnd9a7f1de156aae6f_do_test_DeleteTopics" 3: [0080_admin_ut / 0.803s] Using topic "rdkafkatest_rnd39f839f13cf38754_do_test_DeleteTopics" 3: [0080_admin_ut / 0.803s] Call DeleteTopics, timeout is 100ms 3: [0080_admin_ut / 0.803s] DeleteTopics: duration 0.009ms 3: [0025_timers / 5.109s] rd_kafka_poll(): duration 100.074ms 3: [0080_admin_ut / 0.903s] DeleteTopics.queue_poll: duration 100.026ms 3: [0080_admin_ut / 0.903s] DeleteTopics: got DeleteTopicsResult in 100.026s 3: [0080_admin_ut / 0.903s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 0.903s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-53 DeleteTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.903s] Using topic "rdkafkatest_rnd420928f81dfbf493_do_test_DeleteTopics" 3: [0080_admin_ut / 0.903s] Using topic "rdkafkatest_rnd704b260101b603f4_do_test_DeleteTopics" 3: [0080_admin_ut / 0.903s] Using topic "rdkafkatest_rnd4663bcfa234f9f28_do_test_DeleteTopics" 3: [0080_admin_ut / 0.903s] Using topic "rdkafkatest_rndf31163637bcc6af_do_test_DeleteTopics" 3: [0080_admin_ut / 0.903s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 0.903s] DeleteTopics: duration 0.007ms 3: [0025_timers / 5.209s] rd_kafka_poll(): duration 100.074ms 3: [0084_destroy_flags_local / 1.031s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 1.031s] rd_kafka_destroy_flags(0x8): duration 0.214ms 3: [0084_destroy_flags_local / 1.031s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 1.031s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode ] 3: [0084_destroy_flags_local / 1.031s] Test config file test.conf not found 3: [0084_destroy_flags_local / 1.031s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 1.032s] Created kafka instance 0084_destroy_flags_local#consumer-57 3: [0025_timers / 5.311s] rd_kafka_poll(): duration 101.787ms 3: [0080_admin_ut / 1.103s] DeleteTopics.queue_poll: duration 200.031ms 3: [0080_admin_ut / 1.103s] DeleteTopics: got DeleteTopicsResult in 200.031s 3: [0080_admin_ut / 1.103s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 1.103s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-53 DeleteTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.103s] Using topic "rdkafkatest_rnd7edb8f82720cdd71_do_test_DeleteTopics" 3: [0080_admin_ut / 1.103s] Using topic "rdkafkatest_rnd6c4b4821746b3b8e_do_test_DeleteTopics" 3: [0080_admin_ut / 1.103s] Using topic "rdkafkatest_rnd1eca09e2368b3e65_do_test_DeleteTopics" 3: [0080_admin_ut / 1.103s] Using topic "rdkafkatest_rnd4904996c2a601edc_do_test_DeleteTopics" 3: [0080_admin_ut / 1.103s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 1.103s] DeleteTopics: duration 0.008ms 3: [0025_timers / 5.408s] Call #8: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 5.408s] rd_kafka_poll(): duration 97.855ms 3: [0025_timers / 5.508s] rd_kafka_poll(): duration 100.075ms 3: [0080_admin_ut / 1.303s] DeleteTopics.queue_poll: duration 200.031ms 3: [0080_admin_ut / 1.303s] DeleteTopics: got DeleteTopicsResult in 200.031s 3: [0080_admin_ut / 1.303s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 1.303s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-53 DeleteGroups with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 1.303s] Using topic "rdkafkatest_rnd8c4718e28099ffd_do_test_DeleteGroups" 3: [0080_admin_ut / 1.303s] Using topic "rdkafkatest_rnd1f139af95b938ca6_do_test_DeleteGroups" 3: [0080_admin_ut / 1.303s] Using topic "rdkafkatest_rnd27b7f03e2aa68667_do_test_DeleteGroups" 3: [0080_admin_ut / 1.303s] Using topic "rdkafkatest_rnd5f5c14bb3ba1ff60_do_test_DeleteGroups" 3: [0080_admin_ut / 1.303s] Call DeleteGroups, timeout is 100ms 3: [0080_admin_ut / 1.303s] DeleteGroups: duration 0.022ms 3: [0025_timers / 5.609s] rd_kafka_poll(): duration 100.074ms 3: [0080_admin_ut / 1.403s] DeleteGroups.queue_poll: duration 100.054ms 3: [0080_admin_ut / 1.403s] DeleteGroups: got DeleteGroupsResult in 100.054s 3: [0080_admin_ut / 1.403s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.403s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-53 DeleteGroups with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.403s] Using topic "rdkafkatest_rnd3dba3af55667a392_do_test_DeleteGroups" 3: [0080_admin_ut / 1.403s] Using topic "rdkafkatest_rnd2d5b4f4847622cd3_do_test_DeleteGroups" 3: [0080_admin_ut / 1.403s] Using topic "rdkafkatest_rnd6bd2520167538939_do_test_DeleteGroups" 3: [0080_admin_ut / 1.403s] Using topic "rdkafkatest_rnd455b4272ddb7af9_do_test_DeleteGroups" 3: [0080_admin_ut / 1.403s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 1.403s] DeleteGroups: duration 0.012ms 3: [0025_timers / 5.709s] rd_kafka_poll(): duration 100.075ms 3: [0084_destroy_flags_local / 1.538s] Calling rd_kafka_unsubscribe 3: [0084_destroy_flags_local / 1.538s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 1.539s] rd_kafka_destroy_flags(0x0): duration 1.328ms 3: [0084_destroy_flags_local / 1.539s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode: PASS ] 3: [0084_destroy_flags_local / 1.539s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode ] 3: [0084_destroy_flags_local / 1.539s] Test config file test.conf not found 3: [0084_destroy_flags_local / 1.539s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 1.539s] Created kafka instance 0084_destroy_flags_local#consumer-58 3: [0025_timers / 5.809s] rd_kafka_poll(): duration 100.800ms 3: [0080_admin_ut / 1.603s] DeleteGroups.queue_poll: duration 200.039ms 3: [0080_admin_ut / 1.603s] DeleteGroups: got DeleteGroupsResult in 200.039s 3: [0080_admin_ut / 1.603s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.603s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-53 DeleteGroups with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.604s] Using topic "rdkafkatest_rnd54f7dcd74a0da28_do_test_DeleteGroups" 3: [0080_admin_ut / 1.604s] Using topic "rdkafkatest_rnd2f917eed4bb33ac7_do_test_DeleteGroups" 3: [0080_admin_ut / 1.604s] Using topic "rdkafkatest_rnd17f079503ec29523_do_test_DeleteGroups" 3: [0080_admin_ut / 1.604s] Using topic "rdkafkatest_rnd370017616cc08d3_do_test_DeleteGroups" 3: [0080_admin_ut / 1.604s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 1.604s] DeleteGroups: duration 0.012ms 3: [0025_timers / 5.910s] rd_kafka_poll(): duration 100.072ms 3: [0025_timers / 6.008s] Call #9: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 6.008s] rd_kafka_poll(): duration 98.783ms 3: [0025_timers / 6.009s] All 10 intervals okay 3: [0025_timers / 6.009s] 0025_timers: duration 6008.973ms 3: [0025_timers / 6.009s] ================= Test 0025_timers PASSED ================= 3: [
/ 6.023s] Too many tests running (5 >= 5): postponing 0100_thread_interceptors start... 3: [0097_ssl_verify_local / 0.000s] ================= Running test 0097_ssl_verify_local ================= 3: [0097_ssl_verify_local / 0.000s] ==== Stats written to file stats_0097_ssl_verify_local_3517155820149885335.json ==== 3: [0097_ssl_verify_local / 0.000s] Feature "ssl" is built-in 3: %7|1669457753.442|OPENSSL|rdkafka#producer-59| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1669457753.442|SSL|rdkafka#producer-59| [thrd:app]: Loading CA certificate from string 3: [0097_ssl_verify_local / 0.000s] Failed to create producer with junk ssl.ca.pem (as expected): ssl.ca.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: CERTIFICATE 3: %7|1669457753.442|OPENSSL|rdkafka#producer-60| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1669457753.453|SSL|rdkafka#producer-60| [thrd:app]: Loading private key from string 3: [0097_ssl_verify_local / 0.012s] Failed to create producer with junk ssl.key.pem (as expected): ssl.key.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: ANY PRIVATE KEY 3: %7|1669457753.454|OPENSSL|rdkafka#producer-61| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1669457753.463|SSL|rdkafka#producer-61| [thrd:app]: Loading public key from string 3: [0097_ssl_verify_local / 0.023s] Failed to create producer with junk ssl.certificate.pem (as expected): ssl.certificate.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: CERTIFICATE 3: [0097_ssl_verify_local / 0.023s] 0097_ssl_verify_local: duration 22.767ms 3: [0097_ssl_verify_local / 0.023s] ================= Test 0097_ssl_verify_local PASSED ================= 3: [
/ 6.046s] Too many tests running (5 >= 5): postponing 0103_transactions_local start... 3: [0100_thread_interceptors / 0.000s] ================= Running test 0100_thread_interceptors ================= 3: [0100_thread_interceptors / 0.000s] ==== Stats written to file stats_0100_thread_interceptors_808189842874268791.json ==== 3: [0100_thread_interceptors / 0.000s] on_conf_dup() interceptor called 3: [0100_thread_interceptors / 0.000s] on_new() interceptor called 3: [
/ 6.049s] on_thread_start(2, :0/internal) called 3: [
/ 6.049s] Started thread: :0/internal 3: [
/ 6.050s] on_thread_start(0, main) called 3: [
/ 6.050s] Started thread: main 3: [
/ 6.051s] on_thread_start(2, 127.0.0.1:1/bootstrap) called 3: [
/ 6.051s] Started thread: 127.0.0.1:1/bootstrap 3: %3|1669457753.469|FAIL|rdkafka#producer-62| [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457753.469|ERROR|rdkafka#producer-62| [thrd:127.0.0.1:1/bootstrap]: 1/1 brokers are down 3: %3|1669457753.469|ERROR|rdkafka#producer-62| [thrd:app]: rdkafka#producer-62: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 6.054s] on_thread_exit(0, main) called 3: [
/ 6.054s] Exiting from thread: main 3: [
/ 6.055s] on_thread_exit(2, 127.0.0.1:1/bootstrap) called 3: [
/ 6.055s] Exiting from thread: 127.0.0.1:1/bootstrap 3: [
/ 6.057s] on_thread_exit(2, :0/internal) called 3: [
/ 6.057s] Exiting from thread: :0/internal 3: [0100_thread_interceptors / 0.011s] 3 thread start calls, 3 thread exit calls seen 3: [0100_thread_interceptors / 0.011s] 0100_thread_interceptors: duration 11.303ms 3: [0100_thread_interceptors / 0.011s] ================= Test 0100_thread_interceptors PASSED ================= 3: [
/ 6.058s] Too many tests running (5 >= 5): postponing 0104_fetch_from_follower_mock start... 3: [0103_transactions_local / 0.000s] ================= Running test 0103_transactions_local ================= 3: [0103_transactions_local / 0.000s] ==== Stats written to file stats_0103_transactions_local_2758041339625725389.json ==== 3: [0103_transactions_local / 0.000s] [ do_test_txn_local:1168 ] 3: [0103_transactions_local / 0.000s] Test config file test.conf not found 3: %5|1669457753.481|CONFWARN|0103_transactions_local#producer-63| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0103_transactions_local / 0.003s] Created kafka instance 0103_transactions_local#producer-63 3: [0103_transactions_local / 0.008s] Test config file test.conf not found 3: [0103_transactions_local / 0.009s] Created kafka instance 0103_transactions_local#producer-64 3: [0103_transactions_local / 0.009s] Waiting for init_transactions() timeout 7000 ms 3: [0103_transactions_local / 0.009s] Setting test timeout to 9s * 2.7 3: [
/ 6.076s] Log: [thrd:127.0.0.1:2/bootstrap]: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 1.002s] Error: Local: Broker transport failure: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 1.002s] Error: Local: All broker connections are down: 2/2 brokers are down 3: [0095_all_brokers_down / 1.005s] Test KafkaConsumer 3: [0080_admin_ut / 1.804s] DeleteGroups.queue_poll: duration 200.031ms 3: [0080_admin_ut / 1.804s] DeleteGroups: got DeleteGroupsResult in 200.031s 3: [0080_admin_ut / 1.804s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.804s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 1.804s] Using topic "rdkafkatest_rnd79f99b542f0af98a_do_test_DeleteRecords" 3: [0080_admin_ut / 1.804s] Using topic "rdkafkatest_rnd7c457dcb190d364d_do_test_DeleteRecords" 3: [0080_admin_ut / 1.804s] Using topic "rdkafkatest_rnda9e863023fd6e09_do_test_DeleteRecords" 3: [0080_admin_ut / 1.804s] Using topic "rdkafkatest_rnd43b3bcb469fa9aec_do_test_DeleteRecords" 3: [0080_admin_ut / 1.804s] Call DeleteRecords, timeout is 100ms 3: [0080_admin_ut / 1.804s] DeleteRecords: duration 0.028ms 3: [
/ 6.084s] Log: [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 1.010s] Error: Local: Broker transport failure: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0080_admin_ut / 1.904s] DeleteRecords.queue_poll: duration 100.019ms 3: [0080_admin_ut / 1.904s] DeleteRecords: got DeleteRecordsResult in 100.019s 3: [0080_admin_ut / 1.904s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 1.904s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.904s] Using topic "rdkafkatest_rnd5f9f6d6a016df7a9_do_test_DeleteRecords" 3: [0080_admin_ut / 1.904s] Using topic "rdkafkatest_rnd40623e7e0cfabcb2_do_test_DeleteRecords" 3: [0080_admin_ut / 1.904s] Using topic "rdkafkatest_rnd48d0247c2c34907f_do_test_DeleteRecords" 3: [0080_admin_ut / 1.904s] Using topic "rdkafkatest_rnd744e45eb4d25d8a3_do_test_DeleteRecords" 3: [0080_admin_ut / 1.904s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 1.904s] DeleteRecords: duration 0.023ms 3: [0084_destroy_flags_local / 2.040s] Calling rd_kafka_unsubscribe 3: [0084_destroy_flags_local / 2.041s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 2.041s] rd_kafka_destroy_flags(0x8): duration 0.698ms 3: [0084_destroy_flags_local / 2.041s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode: PASS ] 3: [0084_destroy_flags_local / 2.041s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 2.041s] Test config file test.conf not found 3: [0084_destroy_flags_local / 2.041s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 2.041s] Created kafka instance 0084_destroy_flags_local#consumer-66 3: [0080_admin_ut / 2.104s] DeleteRecords.queue_poll: duration 200.023ms 3: [0080_admin_ut / 2.104s] DeleteRecords: got DeleteRecordsResult in 200.023s 3: [0080_admin_ut / 2.104s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.104s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.104s] Using topic "rdkafkatest_rnd5a100b78799dc3b8_do_test_DeleteRecords" 3: [0080_admin_ut / 2.104s] Using topic "rdkafkatest_rnd41c6b2cc09a18a65_do_test_DeleteRecords" 3: [0080_admin_ut / 2.104s] Using topic "rdkafkatest_rnd4550fe7f59b72c1c_do_test_DeleteRecords" 3: [0080_admin_ut / 2.104s] Using topic "rdkafkatest_rnd48641f8948c0fff5_do_test_DeleteRecords" 3: [0080_admin_ut / 2.104s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 2.104s] DeleteRecords: duration 0.020ms 3: [0034_offset_reset_mock / 6.242s] Bringing up the broker 3: [0080_admin_ut / 2.304s] DeleteRecords.queue_poll: duration 200.025ms 3: [0080_admin_ut / 2.304s] DeleteRecords: got DeleteRecordsResult in 200.025s 3: [0080_admin_ut / 2.304s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.304s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 2.304s] Call DeleteConsumerGroupOffsets, timeout is 100ms 3: [0080_admin_ut / 2.304s] DeleteConsumerGroupOffsets: duration 0.011ms 3: [0080_admin_ut / 2.404s] DeleteConsumerGroupOffsets.queue_poll: duration 100.026ms 3: [0080_admin_ut / 2.404s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 100.026s 3: [0080_admin_ut / 2.404s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 2.404s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.404s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 2.404s] DeleteConsumerGroupOffsets: duration 0.007ms 3: [0084_destroy_flags_local / 2.542s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 2.542s] rd_kafka_destroy_flags(0x0): duration 0.485ms 3: [0084_destroy_flags_local / 2.542s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 2.542s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 2.542s] Test config file test.conf not found 3: [0084_destroy_flags_local / 2.542s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 2.543s] Created kafka instance 0084_destroy_flags_local#consumer-67 3: [0080_admin_ut / 2.604s] DeleteConsumerGroupOffsets.queue_poll: duration 200.033ms 3: [0080_admin_ut / 2.604s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.033s 3: [0080_admin_ut / 2.604s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.604s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.604s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 2.604s] DeleteConsumerGroupOffsets: duration 0.008ms 3: [
/ 7.080s] Log: [thrd:127.0.0.1:2/bootstrap]: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 2.006s] Error: Local: Broker transport failure: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 2.006s] Error: Local: All broker connections are down: 2/2 brokers are down 3: [0095_all_brokers_down / 2.007s] 0095_all_brokers_down: duration 2006.633ms 3: [0095_all_brokers_down / 2.007s] ================= Test 0095_all_brokers_down PASSED ================= 3: [
/ 7.081s] Too many tests running (5 >= 5): postponing 0105_transactions_mock start... 3: [0104_fetch_from_follower_mock/ 0.000s] ================= Running test 0104_fetch_from_follower_mock ================= 3: [0104_fetch_from_follower_mock/ 0.000s] ==== Stats written to file stats_0104_fetch_from_follower_mock_8107381957332275742.json ==== 3: [0104_fetch_from_follower_mock/ 0.000s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.000s] [ Test FFF auto.offset.reset=earliest ] 3: %5|1669457754.500|CONFWARN|MOCK#producer-68| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 0.000s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.001s] Created kafka instance 0104_fetch_from_follower_mock#producer-69 3: [0104_fetch_from_follower_mock/ 0.001s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.001s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 0.003s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 0.003s] PRODUCE: duration 1.909ms 3: [0080_admin_ut / 2.804s] DeleteConsumerGroupOffsets.queue_poll: duration 200.303ms 3: [0080_admin_ut / 2.804s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.303s 3: [0080_admin_ut / 2.804s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.804s] Using topic "rdkafkatest_rnd387c498c7bba7951_do_test_AclBinding" 3: [0080_admin_ut / 2.804s] [ do_test_AclBinding:721 ] 3: [0080_admin_ut / 2.805s] [ do_test_AclBinding:721: PASS (0.00s) ] 3: [0080_admin_ut / 2.805s] Using topic "rdkafkatest_rnd48cd0e955ec2d188_do_test_AclBindingFilter" 3: [0080_admin_ut / 2.805s] [ do_test_AclBindingFilter:853 ] 3: [0080_admin_ut / 2.805s] [ do_test_AclBindingFilter:853: PASS (0.00s) ] 3: [0080_admin_ut / 2.805s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 2.805s] Using topic "rdkafkatest_rnd4ff6571e42c6a9e9_do_test_CreateAcls" 3: [0080_admin_ut / 2.805s] Using topic "rdkafkatest_rnddcdcb124c3bd4e9_do_test_CreateAcls" 3: [0080_admin_ut / 2.805s] Call CreateAcls, timeout is 100ms 3: [0080_admin_ut / 2.805s] CreateAcls: duration 0.011ms 3: [0104_fetch_from_follower_mock/ 0.050s] PRODUCE.DELIVERY.WAIT: duration 47.495ms 3: [0104_fetch_from_follower_mock/ 0.051s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.052s] Created kafka instance 0104_fetch_from_follower_mock#consumer-70 3: [0104_fetch_from_follower_mock/ 0.052s] ASSIGN.PARTITIONS: duration 0.144ms 3: [0104_fetch_from_follower_mock/ 0.052s] earliest: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 0.052s] earliest: consume 1000 messages 3: [0034_offset_reset_mock / 6.867s] #0: message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 6.867s] #0: got expected message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 6.867s] Waiting for up to 5000ms for metadata update 3: [0080_admin_ut / 2.905s] CreateAcls.queue_poll: duration 100.028ms 3: [0080_admin_ut / 2.905s] CreateAcls: got CreateAclsResult in 100.028s 3: [0080_admin_ut / 2.905s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 2.905s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.905s] Using topic "rdkafkatest_rnd5bd3e036186c5143_do_test_CreateAcls" 3: [0080_admin_ut / 2.905s] Using topic "rdkafkatest_rnd703942f31f879cea_do_test_CreateAcls" 3: [0080_admin_ut / 2.905s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 2.905s] CreateAcls: duration 0.007ms 3: [0084_destroy_flags_local / 3.045s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 3.045s] rd_kafka_destroy_flags(0x8): duration 0.234ms 3: [0084_destroy_flags_local / 3.045s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 3.045s] 0084_destroy_flags_local: duration 3045.014ms 3: [0084_destroy_flags_local / 3.045s] ================= Test 0084_destroy_flags_local PASSED ================= 3: [
/ 7.326s] Too many tests running (5 >= 5): postponing 0106_cgrp_sess_timeout start... 3: [0105_transactions_mock / 0.000s] ================= Running test 0105_transactions_mock ================= 3: [0105_transactions_mock / 0.000s] ==== Stats written to file stats_0105_transactions_mock_173085323647561821.json ==== 3: [0105_transactions_mock / 0.000s] Test config file test.conf not found 3: [0105_transactions_mock / 0.000s] [ do_test_txn_recoverable_errors:194 ] 3: [0105_transactions_mock / 0.000s] Test config file test.conf not found 3: [0105_transactions_mock / 0.000s] Setting test timeout to 60s * 2.7 3: %5|1669457754.745|MOCK|0105_transactions_mock#producer-71| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:34195,127.0.0.1:42617,127.0.0.1:33673 3: [0105_transactions_mock / 0.005s] Created kafka instance 0105_transactions_mock#producer-71 3: %4|1669457754.766|GETPID|0105_transactions_mock#producer-71| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Coordinator not available: retrying 3: [0080_admin_ut / 3.105s] CreateAcls.queue_poll: duration 200.033ms 3: [0080_admin_ut / 3.105s] CreateAcls: got CreateAclsResult in 200.033s 3: [0080_admin_ut / 3.105s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.105s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.105s] Using topic "rdkafkatest_rnd20f5949342c92aad_do_test_CreateAcls" 3: [0080_admin_ut / 3.105s] Using topic "rdkafkatest_rnd5cd36d0f69c5b90f_do_test_CreateAcls" 3: [0080_admin_ut / 3.105s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 3.105s] CreateAcls: duration 0.008ms 3: [0080_admin_ut / 3.305s] CreateAcls.queue_poll: duration 200.030ms 3: [0080_admin_ut / 3.305s] CreateAcls: got CreateAclsResult in 200.030s 3: [0080_admin_ut / 3.305s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.305s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 3.305s] Using topic "rdkafkatest_rnd6efdbb2c5121b2fa_do_test_DescribeAcls" 3: [0080_admin_ut / 3.305s] Call DescribeAcls, timeout is 100ms 3: [0080_admin_ut / 3.305s] DescribeAcls: duration 0.010ms 3: [0034_offset_reset_mock / 7.370s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 7.370s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 7.370s] METADATA.WAIT: duration 502.910ms 3: [0080_admin_ut / 3.405s] DescribeAcls.queue_poll: duration 100.030ms 3: [0080_admin_ut / 3.405s] DescribeAcls: got DescribeAclsResult in 100.030s 3: [0080_admin_ut / 3.405s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 3.405s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.405s] Using topic "rdkafkatest_rnd36eb91b3490dc6a4_do_test_DescribeAcls" 3: [0080_admin_ut / 3.405s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 3.405s] DescribeAcls: duration 0.006ms 3: %4|1669457755.156|OFFSET|0104_fetch_from_follower_mock#consumer-70| [thrd:main]: test [0]: offset reset (at offset 10, broker 2) to cached BEGINNING offset 0: fetch failed due to requested offset not available on the broker: Broker: Offset out of range 3: %4|1669457755.267|GETPID|0105_transactions_mock#producer-71| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Not coordinator: retrying 3: [0080_admin_ut / 3.605s] DescribeAcls.queue_poll: duration 200.028ms 3: [0080_admin_ut / 3.605s] DescribeAcls: got DescribeAclsResult in 200.028s 3: [0080_admin_ut / 3.605s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.605s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.605s] Using topic "rdkafkatest_rnd4abf76b378b2447f_do_test_DescribeAcls" 3: [0080_admin_ut / 3.605s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 3.605s] DescribeAcls: duration 0.008ms 3: [0080_admin_ut / 3.805s] DescribeAcls.queue_poll: duration 200.038ms 3: [0080_admin_ut / 3.805s] DescribeAcls: got DescribeAclsResult in 200.038s 3: [0080_admin_ut / 3.805s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.805s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 3.805s] Using topic "rdkafkatest_rnd52af510910107532_do_test_DeleteAcls" 3: [0080_admin_ut / 3.805s] Using topic "rdkafkatest_rnd5269709b1b137092_do_test_DeleteAcls" 3: [0080_admin_ut / 3.805s] Call DeleteAcls, timeout is 100ms 3: [0080_admin_ut / 3.805s] DeleteAcls: duration 0.008ms 3: [0080_admin_ut / 3.905s] DeleteAcls.queue_poll: duration 100.028ms 3: [0080_admin_ut / 3.905s] DeleteAcls: got DeleteAclsResult in 100.028s 3: [0080_admin_ut / 3.905s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 3.905s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.905s] Using topic "rdkafkatest_rnd58d1752842eca58b_do_test_DeleteAcls" 3: [0080_admin_ut / 3.905s] Using topic "rdkafkatest_rnd144702b0114dbeb4_do_test_DeleteAcls" 3: [0080_admin_ut / 3.905s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 3.905s] DeleteAcls: duration 0.008ms 3: [0104_fetch_from_follower_mock/ 1.164s] CONSUME: duration 1111.786ms 3: [0104_fetch_from_follower_mock/ 1.164s] earliest: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 1.164s] Closing consumer 0104_fetch_from_follower_mock#consumer-70 3: [0104_fetch_from_follower_mock/ 1.164s] CONSUMER.CLOSE: duration 0.240ms 3: [0104_fetch_from_follower_mock/ 1.166s] [ Test FFF auto.offset.reset=earliest PASSED ] 3: [0104_fetch_from_follower_mock/ 1.166s] [ Test FFF auto.offset.reset=latest ] 3: %5|1669457755.665|CONFWARN|MOCK#producer-72| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 1.166s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.166s] Created kafka instance 0104_fetch_from_follower_mock#producer-73 3: [0104_fetch_from_follower_mock/ 1.166s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.166s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 1.168s] SUM(POLL): duration 0.001ms 3: [0104_fetch_from_follower_mock/ 1.168s] PRODUCE: duration 1.371ms 3: [0104_fetch_from_follower_mock/ 1.220s] PRODUCE.DELIVERY.WAIT: duration 51.685ms 3: [0104_fetch_from_follower_mock/ 1.224s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.224s] Created kafka instance 0104_fetch_from_follower_mock#consumer-74 3: [0104_fetch_from_follower_mock/ 1.224s] ASSIGN.PARTITIONS: duration 0.182ms 3: [0104_fetch_from_follower_mock/ 1.224s] latest: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 1.224s] latest: not expecting any messages for 5000ms 3: %4|1669457755.775|GETPID|0105_transactions_mock#producer-71| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Coordinator load in progress: retrying 3: [0080_admin_ut / 4.105s] DeleteAcls.queue_poll: duration 200.032ms 3: [0080_admin_ut / 4.105s] DeleteAcls: got DeleteAclsResult in 200.032s 3: [0080_admin_ut / 4.105s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 4.105s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 4.105s] Using topic "rdkafkatest_rnd3ea71edc5d141146_do_test_DeleteAcls" 3: [0080_admin_ut / 4.105s] Using topic "rdkafkatest_rnd7010903d0e9d75fa_do_test_DeleteAcls" 3: [0080_admin_ut / 4.105s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 4.105s] DeleteAcls: duration 0.008ms 3: [0080_admin_ut / 4.305s] DeleteAcls.queue_poll: duration 200.028ms 3: [0080_admin_ut / 4.305s] DeleteAcls: got DeleteAclsResult in 200.028s 3: [0080_admin_ut / 4.305s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 4.305s] [ do_test_mix:1342 ] 3: [0080_admin_ut / 4.305s] Creating 2 topics 3: [0080_admin_ut / 4.306s] Deleting 1 topics 3: [0080_admin_ut / 4.306s] Creating 1 topics 3: [0080_admin_ut / 4.306s] Deleting 3 groups 3: [0080_admin_ut / 4.306s] Deleting offsets from 3 partitions 3: [0080_admin_ut / 4.306s] Creating (up to) 15 partitions for topic "topicD" 3: [0080_admin_ut / 4.306s] Deleting committed offsets for group mygroup and 3 partitions 3: [0080_admin_ut / 4.306s] Provoking invalid DeleteConsumerGroupOffsets call 3: [0080_admin_ut / 4.306s] Creating 2 topics 3: [0080_admin_ut / 4.306s] Got event DeleteConsumerGroupOffsetsResult: Exactly one DeleteConsumerGroupOffsets must be passed 3: [0034_offset_reset_mock / 8.370s] #1: injecting _TRANSPORT, expecting NO_ERROR 3: %4|1669457756.070|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:44345/bootstrap]: 127.0.0.1:44345/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1509ms in state UP) 3: [0034_offset_reset_mock / 8.372s] ASSIGN.PARTITIONS: duration 1.670ms 3: [0034_offset_reset_mock / 8.372s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 8.372s] #1: Ignoring Error event: 127.0.0.1:44345/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1509ms in state UP) 3: [0080_admin_ut / 4.406s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.406s] Got event DeleteTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.406s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.406s] Got event DeleteGroupsResult: Success 3: [0080_admin_ut / 4.406s] Got event DeleteRecordsResult: Failed to query partition leaders: Local: Timed out 3: [0080_admin_ut / 4.406s] Got event CreatePartitionsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.406s] Got event DeleteConsumerGroupOffsetsResult: Failed while waiting for response from broker: Local: Timed out 3: [0080_admin_ut / 4.406s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.406s] [ do_test_mix:1342: PASS (0.10s) ] 3: [0080_admin_ut / 4.406s] [ do_test_configs:1411 ] 3: [0105_transactions_mock / 1.532s] rd_kafka_init_transactions(rk, 5000): duration 1526.815ms 3: [0105_transactions_mock / 1.532s] rd_kafka_begin_transaction(rk): duration 0.139ms 3: [0105_transactions_mock / 1.532s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock / 2.003s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: %4|1669457756.839|OFFSET|0104_fetch_from_follower_mock#consumer-74| [thrd:main]: test [0]: offset reset (at offset 1000, broker 2) to END: fetch failed due to requested offset not available on the broker: Broker: Offset out of range 3: [0105_transactions_mock / 2.106s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 0.447ms 3: [0080_admin_ut / 6.406s] [ do_test_configs:1411: PASS (2.00s) ] 3: [0080_admin_ut / 6.406s] Test config file test.conf not found 3: %5|1669457758.105|CONFWARN|0080_admin_ut#producer-75| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.406s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-75 DeleteRecords with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 6.406s] Using topic "rdkafkatest_rnd1fdabb2f7dde5b4f_do_test_DeleteRecords" 3: [0080_admin_ut / 6.406s] Using topic "rdkafkatest_rnd5ad94ae47bae9b66_do_test_DeleteRecords" 3: [0080_admin_ut / 6.406s] Using topic "rdkafkatest_rnd164aac924b128dd7_do_test_DeleteRecords" 3: [0080_admin_ut / 6.406s] Using topic "rdkafkatest_rnd1b36385018b198c1_do_test_DeleteRecords" 3: [0080_admin_ut / 6.406s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 6.406s] DeleteRecords: duration 0.020ms 3: [0080_admin_ut / 6.406s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-75 DeleteRecords with main queue, options, destroy, timeout 100ms: PASS (0.00s) ] 3: [0080_admin_ut / 6.418s] Test config file test.conf not found 3: %5|1669457758.122|CONFWARN|0080_admin_ut#producer-76| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.423s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-76 DeleteGroups with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 6.423s] Using topic "rdkafkatest_rnd1aeb3e343c2bcce4_do_test_DeleteGroups" 3: [0080_admin_ut / 6.423s] Using topic "rdkafkatest_rnd5b7ac36e77beab43_do_test_DeleteGroups" 3: [0080_admin_ut / 6.423s] Using topic "rdkafkatest_rnd25f185f34a787e9a_do_test_DeleteGroups" 3: [0080_admin_ut / 6.423s] Using topic "rdkafkatest_rnd48e05e3d5cdd17a6_do_test_DeleteGroups" 3: [0080_admin_ut / 6.423s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 6.424s] DeleteGroups: duration 0.019ms 3: [0080_admin_ut / 6.424s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 6.434s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq ] 3: [0080_admin_ut / 6.435s] Test config file test.conf not found 3: %5|1669457758.133|CONFWARN|0080_admin_ut#consumer-77| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0105_transactions_mock / 3.410s] rd_kafka_commit_transaction(rk, 5000): duration 1304.662ms 3: [0105_transactions_mock / 3.411s] [ do_test_txn_recoverable_errors:194: PASS (3.41s) ] 3: [0105_transactions_mock / 3.411s] [ do_test_txn_fatal_idempo_errors:305 ] 3: [0105_transactions_mock / 3.411s] Test config file test.conf not found 3: [0105_transactions_mock / 3.411s] Setting test timeout to 60s * 2.7 3: %5|1669457758.156|MOCK|0105_transactions_mock#producer-78| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:32783,127.0.0.1:38041,127.0.0.1:45513 3: [0105_transactions_mock / 3.416s] Created kafka instance 0105_transactions_mock#producer-78 3: [0105_transactions_mock / 3.443s] rd_kafka_init_transactions(rk, 5000): duration 27.717ms 3: [0105_transactions_mock / 3.443s] rd_kafka_begin_transaction(rk): duration 0.012ms 3: [0105_transactions_mock / 3.443s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.012ms 3: [0105_transactions_mock / 3.443s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0080_admin_ut / 6.535s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 6.535s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 6.539s] rd_kafka_destroy(): duration 4.043ms 3: [0080_admin_ut / 6.539s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq: PASS (0.11s) ] 3: [0080_admin_ut / 6.540s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 6.540s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq ] 3: [0080_admin_ut / 6.540s] Test config file test.conf not found 3: %5|1669457758.238|CONFWARN|0080_admin_ut#consumer-79| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.640s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 6.640s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 6.640s] rd_kafka_destroy(): duration 0.182ms 3: [0080_admin_ut / 6.640s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq: PASS (0.10s) ] 3: [0080_admin_ut / 6.640s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 6.640s] Test config file test.conf not found 3: %4|1669457758.339|CONFWARN|0080_admin_ut#consumer-80| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1669457758.339|CONFWARN|0080_admin_ut#consumer-80| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.640s] [ do_test_options:1588 ] 3: [0080_admin_ut / 6.640s] [ do_test_options:1588: PASS (0.00s) ] 3: [0080_admin_ut / 6.640s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 6.640s] Using topic "rdkafkatest_rnd1386453e139fd4f0_do_test_CreateTopics" 3: [0080_admin_ut / 6.640s] Using topic "rdkafkatest_rnd6635964823b04a23_do_test_CreateTopics" 3: [0080_admin_ut / 6.640s] Using topic "rdkafkatest_rnd14906da7c81bf4b_do_test_CreateTopics" 3: [0080_admin_ut / 6.640s] Using topic "rdkafkatest_rnd1590098b0dcf7dff_do_test_CreateTopics" 3: [0080_admin_ut / 6.640s] Using topic "rdkafkatest_rnd72a41ad17de00e3c_do_test_CreateTopics" 3: [0080_admin_ut / 6.641s] Using topic "rdkafkatest_rnd127ed6007bbe698c_do_test_CreateTopics" 3: [0080_admin_ut / 6.641s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 6.641s] CreateTopics: duration 0.128ms 3: [0080_admin_ut / 6.741s] CreateTopics.queue_poll: duration 99.906ms 3: [0080_admin_ut / 6.741s] CreateTopics: got CreateTopicsResult in 99.906s 3: [0080_admin_ut / 6.741s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 6.741s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms ] 3: [0080_admin_ut / 6.741s] Using topic "rdkafkatest_rnde2d71661209161e_do_test_CreateTopics" 3: [0080_admin_ut / 6.741s] Using topic "rdkafkatest_rnd2963a9b72abaaee0_do_test_CreateTopics" 3: [0080_admin_ut / 6.741s] Using topic "rdkafkatest_rnd658f769b0635724e_do_test_CreateTopics" 3: [0080_admin_ut / 6.741s] Using topic "rdkafkatest_rndb80fc8e50adf0e9_do_test_CreateTopics" 3: [0080_admin_ut / 6.741s] Using topic "rdkafkatest_rnd685e143564343627_do_test_CreateTopics" 3: [0080_admin_ut / 6.741s] Using topic "rdkafkatest_rnd3ded705a4a69cc6f_do_test_CreateTopics" 3: [0080_admin_ut / 6.741s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 6.741s] CreateTopics: duration 0.135ms 3: [0080_admin_ut / 6.841s] CreateTopics.wait_background_event_cb: duration 99.914ms 3: [0080_admin_ut / 6.841s] CreateTopics: got CreateTopicsResult in 99.914s 3: [0080_admin_ut / 6.841s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 6.841s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 6.841s] Using topic "rdkafkatest_rnd65e63d1b4bb2d34a_do_test_CreateTopics" 3: [0080_admin_ut / 6.841s] Using topic "rdkafkatest_rnd50cbaf676142dcd5_do_test_CreateTopics" 3: [0080_admin_ut / 6.841s] Using topic "rdkafkatest_rnd7a58408f53e6f7a6_do_test_CreateTopics" 3: [0080_admin_ut / 6.841s] Using topic "rdkafkatest_rnd328247b26665cda6_do_test_CreateTopics" 3: [0080_admin_ut / 6.841s] Using topic "rdkafkatest_rnd458599b874933f0d_do_test_CreateTopics" 3: [0080_admin_ut / 6.841s] Using topic "rdkafkatest_rnd239b79961df6e8c4_do_test_CreateTopics" 3: [0080_admin_ut / 6.841s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 6.841s] CreateTopics: duration 0.117ms 3: [0080_admin_ut / 7.041s] CreateTopics.queue_poll: duration 199.908ms 3: [0080_admin_ut / 7.041s] CreateTopics: got CreateTopicsResult in 199.908s 3: [0080_admin_ut / 7.041s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 7.041s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.041s] Using topic "rdkafkatest_rnd1c9c97a703865f5f_do_test_CreateTopics" 3: [0080_admin_ut / 7.041s] Using topic "rdkafkatest_rndd5c60fc0f075bed_do_test_CreateTopics" 3: [0080_admin_ut / 7.041s] Using topic "rdkafkatest_rnd46fc888e77657022_do_test_CreateTopics" 3: [0080_admin_ut / 7.041s] Using topic "rdkafkatest_rnd143c85103552e07d_do_test_CreateTopics" 3: [0080_admin_ut / 7.041s] Using topic "rdkafkatest_rnd52ccbb61b391d98_do_test_CreateTopics" 3: [0080_admin_ut / 7.041s] Using topic "rdkafkatest_rnd729ed1a66c04cd00_do_test_CreateTopics" 3: [0080_admin_ut / 7.042s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 7.042s] CreateTopics: duration 0.140ms 3: [0080_admin_ut / 7.242s] CreateTopics.queue_poll: duration 199.895ms 3: [0080_admin_ut / 7.242s] CreateTopics: got CreateTopicsResult in 199.895s 3: [0080_admin_ut / 7.242s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-80 CreateTopics with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 7.242s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-80 DeleteTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 7.242s] Using topic "rdkafkatest_rnd6de05596665d0d8f_do_test_DeleteTopics" 3: [0080_admin_ut / 7.242s] Using topic "rdkafkatest_rnd3801ce4a6701e7c3_do_test_DeleteTopics" 3: [0080_admin_ut / 7.242s] Using topic "rdkafkatest_rnd18df55411e679bf1_do_test_DeleteTopics" 3: [0080_admin_ut / 7.242s] Using topic "rdkafkatest_rnd5be1e37b5e64eefa_do_test_DeleteTopics" 3: [0080_admin_ut / 7.242s] Call DeleteTopics, timeout is 100ms 3: [0080_admin_ut / 7.242s] DeleteTopics: duration 0.011ms 3: [0080_admin_ut / 7.342s] DeleteTopics.queue_poll: duration 100.029ms 3: [0080_admin_ut / 7.342s] DeleteTopics: got DeleteTopicsResult in 100.029s 3: [0080_admin_ut / 7.342s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.342s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-80 DeleteTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.342s] Using topic "rdkafkatest_rnd12fadafe62caf552_do_test_DeleteTopics" 3: [0080_admin_ut / 7.342s] Using topic "rdkafkatest_rnd200689030f1c3c2_do_test_DeleteTopics" 3: [0080_admin_ut / 7.342s] Using topic "rdkafkatest_rnd146eb6091e9d0037_do_test_DeleteTopics" 3: [0080_admin_ut / 7.342s] Using topic "rdkafkatest_rnd347823214c47e90f_do_test_DeleteTopics" 3: [0080_admin_ut / 7.342s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 7.342s] DeleteTopics: duration 0.007ms 3: %3|1669457759.158|TXNERR|0105_transactions_mock#producer-78| [thrd:127.0.0.1:38041/bootstrap]: Current transaction failed in state BeginCommit: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 4.414s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 4.415s] rd_kafka_abort_transaction(rk, -1): duration 0.980ms 3: [0105_transactions_mock / 4.415s] rd_kafka_begin_transaction(rk): duration 0.033ms 3: [0105_transactions_mock / 4.415s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock / 4.417s] rd_kafka_commit_transaction(rk, -1): duration 2.235ms 3: [0105_transactions_mock / 4.418s] [ do_test_txn_fatal_idempo_errors:305: PASS (1.01s) ] 3: [0105_transactions_mock / 4.418s] [ do_test_txn_fenced_reinit:511: With error INVALID_PRODUCER_EPOCH ] 3: [0105_transactions_mock / 4.418s] Test config file test.conf not found 3: [0105_transactions_mock / 4.418s] Setting test timeout to 60s * 2.7 3: %5|1669457759.163|MOCK|0105_transactions_mock#producer-81| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40491,127.0.0.1:40225,127.0.0.1:43279 3: [0105_transactions_mock / 4.419s] Created kafka instance 0105_transactions_mock#producer-81 3: [0105_transactions_mock / 4.424s] rd_kafka_init_transactions(rk, -1): duration 5.469ms 3: [0105_transactions_mock / 4.424s] rd_kafka_begin_transaction(rk): duration 0.096ms 3: [0105_transactions_mock / 4.424s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock / 4.424s] 0105_transactions_mock#producer-81: Flushing 1 messages 3: [0080_admin_ut / 7.542s] DeleteTopics.queue_poll: duration 200.030ms 3: [0080_admin_ut / 7.542s] DeleteTopics: got DeleteTopicsResult in 200.030s 3: [0080_admin_ut / 7.542s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.542s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-80 DeleteTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.542s] Using topic "rdkafkatest_rnd2bf96133437f7f0e_do_test_DeleteTopics" 3: [0080_admin_ut / 7.542s] Using topic "rdkafkatest_rnd54cf0cfd72f5e9c1_do_test_DeleteTopics" 3: [0080_admin_ut / 7.542s] Using topic "rdkafkatest_rnd3ae4ef31418a6713_do_test_DeleteTopics" 3: [0080_admin_ut / 7.542s] Using topic "rdkafkatest_rnd7326ed27037cfae_do_test_DeleteTopics" 3: [0080_admin_ut / 7.542s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 7.542s] DeleteTopics: duration 0.007ms 3: [0080_admin_ut / 7.742s] DeleteTopics.queue_poll: duration 200.027ms 3: [0080_admin_ut / 7.742s] DeleteTopics: got DeleteTopicsResult in 200.027s 3: [0080_admin_ut / 7.742s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.742s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-80 DeleteGroups with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 7.742s] Using topic "rdkafkatest_rnd78af8d990c5f3a88_do_test_DeleteGroups" 3: [0080_admin_ut / 7.742s] Using topic "rdkafkatest_rndb70ed467b878768_do_test_DeleteGroups" 3: [0080_admin_ut / 7.742s] Using topic "rdkafkatest_rnd7efe0c2e7775ba46_do_test_DeleteGroups" 3: [0080_admin_ut / 7.742s] Using topic "rdkafkatest_rnd5fa25e0d6cde61c5_do_test_DeleteGroups" 3: [0080_admin_ut / 7.742s] Call DeleteGroups, timeout is 100ms 3: [0080_admin_ut / 7.742s] DeleteGroups: duration 0.016ms 3: [0080_admin_ut / 7.842s] DeleteGroups.queue_poll: duration 100.039ms 3: [0080_admin_ut / 7.842s] DeleteGroups: got DeleteGroupsResult in 100.039s 3: [0080_admin_ut / 7.842s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 7.842s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-80 DeleteGroups with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.842s] Using topic "rdkafkatest_rnd5dd2c7d617a42c57_do_test_DeleteGroups" 3: [0080_admin_ut / 7.842s] Using topic "rdkafkatest_rnd53e0498876b21d17_do_test_DeleteGroups" 3: [0080_admin_ut / 7.842s] Using topic "rdkafkatest_rnd360bc8482fc22d03_do_test_DeleteGroups" 3: [0080_admin_ut / 7.842s] Using topic "rdkafkatest_rnd55170c114906a346_do_test_DeleteGroups" 3: [0080_admin_ut / 7.842s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 7.842s] DeleteGroups: duration 0.013ms 3: [0080_admin_ut / 8.042s] DeleteGroups.queue_poll: duration 200.031ms 3: [0080_admin_ut / 8.042s] DeleteGroups: got DeleteGroupsResult in 200.031s 3: [0080_admin_ut / 8.042s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 8.042s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-80 DeleteGroups with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.042s] Using topic "rdkafkatest_rnd128d2256571774a1_do_test_DeleteGroups" 3: [0080_admin_ut / 8.042s] Using topic "rdkafkatest_rnd79f8670826fbd85f_do_test_DeleteGroups" 3: [0080_admin_ut / 8.042s] Using topic "rdkafkatest_rnd75b474d92e708a29_do_test_DeleteGroups" 3: [0080_admin_ut / 8.042s] Using topic "rdkafkatest_rnd7343c16e21add60c_do_test_DeleteGroups" 3: [0080_admin_ut / 8.042s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 8.042s] DeleteGroups: duration 0.013ms 3: [0080_admin_ut / 8.242s] DeleteGroups.queue_poll: duration 200.043ms 3: [0080_admin_ut / 8.242s] DeleteGroups: got DeleteGroupsResult in 200.043s 3: [0080_admin_ut / 8.242s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 8.242s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-80 DeleteRecords with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 8.242s] Using topic "rdkafkatest_rnd71f009384812ce6c_do_test_DeleteRecords" 3: [0080_admin_ut / 8.242s] Using topic "rdkafkatest_rnd14a3bfce2cd4f869_do_test_DeleteRecords" 3: [0080_admin_ut / 8.242s] Using topic "rdkafkatest_rnd99d357f1bd62ea0_do_test_DeleteRecords" 3: [0080_admin_ut / 8.242s] Using topic "rdkafkatest_rnd1d0cc817024cc318_do_test_DeleteRecords" 3: [0080_admin_ut / 8.242s] Call DeleteRecords, timeout is 100ms 3: [0080_admin_ut / 8.242s] DeleteRecords: duration 0.029ms 3: [0034_offset_reset_mock / 12.273s] #1: message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 12.273s] #1: got expected message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 12.273s] Waiting for up to 5000ms for metadata update 3: [0080_admin_ut / 8.342s] DeleteRecords.queue_poll: duration 100.030ms 3: [0080_admin_ut / 8.342s] DeleteRecords: got DeleteRecordsResult in 100.030s 3: [0080_admin_ut / 8.342s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-80 DeleteRecords with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 8.342s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-80 DeleteRecords with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.342s] Using topic "rdkafkatest_rnd28356928287db55d_do_test_DeleteRecords" 3: [0080_admin_ut / 8.342s] Using topic "rdkafkatest_rnd7dd44a8127337556_do_test_DeleteRecords" 3: [0080_admin_ut / 8.342s] Using topic "rdkafkatest_rnd1ff36fa45d76a88e_do_test_DeleteRecords" 3: [0080_admin_ut / 8.342s] Using topic "rdkafkatest_rnd1411d71b7dc6377a_do_test_DeleteRecords" 3: [0080_admin_ut / 8.342s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 8.343s] DeleteRecords: duration 0.024ms 3: [0105_transactions_mock / 5.421s] FLUSH: duration 996.584ms 3: [0105_transactions_mock / 5.421s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock / 5.421s] 0105_transactions_mock#producer-81: Flushing 1 messages 3: %3|1669457760.166|TXNERR|0105_transactions_mock#producer-81| [thrd:127.0.0.1:40225/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 5.421s] FLUSH: duration 0.215ms 3: %1|1669457760.167|TXNERR|0105_transactions_mock#producer-81| [thrd:main]: Fatal transaction error: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch (_FENCED) 3: %0|1669457760.167|FATAL|0105_transactions_mock#producer-81| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.422s] 0105_transactions_mock#producer-81 Fatal error: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.422s] Ignoring allowed error: _FENCED: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.422s] 0105_transactions_mock#producer-81 rdkafka ignored FATAL error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.422s] abort_transaction() failed: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.422s] Fatal error: _FENCED: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.423s] [ do_test_txn_fenced_reinit:511: With error INVALID_PRODUCER_EPOCH: PASS (1.01s) ] 3: [0105_transactions_mock / 5.423s] [ do_test_txn_fenced_reinit:511: With error PRODUCER_FENCED ] 3: [0105_transactions_mock / 5.423s] Test config file test.conf not found 3: [0105_transactions_mock / 5.423s] Setting test timeout to 60s * 2.7 3: %5|1669457760.168|MOCK|0105_transactions_mock#producer-82| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:33265,127.0.0.1:44033,127.0.0.1:44853 3: [0105_transactions_mock / 5.424s] Created kafka instance 0105_transactions_mock#producer-82 3: [0105_transactions_mock / 5.430s] rd_kafka_init_transactions(rk, -1): duration 5.589ms 3: [0105_transactions_mock / 5.430s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [0105_transactions_mock / 5.430s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.011ms 3: [0105_transactions_mock / 5.430s] 0105_transactions_mock#producer-82: Flushing 1 messages 3: [0080_admin_ut / 8.543s] DeleteRecords.queue_poll: duration 200.020ms 3: [0080_admin_ut / 8.543s] DeleteRecords: got DeleteRecordsResult in 200.020s 3: [0080_admin_ut / 8.543s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-80 DeleteRecords with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 8.543s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-80 DeleteRecords with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.543s] Using topic "rdkafkatest_rnd751ad4e567f220a3_do_test_DeleteRecords" 3: [0080_admin_ut / 8.543s] Using topic "rdkafkatest_rnd747854912b269d2e_do_test_DeleteRecords" 3: [0080_admin_ut / 8.543s] Using topic "rdkafkatest_rnd17b44da7498f60a3_do_test_DeleteRecords" 3: [0080_admin_ut / 8.543s] Using topic "rdkafkatest_rnd742d40742a416ffd_do_test_DeleteRecords" 3: [0080_admin_ut / 8.543s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 8.543s] DeleteRecords: duration 0.020ms 3: [0080_admin_ut / 8.743s] DeleteRecords.queue_poll: duration 200.024ms 3: [0080_admin_ut / 8.743s] DeleteRecords: got DeleteRecordsResult in 200.024s 3: [0080_admin_ut / 8.743s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-80 DeleteRecords with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 8.743s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-80 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 8.743s] Call DeleteConsumerGroupOffsets, timeout is 100ms 3: [0080_admin_ut / 8.743s] DeleteConsumerGroupOffsets: duration 0.008ms 3: [0034_offset_reset_mock / 12.776s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 12.776s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 12.776s] METADATA.WAIT: duration 502.406ms 3: [0103_transactions_local / 7.009s] init_transactions(): duration 7000.036ms 3: [0103_transactions_local / 7.009s] init_transactions() failed as expected: Failed to initialize Producer ID: Local: Timed out 3: [0103_transactions_local / 7.010s] [ do_test_txn_local:1168: PASS (7.01s) ] 3: [0103_transactions_local / 7.010s] 0103_transactions_local: duration 7009.681ms 3: [0103_transactions_local / 7.010s] ================= Test 0103_transactions_local PASSED ================= 3: [0080_admin_ut / 8.843s] DeleteConsumerGroupOffsets.queue_poll: duration 100.030ms 3: [0080_admin_ut / 8.843s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 100.030s 3: [0080_admin_ut / 8.843s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-80 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 8.843s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-80 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.843s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 8.843s] DeleteConsumerGroupOffsets: duration 0.009ms 3: [
/ 13.169s] Too many tests running (5 >= 5): postponing 0113_cooperative_rebalance_local start... 3: [0106_cgrp_sess_timeout / 0.000s] ================= Running test 0106_cgrp_sess_timeout ================= 3: [0106_cgrp_sess_timeout / 0.000s] ==== Stats written to file stats_0106_cgrp_sess_timeout_2352802345230116733.json ==== 3: [0106_cgrp_sess_timeout / 0.000s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.000s] [ do_test_session_timeout:152: Test session timeout with sync commit ] 3: %5|1669457760.588|CONFWARN|MOCK#producer-83| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 0.004s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.006s] Created kafka instance 0106_cgrp_sess_timeout#producer-84 3: [0106_cgrp_sess_timeout / 0.006s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.006s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 0.006s] SUM(POLL): duration 0.001ms 3: [0106_cgrp_sess_timeout / 0.006s] PRODUCE: duration 0.109ms 3: [0106_cgrp_sess_timeout / 0.047s] PRODUCE.DELIVERY.WAIT: duration 41.628ms 3: [0106_cgrp_sess_timeout / 0.048s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.048s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 0.048s] Created kafka instance 0106_cgrp_sess_timeout#consumer-85 3: [0106_cgrp_sess_timeout / 0.048s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0104_fetch_from_follower_mock/ 6.224s] CONSUME: duration 5000.064ms 3: [0104_fetch_from_follower_mock/ 6.224s] test_consumer_poll_no_msgs:4075: latest: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 6.224s] test_consumer_poll_no_msgs:4075: latest: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 6.224s] Closing consumer 0104_fetch_from_follower_mock#consumer-74 3: [0104_fetch_from_follower_mock/ 6.224s] CONSUMER.CLOSE: duration 0.051ms 3: [0104_fetch_from_follower_mock/ 6.225s] [ Test FFF auto.offset.reset=latest PASSED ] 3: [0104_fetch_from_follower_mock/ 6.225s] [ Test lagging FFF offset reset ] 3: %5|1669457760.724|CONFWARN|MOCK#producer-86| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 6.225s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.226s] Created kafka instance 0104_fetch_from_follower_mock#producer-87 3: [0104_fetch_from_follower_mock/ 6.226s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.226s] Produce to test [0]: messages #0..10 3: [0104_fetch_from_follower_mock/ 6.226s] SUM(POLL): duration 0.001ms 3: [0104_fetch_from_follower_mock/ 6.226s] PRODUCE: duration 0.023ms 3: [0080_admin_ut / 9.043s] DeleteConsumerGroupOffsets.queue_poll: duration 200.017ms 3: [0080_admin_ut / 9.043s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.017s 3: [0080_admin_ut / 9.043s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-80 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.043s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-80 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.043s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 9.043s] DeleteConsumerGroupOffsets: duration 0.007ms 3: [0104_fetch_from_follower_mock/ 6.282s] PRODUCE.DELIVERY.WAIT: duration 56.895ms 3: [0104_fetch_from_follower_mock/ 6.287s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.288s] Created kafka instance 0104_fetch_from_follower_mock#consumer-88 3: [0104_fetch_from_follower_mock/ 6.288s] ASSIGN.PARTITIONS: duration 0.034ms 3: [0104_fetch_from_follower_mock/ 6.288s] lag: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 6.288s] up to wmark: consume 7 messages 3: [0104_fetch_from_follower_mock/ 6.390s] CONSUME: duration 101.937ms 3: [0104_fetch_from_follower_mock/ 6.390s] up to wmark: consumed 7/7 messages (0/0 EOFs) 3: [0104_fetch_from_follower_mock/ 6.390s] no msgs: not expecting any messages for 3000ms 3: [0080_admin_ut / 9.243s] DeleteConsumerGroupOffsets.queue_poll: duration 200.030ms 3: [0080_admin_ut / 9.243s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.030s 3: [0080_admin_ut / 9.243s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-80 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.243s] Using topic "rdkafkatest_rnd513d485c165b4a1d_do_test_AclBinding" 3: [0080_admin_ut / 9.243s] [ do_test_AclBinding:721 ] 3: [0080_admin_ut / 9.243s] [ do_test_AclBinding:721: PASS (0.00s) ] 3: [0080_admin_ut / 9.243s] Using topic "rdkafkatest_rnd1c9631a6448109cb_do_test_AclBindingFilter" 3: [0080_admin_ut / 9.243s] [ do_test_AclBindingFilter:853 ] 3: [0080_admin_ut / 9.243s] [ do_test_AclBindingFilter:853: PASS (0.00s) ] 3: [0080_admin_ut / 9.243s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-80 CreaetAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 9.243s] Using topic "rdkafkatest_rnd3809202a0e863ade_do_test_CreateAcls" 3: [0080_admin_ut / 9.243s] Using topic "rdkafkatest_rndc93d8374cacdff8_do_test_CreateAcls" 3: [0080_admin_ut / 9.243s] Call CreateAcls, timeout is 100ms 3: [0080_admin_ut / 9.243s] CreateAcls: duration 0.011ms 3: [0080_admin_ut / 9.343s] CreateAcls.queue_poll: duration 100.028ms 3: [0080_admin_ut / 9.343s] CreateAcls: got CreateAclsResult in 100.028s 3: [0080_admin_ut / 9.343s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-80 CreaetAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 9.343s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-80 CreaetAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.343s] Using topic "rdkafkatest_rnd3b5b334716310db6_do_test_CreateAcls" 3: [0080_admin_ut / 9.343s] Using topic "rdkafkatest_rnd68830e985867fb5e_do_test_CreateAcls" 3: [0080_admin_ut / 9.343s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 9.343s] CreateAcls: duration 0.008ms 3: [0105_transactions_mock / 6.426s] FLUSH: duration 996.142ms 3: [0105_transactions_mock / 6.426s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.004ms 3: [0105_transactions_mock / 6.426s] 0105_transactions_mock#producer-82: Flushing 1 messages 3: %3|1669457761.170|TXNERR|0105_transactions_mock#producer-82| [thrd:127.0.0.1:44033/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 6.426s] FLUSH: duration 0.120ms 3: %1|1669457761.172|TXNERR|0105_transactions_mock#producer-82| [thrd:main]: Fatal transaction error: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one (_FENCED) 3: %0|1669457761.172|FATAL|0105_transactions_mock#producer-82| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.427s] abort_transaction() failed: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.427s] Fatal error: _FENCED: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.428s] [ do_test_txn_fenced_reinit:511: With error PRODUCER_FENCED: PASS (1.00s) ] 3: [0105_transactions_mock / 6.428s] [ do_test_txn_req_cnt:1071 ] 3: [0105_transactions_mock / 6.428s] Test config file test.conf not found 3: [0105_transactions_mock / 6.428s] Setting test timeout to 60s * 2.7 3: %5|1669457761.172|MOCK|0105_transactions_mock#producer-89| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37315,127.0.0.1:36087,127.0.0.1:40577 3: [0105_transactions_mock / 6.428s] Created kafka instance 0105_transactions_mock#producer-89 3: [0105_transactions_mock / 6.432s] rd_kafka_init_transactions(rk, 5000): duration 3.279ms 3: [0105_transactions_mock / 6.432s] rd_kafka_begin_transaction(rk): duration 0.016ms 3: [0080_admin_ut / 9.543s] CreateAcls.queue_poll: duration 200.032ms 3: [0080_admin_ut / 9.543s] CreateAcls: got CreateAclsResult in 200.032s 3: [0080_admin_ut / 9.543s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-80 CreaetAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.543s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-80 CreaetAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.543s] Using topic "rdkafkatest_rnd187dd0cf10b877c0_do_test_CreateAcls" 3: [0080_admin_ut / 9.543s] Using topic "rdkafkatest_rnde5b0bc16521b50_do_test_CreateAcls" 3: [0080_admin_ut / 9.543s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 9.543s] CreateAcls: duration 0.008ms 3: [0105_transactions_mock / 6.634s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 202.139ms 3: [0105_transactions_mock / 6.634s] rd_kafka_abort_transaction(rk, 5000): duration 0.211ms 3: [0105_transactions_mock / 6.635s] [ do_test_txn_req_cnt:1071: PASS (0.21s) ] 3: [0105_transactions_mock / 6.635s] [ do_test_txn_requires_abort_errors:1132 ] 3: [0105_transactions_mock / 6.635s] Test config file test.conf not found 3: [0105_transactions_mock / 6.635s] Setting test timeout to 60s * 2.7 3: %5|1669457761.379|MOCK|0105_transactions_mock#producer-90| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43827,127.0.0.1:33087,127.0.0.1:41369 3: [0105_transactions_mock / 6.635s] Created kafka instance 0105_transactions_mock#producer-90 3: [0105_transactions_mock / 6.644s] rd_kafka_init_transactions(rk, 5000): duration 8.531ms 3: [0105_transactions_mock / 6.644s] rd_kafka_begin_transaction(rk): duration 0.012ms 3: [0105_transactions_mock / 6.644s] 1. Fail on produce 3: [0105_transactions_mock / 6.644s] 0105_transactions_mock#producer-90: Flushing 1 messages 3: [0080_admin_ut / 9.743s] CreateAcls.queue_poll: duration 200.182ms 3: [0080_admin_ut / 9.743s] CreateAcls: got CreateAclsResult in 200.182s 3: [0080_admin_ut / 9.743s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-80 CreaetAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.743s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-80 DescribeAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 9.743s] Using topic "rdkafkatest_rnd37ebed1620d92060_do_test_DescribeAcls" 3: [0080_admin_ut / 9.744s] Call DescribeAcls, timeout is 100ms 3: [0080_admin_ut / 9.744s] DescribeAcls: duration 0.008ms 3: [0034_offset_reset_mock / 13.776s] #2: injecting TOPIC_AUTHORIZATION_FAILED, expecting TOPIC_AUTHORIZATION_FAILED 3: [0034_offset_reset_mock / 13.776s] ASSIGN.PARTITIONS: duration 0.232ms 3: [0034_offset_reset_mock / 13.776s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 13.819s] #2: injected TOPIC_AUTHORIZATION_FAILED, got error _AUTO_OFFSET_RESET: failed to query logical offset: Broker: Topic authorization failed (broker 1) 3: [0034_offset_reset_mock / 13.819s] Waiting for up to 5000ms for metadata update 3: [0034_offset_reset_mock / 13.819s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 13.819s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 13.819s] METADATA.WAIT: duration 0.133ms 3: [0080_admin_ut / 9.844s] DescribeAcls.queue_poll: duration 100.032ms 3: [0080_admin_ut / 9.844s] DescribeAcls: got DescribeAclsResult in 100.032s 3: [0080_admin_ut / 9.844s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-80 DescribeAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 9.844s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-80 DescribeAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.844s] Using topic "rdkafkatest_rnd73c8c3de4bfdc432_do_test_DescribeAcls" 3: [0080_admin_ut / 9.844s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 9.844s] DescribeAcls: duration 0.006ms 3: [0080_admin_ut / 10.044s] DescribeAcls.queue_poll: duration 200.030ms 3: [0080_admin_ut / 10.044s] DescribeAcls: got DescribeAclsResult in 200.030s 3: [0080_admin_ut / 10.044s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-80 DescribeAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.044s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-80 DescribeAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.044s] Using topic "rdkafkatest_rnd1e9f57da68e398c3_do_test_DescribeAcls" 3: [0080_admin_ut / 10.044s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 10.044s] DescribeAcls: duration 0.007ms 3: [0080_admin_ut / 10.244s] DescribeAcls.queue_poll: duration 200.030ms 3: [0080_admin_ut / 10.244s] DescribeAcls: got DescribeAclsResult in 200.030s 3: [0080_admin_ut / 10.244s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-80 DescribeAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.244s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-80 DeleteAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 10.244s] Using topic "rdkafkatest_rnd33efe4d51317ac6b_do_test_DeleteAcls" 3: [0080_admin_ut / 10.244s] Using topic "rdkafkatest_rnd140a35f14ba4327c_do_test_DeleteAcls" 3: [0080_admin_ut / 10.244s] Call DeleteAcls, timeout is 100ms 3: [0080_admin_ut / 10.244s] DeleteAcls: duration 0.009ms 3: [0080_admin_ut / 10.344s] DeleteAcls.queue_poll: duration 100.027ms 3: [0080_admin_ut / 10.344s] DeleteAcls: got DeleteAclsResult in 100.027s 3: [0080_admin_ut / 10.344s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-80 DeleteAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 10.344s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-80 DeleteAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.344s] Using topic "rdkafkatest_rnd5ca70d0e08377666_do_test_DeleteAcls" 3: [0080_admin_ut / 10.344s] Using topic "rdkafkatest_rnd75e5a2797d4de253_do_test_DeleteAcls" 3: [0080_admin_ut / 10.344s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 10.344s] DeleteAcls: duration 0.008ms 3: [0080_admin_ut / 10.544s] DeleteAcls.queue_poll: duration 200.030ms 3: [0080_admin_ut / 10.544s] DeleteAcls: got DeleteAclsResult in 200.030s 3: [0080_admin_ut / 10.544s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-80 DeleteAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.544s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-80 DeleteAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.544s] Using topic "rdkafkatest_rnd765d1de34722ead6_do_test_DeleteAcls" 3: [0080_admin_ut / 10.544s] Using topic "rdkafkatest_rnd13a92c7012f34f89_do_test_DeleteAcls" 3: [0080_admin_ut / 10.544s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 10.544s] DeleteAcls: duration 0.008ms 3: %3|1669457762.381|TXNERR|0105_transactions_mock#producer-90| [thrd:127.0.0.1:33087/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 1 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:634947000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457762.381|PARTCNT|0105_transactions_mock#producer-90| [thrd:127.0.0.1:33087/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 7.637s] FLUSH: duration 992.903ms 3: [0105_transactions_mock / 7.637s] Error TOPIC_AUTHORIZATION_FAILED: ProduceRequest for mytopic [1] with 1 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:634947000,Epoch:0}, base seq 0): current transaction must be aborted 3: [0105_transactions_mock / 7.637s] rd_kafka_abort_transaction(rk, -1): duration 0.720ms 3: [0105_transactions_mock / 7.638s] 2. Fail on AddPartitionsToTxn 3: [0105_transactions_mock / 7.638s] rd_kafka_begin_transaction(rk): duration 0.037ms 3: %3|1669457762.383|ADDPARTS|0105_transactions_mock#producer-90| [thrd:main]: TxnCoordinator/1: Failed to add partition "mytopic" [1] to transaction: Broker: Topic authorization failed 3: %3|1669457762.383|TXNERR|0105_transactions_mock#producer-90| [thrd:main]: Current transaction failed in state BeginCommit: Failed to add partition(s) to transaction on broker TxnCoordinator/1: Broker: Topic authorization failed (after 0 ms) (TOPIC_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 7.639s] commit_transaction() error TOPIC_AUTHORIZATION_FAILED: Failed to add partition(s) to transaction on broker TxnCoordinator/1: Broker: Topic authorization failed (after 0 ms) 3: [0105_transactions_mock / 7.639s] rd_kafka_abort_transaction(rk, -1): duration 0.182ms 3: [0105_transactions_mock / 7.639s] 3. Fail on AddOffsetsToTxn 3: [0105_transactions_mock / 7.639s] rd_kafka_begin_transaction(rk): duration 0.086ms 3: %3|1669457762.385|ADDOFFSETS|0105_transactions_mock#producer-90| [thrd:main]: TxnCoordinator/1: Failed to add offsets to transaction on broker TxnCoordinator/1: Broker: Group authorization failed 3: %3|1669457762.385|TXNERR|0105_transactions_mock#producer-90| [thrd:main]: Current transaction failed in state InTransaction: Failed to add offsets to transaction on broker TxnCoordinator/1: Broker: Group authorization failed (after 0ms) (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 7.641s] rd_kafka_abort_transaction(rk, -1): duration 0.078ms 3: [0105_transactions_mock / 7.642s] [ do_test_txn_requires_abort_errors:1132: PASS (1.01s) ] 3: [0105_transactions_mock / 7.642s] [ do_test_txn_slow_reinit:390: without sleep ] 3: [0105_transactions_mock / 7.642s] Test config file test.conf not found 3: [0105_transactions_mock / 7.642s] Setting test timeout to 60s * 2.7 3: %5|1669457762.387|MOCK|0105_transactions_mock#producer-91| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:33869,127.0.0.1:44999,127.0.0.1:44207 3: [0105_transactions_mock / 7.643s] Created kafka instance 0105_transactions_mock#producer-91 3: [0105_transactions_mock / 7.645s] rd_kafka_init_transactions(rk, -1): duration 1.404ms 3: [0105_transactions_mock / 7.645s] rd_kafka_begin_transaction(rk): duration 0.082ms 3: [0105_transactions_mock / 7.645s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: [0105_transactions_mock / 7.645s] 0105_transactions_mock#producer-91: Flushing 1 messages 3: [0080_admin_ut / 10.744s] DeleteAcls.queue_poll: duration 200.032ms 3: [0080_admin_ut / 10.744s] DeleteAcls: got DeleteAclsResult in 200.032s 3: [0080_admin_ut / 10.744s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-80 DeleteAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.744s] [ do_test_mix:1342 ] 3: [0080_admin_ut / 10.744s] Creating 2 topics 3: [0080_admin_ut / 10.744s] Deleting 1 topics 3: [0080_admin_ut / 10.744s] Creating 1 topics 3: [0080_admin_ut / 10.744s] Deleting 3 groups 3: [0080_admin_ut / 10.744s] Deleting offsets from 3 partitions 3: [0080_admin_ut / 10.744s] Creating (up to) 15 partitions for topic "topicD" 3: [0080_admin_ut / 10.744s] Deleting committed offsets for group mygroup and 3 partitions 3: [0080_admin_ut / 10.744s] Provoking invalid DeleteConsumerGroupOffsets call 3: [0080_admin_ut / 10.744s] Creating 2 topics 3: [0080_admin_ut / 10.744s] Got event DeleteConsumerGroupOffsetsResult: Exactly one DeleteConsumerGroupOffsets must be passed 3: [0034_offset_reset_mock / 14.819s] #3: injecting NO_ERROR, expecting _NO_OFFSET 3: [0034_offset_reset_mock / 14.819s] ASSIGN.PARTITIONS: duration 0.217ms 3: [0034_offset_reset_mock / 14.819s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 14.819s] #3: Ignoring Error event: Failed to query logical offset TAIL(10): Broker: Topic authorization failed 3: [0034_offset_reset_mock / 14.819s] #3: injected NO_ERROR, got error _AUTO_OFFSET_RESET: no previously committed offset available: Local: No offset stored 3: [0080_admin_ut / 10.857s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.857s] Got event DeleteTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.857s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.857s] Got event CreatePartitionsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.857s] Got event DeleteConsumerGroupOffsetsResult: Failed while waiting for response from broker: Local: Timed out 3: [0080_admin_ut / 10.857s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.857s] Got event DeleteGroupsResult: Success 3: [0080_admin_ut / 10.857s] Got event DeleteRecordsResult: Failed to query partition leaders: Local: Timed out 3: [0080_admin_ut / 10.857s] [ do_test_mix:1342: PASS (0.11s) ] 3: [0080_admin_ut / 10.857s] [ do_test_configs:1411 ] 3: [0034_offset_reset_mock / 14.855s] [ offset_reset_errors:201: PASS (14.85s) ] 3: [0034_offset_reset_mock / 14.855s] 0034_offset_reset_mock: duration 14854.956ms 3: [0034_offset_reset_mock / 14.855s] ================= Test 0034_offset_reset_mock PASSED ================= 3: [
/ 15.137s] Too many tests running (5 >= 5): postponing 0116_kafkaconsumer_close start... 3: [0113_cooperative_rebalance_local/ 0.000s] ================= Running test 0113_cooperative_rebalance_local ================= 3: [0113_cooperative_rebalance_local/ 0.000s] ==== Stats written to file stats_0113_cooperative_rebalance_local_838782929217932442.json ==== 3: [0113_cooperative_rebalance_local/ 0.000s] [ a_assign_rapid:674 ] 3: %5|1669457762.556|CONFWARN|MOCK#producer-92| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 0.004s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 8.645s] FLUSH: duration 1000.237ms 3: [0105_transactions_mock / 8.645s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.004ms 3: %3|1669457763.390|TXNERR|0105_transactions_mock#producer-91| [thrd:127.0.0.1:44207/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 8.646s] commit_transaction(-1): duration 1.088ms 3: [0105_transactions_mock / 8.646s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 8.746s] abort_transaction(100): duration 100.141ms 3: [0105_transactions_mock / 8.746s] First abort_transaction() failed: Transactional operation timed out 3: [0105_transactions_mock / 8.746s] Retrying abort 3: [0113_cooperative_rebalance_local/ 1.051s] Setting test timeout to 20s * 2.7 3: [0113_cooperative_rebalance_local/ 1.054s] pre-commit: 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.054s] topic1[0] offset 11: Success 3: [0113_cooperative_rebalance_local/ 1.054s] topic2[0] offset 22: Success 3: [0106_cgrp_sess_timeout / 3.070s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 3.071s] ASSIGN.PARTITIONS: duration 0.343ms 3: [0106_cgrp_sess_timeout / 3.071s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 3.071s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 3.177s] CONSUME: duration 106.432ms 3: [0106_cgrp_sess_timeout / 3.177s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 3.177s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0104_fetch_from_follower_mock/ 9.390s] CONSUME: duration 3000.071ms 3: [0104_fetch_from_follower_mock/ 9.390s] test_consumer_poll_no_msgs:4075: no msgs: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 9.390s] test_consumer_poll_no_msgs:4075: no msgs: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 9.390s] remaining: consume 3 messages 3: [0104_fetch_from_follower_mock/ 9.402s] CONSUME: duration 11.991ms 3: [0104_fetch_from_follower_mock/ 9.402s] remaining: consumed 3/3 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 9.402s] Closing consumer 0104_fetch_from_follower_mock#consumer-88 3: [0104_fetch_from_follower_mock/ 9.402s] CONSUMER.CLOSE: duration 0.470ms 3: [0104_fetch_from_follower_mock/ 9.404s] [ Test lagging FFF offset reset PASSED ] 3: [0104_fetch_from_follower_mock/ 9.404s] [ Test unknown follower ] 3: %5|1669457763.904|CONFWARN|MOCK#producer-95| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 9.405s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.405s] Created kafka instance 0104_fetch_from_follower_mock#producer-96 3: [0104_fetch_from_follower_mock/ 9.405s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.405s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 9.406s] SUM(POLL): duration 0.001ms 3: [0104_fetch_from_follower_mock/ 9.406s] PRODUCE: duration 1.392ms 3: [0104_fetch_from_follower_mock/ 9.454s] PRODUCE.DELIVERY.WAIT: duration 47.649ms 3: [0104_fetch_from_follower_mock/ 9.456s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.457s] Created kafka instance 0104_fetch_from_follower_mock#consumer-97 3: [0104_fetch_from_follower_mock/ 9.457s] ASSIGN.PARTITIONS: duration 0.035ms 3: [0104_fetch_from_follower_mock/ 9.457s] unknown follower: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 9.457s] unknown follower: not expecting any messages for 5000ms 3: %5|1669457764.072|FETCH|0104_fetch_from_follower_mock#consumer-97| [thrd:127.0.0.1:44365/bootstrap]: 127.0.0.1:44365/1: test [0]: preferred replica (19) is unknown: refreshing metadata 3: [0080_admin_ut / 12.857s] [ do_test_configs:1411: PASS (2.00s) ] 3: [0080_admin_ut / 12.857s] Test config file test.conf not found 3: %4|1669457764.556|CONFWARN|0080_admin_ut#consumer-98| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1669457764.556|CONFWARN|0080_admin_ut#consumer-98| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 12.858s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-98 DeleteRecords with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 12.858s] Using topic "rdkafkatest_rnd21798a681837ccd8_do_test_DeleteRecords" 3: [0080_admin_ut / 12.858s] Using topic "rdkafkatest_rnd185f2c925cd4bdaf_do_test_DeleteRecords" 3: [0080_admin_ut / 12.858s] Using topic "rdkafkatest_rnd2e68da8e00e23b2a_do_test_DeleteRecords" 3: [0080_admin_ut / 12.858s] Using topic "rdkafkatest_rnd353cb90e46e6ab5d_do_test_DeleteRecords" 3: [0080_admin_ut / 12.858s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 12.858s] DeleteRecords: duration 0.020ms 3: [0080_admin_ut / 12.858s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-98 DeleteRecords with main queue, options, destroy, timeout 100ms: PASS (0.00s) ] 3: [0080_admin_ut / 12.862s] Test config file test.conf not found 3: %4|1669457764.560|CONFWARN|0080_admin_ut#consumer-99| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1669457764.560|CONFWARN|0080_admin_ut#consumer-99| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 12.862s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-99 DeleteGroups with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 12.862s] Using topic "rdkafkatest_rnd119ab2ea362269ca_do_test_DeleteGroups" 3: [0080_admin_ut / 12.862s] Using topic "rdkafkatest_rnd5d38c6ad4986a001_do_test_DeleteGroups" 3: [0080_admin_ut / 12.862s] Using topic "rdkafkatest_rnd56fb8a2a51018a8b_do_test_DeleteGroups" 3: [0080_admin_ut / 12.862s] Using topic "rdkafkatest_rnd15846433759ae204_do_test_DeleteGroups" 3: [0080_admin_ut / 12.862s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 12.862s] DeleteGroups: duration 0.016ms 3: [0080_admin_ut / 12.862s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 12.862s] 0080_admin_ut: duration 12862.337ms 3: [0080_admin_ut / 12.862s] ================= Test 0080_admin_ut PASSED ================= 3: [
/ 17.143s] Too many tests running (5 >= 5): postponing 0117_mock_errors start... 3: [0116_kafkaconsumer_close / 0.000s] ================= Running test 0116_kafkaconsumer_close ================= 3: [0116_kafkaconsumer_close / 0.000s] ==== Stats written to file stats_0116_kafkaconsumer_close_4171779453270378760.json ==== 3: [0116_kafkaconsumer_close / 0.000s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=0 ] 3: %5|1669457764.565|CONFWARN|MOCK#producer-100| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 0.003s] Setting test timeout to 10s * 2.7 3: %5|1669457764.575|FETCH|0104_fetch_from_follower_mock#consumer-97| [thrd:127.0.0.1:44365/bootstrap]: 127.0.0.1:44365/1: test [0]: preferred replica (19) lease changing too quickly (0s < 60s): possibly due to unavailable replica or stale cluster state: backing off next fetch 3: [0113_cooperative_rebalance_local/ 2.060s] a_assign_rapid#consumer-94: incremental assign of 2 partition(s) 3: [0113_cooperative_rebalance_local/ 2.060s] incremental_assign(): 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 2.060s] topic1[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 2.060s] topic2[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 2.060s] a_assign_rapid#consumer-94: incremental unassign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 2.060s] incremental_unassign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 2.060s] topic1[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 2.060s] commit: 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 2.060s] topic1[0] offset 55: Success 3: [0113_cooperative_rebalance_local/ 2.060s] topic2[0] offset 33: Success 3: [0113_cooperative_rebalance_local/ 2.060s] a_assign_rapid#consumer-94: incremental assign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 2.061s] incremental_assign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 2.061s] topic3[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 2.061s] Clearing rtt 3: [0113_cooperative_rebalance_local/ 2.061s] a_assign_rapid#consumer-94: incremental assign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 2.061s] incremental_assign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 2.061s] topic1[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 4.064s] [ a_assign_rapid:674: PASS (4.06s) ] 3: [0113_cooperative_rebalance_local/ 4.064s] [ p_lost_partitions_heartbeat_illegal_generation_test:2695 ] 3: %5|1669457766.620|CONFWARN|MOCK#producer-103| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 4.066s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 4.066s] Created kafka instance 0113_cooperative_rebalance_local#producer-104 3: [0113_cooperative_rebalance_local/ 4.066s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 4.066s] Produce to test [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 4.066s] SUM(POLL): duration 0.001ms 3: [0113_cooperative_rebalance_local/ 4.066s] PRODUCE: duration 0.118ms 3: [0113_cooperative_rebalance_local/ 4.138s] PRODUCE.DELIVERY.WAIT: duration 72.258ms 3: [0113_cooperative_rebalance_local/ 4.142s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 4.142s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 4.146s] Created kafka instance 0113_cooperative_rebalance_local#consumer-105 3: [0113_cooperative_rebalance_local/ 4.158s] p_lost_partitions_heartbeat_illegal_generation_test:2720: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0104_fetch_from_follower_mock/ 14.457s] CONSUME: duration 5000.071ms 3: [0104_fetch_from_follower_mock/ 14.457s] test_consumer_poll_no_msgs:4075: unknown follower: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 14.457s] test_consumer_poll_no_msgs:4075: unknown follower: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 14.457s] proper follower: consume 1000 messages 3: [0116_kafkaconsumer_close / 5.031s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=0: PASS (5.03s) ] 3: [0116_kafkaconsumer_close / 5.031s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=0 ] 3: %5|1669457769.593|CONFWARN|MOCK#producer-106| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 5.031s] Setting test timeout to 10s * 2.7 3: [0104_fetch_from_follower_mock/ 15.098s] CONSUME: duration 641.576ms 3: [0104_fetch_from_follower_mock/ 15.098s] proper follower: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 15.098s] do_test_unknown_follower:223: broker_id: Verifying 1000 received messages (flags 0x80000): expecting msgids 0..1000 (1000) 3: [0104_fetch_from_follower_mock/ 15.098s] do_test_unknown_follower:223: broker_id: Verification of 1000 received messages succeeded: expected msgids 0..1000 (1000) 3: [0104_fetch_from_follower_mock/ 15.098s] Closing consumer 0104_fetch_from_follower_mock#consumer-97 3: [0104_fetch_from_follower_mock/ 15.107s] CONSUMER.CLOSE: duration 8.253ms 3: [0104_fetch_from_follower_mock/ 15.123s] [ Test unknown follower PASSED ] 3: [0104_fetch_from_follower_mock/ 15.123s] [ Test REPLICA_NOT_AVAIALBLE ] 3: %5|1669457769.626|CONFWARN|MOCK#producer-109| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 15.127s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.127s] Created kafka instance 0104_fetch_from_follower_mock#producer-110 3: [0104_fetch_from_follower_mock/ 15.127s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.127s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 15.129s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 15.129s] PRODUCE: duration 1.397ms 3: [0104_fetch_from_follower_mock/ 15.184s] PRODUCE.DELIVERY.WAIT: duration 55.599ms 3: [0104_fetch_from_follower_mock/ 15.192s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.193s] Created kafka instance 0104_fetch_from_follower_mock#consumer-111 3: [0104_fetch_from_follower_mock/ 15.193s] ASSIGN.PARTITIONS: duration 0.033ms 3: [0104_fetch_from_follower_mock/ 15.193s] REPLICA_NOT_AVAIALBLE: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 15.193s] Wait initial metadata: not expecting any messages for 2000ms 3: [0113_cooperative_rebalance_local/ 7.176s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 7.176s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 7.176s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 7.176s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 7.176s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 7.176s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.256ms 3: [0113_cooperative_rebalance_local/ 7.176s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 7.277s] p_lost_partitions_heartbeat_illegal_generation_test:2732: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: %4|1669457770.663|SESSTMOUT|0106_cgrp_sess_timeout#consumer-85| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0113_cooperative_rebalance_local/ 8.176s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 8.176s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 8.176s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 8.176s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 8.176s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 8.176s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 8.176s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.051ms 3: [0113_cooperative_rebalance_local/ 8.176s] unassign: incremental unassign of 4 partition(s) done 3: [0106_cgrp_sess_timeout / 10.179s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 10.179s] Performing sync commit 3: [0113_cooperative_rebalance_local/ 8.277s] p_lost_partitions_heartbeat_illegal_generation_test:2737: Waiting for rejoin after lost (_ASSIGN_PARTITIONS) for 12s 3: [0104_fetch_from_follower_mock/ 17.193s] CONSUME: duration 2000.081ms 3: [0104_fetch_from_follower_mock/ 17.193s] test_consumer_poll_no_msgs:4075: Wait initial metadata: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 17.193s] test_consumer_poll_no_msgs:4075: Wait initial metadata: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 17.193s] Consume: consume 1000 messages 3: [0106_cgrp_sess_timeout / 11.180s] UNASSIGN.PARTITIONS: duration 0.188ms 3: [0106_cgrp_sess_timeout / 11.180s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 11.180s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: [0116_kafkaconsumer_close / 8.182s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=0: PASS (3.15s) ] 3: [0116_kafkaconsumer_close / 8.182s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=0 ] 3: %5|1669457772.745|CONFWARN|MOCK#producer-112| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 8.183s] Setting test timeout to 10s * 2.7 3: [0104_fetch_from_follower_mock/ 18.330s] CONSUME: duration 1136.788ms 3: [0104_fetch_from_follower_mock/ 18.330s] Consume: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 18.330s] Closing consumer 0104_fetch_from_follower_mock#consumer-111 3: [0104_fetch_from_follower_mock/ 18.340s] CONSUMER.CLOSE: duration 9.814ms 3: [0104_fetch_from_follower_mock/ 18.361s] [ Test REPLICA_NOT_AVAIALBLE PASSED ] 3: [0104_fetch_from_follower_mock/ 18.361s] 0104_fetch_from_follower_mock: duration 18360.558ms 3: [0104_fetch_from_follower_mock/ 18.361s] ================= Test 0104_fetch_from_follower_mock PASSED ================= 3: [
/ 25.443s] Too many tests running (5 >= 5): postponing 0120_asymmetric_subscription start... 3: [0117_mock_errors / 0.000s] ================= Running test 0117_mock_errors ================= 3: [0117_mock_errors / 0.000s] ==== Stats written to file stats_0117_mock_errors_626719906857965888.json ==== 3: [0117_mock_errors / 0.000s] Test config file test.conf not found 3: [0117_mock_errors / 0.000s] [ do_test_producer_storage_error:53: ] 3: [0117_mock_errors / 0.000s] Test config file test.conf not found 3: [0117_mock_errors / 0.000s] Setting test timeout to 10s * 2.7 3: %5|1669457772.862|MOCK|0117_mock_errors#producer-115| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40207,127.0.0.1:35033,127.0.0.1:45595 3: [0117_mock_errors / 0.004s] Created kafka instance 0117_mock_errors#producer-115 3: [0117_mock_errors / 0.004s] 0117_mock_errors#producer-115: Flushing 1 messages 3: [0105_transactions_mock / 18.650s] rd_kafka_abort_transaction(rk, -1): duration 9904.152ms 3: [0105_transactions_mock / 18.651s] abort_transaction(-1): duration 9904.174ms 3: [0105_transactions_mock / 18.651s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock / 18.651s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock / 18.653s] rd_kafka_commit_transaction(rk, -1): duration 2.294ms 3: [0105_transactions_mock / 18.654s] [ do_test_txn_slow_reinit:390: without sleep: PASS (11.01s) ] 3: [0105_transactions_mock / 18.654s] [ do_test_txn_slow_reinit:390: with sleep ] 3: [0105_transactions_mock / 18.654s] Test config file test.conf not found 3: [0105_transactions_mock / 18.654s] Setting test timeout to 60s * 2.7 3: %5|1669457773.398|MOCK|0105_transactions_mock#producer-116| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44013,127.0.0.1:43395,127.0.0.1:38783 3: [0105_transactions_mock / 18.654s] Created kafka instance 0105_transactions_mock#producer-116 3: [0105_transactions_mock / 18.687s] rd_kafka_init_transactions(rk, -1): duration 29.904ms 3: [0105_transactions_mock / 18.687s] rd_kafka_begin_transaction(rk): duration 0.101ms 3: [0105_transactions_mock / 18.687s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.012ms 3: [0105_transactions_mock / 18.687s] 0105_transactions_mock#producer-116: Flushing 1 messages 3: [0117_mock_errors / 1.523s] FLUSH: duration 1518.694ms 3: [0117_mock_errors / 1.523s] [ do_test_producer_storage_error:53: : PASS (1.52s) ] 3: [0117_mock_errors / 1.523s] [ do_test_producer_storage_error:53: with too few retries ] 3: [0117_mock_errors / 1.523s] Test config file test.conf not found 3: [0117_mock_errors / 1.523s] Setting test timeout to 10s * 2.7 3: %5|1669457774.385|MOCK|0117_mock_errors#producer-117| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44217,127.0.0.1:45183,127.0.0.1:41241 3: [0117_mock_errors / 1.524s] Created kafka instance 0117_mock_errors#producer-117 3: [0117_mock_errors / 1.524s] 0117_mock_errors#producer-117: Flushing 1 messages 3: [0105_transactions_mock / 19.660s] FLUSH: duration 972.519ms 3: [0105_transactions_mock / 19.660s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.005ms 3: %3|1669457774.405|TXNERR|0105_transactions_mock#producer-116| [thrd:127.0.0.1:38783/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 19.661s] commit_transaction(-1): duration 1.193ms 3: [0105_transactions_mock / 19.661s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 19.761s] abort_transaction(100): duration 100.172ms 3: [0105_transactions_mock / 19.761s] First abort_transaction() failed: Transactional operation timed out 3: [0117_mock_errors / 2.029s] FLUSH: duration 505.152ms 3: [0117_mock_errors / 2.030s] [ do_test_producer_storage_error:53: with too few retries: PASS (0.51s) ] 3: [0117_mock_errors / 2.030s] [ do_test_offset_commit_error_during_rebalance:109 ] 3: [0117_mock_errors / 2.030s] Test config file test.conf not found 3: [0117_mock_errors / 2.030s] Setting test timeout to 60s * 2.7 3: %5|1669457774.892|CONFWARN|MOCK#producer-118| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 2.034s] Test config file test.conf not found 3: [0117_mock_errors / 2.034s] Created kafka instance 0117_mock_errors#producer-119 3: [0117_mock_errors / 2.034s] Test config file test.conf not found 3: [0117_mock_errors / 2.034s] Produce to test [-1]: messages #0..100 3: [0117_mock_errors / 2.034s] SUM(POLL): duration 0.000ms 3: [0117_mock_errors / 2.034s] PRODUCE: duration 0.108ms 3: [0117_mock_errors / 2.089s] PRODUCE.DELIVERY.WAIT: duration 54.693ms 3: [0117_mock_errors / 2.090s] Created kafka instance 0117_mock_errors#consumer-120 3: [0117_mock_errors / 2.095s] Created kafka instance 0117_mock_errors#consumer-121 3: [0117_mock_errors / 2.102s] C1.PRE: consume 1 messages 3: [0106_cgrp_sess_timeout / 15.089s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.089s] ASSIGN.PARTITIONS: duration 0.089ms 3: [0106_cgrp_sess_timeout / 15.089s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.089s] Closing consumer 0106_cgrp_sess_timeout#consumer-85 3: [0106_cgrp_sess_timeout / 15.089s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.089s] Performing sync commit 3: [0113_cooperative_rebalance_local/ 13.181s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 13.181s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.100ms 3: [0113_cooperative_rebalance_local/ 13.181s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 13.181s] Closing consumer 3: [0113_cooperative_rebalance_local/ 13.181s] Closing consumer 0113_cooperative_rebalance_local#consumer-105 3: [0113_cooperative_rebalance_local/ 13.181s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 13.181s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 13.181s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.052ms 3: [0113_cooperative_rebalance_local/ 13.181s] unassign: incremental unassign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 13.181s] CONSUMER.CLOSE: duration 0.275ms 3: [0113_cooperative_rebalance_local/ 13.181s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 13.182s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 13.182s] [ p_lost_partitions_heartbeat_illegal_generation_test:2695: PASS (9.12s) ] 3: [0113_cooperative_rebalance_local/ 13.182s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=0 ] 3: %5|1669457775.738|CONFWARN|MOCK#producer-122| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 13.187s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 13.189s] Created kafka instance 0113_cooperative_rebalance_local#producer-123 3: [0113_cooperative_rebalance_local/ 13.189s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 13.189s] Produce to test1 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 13.190s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 13.190s] PRODUCE: duration 0.118ms 3: [0113_cooperative_rebalance_local/ 13.255s] PRODUCE.DELIVERY.WAIT: duration 65.817ms 3: [0113_cooperative_rebalance_local/ 13.259s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 13.263s] Created kafka instance 0113_cooperative_rebalance_local#producer-124 3: [0113_cooperative_rebalance_local/ 13.263s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 13.263s] Produce to test2 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 13.263s] SUM(POLL): duration 0.001ms 3: [0113_cooperative_rebalance_local/ 13.263s] PRODUCE: duration 0.118ms 3: [0113_cooperative_rebalance_local/ 13.321s] PRODUCE.DELIVERY.WAIT: duration 57.867ms 3: [0113_cooperative_rebalance_local/ 13.322s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 13.322s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 13.322s] Created kafka instance 0113_cooperative_rebalance_local#consumer-125 3: [0113_cooperative_rebalance_local/ 13.322s] q_lost_partitions_illegal_generation_test:2801: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0106_cgrp_sess_timeout / 16.089s] UNASSIGN.PARTITIONS: duration 0.030ms 3: [0106_cgrp_sess_timeout / 16.089s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 16.089s] CONSUMER.CLOSE: duration 1000.573ms 3: [0106_cgrp_sess_timeout / 16.090s] [ do_test_session_timeout:152: Test session timeout with sync commit: PASS (16.09s) ] 3: [0106_cgrp_sess_timeout / 16.090s] [ do_test_session_timeout:152: Test session timeout with async commit ] 3: %5|1669457776.678|CONFWARN|MOCK#producer-126| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 16.094s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.095s] Created kafka instance 0106_cgrp_sess_timeout#producer-127 3: [0106_cgrp_sess_timeout / 16.095s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.095s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 16.095s] SUM(POLL): duration 0.001ms 3: [0106_cgrp_sess_timeout / 16.095s] PRODUCE: duration 0.114ms 3: [0106_cgrp_sess_timeout / 16.146s] PRODUCE.DELIVERY.WAIT: duration 50.781ms 3: [0106_cgrp_sess_timeout / 16.154s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.154s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 16.162s] Created kafka instance 0106_cgrp_sess_timeout#consumer-128 3: [0106_cgrp_sess_timeout / 16.166s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0116_kafkaconsumer_close / 13.236s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=0: PASS (5.05s) ] 3: [0116_kafkaconsumer_close / 13.236s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=0 ] 3: %5|1669457777.798|CONFWARN|MOCK#producer-129| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 13.236s] Setting test timeout to 10s * 2.7 3: [0117_mock_errors / 5.120s] 0117_mock_errors#consumer-120: Rebalance: _ASSIGN_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.121s] ASSIGN.PARTITIONS: duration 0.236ms 3: [0117_mock_errors / 5.121s] assign: assigned 2 partition(s) 3: [0117_mock_errors / 5.221s] CONSUME: duration 3119.249ms 3: [0117_mock_errors / 5.221s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 5.221s] C2.PRE: consume 1 messages 3: [0117_mock_errors / 5.221s] 0117_mock_errors#consumer-121: Rebalance: _ASSIGN_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.221s] ASSIGN.PARTITIONS: duration 0.091ms 3: [0117_mock_errors / 5.221s] assign: assigned 2 partition(s) 3: [0117_mock_errors / 5.322s] CONSUME: duration 100.957ms 3: [0117_mock_errors / 5.322s] C2.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 5.322s] Closing consumer 0117_mock_errors#consumer-121 3: [0117_mock_errors / 5.322s] 0117_mock_errors#consumer-121: Rebalance: _REVOKE_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.322s] UNASSIGN.PARTITIONS: duration 0.038ms 3: [0117_mock_errors / 5.322s] unassign: unassigned current partitions 3: [0117_mock_errors / 5.323s] CONSUMER.CLOSE: duration 0.426ms 3: [0117_mock_errors / 5.323s] Committing (should fail) 3: [0117_mock_errors / 5.323s] Commit returned REBALANCE_IN_PROGRESS 3: [0117_mock_errors / 5.323s] C1.PRE: consume 100 messages 3: [0113_cooperative_rebalance_local/ 16.353s] Rebalance #5: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 16.353s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 16.353s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 16.353s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 16.353s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 16.353s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.074ms 3: [0113_cooperative_rebalance_local/ 16.353s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 16.454s] q_lost_partitions_illegal_generation_test:2823: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0106_cgrp_sess_timeout / 19.190s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 19.190s] ASSIGN.PARTITIONS: duration 0.113ms 3: [0106_cgrp_sess_timeout / 19.190s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 19.190s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 19.291s] CONSUME: duration 100.865ms 3: [0106_cgrp_sess_timeout / 19.291s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 19.291s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0116_kafkaconsumer_close / 16.388s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=0: PASS (3.15s) ] 3: [0116_kafkaconsumer_close / 16.388s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=0 ] 3: %5|1669457780.950|CONFWARN|MOCK#producer-132| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 16.388s] Setting test timeout to 10s * 2.7 3: [0117_mock_errors / 8.127s] 0117_mock_errors#consumer-120: Rebalance: _REVOKE_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 8.127s] UNASSIGN.PARTITIONS: duration 0.120ms 3: [0117_mock_errors / 8.127s] unassign: unassigned current partitions 3: [
/ 35.457s] Too many tests running (5 >= 5): postponing 0120_asymmetric_subscription start... 3: [0117_mock_errors / 10.324s] 0117_mock_errors#consumer-120: Rebalance: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0117_mock_errors / 10.324s] ASSIGN.PARTITIONS: duration 0.115ms 3: [0117_mock_errors / 10.324s] assign: assigned 4 partition(s) 3: [0117_mock_errors / 10.425s] CONSUME: duration 5101.896ms 3: [0117_mock_errors / 10.425s] C1.PRE: consumed 100/100 messages (0/-1 EOFs) 3: [0117_mock_errors / 10.425s] 0117_mock_errors#consumer-120: Rebalance: _REVOKE_PARTITIONS: 4 partition(s) 3: [0117_mock_errors / 10.425s] UNASSIGN.PARTITIONS: duration 0.082ms 3: [0117_mock_errors / 10.425s] unassign: unassigned current partitions 3: [0117_mock_errors / 10.426s] [ do_test_offset_commit_error_during_rebalance:109: PASS (8.40s) ] 3: [0117_mock_errors / 10.426s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=true ] 3: [0117_mock_errors / 10.426s] Test config file test.conf not found 3: [0117_mock_errors / 10.426s] Setting test timeout to 60s * 2.7 3: %5|1669457783.288|CONFWARN|MOCK#producer-135| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 10.427s] Test config file test.conf not found 3: [0117_mock_errors / 10.427s] Created kafka instance 0117_mock_errors#producer-136 3: [0117_mock_errors / 10.427s] Test config file test.conf not found 3: [0117_mock_errors / 10.427s] Produce to test [-1]: messages #0..1 3: [0117_mock_errors / 10.427s] SUM(POLL): duration 0.001ms 3: [0117_mock_errors / 10.427s] PRODUCE: duration 0.009ms 3: [0117_mock_errors / 10.431s] PRODUCE.DELIVERY.WAIT: duration 4.063ms 3: [0117_mock_errors / 10.435s] Created kafka instance 0117_mock_errors#consumer-137 3: [0117_mock_errors / 10.439s] C1.PRE: consume 1 messages 3: [0113_cooperative_rebalance_local/ 21.959s] Rebalance #6: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 21.959s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 21.959s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 21.959s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 21.959s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 21.959s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 21.959s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.043ms 3: [0113_cooperative_rebalance_local/ 21.959s] unassign: incremental unassign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 22.456s] q_lost_partitions_illegal_generation_test:2830: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 12s 3: [0116_kafkaconsumer_close / 21.440s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=0: PASS (5.05s) ] 3: [0116_kafkaconsumer_close / 21.440s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=0 ] 3: %5|1669457786.003|CONFWARN|MOCK#producer-138| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 21.441s] Setting test timeout to 10s * 2.7 3: [0117_mock_errors / 13.561s] CONSUME: duration 3121.981ms 3: [0117_mock_errors / 13.561s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 13.561s] Closing consumer 0117_mock_errors#consumer-137 3: [0105_transactions_mock / 31.762s] Retrying abort 3: [0105_transactions_mock / 31.762s] rd_kafka_abort_transaction(rk, -1): duration 0.172ms 3: [0105_transactions_mock / 31.762s] abort_transaction(-1): duration 0.180ms 3: [0105_transactions_mock / 31.762s] rd_kafka_begin_transaction(rk): duration 0.037ms 3: [0105_transactions_mock / 31.762s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock / 31.764s] rd_kafka_commit_transaction(rk, -1): duration 2.466ms 3: [0105_transactions_mock / 31.765s] [ do_test_txn_slow_reinit:390: with sleep: PASS (13.11s) ] 3: [0105_transactions_mock / 31.765s] [ do_test_txn_endtxn_errors:705 ] 3: [0105_transactions_mock / 31.765s] Testing scenario #0 commit with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 31.765s] Test config file test.conf not found 3: [0105_transactions_mock / 31.765s] Setting test timeout to 60s * 2.7 3: %5|1669457786.510|MOCK|0105_transactions_mock#producer-141| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44941,127.0.0.1:42879,127.0.0.1:45577 3: [0105_transactions_mock / 31.766s] Created kafka instance 0105_transactions_mock#producer-141 3: [0105_transactions_mock / 31.800s] rd_kafka_init_transactions(rk, 5000): duration 34.578ms 3: [0105_transactions_mock / 31.801s] rd_kafka_begin_transaction(rk): duration 0.240ms 3: [0105_transactions_mock / 31.820s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 19.707ms 3: [0117_mock_errors / 13.763s] CONSUMER.CLOSE: duration 202.066ms 3: [0117_mock_errors / 13.768s] Created kafka instance 0117_mock_errors#consumer-142 3: [0117_mock_errors / 13.801s] rd_kafka_committed(c2, partitions, 10 * 1000): duration 32.980ms 3: [0117_mock_errors / 13.821s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=true: PASS (3.39s) ] 3: [0117_mock_errors / 13.821s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=false ] 3: [0117_mock_errors / 13.821s] Test config file test.conf not found 3: [0117_mock_errors / 13.821s] Setting test timeout to 60s * 2.7 3: %5|1669457786.683|CONFWARN|MOCK#producer-143| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 13.825s] Test config file test.conf not found 3: [0117_mock_errors / 13.829s] Created kafka instance 0117_mock_errors#producer-144 3: [0117_mock_errors / 13.829s] Test config file test.conf not found 3: [0117_mock_errors / 13.829s] Produce to test [-1]: messages #0..1 3: [0117_mock_errors / 13.829s] SUM(POLL): duration 0.000ms 3: [0117_mock_errors / 13.829s] PRODUCE: duration 0.010ms 3: [0117_mock_errors / 13.833s] PRODUCE.DELIVERY.WAIT: duration 4.025ms 3: [0117_mock_errors / 13.839s] Created kafka instance 0117_mock_errors#consumer-145 3: [0117_mock_errors / 13.839s] C1.PRE: consume 1 messages 3: %4|1669457786.778|SESSTMOUT|0106_cgrp_sess_timeout#consumer-128| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0106_cgrp_sess_timeout / 26.291s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 26.291s] Performing async commit 3: [0105_transactions_mock / 32.872s] commit: duration 1051.868ms 3: [0105_transactions_mock / 32.872s] Scenario #0 commit succeeded 3: [0105_transactions_mock / 32.872s] Testing scenario #0 commit&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 32.872s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock / 32.872s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 32.874s] FLUSH: duration 1.300ms 3: [0105_transactions_mock / 32.875s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.954ms 3: [0106_cgrp_sess_timeout / 27.292s] UNASSIGN.PARTITIONS: duration 0.791ms 3: [0106_cgrp_sess_timeout / 27.292s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 27.292s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: %4|1669457787.880|COMMITFAIL|0106_cgrp_sess_timeout#consumer-128| [thrd:main]: Offset commit (manual) failed for 1/4 partition(s) in join-state wait-unassign-to-complete: Broker: Unknown member: test[0]@17(Broker: Unknown member) 3: [0105_transactions_mock / 33.873s] commit&flush: duration 998.576ms 3: [0105_transactions_mock / 33.873s] Scenario #0 commit&flush succeeded 3: [0105_transactions_mock / 33.873s] Testing scenario #0 abort with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 33.873s] rd_kafka_begin_transaction(rk): duration 0.067ms 3: [0105_transactions_mock / 33.875s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.165ms 3: [0113_cooperative_rebalance_local/ 26.962s] Rebalance #7: _ASSIGN_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 26.962s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.962s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.079ms 3: [0113_cooperative_rebalance_local/ 26.962s] assign: incremental assign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 26.962s] Closing consumer 3: [0113_cooperative_rebalance_local/ 26.962s] Closing consumer 0113_cooperative_rebalance_local#consumer-125 3: [0113_cooperative_rebalance_local/ 26.964s] Rebalance #8: _REVOKE_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 26.964s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.964s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.049ms 3: [0113_cooperative_rebalance_local/ 26.964s] unassign: incremental unassign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 26.964s] CONSUMER.CLOSE: duration 1.281ms 3: [0113_cooperative_rebalance_local/ 26.964s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 26.964s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 26.965s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=0: PASS (13.78s) ] 3: [0113_cooperative_rebalance_local/ 26.965s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=1 ] 3: %5|1669457789.520|CONFWARN|MOCK#producer-146| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 26.968s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.969s] Created kafka instance 0113_cooperative_rebalance_local#producer-147 3: [0113_cooperative_rebalance_local/ 26.969s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.969s] Produce to test1 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 26.970s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 26.970s] PRODUCE: duration 0.127ms 3: [0113_cooperative_rebalance_local/ 27.018s] PRODUCE.DELIVERY.WAIT: duration 48.846ms 3: [0113_cooperative_rebalance_local/ 27.023s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 27.027s] Created kafka instance 0113_cooperative_rebalance_local#producer-148 3: [0113_cooperative_rebalance_local/ 27.027s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 27.027s] Produce to test2 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 27.028s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 27.028s] PRODUCE: duration 0.127ms 3: [0105_transactions_mock / 34.886s] abort: duration 1010.765ms 3: [0105_transactions_mock / 34.886s] Scenario #0 abort succeeded 3: [0105_transactions_mock / 34.886s] Testing scenario #0 abort&flush with 2 injected erorrs, expecting NO_ERROR 3: [0113_cooperative_rebalance_local/ 27.075s] PRODUCE.DELIVERY.WAIT: duration 47.847ms 3: [0113_cooperative_rebalance_local/ 27.076s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 27.076s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 27.077s] Created kafka instance 0113_cooperative_rebalance_local#consumer-149 3: [0113_cooperative_rebalance_local/ 27.077s] q_lost_partitions_illegal_generation_test:2801: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 34.889s] rd_kafka_begin_transaction(rk): duration 3.016ms 3: [0105_transactions_mock / 34.889s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 34.900s] FLUSH: duration 10.932ms 3: [0105_transactions_mock / 34.901s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 1.168ms 3: [0116_kafkaconsumer_close / 25.118s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=0: PASS (3.68s) ] 3: [0116_kafkaconsumer_close / 25.118s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=0 ] 3: %5|1669457789.684|CONFWARN|MOCK#producer-150| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 25.122s] Setting test timeout to 10s * 2.7 3: [0117_mock_errors / 16.993s] CONSUME: duration 3153.538ms 3: [0117_mock_errors / 16.993s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 17.194s] rd_kafka_commit(c1, ((void *)0), 0 ): duration 201.411ms 3: [0117_mock_errors / 17.194s] Closing consumer 0117_mock_errors#consumer-145 3: [0117_mock_errors / 17.194s] CONSUMER.CLOSE: duration 0.272ms 3: [0117_mock_errors / 17.195s] Created kafka instance 0117_mock_errors#consumer-153 3: [0117_mock_errors / 17.216s] rd_kafka_committed(c2, partitions, 10 * 1000): duration 20.777ms 3: [0117_mock_errors / 17.225s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=false: PASS (3.40s) ] 3: [0117_mock_errors / 17.225s] 0117_mock_errors: duration 17225.144ms 3: [0117_mock_errors / 17.225s] ================= Test 0117_mock_errors PASSED ================= 3: [
/ 42.669s] Too many tests running (5 >= 5): postponing 0121_clusterid start... 3: [0120_asymmetric_subscription/ 0.000s] ================= Running test 0120_asymmetric_subscription ================= 3: [0120_asymmetric_subscription/ 0.000s] ==== Stats written to file stats_0120_asymmetric_subscription_1520100687098780542.json ==== 3: [0120_asymmetric_subscription/ 0.000s] Test config file test.conf not found 3: %5|1669457790.087|CONFWARN|MOCK#producer-154| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0120_asymmetric_subscription/ 0.019s] [ do_test_asymmetric:71: roundrobin assignor ] 3: [0120_asymmetric_subscription/ 0.019s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 0.019s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 0.020s] Created kafka instance c0#consumer-155 3: [0120_asymmetric_subscription/ 0.025s] rd_kafka_subscribe(c[i], tlist): duration 4.782ms 3: [0120_asymmetric_subscription/ 0.025s] Created kafka instance c1#consumer-156 3: [0120_asymmetric_subscription/ 0.025s] rd_kafka_subscribe(c[i], tlist): duration 0.017ms 3: [0120_asymmetric_subscription/ 0.026s] Created kafka instance c2#consumer-157 3: [0120_asymmetric_subscription/ 0.026s] rd_kafka_subscribe(c[i], tlist): duration 0.021ms 3: [0105_transactions_mock / 35.977s] abort&flush: duration 1075.983ms 3: [0105_transactions_mock / 35.978s] Scenario #0 abort&flush succeeded 3: [0105_transactions_mock / 35.978s] Testing scenario #1 commit with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 35.978s] rd_kafka_begin_transaction(rk): duration 0.050ms 3: [0105_transactions_mock / 35.980s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.204ms 3: [0106_cgrp_sess_timeout / 31.204s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.204s] ASSIGN.PARTITIONS: duration 0.122ms 3: [0106_cgrp_sess_timeout / 31.204s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.204s] Closing consumer 0106_cgrp_sess_timeout#consumer-128 3: [0106_cgrp_sess_timeout / 31.204s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.204s] Performing async commit 3: [0105_transactions_mock / 37.184s] commit: duration 1204.685ms 3: [0105_transactions_mock / 37.185s] Scenario #1 commit succeeded 3: [0105_transactions_mock / 37.185s] Testing scenario #1 commit&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 37.185s] rd_kafka_begin_transaction(rk): duration 0.041ms 3: [0105_transactions_mock / 37.185s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 37.187s] FLUSH: duration 2.173ms 3: [0105_transactions_mock / 37.187s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.145ms 3: [0113_cooperative_rebalance_local/ 30.087s] Rebalance #9: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 30.087s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 30.087s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 30.087s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 30.087s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 30.087s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.090ms 3: [0113_cooperative_rebalance_local/ 30.087s] assign: incremental assign of 4 partition(s) done 3: [0106_cgrp_sess_timeout / 32.204s] UNASSIGN.PARTITIONS: duration 0.050ms 3: [0106_cgrp_sess_timeout / 32.204s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 32.204s] CONSUMER.CLOSE: duration 1000.511ms 3: [0106_cgrp_sess_timeout / 32.205s] [ do_test_session_timeout:152: Test session timeout with async commit: PASS (16.11s) ] 3: [0106_cgrp_sess_timeout / 32.205s] [ do_test_session_timeout:152: Test session timeout with auto commit ] 3: %5|1669457792.793|CONFWARN|MOCK#producer-158| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 32.210s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.211s] Created kafka instance 0106_cgrp_sess_timeout#producer-159 3: [0106_cgrp_sess_timeout / 32.211s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.211s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 32.211s] SUM(POLL): duration 0.001ms 3: [0106_cgrp_sess_timeout / 32.211s] PRODUCE: duration 0.121ms 3: [0106_cgrp_sess_timeout / 32.264s] PRODUCE.DELIVERY.WAIT: duration 52.770ms 3: [0106_cgrp_sess_timeout / 32.268s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.268s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 32.271s] Created kafka instance 0106_cgrp_sess_timeout#consumer-160 3: [0106_cgrp_sess_timeout / 32.271s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 38.392s] commit&flush: duration 1204.579ms 3: [0105_transactions_mock / 38.392s] Scenario #1 commit&flush succeeded 3: [0105_transactions_mock / 38.392s] Testing scenario #1 abort with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 38.392s] rd_kafka_begin_transaction(rk): duration 0.232ms 3: [0105_transactions_mock / 38.394s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.209ms 3: [0113_cooperative_rebalance_local/ 30.691s] q_lost_partitions_illegal_generation_test:2823: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0113_cooperative_rebalance_local/ 30.691s] Rebalance #10: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 30.691s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 30.691s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 30.691s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 30.691s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 30.691s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 30.691s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.084ms 3: [0113_cooperative_rebalance_local/ 30.691s] unassign: incremental unassign of 4 partition(s) done 3: [0120_asymmetric_subscription/ 4.033s] c0#consumer-155: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[1], t2[3] 3: [0120_asymmetric_subscription/ 4.033s] c1#consumer-156: Assignment (6 partition(s)): t2[0], t2[2], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 4.033s] c2#consumer-157: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 4.033s] rd_kafka_assignment(c[i], &assignment): duration 0.051ms 3: [0120_asymmetric_subscription/ 4.033s] rd_kafka_assignment(c[i], &assignment): duration 0.052ms 3: [0120_asymmetric_subscription/ 4.034s] rd_kafka_assignment(c[i], &assignment): duration 0.622ms 3: [0120_asymmetric_subscription/ 4.034s] Closing consumer c0#consumer-155 3: [0120_asymmetric_subscription/ 4.034s] CONSUMER.CLOSE: duration 0.325ms 3: [0120_asymmetric_subscription/ 4.036s] Closing consumer c2#consumer-157 3: [0120_asymmetric_subscription/ 4.036s] CONSUMER.CLOSE: duration 0.277ms 3: [0120_asymmetric_subscription/ 4.037s] [ do_test_asymmetric:71: roundrobin assignor: PASS (4.02s) ] 3: [0120_asymmetric_subscription/ 4.037s] [ do_test_asymmetric:71: range assignor ] 3: [0120_asymmetric_subscription/ 4.037s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 4.037s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 4.037s] Created kafka instance c0#consumer-161 3: [0120_asymmetric_subscription/ 4.037s] rd_kafka_subscribe(c[i], tlist): duration 0.107ms 3: [0120_asymmetric_subscription/ 4.038s] Created kafka instance c1#consumer-162 3: [0120_asymmetric_subscription/ 4.038s] rd_kafka_subscribe(c[i], tlist): duration 0.085ms 3: [0120_asymmetric_subscription/ 4.038s] Created kafka instance c2#consumer-163 3: [0120_asymmetric_subscription/ 4.038s] rd_kafka_subscribe(c[i], tlist): duration 0.074ms 3: [0113_cooperative_rebalance_local/ 31.691s] q_lost_partitions_illegal_generation_test:2830: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 12s 3: [0105_transactions_mock / 39.604s] abort: duration 1209.486ms 3: [0105_transactions_mock / 39.604s] Scenario #1 abort succeeded 3: [0105_transactions_mock / 39.604s] Testing scenario #1 abort&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 39.604s] rd_kafka_begin_transaction(rk): duration 0.059ms 3: [0105_transactions_mock / 39.604s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 39.606s] FLUSH: duration 2.172ms 3: [0105_transactions_mock / 39.606s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.148ms 3: [0116_kafkaconsumer_close / 30.161s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=0: PASS (5.04s) ] 3: [0116_kafkaconsumer_close / 30.161s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=0 ] 3: %5|1669457794.724|CONFWARN|MOCK#producer-164| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 30.162s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 40.605s] abort&flush: duration 999.168ms 3: [0105_transactions_mock / 40.605s] Scenario #1 abort&flush succeeded 3: [0105_transactions_mock / 40.605s] Testing scenario #2 commit with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.605s] rd_kafka_begin_transaction(rk): duration 0.070ms 3: [0105_transactions_mock / 40.607s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.206ms 3: [0105_transactions_mock / 40.708s] commit: duration 100.973ms 3: [0105_transactions_mock / 40.708s] Scenario #2 commit succeeded 3: [0105_transactions_mock / 40.709s] Testing scenario #2 commit&flush with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.709s] rd_kafka_begin_transaction(rk): duration 0.060ms 3: [0105_transactions_mock / 40.709s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 40.711s] FLUSH: duration 2.173ms 3: [0105_transactions_mock / 40.711s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.154ms 3: [0105_transactions_mock / 40.812s] commit&flush: duration 100.566ms 3: [0105_transactions_mock / 40.812s] Scenario #2 commit&flush succeeded 3: [0105_transactions_mock / 40.812s] Testing scenario #2 abort with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.812s] rd_kafka_begin_transaction(rk): duration 0.056ms 3: [0105_transactions_mock / 40.814s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.205ms 3: [0105_transactions_mock / 40.915s] abort: duration 100.669ms 3: [0105_transactions_mock / 40.915s] Scenario #2 abort succeeded 3: [0105_transactions_mock / 40.915s] Testing scenario #2 abort&flush with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.915s] rd_kafka_begin_transaction(rk): duration 0.057ms 3: [0105_transactions_mock / 40.915s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 40.917s] FLUSH: duration 2.177ms 3: [0105_transactions_mock / 40.917s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.151ms 3: [0105_transactions_mock / 41.018s] abort&flush: duration 100.551ms 3: [0105_transactions_mock / 41.018s] Scenario #2 abort&flush succeeded 3: [0105_transactions_mock / 41.018s] Testing scenario #3 commit with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.018s] rd_kafka_begin_transaction(rk): duration 0.058ms 3: [0105_transactions_mock / 41.020s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.209ms 3: [0106_cgrp_sess_timeout / 35.315s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 35.315s] ASSIGN.PARTITIONS: duration 0.092ms 3: [0106_cgrp_sess_timeout / 35.315s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 35.315s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 35.416s] CONSUME: duration 100.753ms 3: [0106_cgrp_sess_timeout / 35.416s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 35.416s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0105_transactions_mock / 41.322s] commit: duration 302.194ms 3: [0105_transactions_mock / 41.322s] Scenario #3 commit succeeded 3: [0105_transactions_mock / 41.322s] Testing scenario #3 commit&flush with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.322s] rd_kafka_begin_transaction(rk): duration 0.061ms 3: [0105_transactions_mock / 41.322s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 41.324s] FLUSH: duration 2.202ms 3: [0105_transactions_mock / 41.325s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.144ms 3: [0105_transactions_mock / 41.627s] commit&flush: duration 301.943ms 3: [0105_transactions_mock / 41.627s] Scenario #3 commit&flush succeeded 3: [0105_transactions_mock / 41.627s] Testing scenario #3 abort with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.627s] rd_kafka_begin_transaction(rk): duration 0.062ms 3: [0105_transactions_mock / 41.629s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.220ms 3: [0105_transactions_mock / 41.930s] abort: duration 301.637ms 3: [0105_transactions_mock / 41.931s] Scenario #3 abort succeeded 3: [0105_transactions_mock / 41.931s] Testing scenario #3 abort&flush with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.931s] rd_kafka_begin_transaction(rk): duration 0.054ms 3: [0105_transactions_mock / 41.931s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 41.933s] FLUSH: duration 2.173ms 3: [0105_transactions_mock / 41.933s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.142ms 3: [0105_transactions_mock / 42.235s] abort&flush: duration 302.096ms 3: [0105_transactions_mock / 42.235s] Scenario #3 abort&flush succeeded 3: [0105_transactions_mock / 42.235s] Testing scenario #4 commit with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.235s] rd_kafka_begin_transaction(rk): duration 0.057ms 3: [0105_transactions_mock / 42.237s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.219ms 3: %3|1669457796.982|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.238s] commit: duration 0.207ms 3: [0105_transactions_mock / 42.238s] Scenario #4 commit failed: UNKNOWN_PRODUCER_ID: EndTxn commit failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.238s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.238s] rd_kafka_abort_transaction(rk, -1): duration 0.162ms 3: [0105_transactions_mock / 42.238s] Testing scenario #4 commit&flush with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.238s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock / 42.238s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 42.240s] FLUSH: duration 2.150ms 3: [0105_transactions_mock / 42.240s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.135ms 3: %3|1669457796.985|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.240s] commit&flush: duration 0.150ms 3: [0105_transactions_mock / 42.240s] Scenario #4 commit&flush failed: UNKNOWN_PRODUCER_ID: EndTxn commit failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.240s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.240s] rd_kafka_abort_transaction(rk, -1): duration 0.156ms 3: [0105_transactions_mock / 42.240s] Testing scenario #4 abort with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.241s] rd_kafka_begin_transaction(rk): duration 0.030ms 3: [0105_transactions_mock / 42.243s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.180ms 3: %3|1669457796.987|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.243s] abort: duration 0.172ms 3: [0105_transactions_mock / 42.243s] Scenario #4 abort failed: UNKNOWN_PRODUCER_ID: EndTxn abort failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.243s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.243s] rd_kafka_abort_transaction(rk, -1): duration 0.154ms 3: [0105_transactions_mock / 42.243s] Testing scenario #4 abort&flush with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.243s] rd_kafka_begin_transaction(rk): duration 0.039ms 3: [0105_transactions_mock / 42.243s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 42.245s] FLUSH: duration 2.156ms 3: [0105_transactions_mock / 42.245s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.135ms 3: %3|1669457796.990|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.246s] abort&flush: duration 0.169ms 3: [0105_transactions_mock / 42.246s] Scenario #4 abort&flush failed: UNKNOWN_PRODUCER_ID: EndTxn abort failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.246s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.246s] rd_kafka_abort_transaction(rk, -1): duration 0.154ms 3: [0105_transactions_mock / 42.246s] Testing scenario #5 commit with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.246s] rd_kafka_begin_transaction(rk): duration 0.037ms 3: [0105_transactions_mock / 42.248s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.182ms 3: %3|1669457796.993|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.248s] commit: duration 0.204ms 3: [0105_transactions_mock / 42.248s] Scenario #5 commit failed: INVALID_PRODUCER_ID_MAPPING: EndTxn commit failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.248s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.248s] rd_kafka_abort_transaction(rk, -1): duration 0.170ms 3: [0105_transactions_mock / 42.248s] Testing scenario #5 commit&flush with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.248s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock / 42.248s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 42.251s] FLUSH: duration 2.138ms 3: [0105_transactions_mock / 42.251s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.141ms 3: %3|1669457796.995|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.251s] commit&flush: duration 0.138ms 3: [0105_transactions_mock / 42.251s] Scenario #5 commit&flush failed: INVALID_PRODUCER_ID_MAPPING: EndTxn commit failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.251s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.251s] rd_kafka_abort_transaction(rk, -1): duration 0.155ms 3: [0105_transactions_mock / 42.251s] Testing scenario #5 abort with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.251s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock / 42.253s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.186ms 3: %3|1669457796.998|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.253s] abort: duration 0.172ms 3: [0105_transactions_mock / 42.254s] Scenario #5 abort failed: INVALID_PRODUCER_ID_MAPPING: EndTxn abort failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.254s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.254s] rd_kafka_abort_transaction(rk, -1): duration 0.155ms 3: [0105_transactions_mock / 42.254s] Testing scenario #5 abort&flush with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.254s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock / 42.254s] 0105_transactions_mock#producer-141: Flushing 1 messages 3: [0105_transactions_mock / 42.259s] FLUSH: duration 5.757ms 3: [0105_transactions_mock / 42.260s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.149ms 3: %3|1669457797.004|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.260s] abort&flush: duration 0.176ms 3: [0105_transactions_mock / 42.260s] Scenario #5 abort&flush failed: INVALID_PRODUCER_ID_MAPPING: EndTxn abort failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.260s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.260s] rd_kafka_abort_transaction(rk, -1): duration 0.155ms 3: [0105_transactions_mock / 42.260s] Testing scenario #6 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 42.260s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock / 42.262s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.201ms 3: %1|1669457797.007|TXNERR|0105_transactions_mock#producer-141| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457797.007|FATAL|0105_transactions_mock#producer-141| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 42.262s] commit: duration 0.202ms 3: [0105_transactions_mock / 42.262s] Scenario #6 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 42.262s] Fatal error, destroying producer 3: [0105_transactions_mock / 42.263s] Testing scenario #6 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 42.263s] Test config file test.conf not found 3: [0105_transactions_mock / 42.263s] Setting test timeout to 60s * 2.7 3: %5|1669457797.008|MOCK|0105_transactions_mock#producer-167| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37077,127.0.0.1:40663,127.0.0.1:33241 3: [0105_transactions_mock / 42.264s] Created kafka instance 0105_transactions_mock#producer-167 3: [0105_transactions_mock / 42.301s] rd_kafka_init_transactions(rk, 5000): duration 36.899ms 3: [0105_transactions_mock / 42.301s] rd_kafka_begin_transaction(rk): duration 0.081ms 3: [0105_transactions_mock / 42.301s] 0105_transactions_mock#producer-167: Flushing 1 messages 3: [0116_kafkaconsumer_close / 33.334s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=0: PASS (3.17s) ] 3: [0116_kafkaconsumer_close / 33.334s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=1 ] 3: %5|1669457797.896|CONFWARN|MOCK#producer-168| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 33.334s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 43.266s] FLUSH: duration 964.627ms 3: [0105_transactions_mock / 43.267s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.619ms 3: %1|1669457798.015|TXNERR|0105_transactions_mock#producer-167| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457798.015|FATAL|0105_transactions_mock#producer-167| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 43.270s] commit&flush: duration 3.656ms 3: [0105_transactions_mock / 43.270s] Scenario #6 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 43.270s] Fatal error, destroying producer 3: [0105_transactions_mock / 43.271s] Testing scenario #6 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 43.271s] Test config file test.conf not found 3: [0105_transactions_mock / 43.271s] Setting test timeout to 60s * 2.7 3: %5|1669457798.016|MOCK|0105_transactions_mock#producer-171| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:41001,127.0.0.1:36077,127.0.0.1:41851 3: [0105_transactions_mock / 43.272s] Created kafka instance 0105_transactions_mock#producer-171 3: [0105_transactions_mock / 43.308s] rd_kafka_init_transactions(rk, 5000): duration 35.960ms 3: [0105_transactions_mock / 43.308s] rd_kafka_begin_transaction(rk): duration 0.017ms 3: [0105_transactions_mock / 43.338s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 29.949ms 3: %1|1669457798.095|TXNERR|0105_transactions_mock#producer-171| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457798.095|FATAL|0105_transactions_mock#producer-171| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 43.350s] abort: duration 11.972ms 3: [0105_transactions_mock / 43.350s] Scenario #6 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 43.350s] Fatal error, destroying producer 3: [0105_transactions_mock / 43.358s] Testing scenario #6 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 43.358s] Test config file test.conf not found 3: [0105_transactions_mock / 43.358s] Setting test timeout to 60s * 2.7 3: %5|1669457798.103|MOCK|0105_transactions_mock#producer-172| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39289,127.0.0.1:44165,127.0.0.1:39749 3: [0105_transactions_mock / 43.362s] Created kafka instance 0105_transactions_mock#producer-172 3: [0105_transactions_mock / 43.404s] rd_kafka_init_transactions(rk, 5000): duration 41.301ms 3: [0105_transactions_mock / 43.404s] rd_kafka_begin_transaction(rk): duration 0.096ms 3: [0105_transactions_mock / 43.404s] 0105_transactions_mock#producer-172: Flushing 1 messages 3: [0105_transactions_mock / 44.362s] FLUSH: duration 958.483ms 3: [0105_transactions_mock / 44.363s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.580ms 3: %1|1669457799.108|TXNERR|0105_transactions_mock#producer-172| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457799.108|FATAL|0105_transactions_mock#producer-172| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 44.363s] abort&flush: duration 0.242ms 3: [0105_transactions_mock / 44.363s] Scenario #6 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 44.363s] Fatal error, destroying producer 3: [0105_transactions_mock / 44.364s] Testing scenario #7 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 44.364s] Test config file test.conf not found 3: [0105_transactions_mock / 44.364s] Setting test timeout to 60s * 2.7 3: %5|1669457799.109|MOCK|0105_transactions_mock#producer-173| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40053,127.0.0.1:34513,127.0.0.1:41055 3: [0105_transactions_mock / 44.367s] Created kafka instance 0105_transactions_mock#producer-173 3: [0105_transactions_mock / 44.381s] rd_kafka_init_transactions(rk, 5000): duration 13.209ms 3: [0105_transactions_mock / 44.381s] rd_kafka_begin_transaction(rk): duration 0.116ms 3: [0105_transactions_mock / 44.384s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 3.515ms 3: %1|1669457799.130|TXNERR|0105_transactions_mock#producer-173| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457799.130|FATAL|0105_transactions_mock#producer-173| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 44.385s] commit: duration 1.012ms 3: [0105_transactions_mock / 44.385s] Scenario #7 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 44.385s] Fatal error, destroying producer 3: [0105_transactions_mock / 44.388s] Testing scenario #7 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 44.388s] Test config file test.conf not found 3: [0105_transactions_mock / 44.388s] Setting test timeout to 60s * 2.7 3: %5|1669457799.133|MOCK|0105_transactions_mock#producer-174| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35401,127.0.0.1:46543,127.0.0.1:33299 3: [0105_transactions_mock / 44.389s] Created kafka instance 0105_transactions_mock#producer-174 3: [0105_transactions_mock / 44.419s] rd_kafka_init_transactions(rk, 5000): duration 30.321ms 3: [0105_transactions_mock / 44.419s] rd_kafka_begin_transaction(rk): duration 0.106ms 3: [0105_transactions_mock / 44.419s] 0105_transactions_mock#producer-174: Flushing 1 messages 3: [
/ 52.687s] Too many tests running (5 >= 5): postponing 0121_clusterid start... 3: [0105_transactions_mock / 45.391s] FLUSH: duration 971.873ms 3: [0105_transactions_mock / 45.392s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.512ms 3: %1|1669457800.137|TXNERR|0105_transactions_mock#producer-174| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457800.137|FATAL|0105_transactions_mock#producer-174| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 45.392s] commit&flush: duration 0.274ms 3: [0105_transactions_mock / 45.392s] Scenario #7 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 45.392s] Fatal error, destroying producer 3: [0105_transactions_mock / 45.393s] Testing scenario #7 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 45.393s] Test config file test.conf not found 3: [0105_transactions_mock / 45.393s] Setting test timeout to 60s * 2.7 3: %5|1669457800.138|MOCK|0105_transactions_mock#producer-175| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42365,127.0.0.1:46861,127.0.0.1:46437 3: [0105_transactions_mock / 45.397s] Created kafka instance 0105_transactions_mock#producer-175 3: [0105_transactions_mock / 45.423s] rd_kafka_init_transactions(rk, 5000): duration 25.915ms 3: [0105_transactions_mock / 45.423s] rd_kafka_begin_transaction(rk): duration 0.013ms 3: [0105_transactions_mock / 45.446s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 22.881ms 3: %1|1669457800.203|TXNERR|0105_transactions_mock#producer-175| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457800.203|FATAL|0105_transactions_mock#producer-175| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 45.458s] abort: duration 12.080ms 3: [0105_transactions_mock / 45.458s] Scenario #7 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 45.458s] Fatal error, destroying producer 3: [0105_transactions_mock / 45.463s] Testing scenario #7 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 45.463s] Test config file test.conf not found 3: [0105_transactions_mock / 45.463s] Setting test timeout to 60s * 2.7 3: %5|1669457800.208|MOCK|0105_transactions_mock#producer-176| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38519,127.0.0.1:35715,127.0.0.1:46453 3: [0105_transactions_mock / 45.470s] Created kafka instance 0105_transactions_mock#producer-176 3: [0105_transactions_mock / 45.519s] rd_kafka_init_transactions(rk, 5000): duration 48.987ms 3: [0105_transactions_mock / 45.523s] rd_kafka_begin_transaction(rk): duration 3.948ms 3: [0105_transactions_mock / 45.523s] 0105_transactions_mock#producer-176: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 38.082s] Rebalance #11: _ASSIGN_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 38.082s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 38.082s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.081ms 3: [0113_cooperative_rebalance_local/ 38.082s] assign: incremental assign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 38.082s] Closing consumer 3: [0113_cooperative_rebalance_local/ 38.082s] Closing consumer 0113_cooperative_rebalance_local#consumer-149 3: [0113_cooperative_rebalance_local/ 38.083s] Rebalance #12: _REVOKE_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 38.083s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 38.083s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.032ms 3: [0113_cooperative_rebalance_local/ 38.083s] unassign: incremental unassign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 38.083s] CONSUMER.CLOSE: duration 0.688ms 3: [0113_cooperative_rebalance_local/ 38.083s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 38.083s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 38.084s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=1: PASS (11.12s) ] 3: [0113_cooperative_rebalance_local/ 38.084s] [ r_lost_partitions_commit_illegal_generation_test_local:2860 ] 3: %5|1669457800.640|CONFWARN|MOCK#producer-177| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 38.084s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 38.089s] Created kafka instance 0113_cooperative_rebalance_local#producer-178 3: [0113_cooperative_rebalance_local/ 38.089s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 38.089s] Produce to test [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 38.089s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 38.089s] PRODUCE: duration 0.118ms 3: [0113_cooperative_rebalance_local/ 38.140s] PRODUCE.DELIVERY.WAIT: duration 51.617ms 3: [0113_cooperative_rebalance_local/ 38.146s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 38.146s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 38.153s] Created kafka instance 0113_cooperative_rebalance_local#consumer-179 3: [0113_cooperative_rebalance_local/ 38.156s] r_lost_partitions_commit_illegal_generation_test_local:2883: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 46.469s] FLUSH: duration 945.552ms 3: [0105_transactions_mock / 46.470s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.776ms 3: %1|1669457801.214|TXNERR|0105_transactions_mock#producer-176| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457801.214|FATAL|0105_transactions_mock#producer-176| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 46.470s] abort&flush: duration 0.241ms 3: [0105_transactions_mock / 46.470s] Scenario #7 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 46.470s] Fatal error, destroying producer 3: [0105_transactions_mock / 46.471s] Testing scenario #8 commit with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 46.471s] Test config file test.conf not found 3: [0105_transactions_mock / 46.471s] Setting test timeout to 60s * 2.7 3: %5|1669457801.215|MOCK|0105_transactions_mock#producer-180| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46317,127.0.0.1:40693,127.0.0.1:41431 3: [0105_transactions_mock / 46.472s] Created kafka instance 0105_transactions_mock#producer-180 3: [0105_transactions_mock / 46.483s] rd_kafka_init_transactions(rk, 5000): duration 11.456ms 3: [0105_transactions_mock / 46.483s] rd_kafka_begin_transaction(rk): duration 0.148ms 3: [0105_transactions_mock / 46.488s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 4.261ms 3: %1|1669457801.235|TXNERR|0105_transactions_mock#producer-180| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1669457801.235|FATAL|0105_transactions_mock#producer-180| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 46.491s] commit: duration 2.763ms 3: [0105_transactions_mock / 46.491s] Scenario #8 commit failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 46.491s] Fatal error, destroying producer 3: [0105_transactions_mock / 46.503s] Testing scenario #8 commit&flush with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 46.503s] Test config file test.conf not found 3: [0105_transactions_mock / 46.503s] Setting test timeout to 60s * 2.7 3: %5|1669457801.248|MOCK|0105_transactions_mock#producer-181| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38095,127.0.0.1:36591,127.0.0.1:35129 3: [0105_transactions_mock / 46.504s] Created kafka instance 0105_transactions_mock#producer-181 3: [0105_transactions_mock / 46.517s] rd_kafka_init_transactions(rk, 5000): duration 12.669ms 3: [0105_transactions_mock / 46.517s] rd_kafka_begin_transaction(rk): duration 0.072ms 3: [0105_transactions_mock / 46.517s] 0105_transactions_mock#producer-181: Flushing 1 messages 3: [0105_transactions_mock / 47.507s] FLUSH: duration 989.868ms 3: [0105_transactions_mock / 47.507s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.591ms 3: %1|1669457802.252|TXNERR|0105_transactions_mock#producer-181| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1669457802.252|FATAL|0105_transactions_mock#producer-181| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 47.508s] commit&flush: duration 0.262ms 3: [0105_transactions_mock / 47.508s] Scenario #8 commit&flush failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 47.508s] Fatal error, destroying producer 3: [0105_transactions_mock / 47.508s] Testing scenario #8 abort with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 47.508s] Test config file test.conf not found 3: [0105_transactions_mock / 47.508s] Setting test timeout to 60s * 2.7 3: %5|1669457802.253|MOCK|0105_transactions_mock#producer-182| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40095,127.0.0.1:34283,127.0.0.1:41217 3: [0105_transactions_mock / 47.512s] Created kafka instance 0105_transactions_mock#producer-182 3: [0105_transactions_mock / 47.532s] rd_kafka_init_transactions(rk, 5000): duration 20.032ms 3: [0105_transactions_mock / 47.533s] rd_kafka_begin_transaction(rk): duration 0.269ms 3: [0105_transactions_mock / 47.546s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 13.643ms 3: %1|1669457802.302|TXNERR|0105_transactions_mock#producer-182| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1669457802.302|FATAL|0105_transactions_mock#producer-182| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 47.560s] abort: duration 13.922ms 3: [0105_transactions_mock / 47.560s] Scenario #8 abort failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 47.560s] Fatal error, destroying producer 3: [0105_transactions_mock / 47.571s] Testing scenario #8 abort&flush with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 47.571s] Test config file test.conf not found 3: [0105_transactions_mock / 47.571s] Setting test timeout to 60s * 2.7 3: %5|1669457802.316|MOCK|0105_transactions_mock#producer-183| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39379,127.0.0.1:34795,127.0.0.1:41713 3: [0105_transactions_mock / 47.572s] Created kafka instance 0105_transactions_mock#producer-183 3: [0105_transactions_mock / 47.603s] rd_kafka_init_transactions(rk, 5000): duration 31.014ms 3: [0105_transactions_mock / 47.603s] rd_kafka_begin_transaction(rk): duration 0.014ms 3: [0105_transactions_mock / 47.603s] 0105_transactions_mock#producer-183: Flushing 1 messages 3: %4|1669457802.903|SESSTMOUT|0106_cgrp_sess_timeout#consumer-160| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0116_kafkaconsumer_close / 38.354s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=1: PASS (5.02s) ] 3: [0116_kafkaconsumer_close / 38.354s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=1 ] 3: %5|1669457802.916|CONFWARN|MOCK#producer-184| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 38.354s] Setting test timeout to 10s * 2.7 3: [0106_cgrp_sess_timeout / 42.417s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: %4|1669457803.006|COMMITFAIL|0106_cgrp_sess_timeout#consumer-160| [thrd:main]: Offset commit (unassigned partitions) failed for 1/4 partition(s) in join-state wait-unassign-to-complete: Broker: Unknown member: test[0]@17(Broker: Unknown member) 3: [0106_cgrp_sess_timeout / 42.420s] UNASSIGN.PARTITIONS: duration 2.972ms 3: [0106_cgrp_sess_timeout / 42.420s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 42.420s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: [0120_asymmetric_subscription/ 13.041s] c0#consumer-161: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[2], t2[3] 3: [0120_asymmetric_subscription/ 13.042s] c1#consumer-162: Assignment (6 partition(s)): t2[0], t2[1], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 13.042s] c2#consumer-163: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 13.042s] rd_kafka_assignment(c[i], &assignment): duration 0.014ms 3: [0120_asymmetric_subscription/ 13.042s] rd_kafka_assignment(c[i], &assignment): duration 0.022ms 3: [0120_asymmetric_subscription/ 13.042s] rd_kafka_assignment(c[i], &assignment): duration 0.027ms 3: [0120_asymmetric_subscription/ 13.044s] [ do_test_asymmetric:71: range assignor: PASS (9.01s) ] 3: [0120_asymmetric_subscription/ 13.044s] [ do_test_asymmetric:71: cooperative-sticky assignor ] 3: [0120_asymmetric_subscription/ 13.044s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 13.044s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 13.044s] Created kafka instance c0#consumer-187 3: [0120_asymmetric_subscription/ 13.044s] rd_kafka_subscribe(c[i], tlist): duration 0.232ms 3: [0120_asymmetric_subscription/ 13.045s] Created kafka instance c1#consumer-188 3: [0120_asymmetric_subscription/ 13.045s] rd_kafka_subscribe(c[i], tlist): duration 0.064ms 3: [0120_asymmetric_subscription/ 13.045s] Created kafka instance c2#consumer-189 3: [0120_asymmetric_subscription/ 13.045s] rd_kafka_subscribe(c[i], tlist): duration 0.060ms 3: [0105_transactions_mock / 48.575s] FLUSH: duration 971.453ms 3: [0105_transactions_mock / 48.575s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.218ms 3: %1|1669457803.320|TXNERR|0105_transactions_mock#producer-183| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1669457803.320|FATAL|0105_transactions_mock#producer-183| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 48.575s] abort&flush: duration 0.183ms 3: [0105_transactions_mock / 48.575s] Scenario #8 abort&flush failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 48.575s] Fatal error, destroying producer 3: [0105_transactions_mock / 48.576s] Testing scenario #9 commit with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 48.576s] Test config file test.conf not found 3: [0105_transactions_mock / 48.576s] Setting test timeout to 60s * 2.7 3: %5|1669457803.321|MOCK|0105_transactions_mock#producer-190| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:33513,127.0.0.1:38455,127.0.0.1:46349 3: [0105_transactions_mock / 48.576s] Created kafka instance 0105_transactions_mock#producer-190 3: [0105_transactions_mock / 48.616s] rd_kafka_init_transactions(rk, 5000): duration 39.797ms 3: [0105_transactions_mock / 48.619s] rd_kafka_begin_transaction(rk): duration 2.991ms 3: [0105_transactions_mock / 48.634s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 14.893ms 3: [0113_cooperative_rebalance_local/ 41.198s] Rebalance #13: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 41.198s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 41.198s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 41.198s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 41.198s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 41.198s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.063ms 3: [0113_cooperative_rebalance_local/ 41.198s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 41.299s] consume: consume 50 messages 3: [0113_cooperative_rebalance_local/ 41.299s] CONSUME: duration 0.371ms 3: [0113_cooperative_rebalance_local/ 41.299s] consume: consumed 50/50 messages (0/-1 EOFs) 3: [0113_cooperative_rebalance_local/ 41.299s] r_lost_partitions_commit_illegal_generation_test_local:2901: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0113_cooperative_rebalance_local/ 41.299s] Rebalance #14: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 41.299s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 41.299s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 41.299s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 41.299s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 41.299s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 41.299s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.026ms 3: [0113_cooperative_rebalance_local/ 41.299s] unassign: incremental unassign of 4 partition(s) done 3: %3|1669457804.324|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 49.579s] commit: duration 944.826ms 3: [0105_transactions_mock / 49.579s] Scenario #9 commit failed: GROUP_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.579s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.579s] rd_kafka_abort_transaction(rk, -1): duration 0.177ms 3: [0105_transactions_mock / 49.579s] Testing scenario #9 commit&flush with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 49.579s] rd_kafka_begin_transaction(rk): duration 0.035ms 3: [0105_transactions_mock / 49.579s] 0105_transactions_mock#producer-190: Flushing 1 messages 3: [0105_transactions_mock / 49.582s] FLUSH: duration 2.181ms 3: [0105_transactions_mock / 49.582s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.173ms 3: %3|1669457804.326|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 49.582s] commit&flush: duration 0.155ms 3: [0105_transactions_mock / 49.582s] Scenario #9 commit&flush failed: GROUP_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.582s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.582s] rd_kafka_abort_transaction(rk, -1): duration 0.162ms 3: [0105_transactions_mock / 49.582s] Testing scenario #9 abort with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 49.582s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock / 49.584s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.132ms 3: %3|1669457804.329|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 49.585s] abort: duration 0.204ms 3: [0105_transactions_mock / 49.585s] Scenario #9 abort failed: GROUP_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.585s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.585s] rd_kafka_abort_transaction(rk, -1): duration 0.152ms 3: [0105_transactions_mock / 49.585s] Testing scenario #9 abort&flush with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 49.585s] rd_kafka_begin_transaction(rk): duration 0.035ms 3: [0105_transactions_mock / 49.585s] 0105_transactions_mock#producer-190: Flushing 1 messages 3: [0105_transactions_mock / 49.586s] FLUSH: duration 1.244ms 3: [0105_transactions_mock / 49.586s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.158ms 3: %3|1669457804.331|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 49.587s] abort&flush: duration 0.665ms 3: [0105_transactions_mock / 49.587s] Scenario #9 abort&flush failed: GROUP_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.587s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.590s] rd_kafka_abort_transaction(rk, -1): duration 3.500ms 3: [0105_transactions_mock / 49.590s] Testing scenario #10 commit with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 49.590s] rd_kafka_begin_transaction(rk): duration 0.021ms 3: [0105_transactions_mock / 49.591s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.854ms 3: %3|1669457804.336|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 49.592s] commit: duration 0.387ms 3: [0105_transactions_mock / 49.592s] Scenario #10 commit failed: INVALID_MSG_SIZE: EndTxn commit failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.592s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.592s] rd_kafka_abort_transaction(rk, -1): duration 0.157ms 3: [0105_transactions_mock / 49.592s] Testing scenario #10 commit&flush with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 49.592s] rd_kafka_begin_transaction(rk): duration 0.039ms 3: [0105_transactions_mock / 49.592s] 0105_transactions_mock#producer-190: Flushing 1 messages 3: [0105_transactions_mock / 49.593s] FLUSH: duration 0.736ms 3: [0105_transactions_mock / 49.593s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.144ms 3: %3|1669457804.337|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 49.593s] commit&flush: duration 0.137ms 3: [0105_transactions_mock / 49.593s] Scenario #10 commit&flush failed: INVALID_MSG_SIZE: EndTxn commit failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.593s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.593s] rd_kafka_abort_transaction(rk, -1): duration 0.159ms 3: [0105_transactions_mock / 49.593s] Testing scenario #10 abort with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 49.593s] rd_kafka_begin_transaction(rk): duration 0.039ms 3: [0105_transactions_mock / 49.593s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.142ms 3: %3|1669457804.342|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 49.598s] abort: duration 4.236ms 3: [0105_transactions_mock / 49.598s] Scenario #10 abort failed: INVALID_MSG_SIZE: EndTxn abort failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.598s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.598s] rd_kafka_abort_transaction(rk, -1): duration 0.154ms 3: [0105_transactions_mock / 49.598s] Testing scenario #10 abort&flush with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 49.598s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock / 49.598s] 0105_transactions_mock#producer-190: Flushing 1 messages 3: [0105_transactions_mock / 49.600s] FLUSH: duration 2.082ms 3: [0105_transactions_mock / 49.600s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.154ms 3: %3|1669457804.345|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 49.600s] abort&flush: duration 0.171ms 3: [0105_transactions_mock / 49.600s] Scenario #10 abort&flush failed: INVALID_MSG_SIZE: EndTxn abort failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 49.600s] Abortable error, aborting transaction 3: [0105_transactions_mock / 49.600s] rd_kafka_abort_transaction(rk, -1): duration 0.171ms 3: [0105_transactions_mock / 49.600s] Testing scenario #11 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 49.600s] rd_kafka_begin_transaction(rk): duration 0.022ms 3: [0105_transactions_mock / 49.601s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.817ms 3: %1|1669457804.350|TXNERR|0105_transactions_mock#producer-190| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457804.350|FATAL|0105_transactions_mock#producer-190| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 49.605s] commit: duration 4.201ms 3: [0105_transactions_mock / 49.605s] Scenario #11 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 49.606s] Fatal error, destroying producer 3: [0105_transactions_mock / 49.606s] Testing scenario #11 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 49.606s] Test config file test.conf not found 3: [0105_transactions_mock / 49.606s] Setting test timeout to 60s * 2.7 3: %5|1669457804.351|MOCK|0105_transactions_mock#producer-191| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46677,127.0.0.1:33853,127.0.0.1:37133 3: [0105_transactions_mock / 49.609s] Created kafka instance 0105_transactions_mock#producer-191 3: [0105_transactions_mock / 49.611s] rd_kafka_init_transactions(rk, 5000): duration 1.537ms 3: [0105_transactions_mock / 49.611s] rd_kafka_begin_transaction(rk): duration 0.066ms 3: [0105_transactions_mock / 49.611s] 0105_transactions_mock#producer-191: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 42.300s] r_lost_partitions_commit_illegal_generation_test_local:2904: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 22s 3: [0105_transactions_mock / 50.612s] FLUSH: duration 1000.712ms 3: [0105_transactions_mock / 50.612s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.693ms 3: %1|1669457805.357|TXNERR|0105_transactions_mock#producer-191| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457805.357|FATAL|0105_transactions_mock#producer-191| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 50.613s] commit&flush: duration 0.257ms 3: [0105_transactions_mock / 50.613s] Scenario #11 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 50.613s] Fatal error, destroying producer 3: [0105_transactions_mock / 50.613s] Testing scenario #11 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 50.613s] Test config file test.conf not found 3: [0105_transactions_mock / 50.613s] Setting test timeout to 60s * 2.7 3: %5|1669457805.358|MOCK|0105_transactions_mock#producer-192| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42487,127.0.0.1:39419,127.0.0.1:45503 3: [0105_transactions_mock / 50.617s] Created kafka instance 0105_transactions_mock#producer-192 3: [0105_transactions_mock / 50.628s] rd_kafka_init_transactions(rk, 5000): duration 10.584ms 3: [0105_transactions_mock / 50.628s] rd_kafka_begin_transaction(rk): duration 0.162ms 3: [0105_transactions_mock / 50.634s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 6.189ms 3: %1|1669457805.383|TXNERR|0105_transactions_mock#producer-192| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457805.383|FATAL|0105_transactions_mock#producer-192| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 50.639s] abort: duration 4.272ms 3: [0105_transactions_mock / 50.639s] Scenario #11 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 50.639s] Fatal error, destroying producer 3: [0105_transactions_mock / 50.641s] Testing scenario #11 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 50.642s] Test config file test.conf not found 3: [0105_transactions_mock / 50.642s] Setting test timeout to 60s * 2.7 3: %5|1669457805.386|MOCK|0105_transactions_mock#producer-193| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44459,127.0.0.1:35531,127.0.0.1:45841 3: [0105_transactions_mock / 50.643s] Created kafka instance 0105_transactions_mock#producer-193 3: [0105_transactions_mock / 50.671s] rd_kafka_init_transactions(rk, 5000): duration 27.865ms 3: [0105_transactions_mock / 50.671s] rd_kafka_begin_transaction(rk): duration 0.014ms 3: [0105_transactions_mock / 50.671s] 0105_transactions_mock#producer-193: Flushing 1 messages 3: [0105_transactions_mock / 51.646s] FLUSH: duration 974.337ms 3: [0105_transactions_mock / 51.646s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.572ms 3: %1|1669457806.391|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1669457806.391|FATAL|0105_transactions_mock#producer-193| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 51.647s] abort&flush: duration 0.185ms 3: [0105_transactions_mock / 51.647s] Scenario #11 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 51.647s] Fatal error, destroying producer 3: [0105_transactions_mock / 51.647s] [ do_test_txn_endtxn_errors:705: PASS (19.88s) ] 3: [0105_transactions_mock / 51.647s] [ do_test_txn_endtxn_infinite:901 ] 3: [0105_transactions_mock / 51.647s] Test config file test.conf not found 3: [0105_transactions_mock / 51.647s] Setting test timeout to 60s * 2.7 3: %5|1669457806.392|MOCK|0105_transactions_mock#producer-194| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43993,127.0.0.1:32899,127.0.0.1:34799 3: [0105_transactions_mock / 51.648s] Created kafka instance 0105_transactions_mock#producer-194 3: [0105_transactions_mock / 51.652s] rd_kafka_init_transactions(rk, 5000): duration 3.809ms 3: [0105_transactions_mock / 51.652s] rd_kafka_begin_transaction(rk): duration 0.075ms 3: [0105_transactions_mock / 51.652s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0116_kafkaconsumer_close / 42.366s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=1: PASS (4.01s) ] 3: [0116_kafkaconsumer_close / 42.366s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=1 ] 3: %5|1669457806.928|CONFWARN|MOCK#producer-195| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 42.366s] Setting test timeout to 10s * 2.7 3: [0120_asymmetric_subscription/ 17.046s] c0#consumer-187: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[0], t2[2] 3: [0120_asymmetric_subscription/ 17.046s] c1#consumer-188: Assignment (6 partition(s)): t2[1], t2[3], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 17.046s] c2#consumer-189: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 17.046s] rd_kafka_assignment(c[i], &assignment): duration 0.012ms 3: [0120_asymmetric_subscription/ 17.046s] rd_kafka_assignment(c[i], &assignment): duration 0.023ms 3: [0120_asymmetric_subscription/ 17.046s] rd_kafka_assignment(c[i], &assignment): duration 0.020ms 3: [0120_asymmetric_subscription/ 17.046s] Closing consumer c0#consumer-187 3: [0120_asymmetric_subscription/ 17.046s] CONSUMER.CLOSE: duration 0.182ms 3: [0120_asymmetric_subscription/ 17.047s] Closing consumer c2#consumer-189 3: [0120_asymmetric_subscription/ 17.047s] CONSUMER.CLOSE: duration 0.077ms 3: [0120_asymmetric_subscription/ 17.048s] [ do_test_asymmetric:71: cooperative-sticky assignor: PASS (4.00s) ] 3: [0120_asymmetric_subscription/ 17.048s] 0120_asymmetric_subscription: duration 17048.005ms 3: [0120_asymmetric_subscription/ 17.048s] ================= Test 0120_asymmetric_subscription PASSED ================= 3: [
/ 59.817s] Too many tests running (5 >= 5): postponing 0124_openssl_invalid_engine start... 3: [0121_clusterid / 0.000s] ================= Running test 0121_clusterid ================= 3: [0121_clusterid / 0.000s] ==== Stats written to file stats_0121_clusterid_6207877448432754174.json ==== 3: [0121_clusterid / 0.000s] Test config file test.conf not found 3: %5|1669457807.236|CONFWARN|MOCK#producer-198| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1669457807.240|CONFWARN|MOCK#producer-199| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0121_clusterid / 0.008s] Test config file test.conf not found 3: [0121_clusterid / 0.008s] Setting test timeout to 10s * 2.7 3: [
/ 59.826s] Log: 0121_clusterid#producer-200 level 3 fac FAIL: [thrd:127.0.0.1:34525/bootstrap]: 127.0.0.1:34525/bootstrap: Connect to ipv4#127.0.0.1:34525 failed: Connection refused (after 0ms in state CONNECT) 3: [0121_clusterid / 0.012s] Created kafka instance 0121_clusterid#producer-200 3: [0106_cgrp_sess_timeout / 47.331s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.331s] ASSIGN.PARTITIONS: duration 0.063ms 3: [0106_cgrp_sess_timeout / 47.331s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.331s] Closing consumer 0106_cgrp_sess_timeout#consumer-160 3: [0106_cgrp_sess_timeout / 47.331s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.331s] UNASSIGN.PARTITIONS: duration 0.109ms 3: [0106_cgrp_sess_timeout / 47.331s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 47.331s] CONSUMER.CLOSE: duration 0.377ms 3: [0106_cgrp_sess_timeout / 47.332s] [ do_test_session_timeout:152: Test session timeout with auto commit: PASS (15.13s) ] 3: [0106_cgrp_sess_timeout / 47.332s] [ do_test_commit_on_lost:231 ] 3: %5|1669457807.920|CONFWARN|MOCK#producer-201| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 47.333s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.333s] Created kafka instance 0106_cgrp_sess_timeout#producer-202 3: [0106_cgrp_sess_timeout / 47.333s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.333s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 47.333s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 47.333s] PRODUCE: duration 0.134ms 3: [0106_cgrp_sess_timeout / 47.375s] PRODUCE.DELIVERY.WAIT: duration 42.134ms 3: [0106_cgrp_sess_timeout / 47.379s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.379s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 47.380s] Created kafka instance 0106_cgrp_sess_timeout#consumer-203 3: [0106_cgrp_sess_timeout / 47.380s] consume: consume 10 messages 3: [0113_cooperative_rebalance_local/ 46.301s] Rebalance #15: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 46.301s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 46.301s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 46.301s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 46.301s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 46.301s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.046ms 3: [0113_cooperative_rebalance_local/ 46.301s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 46.302s] Closing consumer 3: [0113_cooperative_rebalance_local/ 46.302s] Closing consumer 0113_cooperative_rebalance_local#consumer-179 3: [0113_cooperative_rebalance_local/ 46.302s] Rebalance #16: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 46.302s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 46.302s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 46.302s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 46.302s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 46.302s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.037ms 3: [0113_cooperative_rebalance_local/ 46.302s] unassign: incremental unassign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 46.302s] CONSUMER.CLOSE: duration 0.303ms 3: [0113_cooperative_rebalance_local/ 46.302s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 46.302s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 46.303s] 0113_cooperative_rebalance_local: duration 46303.202ms 3: [0113_cooperative_rebalance_local/ 46.303s] ================= Test 0113_cooperative_rebalance_local PASSED ================= 3: [
/ 61.441s] Too many tests running (5 >= 5): postponing 0128_sasl_callback_queue start... 3: [0124_openssl_invalid_engine / 0.000s] ================= Running test 0124_openssl_invalid_engine ================= 3: [0124_openssl_invalid_engine / 0.000s] ==== Stats written to file stats_0124_openssl_invalid_engine_7108788874712968585.json ==== 3: [0124_openssl_invalid_engine / 0.000s] Test config file test.conf not found 3: [0124_openssl_invalid_engine / 0.000s] Setting test timeout to 30s * 2.7 3: %3|1669457808.859|SSL|0124_openssl_invalid_engine#producer-204| [thrd:app]: error:25066067:DSO support routines:dlfcn_load:could not load the shared library: filename(libinvalid_path.so): libinvalid_path.so: cannot open shared object file: No such file or directory 3: %3|1669457808.859|SSL|0124_openssl_invalid_engine#producer-204| [thrd:app]: error:25070067:DSO support routines:DSO_load:could not load the shared library 3: [0124_openssl_invalid_engine / 0.000s] rd_kafka_new() failed (as expected): OpenSSL engine initialization failed in ENGINE_ctrl_cmd_string LOAD: error:260B6084:engine routines:dynamic_load:dso not found 3: [0124_openssl_invalid_engine / 0.000s] 0124_openssl_invalid_engine: duration 0.273ms 3: [0124_openssl_invalid_engine / 0.000s] ================= Test 0124_openssl_invalid_engine PASSED ================= 3: [
/ 61.541s] Too many tests running (5 >= 5): postponing 0131_connect_timeout start... 3: [0128_sasl_callback_queue / 0.000s] ================= Running test 0128_sasl_callback_queue ================= 3: [0128_sasl_callback_queue / 0.000s] ==== Stats written to file stats_0128_sasl_callback_queue_5918021093952629313.json ==== 3: [0128_sasl_callback_queue / 0.000s] Feature "sasl_oauthbearer" is built-in 3: [0128_sasl_callback_queue / 0.000s] [ do_test:64: Use background queue = yes ] 3: %5|1669457808.960|CONFWARN|rdkafka#producer-205| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1669457808.961|ERROR|rdkafka#producer-205| [thrd:background]: Failed to acquire SASL OAUTHBEARER token: Not implemented by this test, but that's okay 3: [
/ 61.543s] Callback called! 3: [0106_cgrp_sess_timeout / 50.411s] 0106_cgrp_sess_timeout#consumer-203: Rebalance: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 50.411s] ASSIGN.PARTITIONS: duration 0.175ms 3: [0106_cgrp_sess_timeout / 50.411s] assign: assigned 4 partition(s) 3: [
/ 63.828s] Log: 0121_clusterid#producer-200 level 6 fac FAIL: [thrd:127.0.0.1:39641/bootstrap]: 127.0.0.1:39641/1: Disconnected (after 3001ms in state UP) 3: [
/ 63.829s] Log: 0121_clusterid#producer-200 level 3 fac FAIL: [thrd:127.0.0.1:39641/bootstrap]: 127.0.0.1:39641/1: Connect to ipv4#127.0.0.1:39641 failed: Connection refused (after 0ms in state CONNECT) 3: [0106_cgrp_sess_timeout / 51.014s] CONSUME: duration 3633.248ms 3: [0106_cgrp_sess_timeout / 51.014s] consume: consumed 10/10 messages (0/-1 EOFs) 3: %6|1669457811.601|FAIL|0106_cgrp_sess_timeout#consumer-203| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:40659: Disconnected (after 3616ms in state UP) 3: %6|1669457811.601|FAIL|0106_cgrp_sess_timeout#consumer-203| [thrd:127.0.0.1:40659/bootstrap]: 127.0.0.1:40659/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 502ms in state UP) 3: [0106_cgrp_sess_timeout / 51.014s] Waiting for assignment to be lost... 3: %3|1669457811.602|FAIL|0106_cgrp_sess_timeout#consumer-203| [thrd:127.0.0.1:40659/bootstrap]: 127.0.0.1:40659/1: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457811.602|FAIL|0106_cgrp_sess_timeout#consumer-203| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:40659: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 64.372s] Log: 0121_clusterid#producer-200 level 3 fac FAIL: [thrd:127.0.0.1:39641/bootstrap]: 127.0.0.1:39641/1: Connect to ipv4#127.0.0.1:39641 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: %3|1669457811.859|FAIL|0106_cgrp_sess_timeout#consumer-203| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:40659: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0116_kafkaconsumer_close / 47.410s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=1: PASS (5.04s) ] 3: [0116_kafkaconsumer_close / 47.410s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=1 ] 3: %5|1669457811.973|CONFWARN|MOCK#producer-206| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 47.411s] Setting test timeout to 10s * 2.7 3: %3|1669457811.977|FAIL|0106_cgrp_sess_timeout#consumer-203| [thrd:127.0.0.1:40659/bootstrap]: 127.0.0.1:40659/1: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/ 64.827s] Log: 0121_clusterid#producer-200 level 4 fac CLUSTERID: [thrd:main]: Broker 127.0.0.1:34525/bootstrap reports different ClusterId "mockClusterfd54145c" than previously known "mockClusterfd54109c": a client must not be simultaneously connected to multiple clusters 3: [0105_transactions_mock / 58.172s] commit_transaction(): duration 5519.448ms 3: [0105_transactions_mock / 58.172s] commit returned success 3: [0105_transactions_mock / 58.172s] rd_kafka_begin_transaction(rk): duration 0.046ms 3: [0105_transactions_mock / 58.172s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: [0121_clusterid / 6.013s] 0121_clusterid: duration 6012.877ms 3: [0121_clusterid / 6.013s] ================= Test 0121_clusterid PASSED ================= 3: [
/ 65.830s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0131_connect_timeout / 0.000s] ================= Running test 0131_connect_timeout ================= 3: [0131_connect_timeout / 0.000s] ==== Stats written to file stats_0131_connect_timeout_6879034109218585973.json ==== 3: [0131_connect_timeout / 0.000s] Test config file test.conf not found 3: [0131_connect_timeout / 0.000s] Setting test timeout to 20s * 2.7 3: [0131_connect_timeout / 0.007s] Created kafka instance 0131_connect_timeout#producer-209 3: [0128_sasl_callback_queue / 5.001s] [ do_test:64: Use background queue = yes: PASS (5.00s) ] 3: [0128_sasl_callback_queue / 5.001s] [ do_test:64: Use background queue = no ] 3: %5|1669457813.960|CONFWARN|rdkafka#producer-210| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [
/ 66.830s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [
/ 67.831s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0116_kafkaconsumer_close / 51.468s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=1: PASS (4.06s) ] 3: [0116_kafkaconsumer_close / 51.468s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=1 ] 3: %5|1669457816.030|CONFWARN|MOCK#producer-211| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 51.468s] Setting test timeout to 10s * 2.7 3: [
/ 68.831s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: %4|1669457816.999|SESSTMOUT|0106_cgrp_sess_timeout#consumer-203| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Success): revoking assignment and rejoining group 3: [
/ 69.831s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0106_cgrp_sess_timeout / 57.014s] Assignment is lost, committing 3: [0106_cgrp_sess_timeout / 57.014s] commit() returned: _ASSIGNMENT_LOST 3: [0106_cgrp_sess_timeout / 57.014s] Closing consumer 0106_cgrp_sess_timeout#consumer-203 3: [0106_cgrp_sess_timeout / 57.014s] 0106_cgrp_sess_timeout#consumer-203 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:40659: Disconnected (after 3616ms in state UP) 3: [0106_cgrp_sess_timeout / 57.014s] 0106_cgrp_sess_timeout#consumer-203 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:40659/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 502ms in state UP) 3: [0106_cgrp_sess_timeout / 57.014s] 0106_cgrp_sess_timeout#consumer-203 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:40659/1: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT) 3: [0106_cgrp_sess_timeout / 57.014s] 0106_cgrp_sess_timeout#consumer-203 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:40659: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT) 3: [0106_cgrp_sess_timeout / 57.014s] 0106_cgrp_sess_timeout#consumer-203 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:40659: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0106_cgrp_sess_timeout / 57.014s] 0106_cgrp_sess_timeout#consumer-203 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:40659/1: Connect to ipv4#127.0.0.1:40659 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0106_cgrp_sess_timeout / 57.014s] 0106_cgrp_sess_timeout#consumer-203: Rebalance: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 57.014s] UNASSIGN.PARTITIONS: duration 0.029ms 3: [0106_cgrp_sess_timeout / 57.014s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 57.014s] CONSUMER.CLOSE: duration 0.136ms 3: [0106_cgrp_sess_timeout / 57.015s] [ do_test_commit_on_lost:231: PASS (9.68s) ] 3: [0106_cgrp_sess_timeout / 57.015s] 0106_cgrp_sess_timeout: duration 57015.159ms 3: [0106_cgrp_sess_timeout / 57.015s] ================= Test 0106_cgrp_sess_timeout PASSED ================= 3: [
/ 70.831s] 4 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0128_sasl_callback_queue / 10.001s] [ do_test:64: Use background queue = no: PASS (5.00s) ] 3: [0128_sasl_callback_queue / 10.001s] 0128_sasl_callback_queue: duration 10001.451ms 3: [0128_sasl_callback_queue / 10.001s] ================= Test 0128_sasl_callback_queue PASSED ================= 3: [
/ 71.831s] 3 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 0131_connect_timeout 3: [
/ 71.867s] Log: 0131_connect_timeout#producer-209 level 7 fac FAIL: [thrd:127.0.0.1:44745/bootstrap]: 127.0.0.1:44745/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) (_TRANSPORT) 3: [
/ 71.867s] Log: 0131_connect_timeout#producer-209 level 4 fac FAIL: [thrd:127.0.0.1:44745/bootstrap]: 127.0.0.1:44745/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) 3: [0105_transactions_mock / 64.691s] abort_transaction(): duration 5519.808ms 3: [0105_transactions_mock / 64.692s] abort returned success 3: [0105_transactions_mock / 64.692s] [ do_test_txn_endtxn_infinite:901: PASS (13.05s) ] 3: [0105_transactions_mock / 64.692s] [ do_test_txn_broker_down_in_txn:1280: Test coordinator down ] 3: [0105_transactions_mock / 64.692s] Test config file test.conf not found 3: [0105_transactions_mock / 64.692s] Setting test timeout to 60s * 2.7 3: %5|1669457819.437|MOCK|0105_transactions_mock#producer-214| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39199,127.0.0.1:40271,127.0.0.1:34955 3: [0105_transactions_mock / 64.697s] Created kafka instance 0105_transactions_mock#producer-214 3: [0105_transactions_mock / 64.697s] Starting transaction 3: [0105_transactions_mock / 64.706s] rd_kafka_init_transactions(rk, 5000): duration 9.191ms 3: [0105_transactions_mock / 64.706s] rd_kafka_begin_transaction(rk): duration 0.011ms 3: [0105_transactions_mock / 64.706s] Test config file test.conf not found 3: [0105_transactions_mock / 64.706s] Produce to test [-1]: messages #0..500 3: [0105_transactions_mock / 64.706s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock / 64.706s] PRODUCE: duration 0.527ms 3: [0105_transactions_mock / 64.706s] Bringing down coordinator 1 3: %6|1669457819.451|FAIL|0105_transactions_mock#producer-214| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:39199: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock / 64.707s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:39199: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock / 64.707s] 0105_transactions_mock#producer-214 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:39199: Disconnected (after 0ms in state UP) 3: %3|1669457819.564|FAIL|0105_transactions_mock#producer-214| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:39199: Connect to ipv4#127.0.0.1:39199 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 64.820s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:39199: Connect to ipv4#127.0.0.1:39199 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 64.820s] 0105_transactions_mock#producer-214 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:39199: Connect to ipv4#127.0.0.1:39199 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457819.936|FAIL|0105_transactions_mock#producer-214| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:39199: Connect to ipv4#127.0.0.1:39199 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 65.191s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:39199: Connect to ipv4#127.0.0.1:39199 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 65.191s] 0105_transactions_mock#producer-214 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:39199: Connect to ipv4#127.0.0.1:39199 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/ 72.831s] 3 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 0131_connect_timeout 3: [
/ 72.868s] Log: 0131_connect_timeout#producer-209 level 7 fac FAIL: [thrd:127.0.0.1:39639/bootstrap]: 127.0.0.1:39639/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) (_TRANSPORT) 3: [
/ 72.868s] Log: 0131_connect_timeout#producer-209 level 4 fac FAIL: [thrd:127.0.0.1:39639/bootstrap]: 127.0.0.1:39639/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) 3: [
/ 73.831s] 3 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 0131_connect_timeout 3: [0131_connect_timeout / 8.008s] 0131_connect_timeout: duration 8007.932ms 3: [0131_connect_timeout / 8.008s] ================= Test 0131_connect_timeout PASSED ================= 3: [0116_kafkaconsumer_close / 57.488s] Closing with queue 3: [0116_kafkaconsumer_close / 57.488s] Attempting second close 3: [0116_kafkaconsumer_close / 57.489s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=1: PASS (6.02s) ] 3: [0116_kafkaconsumer_close / 57.489s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=1 ] 3: %5|1669457822.051|CONFWARN|MOCK#producer-215| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 57.489s] Setting test timeout to 10s * 2.7 3: [
/ 74.831s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 67.707s] Test config file test.conf not found 3: [0105_transactions_mock / 67.707s] Produce to test [-1]: messages #500..1000 3: [0105_transactions_mock / 67.708s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock / 67.708s] PRODUCE: duration 0.750ms 3: [
/ 75.831s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 76.832s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 69.708s] Bringing up coordinator 1 3: [0116_kafkaconsumer_close / 60.654s] Closing with queue 3: [0116_kafkaconsumer_close / 60.654s] Attempting second close 3: [0116_kafkaconsumer_close / 60.655s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=1: PASS (3.17s) ] 3: [0116_kafkaconsumer_close / 60.655s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=1 ] 3: %5|1669457825.217|CONFWARN|MOCK#producer-218| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 60.655s] Setting test timeout to 10s * 2.7 3: [
/ 77.832s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 71.194s] rd_kafka_commit_transaction(rk, -1): duration 1486.275ms 3: [0105_transactions_mock / 71.195s] [ do_test_txn_broker_down_in_txn:1280: Test coordinator down: PASS (6.50s) ] 3: [0105_transactions_mock / 71.195s] [ do_test_txn_broker_down_in_txn:1280: Test leader down ] 3: [0105_transactions_mock / 71.195s] Test config file test.conf not found 3: [0105_transactions_mock / 71.195s] Setting test timeout to 60s * 2.7 3: %5|1669457825.940|MOCK|0105_transactions_mock#producer-221| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42439,127.0.0.1:44835,127.0.0.1:36205 3: [0105_transactions_mock / 71.196s] Created kafka instance 0105_transactions_mock#producer-221 3: [0105_transactions_mock / 71.196s] Starting transaction 3: [0105_transactions_mock / 71.225s] rd_kafka_init_transactions(rk, 5000): duration 29.441ms 3: [0105_transactions_mock / 71.225s] rd_kafka_begin_transaction(rk): duration 0.107ms 3: [0105_transactions_mock / 71.225s] Test config file test.conf not found 3: [0105_transactions_mock / 71.225s] Produce to test [-1]: messages #0..500 3: [0105_transactions_mock / 71.226s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock / 71.226s] PRODUCE: duration 0.526ms 3: [0105_transactions_mock / 71.226s] Bringing down leader 2 3: %6|1669457825.971|FAIL|0105_transactions_mock#producer-221| [thrd:127.0.0.1:44835/bootstrap]: 127.0.0.1:44835/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 21ms in state UP) 3: [0105_transactions_mock / 71.226s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:44835/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 21ms in state UP) 3: [0105_transactions_mock / 71.226s] 0105_transactions_mock#producer-221 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:44835/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 21ms in state UP) 3: %3|1669457826.025|FAIL|0105_transactions_mock#producer-221| [thrd:127.0.0.1:44835/bootstrap]: 127.0.0.1:44835/2: Connect to ipv4#127.0.0.1:44835 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 71.281s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:44835/2: Connect to ipv4#127.0.0.1:44835 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 71.281s] 0105_transactions_mock#producer-221 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:44835/2: Connect to ipv4#127.0.0.1:44835 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 78.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: %3|1669457826.449|FAIL|0105_transactions_mock#producer-221| [thrd:127.0.0.1:44835/bootstrap]: 127.0.0.1:44835/2: Connect to ipv4#127.0.0.1:44835 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 71.705s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:44835/2: Connect to ipv4#127.0.0.1:44835 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 71.705s] 0105_transactions_mock#producer-221 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:44835/2: Connect to ipv4#127.0.0.1:44835 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/ 79.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 80.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 74.227s] Test config file test.conf not found 3: [0105_transactions_mock / 74.227s] Produce to test [-1]: messages #500..1000 3: [0105_transactions_mock / 74.227s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock / 74.227s] PRODUCE: duration 0.735ms 3: [
/ 81.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 82.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0116_kafkaconsumer_close / 65.699s] Closing with queue 3: [0116_kafkaconsumer_close / 65.699s] Attempting second close 3: [0116_kafkaconsumer_close / 65.700s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=1: PASS (5.04s) ] 3: [0116_kafkaconsumer_close / 65.700s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=1 ] 3: %5|1669457830.262|CONFWARN|MOCK#producer-222| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 65.700s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 76.227s] Bringing up leader 2 3: [
/ 83.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 84.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 77.692s] rd_kafka_commit_transaction(rk, -1): duration 1464.968ms 3: [0105_transactions_mock / 77.693s] [ do_test_txn_broker_down_in_txn:1280: Test leader down: PASS (6.50s) ] 3: [0105_transactions_mock / 77.693s] [ do_test_txns_not_supported:1492 ] 3: [0105_transactions_mock / 77.693s] Test config file test.conf not found 3: [0105_transactions_mock / 77.693s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 77.693s] Created kafka instance 0105_transactions_mock#producer-225 3: %1|1669457832.440|TXNERR|0105_transactions_mock#producer-225| [thrd:main]: Fatal transaction error: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 (_UNSUPPORTED_FEATURE) 3: %0|1669457832.440|FATAL|0105_transactions_mock#producer-225| [thrd:main]: Fatal error: Local: Required feature not supported by broker: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 3: [0105_transactions_mock / 77.695s] init_transactions() returned _UNSUPPORTED_FEATURE: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 3: %6|1669457832.440|FAIL|0105_transactions_mock#producer-225| [thrd:127.0.0.1:37411/bootstrap]: 127.0.0.1:37411/3: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 0ms in state UP) 3: [0105_transactions_mock / 77.696s] [ do_test_txns_not_supported:1492: PASS (0.00s) ] 3: [0105_transactions_mock / 77.696s] [ do_test_txns_send_offsets_concurrent_is_retried:1551 ] 3: [0105_transactions_mock / 77.696s] Test config file test.conf not found 3: [0105_transactions_mock / 77.696s] Setting test timeout to 60s * 2.7 3: %5|1669457832.440|MOCK|0105_transactions_mock#producer-226| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37281,127.0.0.1:34447,127.0.0.1:39235 3: [0105_transactions_mock / 77.696s] Created kafka instance 0105_transactions_mock#producer-226 3: [0105_transactions_mock / 77.698s] rd_kafka_init_transactions(rk, 5000): duration 2.042ms 3: [0105_transactions_mock / 77.698s] rd_kafka_begin_transaction(rk): duration 0.045ms 3: [0105_transactions_mock / 77.698s] 0105_transactions_mock#producer-226: Flushing 1 messages 3: [
/ 85.833s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0116_kafkaconsumer_close / 68.818s] Closing with queue 3: [0116_kafkaconsumer_close / 68.818s] Attempting second close 3: [0116_kafkaconsumer_close / 68.819s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=1: PASS (3.12s) ] 3: [0116_kafkaconsumer_close / 68.819s] 0116_kafkaconsumer_close: duration 68818.613ms 3: [0116_kafkaconsumer_close / 68.819s] ================= Test 0116_kafkaconsumer_close PASSED ================= 3: [0105_transactions_mock / 78.697s] FLUSH: duration 998.958ms 3: [0105_transactions_mock / 79.301s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 604.148ms 3: [0105_transactions_mock / 79.302s] rd_kafka_commit_transaction(rk, 5000): duration 0.211ms 3: [0105_transactions_mock / 79.302s] [ do_test_txns_send_offsets_concurrent_is_retried:1551: PASS (1.61s) ] 3: [0105_transactions_mock / 79.302s] [ do_test_txn_coord_req_destroy:1881 ] 3: [0105_transactions_mock / 79.302s] Test config file test.conf not found 3: [0105_transactions_mock / 79.302s] Setting test timeout to 60s * 2.7 3: %5|1669457834.047|MOCK|0105_transactions_mock#producer-227| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42711,127.0.0.1:33663,127.0.0.1:44845 3: [0105_transactions_mock / 79.303s] Created kafka instance 0105_transactions_mock#producer-227 3: [0105_transactions_mock / 79.304s] rd_kafka_init_transactions(rk, 5000): duration 1.059ms 3: [0105_transactions_mock / 79.304s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 79.304s] rd_kafka_begin_transaction(rk): duration 0.048ms 3: [0105_transactions_mock / 79.405s] send_offsets_to_transaction() #0: 3: [
/ 86.833s] 1 test(s) running: 0105_transactions_mock 3: %3|1669457834.393|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:42711/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:260898000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457834.393|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:42711/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 87.833s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 81.406s] rd_kafka_abort_transaction(rk, 5000): duration 0.322ms 3: [0105_transactions_mock / 81.406s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 81.406s] rd_kafka_begin_transaction(rk): duration 0.048ms 3: %3|1669457836.193|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:260898000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457836.193|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 88.834s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 81.508s] send_offsets_to_transaction() #1: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 81.508s] send_offsets_to_transaction() #1 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/ 89.834s] 1 test(s) running: 0105_transactions_mock 3: [
/ 90.834s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 83.508s] rd_kafka_abort_transaction(rk, 5000): duration 0.335ms 3: [0105_transactions_mock / 83.508s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 83.508s] rd_kafka_begin_transaction(rk): duration 0.100ms 3: [0105_transactions_mock / 83.610s] send_offsets_to_transaction() #2: 3: %3|1669457838.495|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:42711/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:260898000,Epoch:0}, base seq 2): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457838.495|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:42711/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 91.834s] 1 test(s) running: 0105_transactions_mock 3: [
/ 92.834s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 85.611s] rd_kafka_abort_transaction(rk, 5000): duration 0.477ms 3: [0105_transactions_mock / 85.611s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 85.611s] rd_kafka_begin_transaction(rk): duration 0.021ms 3: %3|1669457840.397|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:260898000,Epoch:0}, base seq 2): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457840.397|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 85.713s] send_offsets_to_transaction() #3: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 85.713s] send_offsets_to_transaction() #3 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/ 93.834s] 1 test(s) running: 0105_transactions_mock 3: [
/ 94.834s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 87.714s] rd_kafka_abort_transaction(rk, 5000): duration 0.448ms 3: [0105_transactions_mock / 87.714s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 87.714s] rd_kafka_begin_transaction(rk): duration 0.030ms 3: [0105_transactions_mock / 87.818s] send_offsets_to_transaction() #4: 3: %3|1669457842.702|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457842.702|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 95.834s] 1 test(s) running: 0105_transactions_mock 3: [
/ 96.834s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 89.818s] rd_kafka_abort_transaction(rk, 5000): duration 0.472ms 3: [0105_transactions_mock / 89.818s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 89.818s] rd_kafka_begin_transaction(rk): duration 0.037ms 3: %3|1669457844.605|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 2): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457844.605|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 89.921s] send_offsets_to_transaction() #5: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 89.921s] send_offsets_to_transaction() #5 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/ 97.835s] 1 test(s) running: 0105_transactions_mock 3: [
/ 98.835s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 91.921s] rd_kafka_abort_transaction(rk, 5000): duration 0.637ms 3: [0105_transactions_mock / 91.921s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 91.921s] rd_kafka_begin_transaction(rk): duration 0.055ms 3: [0105_transactions_mock / 92.024s] send_offsets_to_transaction() #6: 3: %3|1669457846.911|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 4): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457846.911|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 99.835s] 1 test(s) running: 0105_transactions_mock 3: [
/100.835s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 94.025s] rd_kafka_abort_transaction(rk, 5000): duration 0.466ms 3: [0105_transactions_mock / 94.025s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 94.025s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: %3|1669457848.812|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 6): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457848.812|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 94.127s] send_offsets_to_transaction() #7: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 94.127s] send_offsets_to_transaction() #7 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/101.835s] 1 test(s) running: 0105_transactions_mock 3: [
/102.835s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 96.127s] rd_kafka_abort_transaction(rk, 5000): duration 0.422ms 3: [0105_transactions_mock / 96.127s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 96.127s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [0105_transactions_mock / 96.231s] send_offsets_to_transaction() #8: 3: %3|1669457851.115|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 8): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457851.115|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/103.835s] 1 test(s) running: 0105_transactions_mock 3: [
/104.835s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 98.231s] rd_kafka_abort_transaction(rk, 5000): duration 0.473ms 3: [0105_transactions_mock / 98.231s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 98.231s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: %3|1669457853.018|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 10): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457853.018|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 98.334s] send_offsets_to_transaction() #9: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 98.334s] send_offsets_to_transaction() #9 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/105.836s] 1 test(s) running: 0105_transactions_mock 3: [
/106.836s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /100.334s] rd_kafka_abort_transaction(rk, 5000): duration 0.445ms 3: [0105_transactions_mock /100.334s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /100.334s] rd_kafka_begin_transaction(rk): duration 0.043ms 3: [0105_transactions_mock /100.437s] send_offsets_to_transaction() #10: 3: [
/107.836s] 1 test(s) running: 0105_transactions_mock 3: %3|1669457855.323|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:260898000,Epoch:0}, base seq 4): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457855.323|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/108.836s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /102.437s] rd_kafka_abort_transaction(rk, 5000): duration 0.427ms 3: [0105_transactions_mock /102.437s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /102.437s] rd_kafka_begin_transaction(rk): duration 0.096ms 3: %3|1669457857.225|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:42711/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:260898000,Epoch:0}, base seq 4): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457857.225|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:42711/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/109.836s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /102.540s] send_offsets_to_transaction() #11: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock /102.540s] send_offsets_to_transaction() #11 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/110.836s] 1 test(s) running: 0105_transactions_mock 3: [
/111.836s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /104.541s] rd_kafka_abort_transaction(rk, 5000): duration 0.396ms 3: [0105_transactions_mock /104.541s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /104.541s] rd_kafka_begin_transaction(rk): duration 0.021ms 3: [0105_transactions_mock /104.643s] send_offsets_to_transaction() #12: 3: %3|1669457859.528|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 12): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457859.528|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/112.836s] 1 test(s) running: 0105_transactions_mock 3: [
/113.837s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /106.643s] rd_kafka_abort_transaction(rk, 5000): duration 0.471ms 3: [0105_transactions_mock /106.643s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /106.643s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: %3|1669457861.430|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:260898000,Epoch:0}, base seq 14): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457861.430|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:44845/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock /106.746s] send_offsets_to_transaction() #13: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock /106.746s] send_offsets_to_transaction() #13 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/114.837s] 1 test(s) running: 0105_transactions_mock 3: [
/115.837s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /108.747s] rd_kafka_abort_transaction(rk, 5000): duration 1.032ms 3: [0105_transactions_mock /108.747s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /108.747s] rd_kafka_begin_transaction(rk): duration 0.027ms 3: [0105_transactions_mock /108.850s] send_offsets_to_transaction() #14: 3: %3|1669457863.735|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:260898000,Epoch:0}, base seq 6): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1669457863.735|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:33663/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/116.837s] 1 test(s) running: 0105_transactions_mock 3: [
/117.837s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /110.850s] rd_kafka_abort_transaction(rk, 5000): duration 0.469ms 3: [0105_transactions_mock /110.851s] [ do_test_txn_coord_req_multi_find:2064 ] 3: [0105_transactions_mock /110.851s] Test config file test.conf not found 3: [0105_transactions_mock /110.851s] Setting test timeout to 60s * 2.7 3: %5|1669457865.596|MOCK|0105_transactions_mock#producer-228| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44763,127.0.0.1:34587,127.0.0.1:35825 3: [0105_transactions_mock /110.852s] Created kafka instance 0105_transactions_mock#producer-228 3: [0105_transactions_mock /110.869s] rd_kafka_init_transactions(rk, 5000): duration 17.192ms 3: [0105_transactions_mock /110.869s] rd_kafka_begin_transaction(rk): duration 0.026ms 3: [0105_transactions_mock /110.869s] 0105_transactions_mock#producer-228: Flushing 3 messages 3: [
/118.837s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /111.854s] FLUSH: duration 984.362ms 3: [
/119.837s] 1 test(s) running: 0105_transactions_mock 3: [
/120.837s] 1 test(s) running: 0105_transactions_mock 3: [
/121.837s] 1 test(s) running: 0105_transactions_mock 3: [
/122.838s] 1 test(s) running: 0105_transactions_mock 3: [
/123.185s] on_response_received_cb: 0105_transactions_mock#producer-228: TxnCoordinator/1: brokerid 1, ApiKey 25, CorrId 0, rtt 4004.32ms, not done yet: NO_ERROR 3: %6|1669457870.603|FAIL|0105_transactions_mock#producer-228| [thrd:127.0.0.1:34587/bootstrap]: 127.0.0.1:34587/2: Disconnected (after 5006ms in state UP) 3: %3|1669457870.603|FAIL|0105_transactions_mock#producer-228| [thrd:127.0.0.1:34587/bootstrap]: 127.0.0.1:34587/2: Connect to ipv4#127.0.0.1:34587 failed: Connection refused (after 0ms in state CONNECT) 3: [
/123.185s] on_response_received_cb: 0105_transactions_mock#producer-228: TxnCoordinator/1: brokerid 1, ApiKey 25, CorrId 0, rtt 4004.32ms, not done yet: NO_ERROR 3: %6|1669457870.604|FAIL|0105_transactions_mock#producer-228| [thrd:127.0.0.1:35825/bootstrap]: 127.0.0.1:35825/3: Disconnected (after 5006ms in state UP) 3: [
/123.838s] 1 test(s) running: 0105_transactions_mock 3: [
/124.838s] 1 test(s) running: 0105_transactions_mock 3: [
/125.838s] 1 test(s) running: 0105_transactions_mock 3: [
/126.838s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /119.902s] send_offsets_to_transaction() 3: [
/127.838s] 1 test(s) running: 0105_transactions_mock 3: [
/128.838s] 1 test(s) running: 0105_transactions_mock 3: [
/129.838s] 1 test(s) running: 0105_transactions_mock 3: [
/130.838s] 1 test(s) running: 0105_transactions_mock 3: [
/131.839s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /124.902s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:34587/2: Disconnected (after 5006ms in state UP) 3: [0105_transactions_mock /124.902s] 0105_transactions_mock#producer-228 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34587/2: Disconnected (after 5006ms in state UP) 3: [0105_transactions_mock /124.902s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:34587/2: Connect to ipv4#127.0.0.1:34587 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /124.902s] 0105_transactions_mock#producer-228 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34587/2: Connect to ipv4#127.0.0.1:34587 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /124.902s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:35825/3: Disconnected (after 5006ms in state UP) 3: [0105_transactions_mock /124.902s] 0105_transactions_mock#producer-228 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:35825/3: Disconnected (after 5006ms in state UP) 3: [0105_transactions_mock /124.902s] rd_kafka_commit_transaction(rk, 5000): duration 0.333ms 3: [0105_transactions_mock /124.904s] [ do_test_txn_coord_req_multi_find:2064: PASS (14.05s) ] 3: [0105_transactions_mock /124.904s] [ do_test_txn_addparts_req_multi:2209 ] 3: [0105_transactions_mock /124.904s] Test config file test.conf not found 3: [0105_transactions_mock /124.904s] Setting test timeout to 60s * 2.7 3: %5|1669457879.649|MOCK|0105_transactions_mock#producer-229| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45645,127.0.0.1:39705,127.0.0.1:46727 3: [0105_transactions_mock /124.907s] Created kafka instance 0105_transactions_mock#producer-229 3: [0105_transactions_mock /124.913s] rd_kafka_init_transactions(rk, 5000): duration 5.134ms 3: [0105_transactions_mock /124.913s] Running seed transaction 3: [0105_transactions_mock /124.913s] rd_kafka_begin_transaction(rk): duration 0.018ms 3: [0105_transactions_mock /124.913s] rd_kafka_producev(rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { void * __t __attribute__((unused)) = ("seed"); size_t __t2 __attribute__((unused)) = (4); } RD_KAFKA_VTYPE_VALUE; }), (void *)"seed", (size_t)4, RD_KAFKA_VTYPE_END): duration 0.012ms 3: [
/132.839s] 1 test(s) running: 0105_transactions_mock 3: [
/133.232s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 0.03ms, count 0: NO_ERROR 3: [
/133.232s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 0.03ms, count 1: NO_ERROR 3: [0105_transactions_mock /125.907s] rd_kafka_commit_transaction(rk, 5000): duration 994.009ms 3: [0105_transactions_mock /125.907s] Running test transaction 3: [0105_transactions_mock /125.907s] rd_kafka_begin_transaction(rk): duration 0.035ms 3: [0105_transactions_mock /125.907s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: [0105_transactions_mock /126.407s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (1); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)1, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.036ms 3: [0105_transactions_mock /126.407s] Waiting for two AddPartitionsToTxnResponse 3: [
/133.839s] 1 test(s) running: 0105_transactions_mock 3: [
/134.239s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.54ms, count 0: NO_ERROR 3: [
/134.239s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.54ms, count 1: NO_ERROR 3: [0105_transactions_mock /126.920s] 2 AddPartitionsToTxnResponses seen 3: [0105_transactions_mock /126.920s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)2, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.016ms 3: [
/134.839s] 1 test(s) running: 0105_transactions_mock 3: [
/135.245s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.81ms, count 2: NO_ERROR 3: [
/135.245s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.81ms, count 3: NO_ERROR 3: [
/135.839s] 1 test(s) running: 0105_transactions_mock 3: [
/136.251s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.72ms, count 4: NO_ERROR 3: [
/136.251s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.72ms, count 5: NO_ERROR 3: [
/136.839s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /129.930s] rd_kafka_commit_transaction(rk, 10 * 1000): duration 2010.007ms 3: [0105_transactions_mock /129.931s] [ do_test_txn_addparts_req_multi:2209: PASS (5.03s) ] 3: [0105_transactions_mock /129.931s] [ do_test_txns_no_timeout_crash:1615 ] 3: [0105_transactions_mock /129.931s] Test config file test.conf not found 3: [0105_transactions_mock /129.931s] Setting test timeout to 60s * 2.7 3: %5|1669457884.676|MOCK|0105_transactions_mock#producer-230| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43497,127.0.0.1:40339,127.0.0.1:39155 3: [0105_transactions_mock /129.932s] Created kafka instance 0105_transactions_mock#producer-230 3: [0105_transactions_mock /129.934s] rd_kafka_init_transactions(rk, 5000): duration 2.212ms 3: [0105_transactions_mock /129.934s] rd_kafka_begin_transaction(rk): duration 0.043ms 3: [0105_transactions_mock /129.934s] 0105_transactions_mock#producer-230: Flushing 1 messages 3: [
/137.839s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /130.934s] FLUSH: duration 999.847ms 3: [
/138.839s] 1 test(s) running: 0105_transactions_mock 3: %5|1669457886.683|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out AddOffsetsToTxnRequest in flight (after 1004ms, timeout #0) 3: %4|1669457886.683|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1669457886.683|FAIL|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43497: 1 request(s) timed out: disconnect (after 2004ms in state UP) 3: [
/139.839s] 1 test(s) running: 0105_transactions_mock 3: %5|1669457887.688|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1669457887.688|FAIL|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43497: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1669457887.688|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %5|1669457887.688|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43497/bootstrap]: 127.0.0.1:43497/1: Timed out MetadataRequest in flight (after 1004ms, timeout #0) 3: %4|1669457887.688|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43497/bootstrap]: 127.0.0.1:43497/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1669457887.688|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43497/bootstrap]: 127.0.0.1:43497/1: 1 request(s) timed out: disconnect (after 2010ms in state UP) 3: [
/140.840s] 1 test(s) running: 0105_transactions_mock 3: %5|1669457888.693|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43497/bootstrap]: 127.0.0.1:43497/1: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1669457888.693|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43497/bootstrap]: 127.0.0.1:43497/1: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1669457888.693|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43497/bootstrap]: 127.0.0.1:43497/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %5|1669457888.693|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: Timed out FindCoordinatorRequest in flight (after 1959ms, timeout #0) 3: %5|1669457888.693|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: Timed out MetadataRequest in flight (after 1004ms, timeout #1) 3: %5|1669457888.693|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: Timed out MetadataRequest in flight (after 1004ms, timeout #2) 3: %4|1669457888.693|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: Timed out 3 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1669457888.693|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: 3 request(s) timed out: disconnect (after 4015ms in state UP) 3: [
/141.840s] 1 test(s) running: 0105_transactions_mock 3: %5|1669457889.698|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1669457889.698|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1669457889.698|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:39155/bootstrap]: 127.0.0.1:39155/3: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %5|1669457889.698|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:40339/bootstrap]: 127.0.0.1:40339/2: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1669457889.698|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:40339/bootstrap]: 127.0.0.1:40339/2: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1669457889.698|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:40339/bootstrap]: 127.0.0.1:40339/2: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: [
/142.840s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /135.934s] send_offsets..() failed with retriable error: Transactional operation timed out 3: [0105_transactions_mock /135.934s] Retrying send_offsets..() 3: %3|1669457890.694|ADDOFFSETS|0105_transactions_mock#producer-230| [thrd:main]: TxnCoordinator/1: Failed to add offsets to transaction on broker TxnCoordinator/1: Local: Outdated 3: [0105_transactions_mock /136.050s] [ do_test_txns_no_timeout_crash:1615: PASS (6.12s) ] 3: [0105_transactions_mock /136.051s] [ do_test_txn_auth_failure:1690: ApiKey=InitProducerId ErrorCode=CLUSTER_AUTHORIZATION_FAILED ] 3: [0105_transactions_mock /136.051s] Test config file test.conf not found 3: [0105_transactions_mock /136.051s] Setting test timeout to 60s * 2.7 3: %5|1669457890.795|MOCK|0105_transactions_mock#producer-231| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42727,127.0.0.1:32917,127.0.0.1:41363 3: [0105_transactions_mock /136.051s] Created kafka instance 0105_transactions_mock#producer-231 3: %1|1669457890.796|TXNERR|0105_transactions_mock#producer-231| [thrd:main]: Fatal transaction error: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed (CLUSTER_AUTHORIZATION_FAILED) 3: %0|1669457890.796|FATAL|0105_transactions_mock#producer-231| [thrd:main]: Fatal error: Broker: Cluster authorization failed: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.052s] init_transactions() failed: CLUSTER_AUTHORIZATION_FAILED: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.052s] [ do_test_txn_auth_failure:1690: ApiKey=InitProducerId ErrorCode=CLUSTER_AUTHORIZATION_FAILED: PASS (0.00s) ] 3: [0105_transactions_mock /136.052s] [ do_test_txn_auth_failure:1690: ApiKey=FindCoordinator ErrorCode=CLUSTER_AUTHORIZATION_FAILED ] 3: [0105_transactions_mock /136.052s] Test config file test.conf not found 3: [0105_transactions_mock /136.052s] Setting test timeout to 60s * 2.7 3: %5|1669457890.797|MOCK|0105_transactions_mock#producer-232| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:41873,127.0.0.1:42387,127.0.0.1:38775 3: [0105_transactions_mock /136.053s] Created kafka instance 0105_transactions_mock#producer-232 3: %1|1669457890.798|TXNERR|0105_transactions_mock#producer-232| [thrd:main]: Fatal transaction error: Failed to find transaction coordinator: 127.0.0.1:41873/1: Broker: Cluster authorization failed: Broker: Cluster authorization failed (CLUSTER_AUTHORIZATION_FAILED) 3: %0|1669457890.798|FATAL|0105_transactions_mock#producer-232| [thrd:main]: Fatal error: Broker: Cluster authorization failed: Failed to find transaction coordinator: 127.0.0.1:41873/1: Broker: Cluster authorization failed: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.054s] init_transactions() failed: CLUSTER_AUTHORIZATION_FAILED: Failed to find transaction coordinator: 127.0.0.1:41873/1: Broker: Cluster authorization failed: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.055s] [ do_test_txn_auth_failure:1690: ApiKey=FindCoordinator ErrorCode=CLUSTER_AUTHORIZATION_FAILED: PASS (0.00s) ] 3: [0105_transactions_mock /136.055s] [ do_test_txn_flush_timeout:1737 ] 3: [0105_transactions_mock /136.055s] Test config file test.conf not found 3: [0105_transactions_mock /136.055s] Setting test timeout to 60s * 2.7 3: %5|1669457890.799|MOCK|0105_transactions_mock#producer-233| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35491,127.0.0.1:36903,127.0.0.1:42279 3: [0105_transactions_mock /136.055s] Created kafka instance 0105_transactions_mock#producer-233 3: [0105_transactions_mock /136.056s] rd_kafka_init_transactions(rk, 5000): duration 0.605ms 3: [0105_transactions_mock /136.056s] rd_kafka_begin_transaction(rk): duration 0.025ms 3: [0105_transactions_mock /136.056s] Test config file test.conf not found 3: [0105_transactions_mock /136.056s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /136.056s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /136.056s] PRODUCE: duration 0.113ms 3: [0105_transactions_mock /136.056s] Test config file test.conf not found 3: [0105_transactions_mock /136.056s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /136.056s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /136.056s] PRODUCE: duration 0.103ms 3: [0105_transactions_mock /136.056s] Test config file test.conf not found 3: [0105_transactions_mock /136.056s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /136.056s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /136.056s] PRODUCE: duration 0.120ms 3: [0105_transactions_mock /136.057s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 0.644ms 3: [
/143.840s] 1 test(s) running: 0105_transactions_mock 3: [
/144.840s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /138.057s] Disconnecting transaction coordinator 2 3: %6|1669457892.802|FAIL|0105_transactions_mock#producer-233| [thrd:127.0.0.1:36903/bootstrap]: 127.0.0.1:36903/2: Disconnected (after 2001ms in state UP) 3: %6|1669457892.802|FAIL|0105_transactions_mock#producer-233| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:36903: Disconnected (after 2001ms in state UP) 3: %3|1669457892.802|FAIL|0105_transactions_mock#producer-233| [thrd:127.0.0.1:36903/bootstrap]: 127.0.0.1:36903/2: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457892.802|FAIL|0105_transactions_mock#producer-233| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:36903: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /138.057s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:36903/2: Disconnected (after 2001ms in state UP) 3: [0105_transactions_mock /138.057s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:36903/2: Disconnected (after 2001ms in state UP) 3: [0105_transactions_mock /138.057s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:36903: Disconnected (after 2001ms in state UP) 3: [0105_transactions_mock /138.057s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:36903: Disconnected (after 2001ms in state UP) 3: [0105_transactions_mock /138.057s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:36903/2: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /138.057s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:36903/2: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /138.057s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:36903: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /138.057s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:36903: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457893.076|FAIL|0105_transactions_mock#producer-233| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:36903: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /138.332s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:36903: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /138.332s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:36903: Connect to ipv4#127.0.0.1:36903 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/145.840s] 1 test(s) running: 0105_transactions_mock 3: [
/146.840s] 1 test(s) running: 0105_transactions_mock 3: [
/147.840s] 1 test(s) running: 0105_transactions_mock 3: [
/148.840s] 1 test(s) running: 0105_transactions_mock 3: [
/149.841s] 1 test(s) running: 0105_transactions_mock 3: [
/150.841s] 1 test(s) running: 0105_transactions_mock 3: [
/151.841s] 1 test(s) running: 0105_transactions_mock 3: [
/152.841s] 1 test(s) running: 0105_transactions_mock 3: %3|1669457900.801|TXNERR|0105_transactions_mock#producer-233| [thrd:main]: Current transaction failed in state BeginCommit: 300 message(s) failed delivery (see individual delivery reports) (_INCONSISTENT) 3: [0105_transactions_mock /146.057s] commit_transaction() failed (expectedly): 300 message(s) failed delivery (see individual delivery reports) 3: [
/153.841s] 1 test(s) running: 0105_transactions_mock 3: [
/154.841s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /148.057s] Aborting and retrying 3: [0105_transactions_mock /148.057s] rd_kafka_abort_transaction(rk, 60000): duration 0.236ms 3: [0105_transactions_mock /148.057s] rd_kafka_begin_transaction(rk): duration 0.032ms 3: [0105_transactions_mock /148.057s] Test config file test.conf not found 3: [0105_transactions_mock /148.057s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /148.057s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /148.057s] PRODUCE: duration 0.143ms 3: [0105_transactions_mock /148.057s] Test config file test.conf not found 3: [0105_transactions_mock /148.057s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /148.057s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /148.057s] PRODUCE: duration 0.166ms 3: [0105_transactions_mock /148.057s] Test config file test.conf not found 3: [0105_transactions_mock /148.057s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /148.057s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /148.057s] PRODUCE: duration 0.163ms 3: [0105_transactions_mock /148.058s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 0.259ms 3: [
/155.841s] 1 test(s) running: 0105_transactions_mock 3: [
/156.841s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /150.059s] [ do_test_txn_flush_timeout:1737: PASS (14.00s) ] 3: [0105_transactions_mock /150.059s] [ do_test_unstable_offset_commit:2320 ] 3: [0105_transactions_mock /150.059s] Test config file test.conf not found 3: [0105_transactions_mock /150.059s] Setting test timeout to 60s * 2.7 3: %5|1669457904.804|MOCK|0105_transactions_mock#producer-234| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39733,127.0.0.1:43833,127.0.0.1:43341 3: [0105_transactions_mock /150.059s] Created kafka instance 0105_transactions_mock#producer-234 3: [0105_transactions_mock /150.059s] Test config file test.conf not found 3: [0105_transactions_mock /150.060s] Created kafka instance 0105_transactions_mock#consumer-235 3: [0105_transactions_mock /150.061s] rd_kafka_init_transactions(rk, -1): duration 1.527ms 3: [0105_transactions_mock /150.061s] rd_kafka_begin_transaction(rk): duration 0.033ms 3: [0105_transactions_mock /150.061s] Test config file test.conf not found 3: [0105_transactions_mock /150.061s] Produce to mytopic [0]: messages #0..100 3: [0105_transactions_mock /150.062s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /150.062s] PRODUCE: duration 0.115ms 3: [0105_transactions_mock /150.063s] rd_kafka_commit_transaction(rk, -1): duration 1.620ms 3: [0105_transactions_mock /150.064s] rd_kafka_commit(c, offsets, 0 ): duration 0.799ms 3: [
/157.841s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /150.668s] #0: committed() returned NO_ERROR (expected NO_ERROR) 3: [0105_transactions_mock /150.868s] #1: committed() returned _TIMED_OUT (expected _TIMED_OUT) 3: [0105_transactions_mock /150.868s] Phase 2: OffsetFetch lookup through assignment 3: [0105_transactions_mock /150.868s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.124ms 3: [0105_transactions_mock /150.868s] assign: incremental assign of 1 partition(s) done 3: [0105_transactions_mock /150.868s] consume: consume exactly 50 messages 3: [
/158.841s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /152.276s] mytopic [0] reached EOF at offset 100 3: [0105_transactions_mock /152.276s] CONSUME: duration 1408.566ms 3: [0105_transactions_mock /152.276s] consume: consumed 50/50 messages (1/1 EOFs) 3: [0105_transactions_mock /152.278s] [ do_test_unstable_offset_commit:2320: PASS (2.22s) ] 3: [0105_transactions_mock /152.278s] [ do_test_commit_after_msg_timeout:2447 ] 3: [0105_transactions_mock /152.278s] Test config file test.conf not found 3: [0105_transactions_mock /152.278s] Setting test timeout to 60s * 2.7 3: %5|1669457907.023|MOCK|0105_transactions_mock#producer-236| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:33213,127.0.0.1:34213,127.0.0.1:36669 3: [0105_transactions_mock /152.279s] Created kafka instance 0105_transactions_mock#producer-236 3: [0105_transactions_mock /152.279s] Starting transaction 3: [0105_transactions_mock /152.279s] rd_kafka_init_transactions(rk, -1): duration 0.739ms 3: [0105_transactions_mock /152.279s] rd_kafka_begin_transaction(rk): duration 0.039ms 3: [0105_transactions_mock /152.279s] Bringing down 2 3: [0105_transactions_mock /152.280s] Test config file test.conf not found 3: %6|1669457907.024|FAIL|0105_transactions_mock#producer-236| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:33213: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock /152.280s] Produce to test [0]: messages #0..1 3: %6|1669457907.024|FAIL|0105_transactions_mock#producer-236| [thrd:127.0.0.1:34213/bootstrap]: 127.0.0.1:34213/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 0ms in state UP) 3: [0105_transactions_mock /152.280s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:33213: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock /152.280s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:33213: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock /152.280s] SUM(POLL): duration 0.082ms 3: [0105_transactions_mock /152.280s] PRODUCE: duration 0.091ms 3: [0105_transactions_mock /152.280s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:34213/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 0ms in state UP) 3: [0105_transactions_mock /152.280s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34213/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 0ms in state UP) 3: %3|1669457907.106|FAIL|0105_transactions_mock#producer-236| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:33213: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.362s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:33213: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.362s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:33213: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457907.115|FAIL|0105_transactions_mock#producer-236| [thrd:127.0.0.1:34213/bootstrap]: 127.0.0.1:34213/2: Connect to ipv4#127.0.0.1:34213 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.371s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:34213/2: Connect to ipv4#127.0.0.1:34213 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.371s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34213/2: Connect to ipv4#127.0.0.1:34213 failed: Connection refused (after 0ms in state CONNECT) 3: [
/159.842s] 1 test(s) running: 0105_transactions_mock 3: %3|1669457907.524|FAIL|0105_transactions_mock#producer-236| [thrd:127.0.0.1:33213/bootstrap]: 127.0.0.1:33213/1: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.779s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:33213/1: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.779s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:33213/1: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457907.605|FAIL|0105_transactions_mock#producer-236| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:33213: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /152.861s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:33213: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /152.861s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:33213: Connect to ipv4#127.0.0.1:33213 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: %3|1669457907.608|FAIL|0105_transactions_mock#producer-236| [thrd:127.0.0.1:34213/bootstrap]: 127.0.0.1:34213/2: Connect to ipv4#127.0.0.1:34213 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /152.863s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:34213/2: Connect to ipv4#127.0.0.1:34213 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /152.863s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34213/2: Connect to ipv4#127.0.0.1:34213 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/160.842s] 1 test(s) running: 0105_transactions_mock 3: [
/161.842s] 1 test(s) running: 0105_transactions_mock 3: [
/162.842s] 1 test(s) running: 0105_transactions_mock 3: [
/163.842s] 1 test(s) running: 0105_transactions_mock 3: [
/164.842s] 1 test(s) running: 0105_transactions_mock 3: [
/165.842s] 1 test(s) running: 0105_transactions_mock 3: [
/166.842s] 1 test(s) running: 0105_transactions_mock 3: [
/167.842s] 1 test(s) running: 0105_transactions_mock 3: [
/168.842s] 1 test(s) running: 0105_transactions_mock 3: [
/169.843s] 1 test(s) running: 0105_transactions_mock 3: [
/170.843s] 1 test(s) running: 0105_transactions_mock 3: %3|1669457918.295|TXNERR|0105_transactions_mock#producer-236| [thrd:127.0.0.1:34213/bootstrap]: Current transaction failed in state BeginCommit: 1 message(s) timed out on test [0] (_TIMED_OUT, requires epoch bump) 3: [0105_transactions_mock /163.551s] commit_transaction() failed (as expected): 1 message(s) timed out on test [0] 3: [0105_transactions_mock /163.551s] Aborting transaction 3: [0105_transactions_mock /164.254s] rd_kafka_abort_transaction(rk, -1): duration 702.765ms 3: [0105_transactions_mock /164.254s] Attempting second transaction, which should succeed 3: [0105_transactions_mock /164.254s] rd_kafka_begin_transaction(rk): duration 0.213ms 3: [0105_transactions_mock /164.254s] Test config file test.conf not found 3: [0105_transactions_mock /164.254s] Produce to test [0]: messages #0..1 3: [0105_transactions_mock /164.254s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /164.254s] PRODUCE: duration 0.025ms 3: [
/171.843s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /165.301s] rd_kafka_commit_transaction(rk, -1): duration 1046.693ms 3: [0105_transactions_mock /165.301s] [ do_test_commit_after_msg_timeout:2447: PASS (13.02s) ] 3: [0105_transactions_mock /165.301s] Setting test timeout to 200s * 2.7 3: [0105_transactions_mock /165.301s] [ do_test_txn_switch_coordinator:1366: Test switching coordinators ] 3: [0105_transactions_mock /165.301s] Test config file test.conf not found 3: [0105_transactions_mock /165.301s] Setting test timeout to 60s * 2.7 3: %5|1669457920.046|MOCK|0105_transactions_mock#producer-237| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:41805,127.0.0.1:33249,127.0.0.1:45287,127.0.0.1:42259,127.0.0.1:40869 3: [0105_transactions_mock /165.302s] Created kafka instance 0105_transactions_mock#producer-237 3: [0105_transactions_mock /165.302s] Starting transaction 3: [0105_transactions_mock /165.304s] rd_kafka_init_transactions(rk, 5000): duration 1.436ms 3: [0105_transactions_mock /165.304s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /165.304s] rd_kafka_begin_transaction(rk): duration 0.049ms 3: [0105_transactions_mock /165.304s] Test config file test.conf not found 3: [0105_transactions_mock /165.304s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /165.304s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /165.304s] PRODUCE: duration 0.077ms 3: [
/172.843s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /165.806s] PRODUCE.DELIVERY.WAIT: duration 501.926ms 3: [0105_transactions_mock /165.806s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /165.806s] Test config file test.conf not found 3: [0105_transactions_mock /165.806s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /165.806s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /165.806s] PRODUCE: duration 0.086ms 3: [0105_transactions_mock /165.806s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /166.405s] rd_kafka_commit_transaction(rk, -1): duration 599.374ms 3: [0105_transactions_mock /166.405s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /166.406s] rd_kafka_begin_transaction(rk): duration 0.073ms 3: [0105_transactions_mock /166.406s] Test config file test.conf not found 3: [0105_transactions_mock /166.406s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /166.406s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /166.406s] PRODUCE: duration 0.110ms 3: [
/173.843s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /166.909s] PRODUCE.DELIVERY.WAIT: duration 503.105ms 3: [0105_transactions_mock /166.909s] Test config file test.conf not found 3: [0105_transactions_mock /166.909s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /166.909s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /166.909s] PRODUCE: duration 0.109ms 3: [0105_transactions_mock /166.909s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /167.512s] rd_kafka_abort_transaction(rk, -1): duration 602.501ms 3: [0105_transactions_mock /167.512s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /167.512s] rd_kafka_begin_transaction(rk): duration 0.081ms 3: [0105_transactions_mock /167.512s] Test config file test.conf not found 3: [0105_transactions_mock /167.512s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /167.512s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /167.512s] PRODUCE: duration 0.104ms 3: [
/174.843s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /168.016s] PRODUCE.DELIVERY.WAIT: duration 503.596ms 3: [0105_transactions_mock /168.016s] Test config file test.conf not found 3: [0105_transactions_mock /168.016s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /168.016s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /168.016s] PRODUCE: duration 0.090ms 3: [0105_transactions_mock /168.016s] rd_kafka_abort_transaction(rk, -1): duration 0.275ms 3: [0105_transactions_mock /168.016s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /168.017s] rd_kafka_begin_transaction(rk): duration 0.046ms 3: [0105_transactions_mock /168.017s] Test config file test.conf not found 3: [0105_transactions_mock /168.017s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /168.017s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /168.017s] PRODUCE: duration 0.094ms 3: [
/175.843s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /168.518s] PRODUCE.DELIVERY.WAIT: duration 501.313ms 3: [0105_transactions_mock /168.518s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /168.518s] Test config file test.conf not found 3: [0105_transactions_mock /168.518s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /168.518s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /168.518s] PRODUCE: duration 0.118ms 3: [0105_transactions_mock /168.518s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /169.115s] rd_kafka_abort_transaction(rk, -1): duration 596.587ms 3: [0105_transactions_mock /169.115s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /169.115s] rd_kafka_begin_transaction(rk): duration 0.054ms 3: [0105_transactions_mock /169.115s] Test config file test.conf not found 3: [0105_transactions_mock /169.115s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /169.115s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /169.115s] PRODUCE: duration 0.117ms 3: [
/176.843s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /169.617s] PRODUCE.DELIVERY.WAIT: duration 502.212ms 3: [0105_transactions_mock /169.617s] Test config file test.conf not found 3: [0105_transactions_mock /169.617s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /169.618s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /169.618s] PRODUCE: duration 0.139ms 3: [0105_transactions_mock /169.618s] rd_kafka_abort_transaction(rk, -1): duration 0.225ms 3: [0105_transactions_mock /169.618s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /169.618s] rd_kafka_begin_transaction(rk): duration 0.043ms 3: [0105_transactions_mock /169.618s] Test config file test.conf not found 3: [0105_transactions_mock /169.618s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /169.618s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /169.618s] PRODUCE: duration 0.095ms 3: [0105_transactions_mock /170.120s] PRODUCE.DELIVERY.WAIT: duration 502.203ms 3: [0105_transactions_mock /170.120s] Test config file test.conf not found 3: [0105_transactions_mock /170.120s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /170.120s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /170.120s] PRODUCE: duration 0.107ms 3: [0105_transactions_mock /170.120s] Changing transaction coordinator from 2 to 3 3: [
/177.843s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /170.615s] rd_kafka_commit_transaction(rk, -1): duration 494.928ms 3: [0105_transactions_mock /170.615s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /170.615s] rd_kafka_begin_transaction(rk): duration 0.042ms 3: [0105_transactions_mock /170.615s] Test config file test.conf not found 3: [0105_transactions_mock /170.616s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /170.616s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /170.616s] PRODUCE: duration 0.112ms 3: [0105_transactions_mock /171.117s] PRODUCE.DELIVERY.WAIT: duration 501.432ms 3: [0105_transactions_mock /171.117s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /171.117s] Test config file test.conf not found 3: [0105_transactions_mock /171.117s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /171.117s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /171.117s] PRODUCE: duration 0.098ms 3: [
/178.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /171.616s] rd_kafka_abort_transaction(rk, -1): duration 498.260ms 3: [0105_transactions_mock /171.616s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /171.616s] rd_kafka_begin_transaction(rk): duration 0.039ms 3: [0105_transactions_mock /171.616s] Test config file test.conf not found 3: [0105_transactions_mock /171.616s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /171.616s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /171.616s] PRODUCE: duration 0.105ms 3: [0105_transactions_mock /172.117s] PRODUCE.DELIVERY.WAIT: duration 501.363ms 3: [0105_transactions_mock /172.117s] Test config file test.conf not found 3: [0105_transactions_mock /172.117s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /172.117s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /172.117s] PRODUCE: duration 0.094ms 3: [0105_transactions_mock /172.117s] Changing transaction coordinator from 1 to 2 3: [
/179.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /172.616s] rd_kafka_abort_transaction(rk, -1): duration 498.344ms 3: [0105_transactions_mock /172.616s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /172.616s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock /172.616s] Test config file test.conf not found 3: [0105_transactions_mock /172.616s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /172.616s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /172.616s] PRODUCE: duration 0.107ms 3: [0105_transactions_mock /173.117s] PRODUCE.DELIVERY.WAIT: duration 501.362ms 3: [0105_transactions_mock /173.117s] Test config file test.conf not found 3: [0105_transactions_mock /173.117s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /173.117s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /173.117s] PRODUCE: duration 0.091ms 3: [0105_transactions_mock /173.117s] Changing transaction coordinator from 3 to 4 3: [
/180.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /173.616s] rd_kafka_abort_transaction(rk, -1): duration 498.625ms 3: [0105_transactions_mock /173.616s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /173.616s] rd_kafka_begin_transaction(rk): duration 0.041ms 3: [0105_transactions_mock /173.616s] Test config file test.conf not found 3: [0105_transactions_mock /173.616s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /173.616s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /173.616s] PRODUCE: duration 0.090ms 3: [0105_transactions_mock /174.119s] PRODUCE.DELIVERY.WAIT: duration 502.183ms 3: [0105_transactions_mock /174.119s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /174.119s] Test config file test.conf not found 3: [0105_transactions_mock /174.119s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /174.119s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /174.119s] PRODUCE: duration 0.096ms 3: [0105_transactions_mock /174.119s] Changing transaction coordinator from 1 to 2 3: [
/181.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /174.616s] rd_kafka_abort_transaction(rk, -1): duration 497.622ms 3: [0105_transactions_mock /174.616s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /174.616s] rd_kafka_begin_transaction(rk): duration 0.044ms 3: [0105_transactions_mock /174.616s] Test config file test.conf not found 3: [0105_transactions_mock /174.616s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /174.617s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /174.617s] PRODUCE: duration 0.098ms 3: [0105_transactions_mock /175.118s] PRODUCE.DELIVERY.WAIT: duration 501.415ms 3: [0105_transactions_mock /175.118s] Test config file test.conf not found 3: [0105_transactions_mock /175.118s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /175.118s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /175.118s] PRODUCE: duration 0.092ms 3: [0105_transactions_mock /175.119s] rd_kafka_commit_transaction(rk, -1): duration 0.406ms 3: [0105_transactions_mock /175.119s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /175.119s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock /175.119s] Test config file test.conf not found 3: [0105_transactions_mock /175.119s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /175.119s] SUM(POLL): duration 0.006ms 3: [0105_transactions_mock /175.119s] PRODUCE: duration 0.114ms 3: [
/182.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /175.620s] PRODUCE.DELIVERY.WAIT: duration 501.228ms 3: [0105_transactions_mock /175.620s] Test config file test.conf not found 3: [0105_transactions_mock /175.620s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /175.620s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /175.620s] PRODUCE: duration 0.124ms 3: [0105_transactions_mock /175.620s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /176.117s] rd_kafka_abort_transaction(rk, -1): duration 496.995ms 3: [0105_transactions_mock /176.117s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /176.117s] rd_kafka_begin_transaction(rk): duration 0.059ms 3: [0105_transactions_mock /176.117s] Test config file test.conf not found 3: [0105_transactions_mock /176.117s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /176.117s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /176.117s] PRODUCE: duration 0.138ms 3: [
/183.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /176.619s] PRODUCE.DELIVERY.WAIT: duration 501.949ms 3: [0105_transactions_mock /176.619s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /176.619s] Test config file test.conf not found 3: [0105_transactions_mock /176.620s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /176.620s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /176.620s] PRODUCE: duration 0.117ms 3: [0105_transactions_mock /177.117s] rd_kafka_abort_transaction(rk, -1): duration 497.811ms 3: [0105_transactions_mock /177.117s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /177.118s] rd_kafka_begin_transaction(rk): duration 0.106ms 3: [0105_transactions_mock /177.118s] Test config file test.conf not found 3: [0105_transactions_mock /177.118s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /177.118s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /177.118s] PRODUCE: duration 0.165ms 3: [
/184.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /177.620s] PRODUCE.DELIVERY.WAIT: duration 502.181ms 3: [0105_transactions_mock /177.620s] Test config file test.conf not found 3: [0105_transactions_mock /177.620s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /177.620s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /177.620s] PRODUCE: duration 0.123ms 3: [0105_transactions_mock /177.620s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /178.118s] rd_kafka_abort_transaction(rk, -1): duration 497.731ms 3: [0105_transactions_mock /178.118s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /178.118s] rd_kafka_begin_transaction(rk): duration 0.058ms 3: [0105_transactions_mock /178.118s] Test config file test.conf not found 3: [0105_transactions_mock /178.118s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /178.118s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /178.118s] PRODUCE: duration 0.112ms 3: [
/185.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /178.620s] PRODUCE.DELIVERY.WAIT: duration 501.365ms 3: [0105_transactions_mock /178.620s] Test config file test.conf not found 3: [0105_transactions_mock /178.620s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /178.620s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /178.620s] PRODUCE: duration 0.090ms 3: [0105_transactions_mock /178.620s] rd_kafka_abort_transaction(rk, -1): duration 0.267ms 3: [0105_transactions_mock /178.620s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /178.620s] rd_kafka_begin_transaction(rk): duration 0.032ms 3: [0105_transactions_mock /178.620s] Test config file test.conf not found 3: [0105_transactions_mock /178.620s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /178.620s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /178.620s] PRODUCE: duration 0.082ms 3: [0105_transactions_mock /179.121s] PRODUCE.DELIVERY.WAIT: duration 500.570ms 3: [0105_transactions_mock /179.121s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /179.121s] Test config file test.conf not found 3: [0105_transactions_mock /179.121s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /179.121s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /179.121s] PRODUCE: duration 0.095ms 3: [0105_transactions_mock /179.121s] Changing transaction coordinator from 2 to 3 3: [
/186.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /179.618s] rd_kafka_commit_transaction(rk, -1): duration 497.229ms 3: [0105_transactions_mock /179.618s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /179.618s] rd_kafka_begin_transaction(rk): duration 0.033ms 3: [0105_transactions_mock /179.618s] Test config file test.conf not found 3: [0105_transactions_mock /179.618s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /179.618s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /179.618s] PRODUCE: duration 0.097ms 3: [0105_transactions_mock /180.120s] PRODUCE.DELIVERY.WAIT: duration 501.341ms 3: [0105_transactions_mock /180.120s] Test config file test.conf not found 3: [0105_transactions_mock /180.120s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /180.120s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /180.120s] PRODUCE: duration 0.118ms 3: [0105_transactions_mock /180.120s] Changing transaction coordinator from 4 to 5 3: [
/187.844s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /180.618s] rd_kafka_abort_transaction(rk, -1): duration 498.480ms 3: [0105_transactions_mock /180.618s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /180.618s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock /180.618s] Test config file test.conf not found 3: [0105_transactions_mock /180.618s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /180.619s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /180.619s] PRODUCE: duration 0.092ms 3: [0105_transactions_mock /181.120s] PRODUCE.DELIVERY.WAIT: duration 501.314ms 3: [0105_transactions_mock /181.120s] Test config file test.conf not found 3: [0105_transactions_mock /181.120s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /181.120s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /181.120s] PRODUCE: duration 0.094ms 3: [0105_transactions_mock /181.120s] Changing transaction coordinator from 1 to 2 3: [
/188.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /181.619s] rd_kafka_abort_transaction(rk, -1): duration 498.515ms 3: [0105_transactions_mock /181.619s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /181.619s] rd_kafka_begin_transaction(rk): duration 0.033ms 3: [0105_transactions_mock /181.619s] Test config file test.conf not found 3: [0105_transactions_mock /181.619s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /181.619s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /181.619s] PRODUCE: duration 0.095ms 3: [0105_transactions_mock /182.120s] PRODUCE.DELIVERY.WAIT: duration 501.330ms 3: [0105_transactions_mock /182.120s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /182.120s] Test config file test.conf not found 3: [0105_transactions_mock /182.120s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /182.120s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /182.120s] PRODUCE: duration 0.117ms 3: [
/189.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /182.619s] rd_kafka_abort_transaction(rk, -1): duration 498.479ms 3: [0105_transactions_mock /182.619s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /182.619s] rd_kafka_begin_transaction(rk): duration 0.051ms 3: [0105_transactions_mock /182.619s] Test config file test.conf not found 3: [0105_transactions_mock /182.619s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /182.619s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /182.619s] PRODUCE: duration 0.091ms 3: [0105_transactions_mock /183.120s] PRODUCE.DELIVERY.WAIT: duration 501.362ms 3: [0105_transactions_mock /183.120s] Test config file test.conf not found 3: [0105_transactions_mock /183.120s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /183.120s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /183.120s] PRODUCE: duration 0.088ms 3: [0105_transactions_mock /183.120s] Changing transaction coordinator from 5 to 1 3: [
/190.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /183.619s] rd_kafka_abort_transaction(rk, -1): duration 498.593ms 3: [0105_transactions_mock /183.620s] [ do_test_txn_switch_coordinator:1366: Test switching coordinators: PASS (18.32s) ] 3: [0105_transactions_mock /183.620s] [ do_test_txn_switch_coordinator_refresh:1433: Test switching coordinators (refresh) ] 3: [0105_transactions_mock /183.620s] Test config file test.conf not found 3: [0105_transactions_mock /183.620s] Setting test timeout to 60s * 2.7 3: %5|1669457938.364|MOCK|0105_transactions_mock#producer-238| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39507,127.0.0.1:35725,127.0.0.1:44833 3: [0105_transactions_mock /183.620s] Created kafka instance 0105_transactions_mock#producer-238 3: [0105_transactions_mock /183.620s] Starting transaction 3: [0105_transactions_mock /183.621s] rd_kafka_init_transactions(rk, 5000): duration 0.798ms 3: [0105_transactions_mock /183.621s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock /183.621s] Switching to coordinator 2 3: [0105_transactions_mock /184.125s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, 20 * 1000): duration 503.391ms 3: [0105_transactions_mock /184.125s] Test config file test.conf not found 3: [0105_transactions_mock /184.125s] Produce to test [-1]: messages #0..10 3: [0105_transactions_mock /184.125s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /184.125s] PRODUCE: duration 0.022ms 3: [0105_transactions_mock /184.130s] PRODUCE.DELIVERY.WAIT: duration 5.565ms 3: [0105_transactions_mock /184.131s] rd_kafka_commit_transaction(rk, -1): duration 0.210ms 3: [0105_transactions_mock /184.131s] [ do_test_txn_switch_coordinator_refresh:1433: Test switching coordinators (refresh): PASS (0.51s) ] 3: [0105_transactions_mock /184.131s] [ do_test_out_of_order_seq:2532 ] 3: [0105_transactions_mock /184.131s] Test config file test.conf not found 3: [0105_transactions_mock /184.131s] Setting test timeout to 60s * 2.7 3: %5|1669457938.876|MOCK|0105_transactions_mock#producer-239| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43415,127.0.0.1:45915,127.0.0.1:33257 3: [0105_transactions_mock /184.132s] Created kafka instance 0105_transactions_mock#producer-239 3: [0105_transactions_mock /184.132s] rd_kafka_init_transactions(rk, -1): duration 0.595ms 3: [0105_transactions_mock /184.132s] rd_kafka_begin_transaction(rk): duration 0.037ms 3: [0105_transactions_mock /184.132s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock /184.132s] 0105_transactions_mock#producer-239: Flushing 1 messages 3: [
/191.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /185.133s] FLUSH: duration 1000.197ms 3: [0105_transactions_mock /185.133s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock /185.133s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /185.133s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /185.133s] Sleeping.. 3: [
/192.845s] 1 test(s) running: 0105_transactions_mock 3: [
/193.845s] 1 test(s) running: 0105_transactions_mock 3: %3|1669457941.923|TXNERR|0105_transactions_mock#producer-239| [thrd:127.0.0.1:45915/bootstrap]: Current transaction failed in state InTransaction: skipped sequence numbers (OUT_OF_ORDER_SEQUENCE_NUMBER, requires epoch bump) 3: [
/194.845s] 1 test(s) running: 0105_transactions_mock 3: [
/195.845s] 1 test(s) running: 0105_transactions_mock 3: [
/196.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /190.133s] produce() failed as expected: Local: Erroneous state 3: [0105_transactions_mock /190.133s] commit_transaction(-1): duration 0.064ms 3: [0105_transactions_mock /190.133s] commit_transaction() failed (expectedly): skipped sequence numbers 3: [0105_transactions_mock /190.133s] rd_kafka_abort_transaction(rk, -1): duration 0.146ms 3: [0105_transactions_mock /190.133s] rd_kafka_begin_transaction(rk): duration 0.047ms 3: [0105_transactions_mock /190.133s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.008ms 3: [0105_transactions_mock /190.175s] rd_kafka_commit_transaction(rk, -1): duration 42.176ms 3: [0105_transactions_mock /190.176s] [ do_test_out_of_order_seq:2532: PASS (6.04s) ] 3: [0105_transactions_mock /190.176s] [ do_test_topic_disappears_for_awhile:2666 ] 3: [0105_transactions_mock /190.176s] Test config file test.conf not found 3: [0105_transactions_mock /190.176s] Setting test timeout to 60s * 2.7 3: %5|1669457944.921|MOCK|0105_transactions_mock#producer-240| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40369 3: [0105_transactions_mock /190.176s] Created kafka instance 0105_transactions_mock#producer-240 3: [0105_transactions_mock /190.177s] rd_kafka_init_transactions(rk, -1): duration 0.687ms 3: [0105_transactions_mock /190.177s] rd_kafka_begin_transaction(rk): duration 0.018ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /190.177s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [
/197.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /191.218s] rd_kafka_commit_transaction(rk, -1): duration 1041.039ms 3: [0105_transactions_mock /191.218s] commit_transaction(-1): duration 1041.054ms 3: [0105_transactions_mock /191.218s] Marking topic as non-existent 3: %5|1669457945.963|PARTCNT|0105_transactions_mock#producer-240| [thrd:main]: Topic mytopic partition count changed from 10 to 0 3: [0105_transactions_mock /191.219s] rd_kafka_metadata(rk, 0, ((void *)0), &md, tmout_multip(5000)): duration 0.094ms 3: [
/198.845s] 1 test(s) running: 0105_transactions_mock 3: [
/199.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /193.219s] Bringing topic back to life 3: [0105_transactions_mock /193.219s] rd_kafka_begin_transaction(rk): duration 0.027ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.002ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.219s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [
/200.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /194.219s] rd_kafka_commit_transaction(rk, -1): duration 1000.434ms 3: [0105_transactions_mock /194.219s] commit_transaction(-1): duration 1000.445ms 3: [0105_transactions_mock /194.219s] Verifying messages by consumtion 3: [0105_transactions_mock /194.219s] Test config file test.conf not found 3: [0105_transactions_mock /194.220s] Created kafka instance 0105_transactions_mock#consumer-241 3: [0105_transactions_mock /194.220s] consume: consume exactly 122 messages 3: [
/201.845s] 1 test(s) running: 0105_transactions_mock 3: [
/202.845s] 1 test(s) running: 0105_transactions_mock 3: [
/203.845s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /197.368s] mytopic [7] reached EOF at offset 12 3: [0105_transactions_mock /197.368s] mytopic [0] reached EOF at offset 12 3: [0105_transactions_mock /197.368s] mytopic [2] reached EOF at offset 12 3: [0105_transactions_mock /197.368s] mytopic [3] reached EOF at offset 12 3: [0105_transactions_mock /197.368s] mytopic [4] reached EOF at offset 12 3: [0105_transactions_mock /197.368s] mytopic [9] reached EOF at offset 13 3: [0105_transactions_mock /197.368s] mytopic [8] reached EOF at offset 12 3: [0105_transactions_mock /197.368s] mytopic [5] reached EOF at offset 12 3: [0105_transactions_mock /197.368s] mytopic [6] reached EOF at offset 12 3: [
/204.846s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /197.871s] mytopic [1] reached EOF at offset 13 3: [0105_transactions_mock /197.871s] CONSUME: duration 3650.999ms 3: [0105_transactions_mock /197.871s] consume: consumed 122/122 messages (10/10 EOFs) 3: [0105_transactions_mock /197.873s] [ do_test_topic_disappears_for_awhile:2666: PASS (7.70s) ] 3: [0105_transactions_mock /197.873s] [ do_test_disconnected_group_coord:2802: switch_coord=false ] 3: [0105_transactions_mock /197.873s] Test config file test.conf not found 3: [0105_transactions_mock /197.873s] Setting test timeout to 60s * 2.7 3: %5|1669457952.617|MOCK|0105_transactions_mock#producer-242| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40985,127.0.0.1:33541,127.0.0.1:44961 3: [0105_transactions_mock /197.873s] Created kafka instance 0105_transactions_mock#producer-242 3: [0105_transactions_mock /197.874s] rd_kafka_init_transactions(rk, -1): duration 0.945ms 3: [0105_transactions_mock /197.875s] rd_kafka_begin_transaction(rk): duration 0.076ms 3: [0105_transactions_mock /197.875s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock /197.875s] 0105_transactions_mock#producer-242: Flushing 1 messages 3: [
/205.846s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /198.874s] FLUSH: duration 999.597ms 3: [
/206.846s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /199.874s] Calling send_offsets_to_transaction() 3: %3|1669457954.619|FAIL|0105_transactions_mock#producer-242| [thrd:127.0.0.1:33541/bootstrap]: 127.0.0.1:33541/2: Connect to ipv4#127.0.0.1:33541 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457954.831|FAIL|0105_transactions_mock#producer-242| [thrd:127.0.0.1:33541/bootstrap]: 127.0.0.1:33541/2: Connect to ipv4#127.0.0.1:33541 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/207.846s] 1 test(s) running: 0105_transactions_mock 3: [
/208.846s] 1 test(s) running: 0105_transactions_mock 3: [
/209.846s] 1 test(s) running: 0105_transactions_mock 3: [
/210.201s] Bringing up group coordinator 2.. 3: [0105_transactions_mock /202.980s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 3105.142ms 3: [0105_transactions_mock /202.980s] send_offsets_to_transaction(-1): duration 3105.157ms 3: [0105_transactions_mock /202.980s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:33541/2: Connect to ipv4#127.0.0.1:33541 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /202.980s] 0105_transactions_mock#producer-242 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:33541/2: Connect to ipv4#127.0.0.1:33541 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /202.980s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:33541/2: Connect to ipv4#127.0.0.1:33541 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /202.980s] 0105_transactions_mock#producer-242 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:33541/2: Connect to ipv4#127.0.0.1:33541 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /202.980s] rd_kafka_commit_transaction(rk, -1): duration 0.248ms 3: [0105_transactions_mock /202.980s] commit_transaction(-1): duration 0.258ms 3: [0105_transactions_mock /202.980s] [ do_test_disconnected_group_coord:2802: switch_coord=false: PASS (5.11s) ] 3: [0105_transactions_mock /202.980s] [ do_test_disconnected_group_coord:2802: switch_coord=true ] 3: [0105_transactions_mock /202.981s] Test config file test.conf not found 3: [0105_transactions_mock /202.981s] Setting test timeout to 60s * 2.7 3: %5|1669457957.725|MOCK|0105_transactions_mock#producer-243| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37827,127.0.0.1:41397,127.0.0.1:41509 3: [0105_transactions_mock /202.981s] Created kafka instance 0105_transactions_mock#producer-243 3: [0105_transactions_mock /202.982s] rd_kafka_init_transactions(rk, -1): duration 0.836ms 3: [0105_transactions_mock /202.982s] rd_kafka_begin_transaction(rk): duration 0.069ms 3: [0105_transactions_mock /202.982s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.014ms 3: [0105_transactions_mock /202.982s] 0105_transactions_mock#producer-243: Flushing 1 messages 3: [
/210.846s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /203.982s] FLUSH: duration 999.846ms 3: [
/211.846s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /204.982s] Calling send_offsets_to_transaction() 3: %3|1669457959.727|FAIL|0105_transactions_mock#producer-243| [thrd:127.0.0.1:41397/bootstrap]: 127.0.0.1:41397/2: Connect to ipv4#127.0.0.1:41397 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1669457959.988|FAIL|0105_transactions_mock#producer-243| [thrd:127.0.0.1:41397/bootstrap]: 127.0.0.1:41397/2: Connect to ipv4#127.0.0.1:41397 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/212.846s] 1 test(s) running: 0105_transactions_mock 3: [
/213.846s] 1 test(s) running: 0105_transactions_mock 3: [
/214.846s] 1 test(s) running: 0105_transactions_mock 3: [
/215.309s] Switching group coordinator to 3 3: [
/215.846s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /209.494s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 4511.413ms 3: [0105_transactions_mock /209.494s] send_offsets_to_transaction(-1): duration 4511.429ms 3: [0105_transactions_mock /209.494s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:41397/2: Connect to ipv4#127.0.0.1:41397 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /209.494s] 0105_transactions_mock#producer-243 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:41397/2: Connect to ipv4#127.0.0.1:41397 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /209.494s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:41397/2: Connect to ipv4#127.0.0.1:41397 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /209.494s] 0105_transactions_mock#producer-243 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:41397/2: Connect to ipv4#127.0.0.1:41397 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /209.494s] rd_kafka_commit_transaction(rk, -1): duration 0.191ms 3: [0105_transactions_mock /209.494s] commit_transaction(-1): duration 0.203ms 3: [0105_transactions_mock /209.495s] [ do_test_disconnected_group_coord:2802: switch_coord=true: PASS (6.51s) ] 3: [0105_transactions_mock /209.495s] 0105_transactions_mock: duration 209494.704ms 3: [0105_transactions_mock /209.495s] ================= Test 0105_transactions_mock PASSED ================= 3: [
/216.847s] ALL-TESTS: duration 216846.474ms 3: [
/216.847s] 10 thread(s) in use by librdkafka, waiting... 3: [
/217.847s] 10 thread(s) in use by librdkafka 3: [
/217.847s] TEST FAILURE 3: ### Test "
" failed at /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:1581:test_wait_exit() at Sat Nov 26 10:19:25 2022: ### 3: 10 thread(s) still active in librdkafka 3: test-runner: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:6629: test_fail0: Assertion `0' failed. 1/1 Test #3: RdKafkaTestBrokerLess ............Subprocess aborted***Exception: 217.85 sec 0% tests passed, 1 tests failed out of 1 Total Test time (real) = 217.87 sec The following tests FAILED: 3 - RdKafkaTestBrokerLess (Subprocess aborted) Errors while running CTest Output from these tests are in: /usr/src/RPM/BUILD/librdkafka-1.9.2/Testing/Temporary/LastTest.log Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely. error: Bad exit status from /usr/src/tmp/rpm-tmp.38630 (%check) RPM build errors: Bad exit status from /usr/src/tmp/rpm-tmp.38630 (%check) Command exited with non-zero status 1 180.97user 14.98system 4:45.23elapsed 68%CPU (0avgtext+0avgdata 572264maxresident)k 0inputs+0outputs (0major+4134244minor)pagefaults 0swaps hsh-rebuild: rebuild of `librdkafka-1.9.2-alt1.src.rpm' failed. Command exited with non-zero status 1 2.64user 1.72system 4:55.94elapsed 1%CPU (0avgtext+0avgdata 106772maxresident)k 88inputs+0outputs (31494major+162174minor)pagefaults 0swaps