<86>Feb 7 02:29:51 userdel[2055318]: delete user 'rooter' <86>Feb 7 02:29:51 userdel[2055318]: removed group 'rooter' owned by 'rooter' <86>Feb 7 02:29:51 userdel[2055318]: removed shadow group 'rooter' owned by 'rooter' <86>Feb 7 02:29:51 groupadd[2055323]: group added to /etc/group: name=rooter, GID=1332 <86>Feb 7 02:29:51 groupadd[2055323]: group added to /etc/gshadow: name=rooter <86>Feb 7 02:29:51 groupadd[2055323]: new group: name=rooter, GID=1332 <86>Feb 7 02:29:51 useradd[2055327]: new user: name=rooter, UID=1332, GID=1332, home=/root, shell=/bin/bash <86>Feb 7 02:29:51 userdel[2055333]: delete user 'builder' <86>Feb 7 02:29:51 userdel[2055333]: removed group 'builder' owned by 'builder' <86>Feb 7 02:29:51 userdel[2055333]: removed shadow group 'builder' owned by 'builder' <86>Feb 7 02:29:51 groupadd[2055338]: group added to /etc/group: name=builder, GID=1333 <86>Feb 7 02:29:51 groupadd[2055338]: group added to /etc/gshadow: name=builder <86>Feb 7 02:29:51 groupadd[2055338]: new group: name=builder, GID=1333 <86>Feb 7 02:29:51 useradd[2055342]: new user: name=builder, UID=1333, GID=1333, home=/usr/src, shell=/bin/bash warning: Macro %cmake_insource not found <13>Feb 7 02:29:52 rpmi: libuv-1.44.2-alt1 sisyphus+303845.100.1.1 1658053885 installed <13>Feb 7 02:29:52 rpmi: libjsoncpp24-1.9.4-alt2 sisyphus+286441.100.1.1 1633444232 installed <13>Feb 7 02:29:52 rpmi: libexpat-2.5.0-alt1 sisyphus+309227.100.1.1 1667075764 installed <13>Feb 7 02:29:52 rpmi: libidn2-2.3.4-alt1 sisyphus+309023.100.1.1 1666791084 installed <13>Feb 7 02:29:52 rpmi: libxxhash-0.8.0-alt2 sisyphus+277476.100.2.1 1625621312 installed <13>Feb 7 02:29:52 rpmi: liblz4-1:1.9.4-alt1 sisyphus+309416.100.1.1 1667412981 installed <13>Feb 7 02:29:52 rpmi: gcc-c++-common-1.4.27-alt1 sisyphus+278099.1300.1.1 1626028636 installed <13>Feb 7 02:29:53 rpmi: libstdc++12-devel-12.1.1-alt2 sisyphus+307182.100.1.1 1663781909 installed <13>Feb 7 02:29:53 rpmi: gcc12-c++-12.1.1-alt2 sisyphus+307182.100.1.1 1663781909 installed <13>Feb 7 02:29:53 rpmi: rpm-macros-cmake-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Feb 7 02:29:53 rpmi: cmake-modules-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Feb 7 02:29:53 rpmi: librhash-1.3.5-alt3 sisyphus+286141.40.2.1 1632982456 installed <13>Feb 7 02:29:53 rpmi: publicsuffix-list-dafsa-20221208-alt1 sisyphus+313597.100.1.1 1673961759 installed <13>Feb 7 02:29:53 rpmi: libpsl-0.21.2-alt1 sisyphus+312536.100.1.1 1672131178 installed <13>Feb 7 02:29:53 rpmi: libnghttp2-1.51.0-alt1 sisyphus+310565.100.1.1 1669296590 installed <13>Feb 7 02:29:53 rpmi: openldap-common-2.6.3-alt1 sisyphus+306372.60.8.1 1663095223 installed <13>Feb 7 02:29:53 rpmi: libverto-0.3.2-alt1_1 sisyphus+279289.100.1.3 1626493868 installed <13>Feb 7 02:29:53 rpmi: liblmdb-0.9.29-alt1.1 sisyphus+306630.100.1.1 1663072360 installed <13>Feb 7 02:29:53 rpmi: libkeyutils-1.6.3-alt1 sisyphus+266061.100.1.1 1612919566 installed <13>Feb 7 02:29:53 rpmi: libcom_err-1.46.4.0.5.4cda-alt1 sisyphus+283826.100.1.1 1629975345 installed <13>Feb 7 02:29:53 rpmi: libbrotlicommon-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Feb 7 02:29:53 rpmi: libbrotlidec-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Feb 7 02:29:53 rpmi: libp11-kit-0.24.1-alt1 sisyphus+293720.100.1.1 1642535264 installed <13>Feb 7 02:29:53 rpmi: libtasn1-4.19.0-alt1 sisyphus+305700.100.1.1 1661359624 installed <13>Feb 7 02:29:53 rpmi: rpm-macros-alternatives-0.5.2-alt1 sisyphus+300869.100.1.1 1653844113 installed <13>Feb 7 02:29:53 rpmi: alternatives-0.5.2-alt1 sisyphus+300869.100.1.1 1653844113 installed <13>Feb 7 02:29:53 rpmi: ca-certificates-2022.12.14-alt1 sisyphus+311754.200.1.1 1671046143 installed <13>Feb 7 02:29:53 rpmi: ca-trust-0.1.4-alt1 sisyphus+308690.100.1.1 1666182992 installed <13>Feb 7 02:29:53 rpmi: p11-kit-trust-0.24.1-alt1 sisyphus+293720.100.1.1 1642535264 installed <13>Feb 7 02:29:53 rpmi: libcrypto1.1-1.1.1q-alt1 sisyphus+303203.100.1.1 1657026987 installed <13>Feb 7 02:29:53 rpmi: libssl1.1-1.1.1q-alt1 sisyphus+303203.100.1.1 1657026987 installed <86>Feb 7 02:29:53 groupadd[2063104]: group added to /etc/group: name=_keytab, GID=499 <86>Feb 7 02:29:53 groupadd[2063104]: group added to /etc/gshadow: name=_keytab <86>Feb 7 02:29:53 groupadd[2063104]: new group: name=_keytab, GID=499 <13>Feb 7 02:29:53 rpmi: libkrb5-1.19.4-alt1 sisyphus+310092.100.2.1 1668703482 installed <86>Feb 7 02:29:53 groupadd[2063228]: group added to /etc/group: name=sasl, GID=498 <86>Feb 7 02:29:53 groupadd[2063228]: group added to /etc/gshadow: name=sasl <86>Feb 7 02:29:53 groupadd[2063228]: new group: name=sasl, GID=498 <13>Feb 7 02:29:53 rpmi: libsasl2-3-2.1.27-alt2.2 sisyphus+306372.1000.8.1 1663097224 installed <13>Feb 7 02:29:53 rpmi: libldap2-2.6.3-alt1 sisyphus+306372.60.8.1 1663095223 installed <13>Feb 7 02:29:53 rpmi: libcurl-7.87.0-alt1 sisyphus+312113.100.1.1 1671611216 installed <13>Feb 7 02:29:53 rpmi: libarchive13-3.6.1-alt2 sisyphus+311213.100.1.1 1670244620 installed <13>Feb 7 02:29:54 rpmi: cmake-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Feb 7 02:29:54 rpmi: ctest-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Feb 7 02:29:54 rpmi: libsasl2-devel-2.1.27-alt2.2 sisyphus+306372.1000.8.1 1663097224 installed <13>Feb 7 02:29:54 rpmi: libssl-devel-1.1.1q-alt1 sisyphus+303203.100.1.1 1657026987 installed <13>Feb 7 02:29:54 rpmi: gcc-c++-12-alt1 sisyphus+300988.300.1.1 1654033053 installed <13>Feb 7 02:29:54 rpmi: liblz4-devel-1:1.9.4-alt1 sisyphus+309416.100.1.1 1667412981 installed <13>Feb 7 02:29:54 rpmi: libxxhash-devel-0.8.0-alt2 sisyphus+277476.100.2.1 1625621312 installed Building target platforms: x86_64 Building for target x86_64 Wrote: /usr/src/in/nosrpm/librdkafka-1.9.2-alt1.nosrc.rpm (w1.gzdio) Installing librdkafka-1.9.2-alt1.src.rpm Building target platforms: x86_64 Building for target x86_64 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.59130 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf librdkafka-1.9.2 + echo 'Source #0 (librdkafka-1.9.2.tar):' Source #0 (librdkafka-1.9.2.tar): + /bin/tar -xf /usr/src/RPM/SOURCES/librdkafka-1.9.2.tar + cd librdkafka-1.9.2 + /bin/chmod -c -Rf u+rwX,go-w . + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.59130 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd librdkafka-1.9.2 + mkdir -p . + cmake -DCMAKE_SKIP_INSTALL_RPATH:BOOL=yes '-DCMAKE_C_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto' '-DCMAKE_CXX_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto' '-DCMAKE_Fortran_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto' -DCMAKE_INSTALL_PREFIX=/usr -DINCLUDE_INSTALL_DIR:PATH=/usr/include -DLIB_INSTALL_DIR:PATH=/usr/lib64 -DSYSCONF_INSTALL_DIR:PATH=/etc -DSHARE_INSTALL_PREFIX:PATH=/usr/share -DLIB_DESTINATION=lib64 -DLIB_SUFFIX=64 -S . -B . -- The C compiler identification is GNU 12.1.1 -- The CXX compiler identification is GNU 12.1.1 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Looking for pow in m -- Looking for pow in m - found -- Checking for module 'libsasl2' -- Found libsasl2, version 2.1.27 -- Found LZ4: /usr/lib64/liblz4.so (found version "1.9.4") -- Found OpenSSL: /usr/lib64/libcrypto.so (found version "1.1.1q") -- Looking for pthread.h -- Looking for pthread.h - found -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Configuring done -- Generating done CMake Warning: Manually-specified variables were not used by the project: CMAKE_Fortran_FLAGS INCLUDE_INSTALL_DIR LIB_DESTINATION LIB_INSTALL_DIR LIB_SUFFIX SHARE_INSTALL_PREFIX SYSCONF_INSTALL_DIR -- Build files have been written to: /usr/src/RPM/BUILD/librdkafka-1.9.2 + make -j16 make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 3%] Building C object src/CMakeFiles/rdkafka.dir/rdcrc32.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdaddr.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 3%] Building C object src/CMakeFiles/rdkafka.dir/rdfnv1a.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 1%] Building C object src/CMakeFiles/rdkafka.dir/crc32c.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdbuf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdavl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 6%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_feature.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_lz4.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 6%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_buf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 4%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_metadata_cache.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 9%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msgset_writer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_metadata.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 6%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_conf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 8%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msg.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_op.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_pattern.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 11%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_range_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 9%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_offset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 8%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msgset_reader.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 11%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_queue.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_roundrobin_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 3%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 13%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_plain.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_timer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_cgrp.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_partition.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 14%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_subscription.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 4%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_broker.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 16%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_interceptor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 16%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_header.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 14%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_assignment.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_transport.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_topic.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 17%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_aux.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_background.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 13%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sticky_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdports.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 22%] Building C object src/CMakeFiles/rdkafka.dir/rdmurmur2.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_idempotence.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdlist.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdregex.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdrand.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 21%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_error.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_cert.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_request.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_coord.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock_cgrp.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 24%] Building C object src/CMakeFiles/rdkafka.dir/rdstring.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/rdvarint.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 22%] Building C object src/CMakeFiles/rdkafka.dir/rdlog.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 26%] Building C object src/CMakeFiles/rdkafka.dir/tinycthread.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/rdmap.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 26%] Building C object src/CMakeFiles/rdkafka.dir/tinycthread_extra.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rddl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 19%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 24%] Building C object src/CMakeFiles/rdkafka.dir/rdunittest.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 27%] Building C object src/CMakeFiles/rdkafka.dir/rdxxhash.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 27%] Building C object src/CMakeFiles/rdkafka.dir/cJSON.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_txnmgr.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rdhdrhistogram.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 29%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_plugin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/snappy.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 29%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_cyrus.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_ssl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_scram.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 17%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_admin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_oauthbearer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock_handlers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Linking C shared library librdkafka.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Built target rdkafka make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object examples/CMakeFiles/producer.dir/producer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object examples/CMakeFiles/consumer.dir/consumer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building C object examples/CMakeFiles/misc.dir/misc.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 33%] Building C object tests/interceptor_test/CMakeFiles/interceptor_test.dir/interceptor_test.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 33%] Building C object examples/CMakeFiles/rdkafka_complex_consumer_example.dir/rdkafka_complex_consumer_example.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building C object examples/CMakeFiles/rdkafka_example.dir/rdkafka_example.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building C object examples/CMakeFiles/rdkafka_performance.dir/rdkafka_performance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Linking C executable consumer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Built target consumer make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Linking C executable producer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Linking C executable misc make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Built target producer make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Built target misc make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Linking C shared library interceptor_test.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target interceptor_test make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Linking C executable rdkafka_complex_consumer_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target rdkafka_complex_consumer_example make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Linking C executable rdkafka_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target rdkafka_example make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 33%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/MessageImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/HeadersImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ConfImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ConsumerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Linking C executable rdkafka_performance make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 34%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ProducerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/HandleImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target rdkafka_performance make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 33%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/MetadataImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/QueueImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/KafkaConsumerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/TopicPartitionImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/RdKafka.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/TopicImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Linking CXX shared library librdkafka++.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target rdkafka++ make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 45%] Building C object tests/CMakeFiles/test-runner.dir/0003-msgmaxsize.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0000-unittests.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0006-symbols.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0001-multiobj.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0002-unkpart.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0005-order.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0007-autotopic.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 47%] Building C object tests/CMakeFiles/test-runner.dir/0008-reqacks.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 47%] Building C object tests/CMakeFiles/test-runner.dir/0009-mock_cluster.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0004-conf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 48%] Building C object tests/CMakeFiles/test-runner.dir/0011-produce_batch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 50%] Building C object tests/CMakeFiles/test-runner.dir/0016-client_swname.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 52%] Building C object tests/CMakeFiles/test-runner.dir/0021-rkt_destroy.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0013-null-msgs.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0014-reconsume-191.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0017-compression.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0012-produce_consume.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0015-offset_seeks.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0020-destroy_hang.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0018-cgrp_term.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0019-list_groups.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 52%] Building C object tests/CMakeFiles/test-runner.dir/0022-consume_batch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 54%] Building C object tests/CMakeFiles/test-runner.dir/0028-long_topicnames.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 53%] Building C object tests/CMakeFiles/test-runner.dir/0025-timers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 54%] Building C object tests/CMakeFiles/test-runner.dir/0029-assign_offset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 55%] Building C object tests/CMakeFiles/test-runner.dir/0031-get_offsets.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0035-api_version.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0036-partial_fetch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 57%] Building C object tests/CMakeFiles/test-runner.dir/0037-destroy_hang_local.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0034-offset_reset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 53%] Building C object tests/CMakeFiles/test-runner.dir/0026-consume_pause.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 54%] Building C object tests/CMakeFiles/test-runner.dir/0030-offset_commit.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 55%] Building C object tests/CMakeFiles/test-runner.dir/0033-regex_subscribe.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 57%] Building C object tests/CMakeFiles/test-runner.dir/0038-performance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0040-io_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0039-event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0046-rkt_cache.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 60%] Building C object tests/CMakeFiles/test-runner.dir/0044-partition_cnt.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 59%] Building C object tests/CMakeFiles/test-runner.dir/0043-no_connection.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 59%] Building C object tests/CMakeFiles/test-runner.dir/0041-fetch_max_bytes.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 59%] Building C object tests/CMakeFiles/test-runner.dir/0042-many_topics.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0047-partial_buf_tmout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 60%] Building C object tests/CMakeFiles/test-runner.dir/0045-subscribe_update.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 62%] Building C object tests/CMakeFiles/test-runner.dir/0050-subscribe_adds.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 62%] Building C object tests/CMakeFiles/test-runner.dir/0048-partitioner.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 62%] Building C object tests/CMakeFiles/test-runner.dir/0049-consume_conn_close.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 63%] Building C object tests/CMakeFiles/test-runner.dir/0051-assign_adds.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 63%] Building C object tests/CMakeFiles/test-runner.dir/0052-msg_timestamps.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 65%] Building C object tests/CMakeFiles/test-runner.dir/0056-balanced_group_mt.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Building CXX object examples/CMakeFiles/producer_cpp.dir/producer.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Building CXX object examples/CMakeFiles/openssl_engine_example_cpp.dir/openssl_engine_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building C object tests/CMakeFiles/test-runner.dir/0055-producer_latency.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 67%] Building C object tests/CMakeFiles/test-runner.dir/0062-stats_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Building CXX object examples/CMakeFiles/rdkafka_complex_consumer_example_cpp.dir/rdkafka_complex_consumer_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Building C object tests/CMakeFiles/test-runner.dir/0064-interceptors.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Building CXX object examples/CMakeFiles/rdkafka_example_cpp.dir/rdkafka_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Linking CXX executable producer_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Built target producer_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 69%] Linking CXX executable openssl_engine_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 72%] Built target openssl_engine_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 72%] Building C object tests/CMakeFiles/test-runner.dir/0068-produce_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 72%] Building C object tests/CMakeFiles/test-runner.dir/0069-consumer_add_parts.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 65%] Building CXX object tests/CMakeFiles/test-runner.dir/0057-invalid_topic.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building CXX object tests/CMakeFiles/test-runner.dir/0053-stats_cb.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building CXX object tests/CMakeFiles/test-runner.dir/0058-log.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Building CXX object examples/CMakeFiles/kafkatest_verifiable_client.dir/kafkatest_verifiable_client.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 73%] Building C object tests/CMakeFiles/test-runner.dir/0072-headers_ut.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building CXX object tests/CMakeFiles/test-runner.dir/0054-offset_time.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 67%] Building CXX object tests/CMakeFiles/test-runner.dir/0060-op_prio.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building CXX object tests/CMakeFiles/test-runner.dir/0059-bsearch.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Building CXX object tests/CMakeFiles/test-runner.dir/0063-clusterid.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0074-producev.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0075-retry.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0073-headers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building CXX object tests/CMakeFiles/test-runner.dir/0061-consumer_lag.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 70%] Building CXX object tests/CMakeFiles/test-runner.dir/0065-yield.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 75%] Building C object tests/CMakeFiles/test-runner.dir/0076-produce_retry.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 70%] Building CXX object tests/CMakeFiles/test-runner.dir/0066-plugins.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 75%] Building C object tests/CMakeFiles/test-runner.dir/0077-compaction.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building C object tests/CMakeFiles/test-runner.dir/0079-fork.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Linking CXX executable rdkafka_complex_consumer_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 80%] Built target rdkafka_complex_consumer_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 78%] Building C object tests/CMakeFiles/test-runner.dir/0083-cb_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 78%] Building C object tests/CMakeFiles/test-runner.dir/0084-destroy_flags.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building C object tests/CMakeFiles/test-runner.dir/0086-purge.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Building CXX object tests/CMakeFiles/test-runner.dir/0067-empty_topic.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building C object tests/CMakeFiles/test-runner.dir/0088-produce_metadata_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 80%] Building C object tests/CMakeFiles/test-runner.dir/0089-max_poll_interval.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 81%] Building C object tests/CMakeFiles/test-runner.dir/0092-mixed_msgver.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 80%] Building C object tests/CMakeFiles/test-runner.dir/0090-idempotence.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 81%] Building C object tests/CMakeFiles/test-runner.dir/0091-max_poll_interval_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building C object tests/CMakeFiles/test-runner.dir/0080-admin_ut.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building C object tests/CMakeFiles/test-runner.dir/0094-idempotence_msg_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building C object tests/CMakeFiles/test-runner.dir/0093-holb.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Linking CXX executable rdkafka_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 86%] Built target rdkafka_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 77%] Building C object tests/CMakeFiles/test-runner.dir/0081-admin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building C object tests/CMakeFiles/test-runner.dir/0099-commit_metadata.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 85%] Building C object tests/CMakeFiles/test-runner.dir/0102-static_group_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 86%] Building C object tests/CMakeFiles/test-runner.dir/0104-fetch_from_follower_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building C object tests/CMakeFiles/test-runner.dir/0106-cgrp_sess_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building C object tests/CMakeFiles/test-runner.dir/0107-topic_recreate.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 73%] Building CXX object tests/CMakeFiles/test-runner.dir/0070-null_empty.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building C object tests/CMakeFiles/test-runner.dir/0112-assign_unknown_part.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 85%] Building C object tests/CMakeFiles/test-runner.dir/0103-transactions.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building CXX object tests/CMakeFiles/test-runner.dir/0078-c_from_cpp.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 77%] Building CXX object tests/CMakeFiles/test-runner.dir/0082-fetch_max_bytes.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0101-fetch-from-follower.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0098-consumer-txn.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 91%] Building C object tests/CMakeFiles/test-runner.dir/0117-mock_errors.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building CXX object tests/CMakeFiles/test-runner.dir/0085-headers.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 86%] Building C object tests/CMakeFiles/test-runner.dir/0105-transactions_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0095-all_brokers_down.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0100-thread_interceptors.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building C object tests/CMakeFiles/test-runner.dir/0121-clusterid.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 91%] Building C object tests/CMakeFiles/test-runner.dir/0120-asymmetric_subscription.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 91%] Building C object tests/CMakeFiles/test-runner.dir/0118-commit_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 93%] Building C object tests/CMakeFiles/test-runner.dir/0122-buffer_cleaning_after_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 93%] Building C object tests/CMakeFiles/test-runner.dir/0123-connections_max_idle.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0124-openssl_invalid_engine.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0097-ssl_verify.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 88%] Building CXX object tests/CMakeFiles/test-runner.dir/0110-batch_size.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0125-immediate_flush.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0126-oauthbearer_oidc.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 95%] Building C object tests/CMakeFiles/test-runner.dir/0129-fetch_aborted_msgs.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 96%] Building C object tests/CMakeFiles/test-runner.dir/0130-store_offsets.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 96%] Building C object tests/CMakeFiles/test-runner.dir/0131-connect_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building C object tests/CMakeFiles/test-runner.dir/rusage.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building C object tests/CMakeFiles/test-runner.dir/0132-strategy_ordering.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Building C object tests/CMakeFiles/test-runner.dir/sockem.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 88%] Building CXX object tests/CMakeFiles/test-runner.dir/0111-delay_create_topics.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Building C object tests/CMakeFiles/test-runner.dir/sockem_ctrl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building CXX object tests/CMakeFiles/test-runner.dir/0109-auto_create_topics.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building CXX object tests/CMakeFiles/test-runner.dir/0114-sticky_partitioning.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 90%] Building CXX object tests/CMakeFiles/test-runner.dir/0115-producer_auth.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 90%] Building CXX object tests/CMakeFiles/test-runner.dir/0116-kafkaconsumer_close.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 95%] Building CXX object tests/CMakeFiles/test-runner.dir/0128-sasl_callback_queue.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building CXX object tests/CMakeFiles/test-runner.dir/8000-idle.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building CXX object tests/CMakeFiles/test-runner.dir/0119-consumer_auth.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Linking CXX executable kafkatest_verifiable_client make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Built target kafkatest_verifiable_client make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building C object tests/CMakeFiles/test-runner.dir/test.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 98%] Building CXX object tests/CMakeFiles/test-runner.dir/testcpp.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building CXX object tests/CMakeFiles/test-runner.dir/0113-cooperative_rebalance.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Linking CXX executable test-runner make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Built target test-runner make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' + exit 0 Executing(%install): /bin/sh -e /usr/src/tmp/rpm-tmp.60349 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + /bin/chmod -Rf u+rwX -- /usr/src/tmp/librdkafka-buildroot + : + /bin/rm -rf -- /usr/src/tmp/librdkafka-buildroot + PATH=/usr/libexec/rpm-build:/usr/src/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/games + cd librdkafka-1.9.2 + make 'INSTALL=/usr/libexec/rpm-build/install -p' install DESTDIR=/usr/src/tmp/librdkafka-buildroot make: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Built target rdkafka make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka++ make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Built target rdkafka++ make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target producer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target producer make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target producer_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target producer_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target consumer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target consumer make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_performance make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Built target rdkafka_performance make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Built target rdkafka_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_complex_consumer_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Built target rdkafka_complex_consumer_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target openssl_engine_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target openssl_engine_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target misc make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Built target misc make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Built target rdkafka_example make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_complex_consumer_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Built target rdkafka_complex_consumer_example make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target kafkatest_verifiable_client make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Built target kafkatest_verifiable_client make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target test-runner make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Built target test-runner make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target interceptor_test make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Built target interceptor_test make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Install the project... -- Install configuration: "" -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaConfig.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaConfigVersion.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/FindLZ4.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaTargets.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaTargets-noconfig.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/share/licenses/librdkafka/LICENSES.txt -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka.pc -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka.so.1 -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka.so -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafka.h -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafka_mock.h -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka++.pc -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka++.so.1 -- Set runtime path of "/usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka++.so.1" to "" -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka++.so -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafkacpp.h make: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' + mkdir -p /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig + cp /usr/src/RPM/SOURCES/rdkafka.pc /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/ + /usr/bin/subst 's|@VERSION@|1.9.2|g' /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka++.pc /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka.pc + rm -f '/usr/src/tmp/librdkafka-buildroot/usr/lib64/*.a' + rm -f /usr/src/tmp/librdkafka-buildroot/usr/share/licenses/librdkafka/LICENSES.txt + /usr/lib/rpm/brp-alt Cleaning files in /usr/src/tmp/librdkafka-buildroot (auto) mode of './usr/lib64/librdkafka++.so.1' changed from 0755 (rwxr-xr-x) to 0644 (rw-r--r--) mode of './usr/lib64/librdkafka.so.1' changed from 0755 (rwxr-xr-x) to 0644 (rw-r--r--) Verifying and fixing files in /usr/src/tmp/librdkafka-buildroot (binconfig,pkgconfig,libtool,desktop,gnuconfig) /usr/lib64/pkgconfig/rdkafka++.pc: Cflags: '-I${includedir}' --> '' Checking contents of files in /usr/src/tmp/librdkafka-buildroot/ (default) Compressing files in /usr/src/tmp/librdkafka-buildroot (auto) Adjusting library links in /usr/src/tmp/librdkafka-buildroot ./usr/lib64: (from :0) librdkafka.so.1 -> librdkafka.so.1 librdkafka++.so.1 -> librdkafka++.so.1 Verifying ELF objects in /usr/src/tmp/librdkafka-buildroot (arch=normal,fhs=normal,lfs=relaxed,lint=relaxed,rpath=normal,stack=normal,textrel=normal,unresolved=normal) Executing(%check): /bin/sh -e /usr/src/tmp/rpm-tmp.74583 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd librdkafka-1.9.2 + ctest -VV -R RdKafkaTestBrokerLess UpdateCTestConfiguration from :/usr/src/RPM/BUILD/librdkafka-1.9.2/DartConfiguration.tcl UpdateCTestConfiguration from :/usr/src/RPM/BUILD/librdkafka-1.9.2/DartConfiguration.tcl Test project /usr/src/RPM/BUILD/librdkafka-1.9.2 Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 3 Start 3: RdKafkaTestBrokerLess 3: Test command: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test-runner "-p5" "-l" 3: Test timeout computed to be: 10000000 3: [
/ 0.000s] Test config file test.conf not found 3: [
/ 0.000s] Setting test timeout to 10s * 1.0 3: [
/ 0.000s] Git version: HEAD 3: [
/ 0.000s] Broker version: 2.4.0.0 (2.4.0.0) 3: [
/ 0.000s] Tests to run : all 3: [
/ 0.000s] Test mode : bare 3: [
/ 0.000s] Test scenario: default 3: [
/ 0.000s] Test filter : local tests only 3: [
/ 0.000s] Test timeout multiplier: 2.7 3: [
/ 0.000s] Action on test failure: continue other tests 3: [
/ 0.000s] Current directory: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests 3: [
/ 0.000s] Setting test timeout to 30s * 2.7 3: [0000_unittests / 0.000s] ================= Running test 0000_unittests ================= 3: [0000_unittests / 0.000s] ==== Stats written to file stats_0000_unittests_4925468426772218644.json ==== 3: [0000_unittests / 0.000s] [0004_conf / 0.000s] ================= Running test 0004_conf ================= 3: [0004_conf / 0.000s] ==== Stats written to file stats_0004_conf_4589811330312329356.json ==== 3: builtin.features = snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer 3: [0006_symbols / 0.000s] ================= Running test 0006_symbols ================= 3: [0006_symbols / 0.000s] ==== Stats written to file stats_0006_symbols_6346019213006973005.json ==== 3: [0006_symbols / 0.000s] 0006_symbols: duration 0.000ms 3: [0006_symbols / 0.000s] ================= Test 0006_symbols PASSED ================= 3: [0004_conf / 0.000s] Test config file test.conf not found 3: [0004_conf / 0.000s] Setting test timeout to 10s * 2.7 3: [0004_conf / 0.000s] Using topic "rdkafkatest_0004" 3: [0009_mock_cluster / 0.000s] ================= Running test 0009_mock_cluster ================= 3: [0009_mock_cluster / 0.000s] ==== Stats written to file stats_0009_mock_cluster_9208797995054553155.json ==== 3: [0009_mock_cluster / 0.000s] Using topic "rdkafkatest_rnd636de1bf2665bdf0_0009_mock_cluster" 3: [0022_consume_batch_local / 0.000s] ================= Running test 0022_consume_batch_local ================= 3: [0022_consume_batch_local / 0.000s] ==== Stats written to file stats_0022_consume_batch_local_6703573673530408916.json ==== 3: [0022_consume_batch_local / 0.000s] [ do_test_consume_batch_oauthbearer_cb:170 ] 3: [0009_mock_cluster / 0.000s] Test config file test.conf not found 3: [
/ 0.000s] Too many tests running (5 >= 5): postponing 0033_regex_subscribe_local start... 3: [0025_timers / 0.000s] ================= Running test 0025_timers ================= 3: [0025_timers / 0.000s] ==== Stats written to file stats_0025_timers_838913993661541161.json ==== 3: [0025_timers / 0.000s] Test config file test.conf not found 3: [0025_timers / 0.000s] Setting test timeout to 200s * 2.7 3: [0004_conf / 0.002s] : on_new() called 3: %7|1675737011.825|OPENSSL|rdkafka#producer-1| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1675737011.825|INIT|my id#producer-5| [thrd:app]: librdkafka v1.9.2 (0x10902ff) my id#producer-5 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x80c) 3: %4|1675737011.825|CONFWARN|my id#producer-5| [thrd:app]: Configuration property auto.offset.reset is a consumer property and will be ignored by this producer instance 3: %5|1675737011.825|CONFWARN|MOCK#producer-3| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1675737011.825|CONFWARN|my id#producer-5| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.002s] Created kafka instance my id#producer-5 3: %7|1675737011.825|TOPIC|my id#producer-5| [thrd:app]: New local topic: rdkafkatest_0004 3: %7|1675737011.825|TOPPARNEW|my id#producer-5| [thrd:app]: NEW rdkafkatest_0004 [-1] 0x7fa468000de0 refcnt 0x7fa468000e70 (at rd_kafka_topic_new0:468) 3: %7|1675737011.825|METADATA|my id#producer-5| [thrd:app]: Hinted cache of 1/1 topic(s) being queried 3: %7|1675737011.825|METADATA|my id#producer-5| [thrd:app]: Skipping metadata refresh of 1 topic(s): leader query: no usable brokers 3: %7|1675737011.825|DESTROY|my id#producer-5| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %5|1675737011.825|CONFWARN|0025_timers#consumer-4| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0025_timers / 0.002s] Created kafka instance 0025_timers#consumer-4 3: [0025_timers / 0.002s] rd_kafka_new(): duration 1.958ms 3: [0025_timers / 0.002s] %5|1675737011.825|CONFWARN|0022_consume_batch_local#consumer-2| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0022_consume_batch_local / 0.002s] Created kafka instance 0022_consume_batch_local#consumer-2 3: Starting wait loop for 10 expected stats_cb calls with an interval of 600ms 3: [0022_consume_batch_local / 0.002s] Refresh callback called 3: %3|1675737011.825|ERROR|0022_consume_batch_local#consumer-2| [thrd:app]: Failed to acquire SASL OAUTHBEARER token: Refresh called 3: %7|1675737011.825|DESTROY|my id#producer-5| [thrd:main]: Destroy internal 3: %7|1675737011.825|DESTROY|my id#producer-5| [thrd:main]: Removing all topics 3: %7|1675737011.825|TOPPARREMOVE|my id#producer-5| [thrd:main]: Removing toppar rdkafkatest_0004 [-1] 0x7fa468000de0 3: %7|1675737011.825|DESTROY|my id#producer-5| [thrd:main]: rdkafkatest_0004 [-1]: 0x7fa468000de0 DESTROY_FINAL 3: [0009_mock_cluster / 0.002s] Test config file test.conf not found 3: [0009_mock_cluster / 0.002s] Setting test timeout to 30s * 2.7 3: %7|1675737011.825|INIT|my id#producer-6| [thrd:app]: librdkafka v1.9.2 (0x10902ff) my id#producer-6 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x80c) 3: %4|1675737011.825|CONFWARN|my id#producer-6| [thrd:app]: Configuration property auto.offset.reset is a consumer property and will be ignored by this producer instance 3: %5|1675737011.825|CONFWARN|my id#producer-6| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.002s] Created kafka instance my id#producer-6 3: %7|1675737011.825|TOPIC|my id#producer-6| [thrd:app]: New local topic: rdkafkatest_0004 3: %7|1675737011.825|TOPPARNEW|my id#producer-6| [thrd:app]: NEW rdkafkatest_0004 [-1] 0x7fa468000de0 refcnt 0x7fa468000e70 (at rd_kafka_topic_new0:468) 3: %7|1675737011.825|METADATA|my id#producer-6| [thrd:app]: Hinted cache of 1/1 topic(s) being queried 3: %7|1675737011.825|METADATA|my id#producer-6| [thrd:app]: Skipping metadata refresh of 1 topic(s): leader query: no usable brokers 3: %7|1675737011.825|DESTROY|my id#producer-6| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1675737011.825|DESTROY|my id#producer-6| [thrd:main]: Destroy internal 3: %7|1675737011.825|DESTROY|my id#producer-6| [thrd:main]: Removing all topics 3: %7|1675737011.825|TOPPARREMOVE|my id#producer-6| [thrd:main]: Removing toppar rdkafkatest_0004 [-1] 0x7fa468000de0 3: %7|1675737011.825|DESTROY|my id#producer-6| [thrd:main]: rdkafkatest_0004 [-1]: 0x7fa468000de0 DESTROY_FINAL 3: [0004_conf / 0.002s] Incremental S2F tests 3: [0004_conf / 0.002s] Set: generic,broker,queue,cgrp 3: [0004_conf / 0.002s] Now: generic,broker,queue,cgrp 3: [0004_conf / 0.002s] Set: -broker,+queue,topic 3: [0004_conf / 0.002s] Now: generic,topic,queue,cgrp 3: [0004_conf / 0.002s] Set: -all,security,-fetch,+metadata 3: [0004_conf / 0.002s] Now: metadata,security 3: [0004_conf / 0.002s] Error reporting for S2F properties 3: [0004_conf / 0.002s] Ok: Invalid value "invalid-value" for configuration property "debug" 3: [0004_conf / 0.003s] Verifying that ssl.ca.location is not overwritten (#3566) 3: [0009_mock_cluster / 0.002s] Created kafka instance 0009_mock_cluster#producer-7 3: %3|1675737011.826|SSL|rdkafka#producer-8| [thrd:app]: error:02001002:system library:fopen:No such file or directory: fopen('/?/does/!/not/exist!','r') 3: %3|1675737011.826|SSL|rdkafka#producer-8| [thrd:app]: error:2006D080:BIO routines:BIO_new_file:no such file 3: [0004_conf / 0.003s] rd_kafka_new() failed as expected: ssl.ca.location failed: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib 3: [0004_conf / 0.003s] Canonical tests 3: [0004_conf / 0.003s] Set: request.required.acks=0 expect 0 (topic) 3: [0004_conf / 0.003s] Set: request.required.acks=-1 expect -1 (topic) 3: [0004_conf / 0.003s] Set: request.required.acks=1 expect 1 (topic) 3: [0004_conf / 0.003s] Set: acks=3 expect 3 (topic) 3: [0004_conf / 0.003s] Set: request.required.acks=393 expect 393 (topic) 3: [0004_conf / 0.003s] Set: request.required.acks=bad expect (null) (topic) 3: [0004_conf / 0.003s] Set: request.required.acks=all expect -1 (topic) 3: [0004_conf / 0.003s] Set: request.required.acks=all expect -1 (global) 3: [0004_conf / 0.003s] Set: acks=0 expect 0 (topic) 3: [0004_conf / 0.003s] Set: sasl.mechanisms=GSSAPI expect GSSAPI (global) 3: [0004_conf / 0.003s] Set: sasl.mechanisms=PLAIN expect PLAIN (global) 3: [0004_conf / 0.003s] Set: sasl.mechanisms=GSSAPI,PLAIN expect (null) (global) 3: [0004_conf / 0.003s] Set: sasl.mechanisms= expect (null) (global) 3: [0004_conf / 0.003s] Set: linger.ms=12555.3 expect 12555.3 (global) 3: [0004_conf / 0.003s] Set: linger.ms=1500.000 expect 1500 (global) 3: [0004_conf / 0.003s] Set: linger.ms=0.0001 expect 0.0001 (global) 3: %4|1675737011.826|CONFWARN|0009_mock_cluster#consumer-9| [thrd:app]: Configuration property dr_msg_cb is a producer property and will be ignored by this consumer instance 3: [0009_mock_cluster / 0.003s] Created kafka instance 0009_mock_cluster#consumer-9 3: [0009_mock_cluster / 0.003s] Test config file test.conf not found 3: [0009_mock_cluster / 0.003s] Produce to rdkafkatest_rnd636de1bf2665bdf0_0009_mock_cluster [-1]: messages #0..100 3: %5|1675737011.826|CONFWARN|rdkafka#producer-10| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0009_mock_cluster / 0.003s] SUM(POLL): duration 0.000ms 3: [0009_mock_cluster / 0.003s] PRODUCE: duration 0.157ms 3: %4|1675737011.826|CONFWARN|rdkafka#producer-12| [thrd:app]: Configuration property partition.assignment.strategy is a consumer property and will be ignored by this producer instance 3: %5|1675737011.826|CONFWARN|rdkafka#producer-12| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.003s] Ok: `acks` must be set to `all` when `enable.idempotence` is true 3: [0004_conf / 0.003s] Ok: Java TrustStores are not supported, use `ssl.ca.location` and a certificate file instead. See https://github.com/edenhill/librdkafka/wiki/Using-SSL-with-librdkafka for more information. 3: [0004_conf / 0.003s] Ok: Java JAAS configuration is not supported, see https://github.com/edenhill/librdkafka/wiki/Using-SASL-with-librdkafka for more information. 3: [0004_conf / 0.003s] Ok: Internal property "interceptors" not settable 3: %5|1675737011.826|CONFWARN|rdkafka#producer-13| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1675737011.826|TOPICCONF|rdkafka#producer-13| [thrd:app]: Incompatible configuration settings for topic "mytopic": `acks` must be set to `all` when `enable.idempotence` is true 3: %5|1675737011.826|CONFWARN|rdkafka#producer-14| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1675737011.827|CONFWARN|rdkafka#producer-15| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1675737011.827|CONFWARN|rdkafka#producer-16| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1675737011.827|TOPICCONF|rdkafka#producer-16| [thrd:app]: Incompatible configuration settings for topic "mytopic": `queuing.strategy` must be set to `fifo` when `enable.idempotence` is true 3: %4|1675737011.827|CONFWARN|rdkafka#consumer-17| [thrd:app]: Configuration property queue.buffering.max.ms is a producer property and will be ignored by this consumer instance 3: [0004_conf / 0.004s] Instance config linger.ms=123 3: [0004_conf / 0.004s] Instance config group.id=test1 3: [0004_conf / 0.004s] Instance config enable.auto.commit=false 3: [0004_conf / 0.004s] [ do_test_default_topic_conf:381 ] 3: [0004_conf / 0.004s] [ do_test_default_topic_conf:381: PASS (0.00s) ] 3: [0004_conf / 0.004s] [ do_message_timeout_linger_checks:447 ] 3: %5|1675737011.827|CONFWARN|rdkafka#producer-18| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.004s] #0 "default and L and M": rd_kafka_new() succeeded 3: %5|1675737011.827|CONFWARN|rdkafka#producer-19| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.004s] #1 "set L such that L=M": rd_kafka_new() failed: `message.timeout.ms` must be greater than `linger.ms` 3: %5|1675737011.828|CONFWARN|rdkafka#producer-22| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.005s] #5 "set M such that L>=M": rd_kafka_new() succeeded 3: [0004_conf / 0.005s] #6 "set L and M such that L>=M": rd_kafka_new() failed: `message.timeout.ms` must be greater than `linger.ms` 3: [0004_conf / 0.005s] [ do_message_timeout_linger_checks:447: PASS (0.00s) ] 3: [0004_conf / 0.005s] 0004_conf: duration 4.740ms 3: [0004_conf / 0.005s] ================= Test 0004_conf PASSED ================= 3: %7|1675737011.830|INIT|rdkafka#producer-1| [thrd:app]: librdkafka v1.9.2 (0x10902ff) rdkafka#producer-1 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x201) 3: %5|1675737011.830|CONFWARN|rdkafka#producer-1| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %7|1675737011.830|DESTROY|rdkafka#producer-1| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:app]: Interrupting timers 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:app]: Sending TERMINATE to internal main thread 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:app]: Joining internal main thread 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:main]: Internal main thread terminating 3: %7|1675737011.830|DESTROY|rdkafka#producer-1| [thrd:main]: Destroy internal 3: %7|1675737011.830|BROADCAST|rdkafka#producer-1| [thrd:main]: Broadcasting state change 3: %7|1675737011.830|DESTROY|rdkafka#producer-1| [thrd:main]: Removing all topics 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:main]: Purging reply queue 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:main]: Decommissioning internal broker 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:main]: Join 1 broker thread(s) 3: %7|1675737011.830|BROADCAST|rdkafka#producer-1| [thrd::0/internal]: Broadcasting state change 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:main]: Internal main thread termination done 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:app]: Destroying op queues 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:app]: Destroying SSL CTX 3: %7|1675737011.830|TERMINATE|rdkafka#producer-1| [thrd:app]: Termination done: freeing resources 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: empty tqh[0] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: prepend 1,0 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: prepend 2,1,0 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: append 1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: append 1,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1,0,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 2,0,1 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:345: unittest_sysqueue 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: sysqueue: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:393: ut_strcasestr: BEGIN:  3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:409: ut_strcasestr 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:590: ut_string_split: BEGIN:  3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:616: ut_string_split 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: string: PASS 3: [0009_mock_cluster / 0.008s] PRODUCE.DELIVERY.WAIT: duration 5.322ms 3: [0009_mock_cluster / 0.008s] Produce to rdkafkatest_rnd636de1bf2665bdf0_0009_mock_cluster [-1]: messages #0..100 3: [0009_mock_cluster / 0.008s] SUM(POLL): duration 0.000ms 3: [0009_mock_cluster / 0.008s] PRODUCE: duration 0.048ms 3: [0009_mock_cluster / 0.014s] PRODUCE.DELIVERY.WAIT: duration 5.098ms 3: [0009_mock_cluster / 0.014s] ASSIGN.PARTITIONS: duration 0.040ms 3: [0009_mock_cluster / 0.014s] CONSUME: assigned 4 partition(s) 3: [0009_mock_cluster / 0.014s] CONSUME: consume 100 messages 3: [0025_timers / 0.102s] rd_kafka_poll(): duration 100.077ms 3: [0009_mock_cluster / 0.115s] CONSUME: duration 100.931ms 3: [0009_mock_cluster / 0.115s] CONSUME: consumed 100/100 messages (0/-1 EOFs) 3: [
/ 0.135s] Too many tests running (5 >= 5): postponing 0034_offset_reset_mock start... 3: [0033_regex_subscribe_local / 0.000s] ================= Running test 0033_regex_subscribe_local ================= 3: [0033_regex_subscribe_local / 0.000s] ==== Stats written to file stats_0033_regex_subscribe_local_1353545965596080485.json ==== 3: [0009_mock_cluster / 0.136s] 0009_mock_cluster: duration 136.211ms 3: [0009_mock_cluster / 0.136s] ================= Test 0009_mock_cluster PASSED ================= 3: [0033_regex_subscribe_local / 0.028s] 0033_regex_subscribe_local: duration 27.654ms 3: [0033_regex_subscribe_local / 0.028s] ================= Test 0033_regex_subscribe_local PASSED ================= 3: [
/ 0.163s] [0034_offset_reset_mock / 0.000s] ================= Running test 0034_offset_reset_mock ================= 3: [0034_offset_reset_mock / 0.000s] Too many tests running (5 >= 5): postponing 0039_event_log start... 3: [0037_destroy_hang_local / 0.000s] ==== Stats written to file stats_0034_offset_reset_mock_1522539265972781696.json ==== 3: ================= Running test 0037_destroy_hang_local ================= 3: [0037_destroy_hang_local / 0.000s] [0034_offset_reset_mock / 0.000s] [ offset_reset_errors:201 ] 3: ==== Stats written to file stats_0037_destroy_hang_local_5097040422620652190.json ==== 3: [0037_destroy_hang_local / 0.000s] Test config file test.conf not found 3: [0037_destroy_hang_local / 0.000s] Setting test timeout to 30s * 2.7 3: [0037_destroy_hang_local / 0.000s] Using topic "rdkafkatest_legacy_consumer_early_destroy" 3: [0037_destroy_hang_local / 0.000s] legacy_consumer_early_destroy: pass #0 3: [0037_destroy_hang_local / 0.000s] Test config file test.conf not found 3: %5|1675737011.986|CONFWARN|0037_destroy_hang_local#consumer-24| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0037_destroy_hang_local / 0.000s] Created kafka instance 0037_destroy_hang_local#consumer-24 3: [0037_destroy_hang_local / 0.000s] legacy_consumer_early_destroy: pass #1 3: [0037_destroy_hang_local / 0.000s] Test config file test.conf not found 3: %5|1675737011.986|CONFWARN|0037_destroy_hang_local#consumer-25| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0037_destroy_hang_local / 0.000s] Created kafka instance 0037_destroy_hang_local#consumer-25 3: %5|1675737011.986|CONFWARN|MOCK#producer-26| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0034_offset_reset_mock / 0.001s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.001s] Created kafka instance 0034_offset_reset_mock#producer-27 3: [0034_offset_reset_mock / 0.001s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.001s] Produce to topic [0]: messages #0..10 3: [0034_offset_reset_mock / 0.001s] SUM(POLL): duration 0.000ms 3: [0034_offset_reset_mock / 0.001s] PRODUCE: duration 0.012ms 3: [0034_offset_reset_mock / 0.006s] PRODUCE.DELIVERY.WAIT: duration 5.261ms 3: [0034_offset_reset_mock / 0.006s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.006s] Setting test timeout to 300s * 2.7 3: [0034_offset_reset_mock / 0.006s] Created kafka instance 0034_offset_reset_mock#consumer-28 3: [0034_offset_reset_mock / 0.006s] Waiting for up to 5000ms for metadata update 3: [0034_offset_reset_mock / 0.006s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 0.006s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 0.006s] METADATA.WAIT: duration 0.235ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:457: unittest_untyped_map: 500000 map_get iterations took 125.667ms = 0us/get 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:474: unittest_untyped_map: Total time over 100000 entries took 165.178ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:477: unittest_untyped_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:305: unittest_typed_map: enumerated key 2 person Hedvig Lindahl 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:305: unittest_typed_map: enumerated key 1 person Roy McPhearsome 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:323: unittest_typed_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: map: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1353: do_unittest_write_read 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1518: do_unittest_write_split_seek 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1608: do_unittest_write_read_payload_correctness 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1676: do_unittest_write_iov 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1866: do_unittest_erase 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdbuf: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdvarint: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:411: unittest_rd_crc32c: Calculate CRC32C using hardware (SSE42) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:422: unittest_rd_crc32c: Calculate CRC32C using software 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:429: unittest_rd_crc32c 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: crc32c: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:1985: unittest_msgq_order: FIFO: testing in FIFO mode 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2172: unittest_msg_seq_wrap 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: get baseline insert time 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 2 messages into destq with 2 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 0us, 0.0000us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: get baseline insert time 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 2 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 1.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 3 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.5000us/msg over 2 messages in 1us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: single-message ranges 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 4 messages into destq with 5 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 0us, 0.0000us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: single-message ranges 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 5 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 6 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 7 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 8 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0000us/msg over 4 messages in 0us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: many messages 3: [0025_timers / 0.202s] rd_kafka_poll(): duration 100.060ms 3: [0025_timers / 0.302s] rd_kafka_poll(): duration 100.079ms 3: [0025_timers / 0.402s] rd_kafka_poll(): duration 100.077ms 3: [0025_timers / 0.502s] rd_kafka_poll(): duration 100.077ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 4315956 messages into destq with 165288 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 4238us, 0.0009us/msg 3: [0025_timers / 0.602s] Call #0: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 0.602s] rd_kafka_poll(): duration 99.476ms 3: [0025_timers / 0.702s] rd_kafka_poll(): duration 100.082ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: many messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 100001 messages into destq with 165288 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 20us, 0.0002us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 50001 messages into destq with 265289 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1217us, 0.0243us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 20001 messages into destq with 315290 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1460us, 0.0730us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 59129 messages into destq with 335291 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1522us, 0.0257us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 86823 messages into destq with 394420 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 4196us, 0.0483us/msg 3: [0025_timers / 0.802s] rd_kafka_poll(): duration 100.092ms 3: [0025_timers / 0.902s] rd_kafka_poll(): duration 100.078ms 3: [0022_consume_batch_local / 1.002s] refresh_called = 1 3: [0022_consume_batch_local / 1.002s] 0022_consume_batch_local: duration 1002.278ms 3: [0022_consume_batch_local / 1.002s] ================= Test 0022_consume_batch_local PASSED ================= 3: [0025_timers / 1.002s] rd_kafka_poll(): duration 100.077ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 4000001 messages into destq with 481243 messages 3: [0025_timers / 1.102s] rd_kafka_poll(): duration 100.078ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 6008us, 0.0015us/msg 3: [
/ 1.121s] Too many tests running (5 >= 5): postponing 0039_event start... 3: [0039_event_log / 0.000s] ================= Running test 0039_event_log ================= 3: [0039_event_log / 0.000s] ==== Stats written to file stats_0039_event_log_7550344312917808344.json ==== 3: [0039_event_log / 0.001s] Created kafka instance 0039_event_log#producer-29 3: [0039_event_log / 0.001s] rd_kafka_set_log_queue(rk, eventq): duration 0.002ms 3: [0039_event_log / 0.001s] Got log event: level: 7 ctx: queue fac: WAKEUPFD: msg: [thrd:app]: 0:65534/bootstrap: Enabled low-latency ops queue wake-ups 3: [0039_event_log / 0.001s] Destroying kafka instance 0039_event_log#producer-29 3: [0039_event_log / 0.001s] 0039_event_log: duration 1.461ms 3: [0039_event_log / 0.001s] ================= Test 0039_event_log PASSED ================= 3: [0037_destroy_hang_local / 1.001s] 0037_destroy_hang_local: duration 1000.556ms 3: [0037_destroy_hang_local / 1.001s] ================= Test 0037_destroy_hang_local PASSED ================= 3: [0039_event / 0.000s] ================= Running test 0039_event ================= 3: [0039_event / 0.000s] ==== Stats written to file stats_0039_event_62509117939316802.json ==== 3: [
/ 1.164s] Too many tests running (5 >= 5): postponing 0045_subscribe_update_mock start... 3: [0043_no_connection / 0.000s] ================= Running test 0043_no_connection ================= 3: [0043_no_connection / 0.000s] ==== Stats written to file stats_0043_no_connection_2605267453207227375.json ==== 3: [0043_no_connection / 0.000s] Test config file test.conf not found 3: [0043_no_connection / 0.000s] Setting test timeout to 20s * 2.7 3: [0039_event / 0.000s] Created kafka instance 0039_event#producer-30 3: %5|1675737012.987|CONFWARN|0043_no_connection#producer-31| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0043_no_connection / 0.000s] Created kafka instance 0043_no_connection#producer-31 3: [0043_no_connection / 0.000s] Test config file test.conf not found 3: [0043_no_connection / 0.000s] Produce to test_producer_no_connection [-1]: messages #0..100 3: [0043_no_connection / 0.000s] SUM(POLL): duration 0.000ms 3: [0043_no_connection / 0.000s] PRODUCE: duration 0.062ms 3: [0043_no_connection / 0.000s] Produce to test_producer_no_connection [0]: messages #0..100 3: [0043_no_connection / 0.000s] SUM(POLL): duration 0.000ms 3: [0043_no_connection / 0.000s] PRODUCE: duration 0.061ms 3: [0043_no_connection / 0.000s] Produce to test_producer_no_connection [1]: messages #0..100 3: %3|1675737012.987|FAIL|0039_event#producer-30| [thrd:0:65534/bootstrap]: 0:65534/bootstrap: Connect to ipv4#0.0.0.0:65534 failed: Connection refused (after 0ms in state CONNECT) 3: [0043_no_connection / 0.000s] SUM(POLL): duration 0.001ms 3: [0043_no_connection / 0.000s] PRODUCE: duration 0.080ms 3: [0039_event / 0.000s] Got Error event: _TRANSPORT: 0:65534/bootstrap: Connect to ipv4#0.0.0.0:65534 failed: Connection refused (after 0ms in state CONNECT) 3: [0039_event / 0.000s] Destroying kafka instance 0039_event#producer-30 3: [0039_event / 0.001s] 0039_event: duration 0.580ms 3: [0039_event / 0.001s] ================= Test 0039_event PASSED ================= 3: [
/ 1.164s] Too many tests running (5 >= 5): postponing 0046_rkt_cache start... 3: [0045_subscribe_update_mock / 0.000s] ================= Running test 0045_subscribe_update_mock ================= 3: [0045_subscribe_update_mock / 0.000s] ==== Stats written to file stats_0045_subscribe_update_mock_486035464731550471.json ==== 3: [0045_subscribe_update_mock / 0.000s] [ do_test_regex_many_mock:378: range with 50 topics ] 3: %5|1675737012.987|CONFWARN|MOCK#producer-32| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0045_subscribe_update_mock / 0.000s] Test config file test.conf not found 3: [0045_subscribe_update_mock / 0.000s] Setting test timeout to 300s * 2.7 3: [0045_subscribe_update_mock / 0.001s] Created kafka instance 0045_subscribe_update_mock#consumer-33 3: [0045_subscribe_update_mock / 0.001s] Creating topic topic_0 3: [0045_subscribe_update_mock / 0.001s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.022ms 3: [0045_subscribe_update_mock / 0.001s] POLL: not expecting any messages for 300ms 3: [0034_offset_reset_mock / 1.007s] #0: injecting _TRANSPORT, expecting NO_ERROR 3: [0034_offset_reset_mock / 1.007s] Bringing down the broker 3: %6|1675737012.992|FAIL|0034_offset_reset_mock#consumer-28| [thrd:127.0.0.1:39707/bootstrap]: 127.0.0.1:39707/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1000ms in state UP) 3: [0034_offset_reset_mock / 1.007s] ASSIGN.PARTITIONS: duration 0.024ms 3: [0034_offset_reset_mock / 1.007s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 1.007s] #0: Ignoring Error event: 127.0.0.1:39707/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1000ms in state UP) 3: %6|1675737012.993|FAIL|0034_offset_reset_mock#consumer-28| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:39707: Disconnected (after 959ms in state UP) 3: [0034_offset_reset_mock / 1.007s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:39707: Disconnected (after 959ms in state UP) 3: [0034_offset_reset_mock / 1.007s] #0: Ignoring Error event: 2/2 brokers are down 3: %3|1675737012.993|FAIL|0034_offset_reset_mock#consumer-28| [thrd:127.0.0.1:39707/bootstrap]: 127.0.0.1:39707/1: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT) 3: [0034_offset_reset_mock / 1.007s] #0: Ignoring Error event: 127.0.0.1:39707/1: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737012.993|FAIL|0034_offset_reset_mock#consumer-28| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:39707: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT) 3: [0034_offset_reset_mock / 1.007s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:39707: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT) 3: [0025_timers / 1.202s] Call #1: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 1.202s] rd_kafka_poll(): duration 99.533ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0033us/msg over 4315956 messages in 14423us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: issue #2508 3: [0025_timers / 1.302s] rd_kafka_poll(): duration 100.081ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 145952 messages into destq with 154875 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 2263us, 0.0075us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: issue #2508 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 59129 messages into destq with 154875 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.0000us/msg 3: %3|1675737013.163|FAIL|0034_offset_reset_mock#consumer-28| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:39707: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0034_offset_reset_mock / 1.177s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:39707: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: %3|1675737013.163|FAIL|0034_offset_reset_mock#consumer-28| [thrd:127.0.0.1:39707/bootstrap]: 127.0.0.1:39707/1: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0034_offset_reset_mock / 1.177s] #0: Ignoring Error event: 127.0.0.1:39707/1: Connect to ipv4#127.0.0.1:39707 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 86823 messages into destq with 214004 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0000us/msg over 145952 messages in 1us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: issue #2450 (v1.2.1 regression) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 86 messages into destq with 199999 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 0us, 0.0000us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: issue #2450 (v1.2.1 regression) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 199999 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 5 messages into destq with 200001 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 4 messages into destq with 200006 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200010 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 3 messages into destq with 200012 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 61 messages into destq with 200015 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200076 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200078 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200080 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 3 messages into destq with 200082 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0000us/msg over 86 messages in 0us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: msg: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmurmur2.c:166: unittest_murmur2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: murmurhash: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdfnv1a.c:112: unittest_fnv1a 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: fnv1a: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:468: ut_high_sigfig 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:495: ut_quantile 3: [0025_timers / 1.402s] rd_kafka_poll(): duration 100.090ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:514: ut_mean 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:536: ut_stddev 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:555: ut_totalcount 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:573: ut_max 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:590: ut_min 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:609: ut_reset 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:623: ut_nan 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:638: ut_sigfigs 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:654: ut_minmax_trackable 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:664: ut_unitmagnitude_overflow 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:697: ut_subbucketmask_overflow 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdhdrhistogram: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4311: unittest_conf: Safified client.software.name="aba.-va" 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4319: unittest_conf: Safified client.software.version="1.2.3.4.5----a" 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4323: unittest_conf 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: conf: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_broker.c:2081: rd_ut_reconnect_backoff 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: broker: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4827: unittest_idempotent_producer: Verifying idempotent producer error handling 3: %5|1675737013.245|CONFWARN|rdkafka#producer-34| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: [0045_subscribe_update_mock / 0.301s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 0.301s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 0.301s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 1.502s] rd_kafka_poll(): duration 100.084ms 3: [0045_subscribe_update_mock / 0.401s] Creating topic topic_1 3: [0045_subscribe_update_mock / 0.401s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.024ms 3: [0045_subscribe_update_mock / 0.401s] POLL: not expecting any messages for 300ms 3: [0025_timers / 1.602s] rd_kafka_poll(): duration 100.084ms 3: [0025_timers / 1.702s] rd_kafka_poll(): duration 100.086ms 3: [0025_timers / 1.802s] Call #2: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 1.802s] rd_kafka_poll(): duration 99.544ms 3: [0045_subscribe_update_mock / 0.701s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 0.701s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 0.701s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 1.902s] rd_kafka_poll(): duration 100.086ms 3: [0045_subscribe_update_mock / 0.801s] Creating topic topic_2 3: [0045_subscribe_update_mock / 0.801s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.034ms 3: [0045_subscribe_update_mock / 0.801s] POLL: not expecting any messages for 300ms 3: [0025_timers / 2.002s] rd_kafka_poll(): duration 100.074ms 3: [0025_timers / 2.102s] rd_kafka_poll(): duration 100.086ms 3: [0043_no_connection / 1.000s] 300 messages in queue 3: %4|1675737013.987|TERMINATE|0043_no_connection#producer-31| [thrd:app]: Producer terminating with 300 messages (30000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0043_no_connection / 1.001s] rd_kafka_destroy(): duration 0.128ms 3: [0043_no_connection / 1.001s] 0043_no_connection: duration 1000.603ms 3: [0043_no_connection / 1.001s] ================= Test 0043_no_connection PASSED ================= 3: [
/ 2.164s] Too many tests running (5 >= 5): postponing 0053_stats_timing start... 3: [0046_rkt_cache / 0.000s] ================= Running test 0046_rkt_cache ================= 3: [0046_rkt_cache / 0.000s] ==== Stats written to file stats_0046_rkt_cache_5070941984084152476.json ==== 3: [0046_rkt_cache / 0.000s] Using topic "rdkafkatest_0046_rkt_cache" 3: [0046_rkt_cache / 0.000s] Test config file test.conf not found 3: %5|1675737013.987|CONFWARN|0046_rkt_cache#producer-35| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0046_rkt_cache / 0.000s] Created kafka instance 0046_rkt_cache#producer-35 3: [0046_rkt_cache / 0.000s] Test config file test.conf not found 3: [0046_rkt_cache / 0.000s] 0046_rkt_cache: duration 0.287ms 3: [0046_rkt_cache / 0.000s] ================= Test 0046_rkt_cache PASSED ================= 3: [0025_timers / 2.202s] rd_kafka_poll(): duration 100.068ms 3: [
/ 2.265s] Too many tests running (5 >= 5): postponing 0058_log start... 3: [0053_stats_timing / 0.000s] ================= Running test 0053_stats_timing ================= 3: [0053_stats_timing / 0.000s] ==== Stats written to file stats_0053_stats_timing_2772605462781280119.json ==== 3: [0045_subscribe_update_mock / 1.101s] CONSUME: duration 300.068ms 3: [0045_subscribe_update_mock / 1.101s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 1.101s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 2.302s] rd_kafka_poll(): duration 100.074ms 3: [0053_stats_timing / 0.100s] Stats (#0): { "name": "rdkafka#producer-36", "client_id": "rdkafka", "type": "producer", "ts":2447666477569, "time":1675737014, "age":100059, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0045_subscribe_update_mock / 1.201s] Creating topic topic_3 3: [0045_subscribe_update_mock / 1.201s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.020ms 3: [0045_subscribe_update_mock / 1.201s] POLL: not expecting any messages for 300ms 3: [0025_timers / 2.402s] Call #3: after 599ms, -0% outside interval 600 >-60 <+120 3: [0025_timers / 2.402s] rd_kafka_poll(): duration 99.537ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:5022: unittest_idempotent_producer 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: request: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1581: do_unittest_config_no_principal_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1610: do_unittest_config_empty_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1677: do_unittest_config_empty_value_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1715: do_unittest_config_value_with_quote_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1640: do_unittest_config_unrecognized_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1464: do_unittest_config_defaults 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1507: do_unittest_config_explicit_scope_and_life 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1551: do_unittest_config_all_explicit_values 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1754: do_unittest_config_extensions 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1773: do_unittest_illegal_extension_keys_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1806: do_unittest_odd_extension_size_should_fail 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: sasl_oauthbearer: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msgset_reader.c:1781: unittest_aborted_txns 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: aborted_txns: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5705: unittest_consumer_group_metadata 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5776: unittest_set_intersect 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5825: unittest_set_subtract 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5852: unittest_map_to_list 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5882: unittest_list_to_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: cgrp: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_scram.c:910: unittest_scram_nonce 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_scram.c:949: unittest_scram_safe 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: scram: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case Symmetrical subscription: range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case Symmetrical subscription: roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case 1*3 partitions (asymmetrical): range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case 1*3 partitions (asymmetrical): roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case #2121 (asymmetrical): range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case #2121 (asymmetrical): roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #0 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerNoTopic:2211: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2217: ut_testOneConsumerNoTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #0 ran for 0.023ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #1 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerNonexistentTopic:2237: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2243: ut_testOneConsumerNonexistentTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #1 ran for 0.011ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #2 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerOneTopic:2269: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2275: ut_testOneConsumerOneTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #2 ran for 0.014ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #3 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOnlyAssignsPartitionsFromSubscribedTopics:2300: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2306: ut_testOnlyAssignsPartitionsFromSubscribedTopics 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #3 ran for 0.012ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #4 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerMultipleTopics:2329: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2335: ut_testOneConsumerMultipleTopics 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #4 ran for 0.013ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #5 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersOneTopicOnePartition:2358: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2365: ut_testTwoConsumersOneTopicOnePartition 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #5 ran for 0.014ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #6 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersOneTopicTwoPartitions:2389: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2396: ut_testTwoConsumersOneTopicTwoPartitions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #6 ran for 0.013ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #7 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMultipleConsumersMixedTopicSubscriptions:2424: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2432: ut_testMultipleConsumersMixedTopicSubscriptions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #7 ran for 0.022ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #8 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersTwoTopicsSixPartitions:2459: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2466: ut_testTwoConsumersTwoTopicsSixPartitions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #8 ran for 0.020ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #9 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2487: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2501: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2514: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2522: ut_testAddRemoveConsumerOneTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #9 ran for 0.036ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #10 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testPoorRoundRobinAssignmentScenario:2576: verifying assignment for 4 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2585: ut_testPoorRoundRobinAssignmentScenario 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #10 ran for 0.033ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #11 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2609: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2615: ut_testAddRemoveTopicTwoConsumers: Adding topic2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2630: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2638: ut_testAddRemoveTopicTwoConsumers: Removing topic1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2650: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2658: ut_testAddRemoveTopicTwoConsumers 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #11 ran for 0.051ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #12 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerLeaves:2706: verifying assignment for 19 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerLeaves:2721: verifying assignment for 18 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2728: ut_testReassignmentAfterOneConsumerLeaves 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #12 ran for 3.118ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #13 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerAdded:2762: verifying assignment for 8 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerAdded:2774: verifying assignment for 9 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2781: ut_testReassignmentAfterOneConsumerAdded 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #13 ran for 0.181ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #14 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testSameSubscriptions:2823: verifying assignment for 9 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testSameSubscriptions:2836: verifying assignment for 8 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2844: ut_testSameSubscriptions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #14 ran for 1.554ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #15 ] 3: [0025_timers / 2.502s] rd_kafka_poll(): duration 100.076ms 3: [0025_timers / 2.602s] rd_kafka_poll(): duration 100.075ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testLargeAssignmentWithMultipleConsumersLeaving:2895: verifying assignment for 200 member(s): 3: [0045_subscribe_update_mock / 1.501s] CONSUME: duration 300.078ms 3: [0045_subscribe_update_mock / 1.501s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 1.501s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 2.702s] rd_kafka_poll(): duration 100.076ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testLargeAssignmentWithMultipleConsumersLeaving:2911: verifying assignment for 150 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2918: ut_testLargeAssignmentWithMultipleConsumersLeaving 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #15 ran for 315.642ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #16 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNewSubscription:2957: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2963: ut_testNewSubscription: Adding topic1 to consumer1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNewSubscription:2972: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2980: ut_testNewSubscription 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #16 ran for 0.082ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #17 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMoveExistingAssignments:3006: verifying assignment for 4 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMoveExistingAssignments:3027: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3063: ut_testMoveExistingAssignments 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #17 ran for 0.034ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #18 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness:3111: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3118: ut_testStickiness 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #18 ran for 0.022ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #19 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3144: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3154: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3168: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3168: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3180: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3192: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3201: ut_testStickiness2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #19 ran for 0.108ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #20 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAssignmentUpdatedForDeletedTopic:3223: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3233: ut_testAssignmentUpdatedForDeletedTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #20 ran for 0.247ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #21 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted:3255: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted:3269: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3275: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #21 ran for 0.017ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #22 ] 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3290: ut_testConflictingPreviousAssignments 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #22 ran for 0.002ms ] 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:1054: ut_assignors 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: assignors: PASS 3: [0000_unittests / 2.750s] 0000_unittests: duration 2749.858ms 3: [0000_unittests / 2.750s] ================= Test 0000_unittests PASSED ================= 3: [0045_subscribe_update_mock / 1.601s] Creating topic topic_4 3: [0045_subscribe_update_mock / 1.601s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.025ms 3: [0045_subscribe_update_mock / 1.601s] POLL: not expecting any messages for 300ms 3: [0025_timers / 2.802s] rd_kafka_poll(): duration 100.081ms 3: [
/ 2.850s] Too many tests running (5 >= 5): postponing 0062_stats_event start... 3: [0058_log / 0.000s] ================= Running test 0058_log ================= 3: [0058_log / 0.000s] ==== Stats written to file stats_0058_log_8397226007577078987.json ==== 3: [0058_log / 0.000s] main.queue: Creating producer, not expecting any log messages 3: [0025_timers / 2.902s] rd_kafka_poll(): duration 100.083ms 3: [0025_timers / 3.002s] Call #4: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 3.002s] rd_kafka_poll(): duration 99.578ms 3: [0045_subscribe_update_mock / 1.901s] CONSUME: duration 300.079ms 3: [0045_subscribe_update_mock / 1.901s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 1.901s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 3.102s] rd_kafka_poll(): duration 100.083ms 3: [0045_subscribe_update_mock / 2.002s] Creating topic topic_5 3: [0045_subscribe_update_mock / 2.002s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.033ms 3: [0045_subscribe_update_mock / 2.002s] POLL: not expecting any messages for 300ms 3: [0025_timers / 3.202s] rd_kafka_poll(): duration 100.074ms 3: [0025_timers / 3.302s] rd_kafka_poll(): duration 100.085ms 3: [0053_stats_timing / 1.100s] Stats (#10): { "name": "rdkafka#producer-36", "client_id": "rdkafka", "type": "producer", "ts":2447667477806, "time":1675737015, "age":1100296, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0025_timers / 3.402s] rd_kafka_poll(): duration 100.086ms 3: [0053_stats_timing / 1.200s] 12 (expected 12) stats callbacks received in 1200ms (expected 1200ms +-25%) 3: [0053_stats_timing / 1.201s] 0053_stats_timing: duration 1200.540ms 3: [0053_stats_timing / 1.201s] ================= Test 0053_stats_timing PASSED ================= 3: [
/ 3.466s] Too many tests running (5 >= 5): postponing 0066_plugins start... 3: [0062_stats_event / 0.000s] ================= Running test 0062_stats_event ================= 3: [0062_stats_event / 0.000s] ==== Stats written to file stats_0062_stats_event_6181547169544463346.json ==== 3: [0062_stats_event / 0.000s] Test config file test.conf not found 3: [0062_stats_event / 0.000s] Setting test timeout to 10s * 2.7 3: %5|1675737015.288|CONFWARN|0062_stats_event#producer-40| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0062_stats_event / 0.000s] Created kafka instance 0062_stats_event#producer-40 3: [0045_subscribe_update_mock / 2.302s] CONSUME: duration 300.087ms 3: [0045_subscribe_update_mock / 2.302s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 2.302s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 3.502s] rd_kafka_poll(): duration 100.086ms 3: [0062_stats_event / 0.100s] Stats event 3: [0062_stats_event / 0.100s] Stats: { "name": "0062_stats_event#producer-40", "client_id": "0062_stats_event", "type": "producer", "ts":2447667678316, "time":1675737015, "age":100132, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.100s] STATS_EVENT: duration 100.073ms 3: [0045_subscribe_update_mock / 2.402s] Creating topic topic_6 3: [0045_subscribe_update_mock / 2.402s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.023ms 3: [0045_subscribe_update_mock / 2.402s] POLL: not expecting any messages for 300ms 3: [0025_timers / 3.602s] Call #5: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 3.602s] rd_kafka_poll(): duration 99.522ms 3: [0062_stats_event / 0.200s] Stats event 3: [0062_stats_event / 0.200s] Stats: { "name": "0062_stats_event#producer-40", "client_id": "0062_stats_event", "type": "producer", "ts":2447667778327, "time":1675737015, "age":200143, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.200s] STATS_EVENT: duration 99.998ms 3: [0025_timers / 3.702s] rd_kafka_poll(): duration 100.086ms 3: [0062_stats_event / 0.300s] Stats event 3: [0062_stats_event / 0.300s] Stats: { "name": "0062_stats_event#producer-40", "client_id": "0062_stats_event", "type": "producer", "ts":2447667878333, "time":1675737015, "age":300149, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.300s] STATS_EVENT: duration 100.003ms 3: [0025_timers / 3.802s] rd_kafka_poll(): duration 100.086ms 3: [0058_log / 1.000s] main.queue: Setting log queue 3: [0058_log / 1.000s] main.queue: Expecting at least one log message 3: [0058_log / 1.000s] Log: level 7, facility INIT, str [thrd:app]: librdkafka v1.9.2 (0x10902ff) 0058_log#producer-39 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x1) 3: [0058_log / 1.000s] Log: level 5, facility CONFWARN, str [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0058_log / 1.000s] main.queue: Saw 2 logs 3: [0058_log / 1.000s] local.queue: Creating producer, not expecting any log messages 3: [0062_stats_event / 0.400s] Stats event 3: [0062_stats_event / 0.400s] Stats: { "name": "0062_stats_event#producer-40", "client_id": "0062_stats_event", "type": "producer", "ts":2447667978340, "time":1675737015, "age":400156, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.400s] STATS_EVENT: duration 100.000ms 3: [0045_subscribe_update_mock / 2.702s] CONSUME: duration 300.071ms 3: [0045_subscribe_update_mock / 2.702s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 2.702s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 3.902s] rd_kafka_poll(): duration 100.086ms 3: [0062_stats_event / 0.500s] Stats event 3: [0062_stats_event / 0.500s] Stats: { "name": "0062_stats_event#producer-40", "client_id": "0062_stats_event", "type": "producer", "ts":2447668078347, "time":1675737015, "age":500163, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.500s] STATS_EVENT: duration 100.003ms 3: [0062_stats_event / 0.500s] 0062_stats_event: duration 500.351ms 3: [0062_stats_event / 0.500s] ================= Test 0062_stats_event PASSED ================= 3: [0045_subscribe_update_mock / 2.802s] Creating topic topic_7 3: [0045_subscribe_update_mock / 2.802s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.025ms 3: [0045_subscribe_update_mock / 2.802s] POLL: not expecting any messages for 300ms 3: [0025_timers / 4.002s] rd_kafka_poll(): duration 100.076ms 3: [
/ 4.066s] Too many tests running (5 >= 5): postponing 0072_headers_ut start... 3: [0066_plugins / 0.000s] ================= Running test 0066_plugins ================= 3: [0066_plugins / 0.000s] ==== Stats written to file stats_0066_plugins_7538038226102718257.json ==== 3: [0066_plugins / 0.000s] Using topic "rdkafkatest_rnd7cbe029c37894fc5_0066_plugins" 3: [0066_plugins / 0.000s] running test from cwd /usr/src/RPM/BUILD/librdkafka-1.9.2/tests 3: [0066_plugins / 0.000s] set(session.timeout.ms, 6000) 3: [0066_plugins / 0.000s] set(plugin.library.paths, interceptor_test/interceptor_test) 3: [0066_plugins / 0.000s] conf_init(conf 0x7fa4600008e0) called (setting opaque to 0x7fa4741810ea) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fa4600008e0) for ici 0x7fa460002770 with ici->conf 0x7fa460001030 3: [0066_plugins / 0.000s] set(socket.timeout.ms, 12) 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa4600008e0, "socket.timeout.ms", "12"): 0x7fa460002770 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa4600008e0, "socket.timeout.ms", "12"): 0x7fa460002770 3: [0066_plugins / 0.000s] set(interceptor_test.config1, one) 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa4600008e0, "interceptor_test.config1", "one"): 0x7fa460002770 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa4600008e0, interceptor_test.config1, one): 0x7fa460002770 3: [0066_plugins / 0.000s] set(interceptor_test.config2, two) 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa4600008e0, "interceptor_test.config2", "two"): 0x7fa460002770 3: [0066_plugins / 0.000s] set(topic.metadata.refresh.interval.ms, 1234) 3: [0066_plugins / 0.000s] on_conf_dup(new_conf 0x7fa460003e50, old_conf 0x7fa4600008e0, filter_cnt 0, ici 0x7fa460002770) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fa460003e50) for ici 0x7fa4600020b0 with ici->conf 0x7fa4600045a0 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa460003e50, "socket.timeout.ms", "12"): 0x7fa4600020b0 3: [0066_plugins / 0.000s] conf_init(conf 0x7fa4600045a0) called (setting opaque to 0x7fa4741810ea) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fa4600045a0) for ici 0x7fa460004e50 with ici->conf 0x7fa460004e80 3: [0066_plugins / 0.000s] conf_init(conf 0x7fa460003e50) called (setting opaque to 0x7fa4741810ea) 3: [0066_plugins / 0.000s] on_conf_dup(new_conf 0x7fa460005d60, old_conf 0x7fa460003e50, filter_cnt 2, ici 0x7fa4600020b0) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fa460005d60) for ici 0x7fa460006660 with ici->conf 0x7fa460006690 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa460005d60, "socket.timeout.ms", "12"): 0x7fa460006660 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fa460003e50) for ici 0x7fa460005d30 with ici->conf 0x7fa460005d60 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa460003e50, "interceptor_test.config1", "one"): 0x7fa4600020b0 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa460003e50, interceptor_test.config1, one): 0x7fa4600020b0 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa460003e50, "interceptor_test.config2", "two"): 0x7fa4600020b0 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa460003e50, "session.timeout.ms", "6000"): 0x7fa4600020b0 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa4600045a0, "session.timeout.ms", "6000"): 0x7fa460004e50 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fa4600045a0, "session.timeout.ms", "6000"): 0x7fa460004e50 3: [0066_plugins / 0.000s] on_new(rk 0x7fa4600076f0, conf 0x7fa460007828, ici->conf 0x7fa4600045a0): 0x7fa4600020b0: #1 3: %4|1675737015.890|CONFWARN|rdkafka#producer-42| [thrd:app]: Configuration property session.timeout.ms is a consumer property and will be ignored by this producer instance 3: %5|1675737015.890|CONFWARN|rdkafka#producer-42| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0066_plugins / 0.001s] 0066_plugins: duration 0.764ms 3: [0066_plugins / 0.001s] ================= Test 0066_plugins PASSED ================= 3: [
/ 4.067s] Too many tests running (5 >= 5): postponing 0074_producev start... 3: [0072_headers_ut / 0.000s] ================= Running test 0072_headers_ut ================= 3: [0072_headers_ut / 0.000s] ==== Stats written to file stats_0072_headers_ut_157483858678902009.json ==== 3: [0072_headers_ut / 0.000s] Using topic "rdkafkatest_0072_headers_ut" 3: %5|1675737015.890|CONFWARN|0072_headers_ut#producer-43| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0072_headers_ut / 0.000s] Created kafka instance 0072_headers_ut#producer-43 3: [0025_timers / 4.102s] rd_kafka_poll(): duration 100.084ms 3: [0025_timers / 4.202s] Call #6: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 4.202s] rd_kafka_poll(): duration 99.517ms 3: [0045_subscribe_update_mock / 3.102s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 3.102s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 3.102s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 4.302s] rd_kafka_poll(): duration 100.084ms 3: [0045_subscribe_update_mock / 3.202s] Creating topic topic_8 3: [0045_subscribe_update_mock / 3.202s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.024ms 3: [0045_subscribe_update_mock / 3.202s] POLL: not expecting any messages for 300ms 3: [0025_timers / 4.402s] rd_kafka_poll(): duration 100.085ms 3: [0025_timers / 4.502s] rd_kafka_poll(): duration 100.086ms 3: [0025_timers / 4.602s] rd_kafka_poll(): duration 100.086ms 3: [0045_subscribe_update_mock / 3.502s] CONSUME: duration 300.079ms 3: [0045_subscribe_update_mock / 3.502s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 3.502s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 4.703s] rd_kafka_poll(): duration 100.086ms 3: [0045_subscribe_update_mock / 3.602s] Creating topic topic_9 3: [0045_subscribe_update_mock / 3.602s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.027ms 3: [0045_subscribe_update_mock / 3.602s] POLL: not expecting any messages for 300ms 3: [0025_timers / 4.802s] Call #7: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 4.802s] rd_kafka_poll(): duration 99.510ms 3: [0058_log / 2.001s] local.queue: Setting log queue 3: [0058_log / 2.001s] local.queue: Expecting at least one log message 3: [0058_log / 2.001s] Log: level 7, facility INIT, str [thrd:app]: librdkafka v1.9.2 (0x10902ff) 0058_log#producer-41 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x1) 3: [0058_log / 2.001s] Log: level 5, facility CONFWARN, str [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0058_log / 2.001s] local.queue: Saw 2 logs 3: [0058_log / 2.001s] 0058_log: duration 2000.822ms 3: [0058_log / 2.001s] ================= Test 0058_log PASSED ================= 3: [
/ 4.851s] Too many tests running (5 >= 5): postponing 0078_c_from_cpp start... 3: [0074_producev / 0.000s] ================= Running test 0074_producev ================= 3: [0074_producev / 0.000s] ==== Stats written to file stats_0074_producev_3217656269064140578.json ==== 3: %5|1675737016.674|CONFWARN|0074_producev#producer-44| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0074_producev / 0.000s] Created kafka instance 0074_producev#producer-44 3: [0074_producev / 0.000s] produceva() error (expected): Failed to produce message: Broker: Message size too large 3: [0074_producev / 0.000s] 0074_producev: duration 0.236ms 3: [0074_producev / 0.000s] ================= Test 0074_producev PASSED ================= 3: [0025_timers / 4.902s] rd_kafka_poll(): duration 100.083ms 3: [0079_fork / 0.000s] WARN: SKIPPING TEST: Filtered due to negative test flags 3: [
/ 4.952s] Too many tests running (5 >= 5): postponing 0080_admin_ut start... 3: [0078_c_from_cpp / 0.000s] ================= Running test 0078_c_from_cpp ================= 3: [0078_c_from_cpp / 0.000s] ==== Stats written to file stats_0078_c_from_cpp_2780383119200496694.json ==== 3: %5|1675737016.774|CONFWARN|myclient#producer-45| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0078_c_from_cpp / 0.000s] Compare C name myclient#producer-45 to C++ name myclient#producer-45 3: [0078_c_from_cpp / 0.000s] Compare C topic mytopic to C++ topic mytopic 3: [0078_c_from_cpp / 0.000s] 0078_c_from_cpp: duration 0.189ms 3: [0078_c_from_cpp / 0.000s] ================= Test 0078_c_from_cpp PASSED ================= 3: [
/ 4.952s] Too many tests running (5 >= 5): postponing 0084_destroy_flags_local start... 3: [0080_admin_ut / 0.000s] ================= Running test 0080_admin_ut ================= 3: [0080_admin_ut / 0.000s] ==== Stats written to file stats_0080_admin_ut_1253391059928943900.json ==== 3: [0080_admin_ut / 0.000s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq ] 3: [0080_admin_ut / 0.000s] Test config file test.conf not found 3: %5|1675737016.775|CONFWARN|0080_admin_ut#producer-46| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0025_timers / 5.002s] rd_kafka_poll(): duration 100.075ms 3: [0080_admin_ut / 0.100s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 0.100s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 0.100s] rd_kafka_destroy(): duration 0.140ms 3: [0080_admin_ut / 0.100s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq: PASS (0.10s) ] 3: [0080_admin_ut / 0.100s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 0.100s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq ] 3: [0080_admin_ut / 0.100s] Test config file test.conf not found 3: %5|1675737016.875|CONFWARN|0080_admin_ut#producer-47| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0045_subscribe_update_mock / 3.903s] CONSUME: duration 300.066ms 3: [0045_subscribe_update_mock / 3.903s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 3.903s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0072_headers_ut / 1.000s] 0072_headers_ut: duration 1000.217ms 3: [0072_headers_ut / 1.000s] ================= Test 0072_headers_ut PASSED ================= 3: [
/ 5.067s] Too many tests running (5 >= 5): postponing 0086_purge_local start... 3: [0084_destroy_flags_local / 0.000s] ================= Running test 0084_destroy_flags_local ================= 3: [0084_destroy_flags_local / 0.000s] ==== Stats written to file stats_0084_destroy_flags_local_600226379788796873.json ==== 3: [0084_destroy_flags_local / 0.000s] Using topic "rdkafkatest_rndfd36fbb711ca4dd_destroy_flags" 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.000s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1675737016.890|CONFWARN|0084_destroy_flags_local#producer-48| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.000s] Created kafka instance 0084_destroy_flags_local#producer-48 3: [0084_destroy_flags_local / 0.000s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 0.000s] rd_kafka_destroy_flags(0x0): duration 0.081ms 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.000s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1675737016.891|CONFWARN|0084_destroy_flags_local#producer-49| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.000s] Created kafka instance 0084_destroy_flags_local#producer-49 3: [0084_destroy_flags_local / 0.000s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 0.000s] rd_kafka_destroy_flags(0x8): duration 0.073ms 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.000s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1675737016.891|CONFWARN|0084_destroy_flags_local#producer-50| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.001s] Created kafka instance 0084_destroy_flags_local#producer-50 3: [0084_destroy_flags_local / 0.001s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.001s] Produce to rdkafkatest_rndfd36fbb711ca4dd_destroy_flags [-1]: messages #0..10000 3: [0084_destroy_flags_local / 0.006s] SUM(POLL): duration 0.000ms 3: [0084_destroy_flags_local / 0.006s] PRODUCE: duration 5.035ms 3: [0084_destroy_flags_local / 0.006s] Calling rd_kafka_destroy_flags(0x0) 3: %4|1675737016.896|TERMINATE|0084_destroy_flags_local#producer-50| [thrd:app]: Producer terminating with 10000 messages (1000000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0084_destroy_flags_local / 0.006s] rd_kafka_destroy_flags(0x0): duration 0.756ms 3: [0084_destroy_flags_local / 0.006s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.006s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.006s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.006s] Setting test timeout to 20s * 2.7 3: %5|1675737016.897|CONFWARN|0084_destroy_flags_local#producer-51| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.006s] Created kafka instance 0084_destroy_flags_local#producer-51 3: [0084_destroy_flags_local / 0.006s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.006s] Produce to rdkafkatest_rndfd36fbb711ca4dd_destroy_flags [-1]: messages #0..10000 3: [0084_destroy_flags_local / 0.011s] SUM(POLL): duration 0.000ms 3: [0084_destroy_flags_local / 0.011s] PRODUCE: duration 4.699ms 3: [0084_destroy_flags_local / 0.011s] Calling rd_kafka_destroy_flags(0x8) 3: %4|1675737016.901|TERMINATE|0084_destroy_flags_local#producer-51| [thrd:app]: Producer terminating with 10000 messages (1000000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0084_destroy_flags_local / 0.012s] rd_kafka_destroy_flags(0x8): duration 0.789ms 3: [0084_destroy_flags_local / 0.012s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.012s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.012s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.012s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 0.012s] Created kafka instance 0084_destroy_flags_local#consumer-52 3: [0025_timers / 5.102s] rd_kafka_poll(): duration 100.059ms 3: [0080_admin_ut / 0.201s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 0.201s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 0.201s] rd_kafka_destroy(): duration 0.152ms 3: [0080_admin_ut / 0.201s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq: PASS (0.10s) ] 3: [0080_admin_ut / 0.201s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 0.201s] Test config file test.conf not found 3: %5|1675737016.976|CONFWARN|0080_admin_ut#producer-53| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 0.201s] [ do_test_options:1588 ] 3: [0080_admin_ut / 0.201s] [ do_test_options:1588: PASS (0.00s) ] 3: [0080_admin_ut / 0.201s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 0.201s] Using topic "rdkafkatest_rnd77c98a210b18371_do_test_CreateTopics" 3: [0080_admin_ut / 0.201s] Using topic "rdkafkatest_rnd2ba45d9415ff2f60_do_test_CreateTopics" 3: [0080_admin_ut / 0.201s] Using topic "rdkafkatest_rnd4335409b5c5eca3e_do_test_CreateTopics" 3: [0080_admin_ut / 0.201s] Using topic "rdkafkatest_rnd69af86955bc189b5_do_test_CreateTopics" 3: [0080_admin_ut / 0.201s] Using topic "rdkafkatest_rnd29fd3360318ace1b_do_test_CreateTopics" 3: [0080_admin_ut / 0.201s] Using topic "rdkafkatest_rnd1299b13543b5dd4c_do_test_CreateTopics" 3: [0080_admin_ut / 0.201s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 0.201s] CreateTopics: duration 0.093ms 3: [0045_subscribe_update_mock / 4.003s] Creating topic topic_10 3: [0045_subscribe_update_mock / 4.003s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.019ms 3: [0045_subscribe_update_mock / 4.003s] POLL: not expecting any messages for 300ms 3: [0025_timers / 5.202s] rd_kafka_poll(): duration 100.083ms 3: [0080_admin_ut / 0.301s] CreateTopics.queue_poll: duration 99.943ms 3: [0080_admin_ut / 0.301s] CreateTopics: got CreateTopicsResult in 99.943s 3: [0080_admin_ut / 0.301s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 0.301s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms ] 3: [0080_admin_ut / 0.301s] Using topic "rdkafkatest_rnd4a2300fa45e55c08_do_test_CreateTopics" 3: [0080_admin_ut / 0.301s] Using topic "rdkafkatest_rnd76ca6a4e74c5d32a_do_test_CreateTopics" 3: [0080_admin_ut / 0.301s] Using topic "rdkafkatest_rnd363a9685062ac3b1_do_test_CreateTopics" 3: [0080_admin_ut / 0.301s] Using topic "rdkafkatest_rnd3e8f053b5e4c077b_do_test_CreateTopics" 3: [0080_admin_ut / 0.301s] Using topic "rdkafkatest_rnd2fabaa1865c8a01d_do_test_CreateTopics" 3: [0080_admin_ut / 0.301s] Using topic "rdkafkatest_rnd4ac74738116cfdb1_do_test_CreateTopics" 3: [0080_admin_ut / 0.301s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 0.302s] CreateTopics: duration 0.099ms 3: [0025_timers / 5.302s] rd_kafka_poll(): duration 100.086ms 3: [0080_admin_ut / 0.401s] CreateTopics.wait_background_event_cb: duration 99.947ms 3: [0080_admin_ut / 0.401s] CreateTopics: got CreateTopicsResult in 99.947s 3: [0080_admin_ut / 0.402s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 0.402s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd6ca1a30554a23e4c_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd246f4f6f3e51c4e2_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd50c5ec68684ef842_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd40216d547ae8a977_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd2c3af0dc450baa72_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Using topic "rdkafkatest_rnd46be615d3bd614c0_do_test_CreateTopics" 3: [0080_admin_ut / 0.402s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 0.402s] CreateTopics: duration 0.068ms 3: [0025_timers / 5.402s] Call #8: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 5.402s] rd_kafka_poll(): duration 99.559ms 3: [0045_subscribe_update_mock / 4.303s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 4.303s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 4.303s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 5.502s] rd_kafka_poll(): duration 100.086ms 3: [0080_admin_ut / 0.602s] CreateTopics.queue_poll: duration 199.954ms 3: [0080_admin_ut / 0.602s] CreateTopics: got CreateTopicsResult in 199.954s 3: [0080_admin_ut / 0.602s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 0.602s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.602s] Using topic "rdkafkatest_rnd7d7b9957210ab45_do_test_CreateTopics" 3: [0080_admin_ut / 0.602s] Using topic "rdkafkatest_rnd4a81eea309fb081_do_test_CreateTopics" 3: [0080_admin_ut / 0.602s] Using topic "rdkafkatest_rnd114bf3fa604b5a99_do_test_CreateTopics" 3: [0080_admin_ut / 0.602s] Using topic "rdkafkatest_rnd2ea14c7b2b12a1d1_do_test_CreateTopics" 3: [0080_admin_ut / 0.602s] Using topic "rdkafkatest_rnd61f5d45c17b444d7_do_test_CreateTopics" 3: [0080_admin_ut / 0.602s] Using topic "rdkafkatest_rnd71a9267b3c239446_do_test_CreateTopics" 3: [0080_admin_ut / 0.602s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 0.602s] CreateTopics: duration 0.075ms 3: [0045_subscribe_update_mock / 4.403s] Creating topic topic_11 3: [0045_subscribe_update_mock / 4.403s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.025ms 3: [0045_subscribe_update_mock / 4.403s] POLL: not expecting any messages for 300ms 3: [0084_destroy_flags_local / 0.512s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 0.513s] rd_kafka_destroy_flags(0x0): duration 0.220ms 3: [0084_destroy_flags_local / 0.513s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.513s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.513s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.513s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 0.513s] Created kafka instance 0084_destroy_flags_local#consumer-54 3: [0025_timers / 5.602s] rd_kafka_poll(): duration 100.086ms 3: [0025_timers / 5.702s] rd_kafka_poll(): duration 100.087ms 3: [0080_admin_ut / 0.802s] CreateTopics.queue_poll: duration 199.949ms 3: [0080_admin_ut / 0.802s] CreateTopics: got CreateTopicsResult in 199.949s 3: [0080_admin_ut / 0.802s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-53 CreateTopics with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 0.802s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-53 DeleteTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 0.802s] Using topic "rdkafkatest_rnd5d1e024f0ce980ae_do_test_DeleteTopics" 3: [0080_admin_ut / 0.802s] Using topic "rdkafkatest_rnd72501ad87a1dac3e_do_test_DeleteTopics" 3: [0080_admin_ut / 0.802s] Using topic "rdkafkatest_rnd4d0aee026d38c450_do_test_DeleteTopics" 3: [0080_admin_ut / 0.802s] Using topic "rdkafkatest_rnd5ad333797945dede_do_test_DeleteTopics" 3: [0080_admin_ut / 0.802s] Call DeleteTopics, timeout is 100ms 3: [0080_admin_ut / 0.802s] DeleteTopics: duration 0.006ms 3: [0025_timers / 5.802s] rd_kafka_poll(): duration 100.086ms 3: [0080_admin_ut / 0.902s] DeleteTopics.queue_poll: duration 100.016ms 3: [0080_admin_ut / 0.902s] DeleteTopics: got DeleteTopicsResult in 100.016s 3: [0080_admin_ut / 0.902s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 0.902s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-53 DeleteTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.902s] Using topic "rdkafkatest_rnd32446ec2016e16bc_do_test_DeleteTopics" 3: [0080_admin_ut / 0.902s] Using topic "rdkafkatest_rnd4004403b6e1a8382_do_test_DeleteTopics" 3: [0080_admin_ut / 0.902s] Using topic "rdkafkatest_rnd1ccecd2947dbf9d1_do_test_DeleteTopics" 3: [0080_admin_ut / 0.902s] Using topic "rdkafkatest_rnd602b2ec83e5a4748_do_test_DeleteTopics" 3: [0080_admin_ut / 0.902s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 0.902s] DeleteTopics: duration 0.005ms 3: [0045_subscribe_update_mock / 4.703s] CONSUME: duration 300.078ms 3: [0045_subscribe_update_mock / 4.703s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 4.703s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0025_timers / 5.903s] rd_kafka_poll(): duration 100.085ms 3: [0045_subscribe_update_mock / 4.803s] Creating topic topic_12 3: [0045_subscribe_update_mock / 4.803s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.025ms 3: [0045_subscribe_update_mock / 4.803s] POLL: not expecting any messages for 300ms 3: [0025_timers / 6.002s] Call #9: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 6.002s] rd_kafka_poll(): duration 99.505ms 3: [0025_timers / 6.002s] All 10 intervals okay 3: [0025_timers / 6.002s] 0025_timers: duration 6002.196ms 3: [0025_timers / 6.002s] ================= Test 0025_timers PASSED ================= 3: [0080_admin_ut / 1.102s] DeleteTopics.queue_poll: duration 200.018ms 3: [0080_admin_ut / 1.102s] DeleteTopics: got DeleteTopicsResult in 200.018s 3: [0080_admin_ut / 1.102s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 1.102s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-53 DeleteTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.102s] Using topic "rdkafkatest_rnd4c8418bb10cadf49_do_test_DeleteTopics" 3: [0080_admin_ut / 1.102s] Using topic "rdkafkatest_rnd3e31c8e15dd00cb6_do_test_DeleteTopics" 3: [0080_admin_ut / 1.102s] Using topic "rdkafkatest_rnd711639e223d1ea98_do_test_DeleteTopics" 3: [0080_admin_ut / 1.102s] Using topic "rdkafkatest_rndc7159311c28dbb4_do_test_DeleteTopics" 3: [0080_admin_ut / 1.102s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 1.102s] DeleteTopics: duration 0.004ms 3: [0084_destroy_flags_local / 1.013s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 1.013s] rd_kafka_destroy_flags(0x8): duration 0.203ms 3: [0084_destroy_flags_local / 1.013s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 1.013s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode ] 3: [0084_destroy_flags_local / 1.013s] Test config file test.conf not found 3: [0084_destroy_flags_local / 1.013s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 1.014s] Created kafka instance 0084_destroy_flags_local#consumer-55 3: [
/ 6.103s] Too many tests running (5 >= 5): postponing 0095_all_brokers_down start... 3: [0086_purge_local / 0.000s] ================= Running test 0086_purge_local ================= 3: [0086_purge_local / 0.000s] ==== Stats written to file stats_0086_purge_local_1936277357466168718.json ==== 3: [0086_purge_local / 0.000s] Using topic "rdkafkatest_0086_purge" 3: [0086_purge_local / 0.000s] Test rd_kafka_purge(): local 3: [0086_purge_local / 0.000s] Test config file test.conf not found 3: [0086_purge_local / 0.000s] Setting test timeout to 20s * 2.7 3: %4|1675737017.926|CONFWARN|0086_purge_local#producer-56| [thrd:app]: Configuration property enable.gapless.guarantee is experimental: When set to `true`, any error that could result in a gap in the produced message series when a batch of messages fails, will raise a fatal error (ERR__GAPLESS_GUARANTEE) and stop the producer. Messages failing due to `message.timeout.ms` are not covered by this guarantee. Requires `enable.idempotence=true`. 3: %5|1675737017.926|CONFWARN|0086_purge_local#producer-56| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0086_purge_local / 0.000s] Created kafka instance 0086_purge_local#producer-56 3: [0086_purge_local / 0.000s] Producing 20 messages to topic rdkafkatest_0086_purge 3: [0086_purge_local / 0.000s] local:281: purge(0x2): expecting 20 messages to remain when done 3: [0086_purge_local / 0.000s] local:281: purge(0x2): duration 0.002ms 3: [0086_purge_local / 0.000s] local:285: purge(0x1): expecting 0 messages to remain when done 3: [0086_purge_local / 0.000s] local:285: purge(0x1): duration 0.001ms 3: [0086_purge_local / 0.000s] DeliveryReport for msg #0: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #1: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #2: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #3: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #4: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #5: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #6: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #7: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #8: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #9: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #10: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #11: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #12: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #13: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #14: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #15: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #16: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #17: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #18: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #19: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] 0086_purge_local: duration 0.311ms 3: [0086_purge_local / 0.000s] ================= Test 0086_purge_local PASSED ================= 3: [
/ 6.103s] Too many tests running (5 >= 5): postponing 0097_ssl_verify_local start... 3: [0095_all_brokers_down / 0.000s] ================= Running test 0095_all_brokers_down ================= 3: [0095_all_brokers_down / 0.000s] ==== Stats written to file stats_0095_all_brokers_down_3737179048880531379.json ==== 3: [0095_all_brokers_down / 0.000s] Setting test timeout to 20s * 2.7 3: [0095_all_brokers_down / 0.000s] Test Producer 3: [
/ 6.104s] Log: [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 0.000s] Error: Local: Broker transport failure: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0080_admin_ut / 1.302s] DeleteTopics.queue_poll: duration 200.020ms 3: [0080_admin_ut / 1.302s] DeleteTopics: got DeleteTopicsResult in 200.020s 3: [0080_admin_ut / 1.302s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 1.302s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-53 DeleteGroups with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 1.302s] Using topic "rdkafkatest_rnd601054097000b4d1_do_test_DeleteGroups" 3: [0080_admin_ut / 1.302s] Using topic "rdkafkatest_rnd708f8a493d2e5659_do_test_DeleteGroups" 3: [0080_admin_ut / 1.302s] Using topic "rdkafkatest_rnd7cea357f62dfa522_do_test_DeleteGroups" 3: [0080_admin_ut / 1.302s] Using topic "rdkafkatest_rnd374c029749f52382_do_test_DeleteGroups" 3: [0080_admin_ut / 1.302s] Call DeleteGroups, timeout is 100ms 3: [0080_admin_ut / 1.302s] DeleteGroups: duration 0.011ms 3: [0045_subscribe_update_mock / 5.103s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 5.103s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 5.103s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0034_offset_reset_mock / 6.177s] Bringing up the broker 3: [0080_admin_ut / 1.402s] DeleteGroups.queue_poll: duration 100.033ms 3: [0080_admin_ut / 1.402s] DeleteGroups: got DeleteGroupsResult in 100.033s 3: [0080_admin_ut / 1.402s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.402s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-53 DeleteGroups with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.402s] Using topic "rdkafkatest_rnd50186972121f3611_do_test_DeleteGroups" 3: [0080_admin_ut / 1.402s] Using topic "rdkafkatest_rnd433b0260025cd834_do_test_DeleteGroups" 3: [0080_admin_ut / 1.402s] Using topic "rdkafkatest_rnd138d4ccd033f429c_do_test_DeleteGroups" 3: [0080_admin_ut / 1.402s] Using topic "rdkafkatest_rnd70775bb6305c19f7_do_test_DeleteGroups" 3: [0080_admin_ut / 1.402s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 1.402s] DeleteGroups: duration 0.008ms 3: [0045_subscribe_update_mock / 5.203s] Creating topic topic_13 3: [0045_subscribe_update_mock / 5.203s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.035ms 3: [0045_subscribe_update_mock / 5.203s] POLL: not expecting any messages for 300ms 3: [0080_admin_ut / 1.602s] DeleteGroups.queue_poll: duration 200.019ms 3: [0080_admin_ut / 1.602s] DeleteGroups: got DeleteGroupsResult in 200.019s 3: [0080_admin_ut / 1.602s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.602s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-53 DeleteGroups with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.602s] Using topic "rdkafkatest_rnd4b1b3c6d50a28a7e_do_test_DeleteGroups" 3: [0080_admin_ut / 1.602s] Using topic "rdkafkatest_rnd6eb6613f179f5528_do_test_DeleteGroups" 3: [0080_admin_ut / 1.602s] Using topic "rdkafkatest_rnd616d69c72ce82a20_do_test_DeleteGroups" 3: [0080_admin_ut / 1.602s] Using topic "rdkafkatest_rnd756f61de5283a3aa_do_test_DeleteGroups" 3: [0080_admin_ut / 1.602s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 1.602s] DeleteGroups: duration 0.007ms 3: [0084_destroy_flags_local / 1.514s] Calling rd_kafka_unsubscribe 3: [0084_destroy_flags_local / 1.514s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 1.514s] rd_kafka_destroy_flags(0x0): duration 0.129ms 3: [0084_destroy_flags_local / 1.514s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode: PASS ] 3: [0084_destroy_flags_local / 1.514s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode ] 3: [0084_destroy_flags_local / 1.514s] Test config file test.conf not found 3: [0084_destroy_flags_local / 1.514s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 1.514s] Created kafka instance 0084_destroy_flags_local#consumer-58 3: [0045_subscribe_update_mock / 5.503s] CONSUME: duration 300.080ms 3: [0045_subscribe_update_mock / 5.503s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 5.503s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0034_offset_reset_mock / 6.559s] #0: message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 6.559s] #0: got expected message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 6.559s] Waiting for up to 5000ms for metadata update 3: [0080_admin_ut / 1.802s] DeleteGroups.queue_poll: duration 200.020ms 3: [0080_admin_ut / 1.802s] DeleteGroups: got DeleteGroupsResult in 200.020s 3: [0080_admin_ut / 1.802s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.802s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 1.802s] Using topic "rdkafkatest_rnd50ba14b801e0bb10_do_test_DeleteRecords" 3: [0080_admin_ut / 1.802s] Using topic "rdkafkatest_rnd6eac7f5e6b991eb8_do_test_DeleteRecords" 3: [0080_admin_ut / 1.802s] Using topic "rdkafkatest_rnd7047e89e22899fe9_do_test_DeleteRecords" 3: [0080_admin_ut / 1.802s] Using topic "rdkafkatest_rnd5227866b50583ca7_do_test_DeleteRecords" 3: [0080_admin_ut / 1.802s] Call DeleteRecords, timeout is 100ms 3: [0080_admin_ut / 1.802s] DeleteRecords: duration 0.017ms 3: [0045_subscribe_update_mock / 5.603s] Creating topic topic_14 3: [0045_subscribe_update_mock / 5.603s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.024ms 3: [0045_subscribe_update_mock / 5.603s] POLL: not expecting any messages for 300ms 3: [0080_admin_ut / 1.902s] DeleteRecords.queue_poll: duration 100.012ms 3: [0080_admin_ut / 1.902s] DeleteRecords: got DeleteRecordsResult in 100.012s 3: [0080_admin_ut / 1.902s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 1.902s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.902s] Using topic "rdkafkatest_rnd128a54ba42b710b5_do_test_DeleteRecords" 3: [0080_admin_ut / 1.902s] Using topic "rdkafkatest_rndd8693000f748a39_do_test_DeleteRecords" 3: [0080_admin_ut / 1.902s] Using topic "rdkafkatest_rnd2596b5d744d29598_do_test_DeleteRecords" 3: [0080_admin_ut / 1.902s] Using topic "rdkafkatest_rnd5969adbb75af1f49_do_test_DeleteRecords" 3: [0080_admin_ut / 1.902s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 1.902s] DeleteRecords: duration 0.011ms 3: [0080_admin_ut / 2.102s] DeleteRecords.queue_poll: duration 200.015ms 3: [0080_admin_ut / 2.103s] DeleteRecords: got DeleteRecordsResult in 200.015s 3: [0080_admin_ut / 2.103s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.103s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.103s] Using topic "rdkafkatest_rnd56f1cba91ca4b01c_do_test_DeleteRecords" 3: [0080_admin_ut / 2.103s] Using topic "rdkafkatest_rnd780bf77d6a7f1876_do_test_DeleteRecords" 3: [0080_admin_ut / 2.103s] Using topic "rdkafkatest_rnd1fe3f2b868835333_do_test_DeleteRecords" 3: [0080_admin_ut / 2.103s] Using topic "rdkafkatest_rnd1adb326d6aff2f25_do_test_DeleteRecords" 3: [0080_admin_ut / 2.103s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 2.103s] DeleteRecords: duration 0.010ms 3: [0045_subscribe_update_mock / 5.904s] CONSUME: duration 300.079ms 3: [0045_subscribe_update_mock / 5.904s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 5.904s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0084_destroy_flags_local / 2.014s] Calling rd_kafka_unsubscribe 3: [0084_destroy_flags_local / 2.015s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 2.015s] rd_kafka_destroy_flags(0x8): duration 0.168ms 3: [0084_destroy_flags_local / 2.015s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode: PASS ] 3: [0084_destroy_flags_local / 2.015s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 2.015s] Test config file test.conf not found 3: [0084_destroy_flags_local / 2.015s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 2.015s] Created kafka instance 0084_destroy_flags_local#consumer-59 3: [
/ 7.103s] Log: [thrd:127.0.0.1:2/bootstrap]: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 1.000s] Error: Local: Broker transport failure: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 1.000s] Error: Local: All broker connections are down: 2/2 brokers are down 3: [0095_all_brokers_down / 1.000s] Test KafkaConsumer 3: [0034_offset_reset_mock / 7.061s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 7.061s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 7.061s] METADATA.WAIT: duration 502.380ms 3: [0080_admin_ut / 2.303s] DeleteRecords.queue_poll: duration 200.016ms 3: [0080_admin_ut / 2.303s] DeleteRecords: got DeleteRecordsResult in 200.016s 3: [0080_admin_ut / 2.303s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-53 DeleteRecords with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.303s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 2.303s] Call DeleteConsumerGroupOffsets, timeout is 100ms 3: [0080_admin_ut / 2.303s] DeleteConsumerGroupOffsets: duration 0.007ms 3: [0080_admin_ut / 2.403s] DeleteConsumerGroupOffsets.queue_poll: duration 100.017ms 3: [0080_admin_ut / 2.403s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 100.017s 3: [0080_admin_ut / 2.403s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 2.403s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.403s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 2.403s] DeleteConsumerGroupOffsets: duration 0.004ms 3: [0080_admin_ut / 2.603s] DeleteConsumerGroupOffsets.queue_poll: duration 200.018ms 3: [0080_admin_ut / 2.603s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.018s 3: [0080_admin_ut / 2.603s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.603s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.603s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 2.603s] DeleteConsumerGroupOffsets: duration 0.005ms 3: [0084_destroy_flags_local / 2.515s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 2.515s] rd_kafka_destroy_flags(0x0): duration 0.162ms 3: [0084_destroy_flags_local / 2.515s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 2.515s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 2.515s] Test config file test.conf not found 3: [0084_destroy_flags_local / 2.515s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 2.516s] Created kafka instance 0084_destroy_flags_local#consumer-61 3: [0080_admin_ut / 2.803s] DeleteConsumerGroupOffsets.queue_poll: duration 200.018ms 3: [0080_admin_ut / 2.803s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.018s 3: [0080_admin_ut / 2.803s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-53 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.803s] Using topic "rdkafkatest_rnd3925ddb2099193ac_do_test_AclBinding" 3: [0080_admin_ut / 2.803s] [ do_test_AclBinding:721 ] 3: [0080_admin_ut / 2.803s] [ do_test_AclBinding:721: PASS (0.00s) ] 3: [0080_admin_ut / 2.803s] Using topic "rdkafkatest_rnd29e844d1a934779_do_test_AclBindingFilter" 3: [0080_admin_ut / 2.803s] [ do_test_AclBindingFilter:853 ] 3: [0080_admin_ut / 2.803s] [ do_test_AclBindingFilter:853: PASS (0.00s) ] 3: [0080_admin_ut / 2.803s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 2.803s] Using topic "rdkafkatest_rnd3679bdcd780de62c_do_test_CreateAcls" 3: [0080_admin_ut / 2.803s] Using topic "rdkafkatest_rnd6d16eb230733d285_do_test_CreateAcls" 3: [0080_admin_ut / 2.803s] Call CreateAcls, timeout is 100ms 3: [0080_admin_ut / 2.803s] CreateAcls: duration 0.007ms 3: [0080_admin_ut / 2.903s] CreateAcls.queue_poll: duration 100.014ms 3: [0080_admin_ut / 2.903s] CreateAcls: got CreateAclsResult in 100.014s 3: [0080_admin_ut / 2.903s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 2.903s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.903s] Using topic "rdkafkatest_rnd79eea13c5bc36a81_do_test_CreateAcls" 3: [0080_admin_ut / 2.903s] Using topic "rdkafkatest_rnd72ccf13d6a3689da_do_test_CreateAcls" 3: [0080_admin_ut / 2.903s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 2.903s] CreateAcls: duration 0.004ms 3: [0080_admin_ut / 3.103s] CreateAcls.queue_poll: duration 200.017ms 3: [0080_admin_ut / 3.103s] CreateAcls: got CreateAclsResult in 200.017s 3: [0080_admin_ut / 3.103s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.103s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.103s] Using topic "rdkafkatest_rnd7e4d0a6a44f477a9_do_test_CreateAcls" 3: [0080_admin_ut / 3.103s] Using topic "rdkafkatest_rnd3a8ec68110d75f24_do_test_CreateAcls" 3: [0080_admin_ut / 3.103s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 3.103s] CreateAcls: duration 0.005ms 3: [0084_destroy_flags_local / 3.016s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 3.016s] rd_kafka_destroy_flags(0x8): duration 0.169ms 3: [0084_destroy_flags_local / 3.016s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 3.016s] 0084_destroy_flags_local: duration 3016.003ms 3: [0084_destroy_flags_local / 3.016s] ================= Test 0084_destroy_flags_local PASSED ================= 3: [
/ 8.103s] Log: [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 2.000s] Error: Local: Broker transport failure: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 8.184s] Too many tests running (5 >= 5): postponing 0100_thread_interceptors start... 3: [0097_ssl_verify_local / 0.000s] ================= Running test 0097_ssl_verify_local ================= 3: [0097_ssl_verify_local / 0.000s] ==== Stats written to file stats_0097_ssl_verify_local_552685317800679810.json ==== 3: [0097_ssl_verify_local / 0.000s] Feature "ssl" is built-in 3: %7|1675737020.006|OPENSSL|rdkafka#producer-62| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1675737020.007|SSL|rdkafka#producer-62| [thrd:app]: Loading CA certificate from string 3: [0097_ssl_verify_local / 0.000s] Failed to create producer with junk ssl.ca.pem (as expected): ssl.ca.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: CERTIFICATE 3: %7|1675737020.007|OPENSSL|rdkafka#producer-63| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1675737020.011|SSL|rdkafka#producer-63| [thrd:app]: Loading private key from string 3: [0097_ssl_verify_local / 0.005s] Failed to create producer with junk ssl.key.pem (as expected): ssl.key.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: ANY PRIVATE KEY 3: %7|1675737020.011|OPENSSL|rdkafka#producer-64| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1675737020.015|SSL|rdkafka#producer-64| [thrd:app]: Loading public key from string 3: [0097_ssl_verify_local / 0.009s] Failed to create producer with junk ssl.certificate.pem (as expected): ssl.certificate.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: CERTIFICATE 3: [0097_ssl_verify_local / 0.009s] 0097_ssl_verify_local: duration 8.584ms 3: [0097_ssl_verify_local / 0.009s] ================= Test 0097_ssl_verify_local PASSED ================= 3: [
/ 8.192s] Too many tests running (5 >= 5): postponing 0103_transactions_local start... 3: [0100_thread_interceptors / 0.000s] ================= Running test 0100_thread_interceptors ================= 3: [0100_thread_interceptors / 0.000s] ==== Stats written to file stats_0100_thread_interceptors_2327210223162506805.json ==== 3: [0100_thread_interceptors / 0.000s] on_conf_dup() interceptor called 3: [0100_thread_interceptors / 0.000s] on_new() interceptor called 3: [
/ 8.193s] on_thread_start(0, main) called 3: [
/ 8.193s] Started thread: main 3: [
/ 8.193s] on_thread_start(2, :0/internal) called 3: [
/ 8.193s] Started thread: :0/internal 3: [
/ 8.193s] on_thread_start(2, 127.0.0.1:1/bootstrap) called 3: [
/ 8.193s] Started thread: 127.0.0.1:1/bootstrap 3: %3|1675737020.016|FAIL|rdkafka#producer-65| [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737020.016|ERROR|rdkafka#producer-65| [thrd:127.0.0.1:1/bootstrap]: 1/1 brokers are down 3: %3|1675737020.016|ERROR|rdkafka#producer-65| [thrd:app]: rdkafka#producer-65: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 8.193s] on_thread_exit(0, main) called 3: [
/ 8.193s] Exiting from thread: main 3: [
/ 8.193s] on_thread_exit(2, 127.0.0.1:1/bootstrap) called 3: [
/ 8.193s] Exiting from thread: 127.0.0.1:1/bootstrap 3: [
/ 8.193s] on_thread_exit(2, :0/internal) called 3: [
/ 8.193s] Exiting from thread: :0/internal 3: [0100_thread_interceptors / 0.001s] 3 thread start calls, 3 thread exit calls seen 3: [0100_thread_interceptors / 0.001s] 0100_thread_interceptors: duration 0.976ms 3: [0100_thread_interceptors / 0.001s] ================= Test 0100_thread_interceptors PASSED ================= 3: [0034_offset_reset_mock / 8.062s] #1: injecting _TRANSPORT, expecting NO_ERROR 3: [0034_offset_reset_mock / 8.062s] ASSIGN.PARTITIONS: duration 0.018ms 3: [0034_offset_reset_mock / 8.062s] ASSIGN: assigned 1 partition(s) 3: %4|1675737020.047|FAIL|0034_offset_reset_mock#consumer-28| [thrd:127.0.0.1:39707/bootstrap]: 127.0.0.1:39707/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1502ms in state UP) 3: [0034_offset_reset_mock / 8.062s] #1: Ignoring Error event: 127.0.0.1:39707/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1502ms in state UP) 3: [0080_admin_ut / 3.303s] CreateAcls.queue_poll: duration 200.018ms 3: [0080_admin_ut / 3.303s] CreateAcls: got CreateAclsResult in 200.018s 3: [0080_admin_ut / 3.303s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-53 CreaetAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.303s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 3.303s] Using topic "rdkafkatest_rndce7ef1a79b59719_do_test_DescribeAcls" 3: [0080_admin_ut / 3.303s] Call DescribeAcls, timeout is 100ms 3: [0080_admin_ut / 3.303s] DescribeAcls: duration 0.005ms 3: [
/ 8.294s] Too many tests running (5 >= 5): postponing 0104_fetch_from_follower_mock start... 3: [0103_transactions_local / 0.000s] ================= Running test 0103_transactions_local ================= 3: [0103_transactions_local / 0.000s] ==== Stats written to file stats_0103_transactions_local_2517896464099293891.json ==== 3: [0103_transactions_local / 0.000s] [ do_test_txn_local:1168 ] 3: [0103_transactions_local / 0.000s] Test config file test.conf not found 3: %5|1675737020.116|CONFWARN|0103_transactions_local#producer-66| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0103_transactions_local / 0.000s] Created kafka instance 0103_transactions_local#producer-66 3: [0103_transactions_local / 0.000s] Test config file test.conf not found 3: [0103_transactions_local / 0.000s] Created kafka instance 0103_transactions_local#producer-67 3: [0103_transactions_local / 0.000s] Waiting for init_transactions() timeout 7000 ms 3: [0103_transactions_local / 0.000s] Setting test timeout to 9s * 2.7 3: [0080_admin_ut / 3.403s] DescribeAcls.queue_poll: duration 100.020ms 3: [0080_admin_ut / 3.403s] DescribeAcls: got DescribeAclsResult in 100.020s 3: [0080_admin_ut / 3.403s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 3.403s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.403s] Using topic "rdkafkatest_rnd165a47351afd54fb_do_test_DescribeAcls" 3: [0080_admin_ut / 3.403s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 3.403s] DescribeAcls: duration 0.004ms 3: [0080_admin_ut / 3.603s] DescribeAcls.queue_poll: duration 200.020ms 3: [0080_admin_ut / 3.603s] DescribeAcls: got DescribeAclsResult in 200.020s 3: [0080_admin_ut / 3.603s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.603s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.603s] Using topic "rdkafkatest_rnd4e58d339363e39ed_do_test_DescribeAcls" 3: [0080_admin_ut / 3.603s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 3.603s] DescribeAcls: duration 0.005ms 3: [0080_admin_ut / 3.803s] DescribeAcls.queue_poll: duration 200.018ms 3: [0080_admin_ut / 3.803s] DescribeAcls: got DescribeAclsResult in 200.018s 3: [0080_admin_ut / 3.803s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-53 DescribeAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.803s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 3.803s] Using topic "rdkafkatest_rnd380a82e693405a7_do_test_DeleteAcls" 3: [0080_admin_ut / 3.803s] Using topic "rdkafkatest_rnd213d69123ca685e0_do_test_DeleteAcls" 3: [0080_admin_ut / 3.803s] Call DeleteAcls, timeout is 100ms 3: [0080_admin_ut / 3.803s] DeleteAcls: duration 0.005ms 3: [0080_admin_ut / 3.903s] DeleteAcls.queue_poll: duration 100.018ms 3: [0080_admin_ut / 3.903s] DeleteAcls: got DeleteAclsResult in 100.018s 3: [0080_admin_ut / 3.903s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 3.903s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.903s] Using topic "rdkafkatest_rnd72c5995323dbed60_do_test_DeleteAcls" 3: [0080_admin_ut / 3.903s] Using topic "rdkafkatest_rnd5739cd5a293f5720_do_test_DeleteAcls" 3: [0080_admin_ut / 3.903s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 3.903s] DeleteAcls: duration 0.004ms 3: [0080_admin_ut / 4.103s] DeleteAcls.queue_poll: duration 200.036ms 3: [0080_admin_ut / 4.103s] DeleteAcls: got DeleteAclsResult in 200.036s 3: [0080_admin_ut / 4.103s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 4.103s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 4.103s] Using topic "rdkafkatest_rnd1be9d38c4450b87d_do_test_DeleteAcls" 3: [0080_admin_ut / 4.103s] Using topic "rdkafkatest_rnd307329a615d874c8_do_test_DeleteAcls" 3: [0080_admin_ut / 4.103s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 4.103s] DeleteAcls: duration 0.004ms 3: [
/ 9.103s] Log: [thrd:127.0.0.1:2/bootstrap]: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 3.000s] Error: Local: Broker transport failure: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 3.000s] Error: Local: All broker connections are down: 2/2 brokers are down 3: [0095_all_brokers_down / 3.001s] 0095_all_brokers_down: duration 3000.985ms 3: [0095_all_brokers_down / 3.001s] ================= Test 0095_all_brokers_down PASSED ================= 3: [
/ 9.204s] Too many tests running (5 >= 5): postponing 0105_transactions_mock start... 3: [0104_fetch_from_follower_mock/ 0.000s] ================= Running test 0104_fetch_from_follower_mock ================= 3: [0104_fetch_from_follower_mock/ 0.000s] ==== Stats written to file stats_0104_fetch_from_follower_mock_2311510987951315683.json ==== 3: [0104_fetch_from_follower_mock/ 0.000s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.000s] [ Test FFF auto.offset.reset=earliest ] 3: %5|1675737021.027|CONFWARN|MOCK#producer-68| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 0.000s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.017s] Created kafka instance 0104_fetch_from_follower_mock#producer-69 3: [0104_fetch_from_follower_mock/ 0.017s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.017s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 0.018s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 0.018s] PRODUCE: duration 1.227ms 3: [0080_admin_ut / 4.303s] DeleteAcls.queue_poll: duration 200.014ms 3: [0080_admin_ut / 4.303s] DeleteAcls: got DeleteAclsResult in 200.014s 3: [0080_admin_ut / 4.303s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-53 DeleteAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 4.303s] [ do_test_mix:1342 ] 3: [0080_admin_ut / 4.303s] Creating 2 topics 3: [0080_admin_ut / 4.303s] Deleting 1 topics 3: [0080_admin_ut / 4.303s] Creating 1 topics 3: [0080_admin_ut / 4.303s] Deleting 3 groups 3: [0080_admin_ut / 4.303s] Deleting offsets from 3 partitions 3: [0080_admin_ut / 4.303s] Creating (up to) 15 partitions for topic "topicD" 3: [0080_admin_ut / 4.303s] Deleting committed offsets for group mygroup and 3 partitions 3: [0080_admin_ut / 4.303s] Provoking invalid DeleteConsumerGroupOffsets call 3: [0080_admin_ut / 4.303s] Creating 2 topics 3: [0080_admin_ut / 4.303s] Got event DeleteConsumerGroupOffsetsResult: Exactly one DeleteConsumerGroupOffsets must be passed 3: [0104_fetch_from_follower_mock/ 0.060s] PRODUCE.DELIVERY.WAIT: duration 41.610ms 3: [0104_fetch_from_follower_mock/ 0.060s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.060s] Created kafka instance 0104_fetch_from_follower_mock#consumer-70 3: [0104_fetch_from_follower_mock/ 0.060s] ASSIGN.PARTITIONS: duration 0.024ms 3: [0104_fetch_from_follower_mock/ 0.060s] earliest: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 0.060s] earliest: consume 1000 messages 3: [0080_admin_ut / 4.403s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.403s] Got event DeleteTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.403s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.403s] Got event DeleteGroupsResult: Success 3: [0080_admin_ut / 4.403s] Got event DeleteRecordsResult: Failed to query partition leaders: Local: Timed out 3: [0080_admin_ut / 4.403s] Got event CreatePartitionsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.403s] Got event DeleteConsumerGroupOffsetsResult: Failed while waiting for response from broker: Local: Timed out 3: [0080_admin_ut / 4.403s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.403s] [ do_test_mix:1342: PASS (0.10s) ] 3: [0080_admin_ut / 4.403s] [ do_test_configs:1411 ] 3: %4|1675737021.691|OFFSET|0104_fetch_from_follower_mock#consumer-70| [thrd:main]: test [0]: offset reset (at offset 10, broker 2) to cached BEGINNING offset 0: fetch failed due to requested offset not available on the broker: Broker: Offset out of range 3: [0104_fetch_from_follower_mock/ 1.170s] CONSUME: duration 1109.638ms 3: [0104_fetch_from_follower_mock/ 1.170s] earliest: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 1.170s] Closing consumer 0104_fetch_from_follower_mock#consumer-70 3: [0104_fetch_from_follower_mock/ 1.170s] CONSUMER.CLOSE: duration 0.159ms 3: [0104_fetch_from_follower_mock/ 1.172s] [ Test FFF auto.offset.reset=earliest PASSED ] 3: [0104_fetch_from_follower_mock/ 1.172s] [ Test FFF auto.offset.reset=latest ] 3: %5|1675737022.199|CONFWARN|MOCK#producer-71| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 1.172s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.172s] Created kafka instance 0104_fetch_from_follower_mock#producer-72 3: [0104_fetch_from_follower_mock/ 1.172s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.172s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 1.173s] SUM(POLL): duration 0.002ms 3: [0104_fetch_from_follower_mock/ 1.173s] PRODUCE: duration 0.843ms 3: [0104_fetch_from_follower_mock/ 1.216s] PRODUCE.DELIVERY.WAIT: duration 42.742ms 3: [0104_fetch_from_follower_mock/ 1.216s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.216s] Created kafka instance 0104_fetch_from_follower_mock#consumer-73 3: [0104_fetch_from_follower_mock/ 1.216s] ASSIGN.PARTITIONS: duration 0.048ms 3: [0104_fetch_from_follower_mock/ 1.216s] latest: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 1.216s] latest: not expecting any messages for 5000ms 3: [0080_admin_ut / 6.403s] [ do_test_configs:1411: PASS (2.00s) ] 3: [0080_admin_ut / 6.404s] Test config file test.conf not found 3: %5|1675737023.179|CONFWARN|0080_admin_ut#producer-74| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.404s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-74 DeleteRecords with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rndefea21e612d69_do_test_DeleteRecords" 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rnd6834928c3a9dc523_do_test_DeleteRecords" 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rnd2f388c8e6fe01aea_do_test_DeleteRecords" 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rnd2b31ea54f8475ec_do_test_DeleteRecords" 3: [0080_admin_ut / 6.404s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 6.404s] DeleteRecords: duration 0.014ms 3: [0080_admin_ut / 6.404s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-74 DeleteRecords with main queue, options, destroy, timeout 100ms: PASS (0.00s) ] 3: [0080_admin_ut / 6.404s] Test config file test.conf not found 3: %5|1675737023.179|CONFWARN|0080_admin_ut#producer-75| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.404s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-75 DeleteGroups with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rnd1d22591f0f9b0dbf_do_test_DeleteGroups" 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rnd493a0d054013b69d_do_test_DeleteGroups" 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rnd7374c8825f94543b_do_test_DeleteGroups" 3: [0080_admin_ut / 6.404s] Using topic "rdkafkatest_rnd5b110b9841cd9bbc_do_test_DeleteGroups" 3: [0080_admin_ut / 6.404s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 6.404s] DeleteGroups: duration 0.008ms 3: [0080_admin_ut / 6.404s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 6.404s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq ] 3: [0080_admin_ut / 6.404s] Test config file test.conf not found 3: %5|1675737023.179|CONFWARN|0080_admin_ut#consumer-76| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.505s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 6.505s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 6.505s] rd_kafka_destroy(): duration 0.114ms 3: [0080_admin_ut / 6.505s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq: PASS (0.10s) ] 3: [0080_admin_ut / 6.505s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 6.505s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq ] 3: [0080_admin_ut / 6.505s] Test config file test.conf not found 3: %5|1675737023.280|CONFWARN|0080_admin_ut#consumer-77| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %4|1675737023.350|OFFSET|0104_fetch_from_follower_mock#consumer-73| [thrd:main]: test [0]: offset reset (at offset 1000, broker 2) to END: fetch failed due to requested offset not available on the broker: Broker: Offset out of range 3: [0080_admin_ut / 6.605s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 6.605s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 6.611s] rd_kafka_destroy(): duration 6.249ms 3: [0080_admin_ut / 6.611s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq: PASS (0.11s) ] 3: [0080_admin_ut / 6.611s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 6.611s] Test config file test.conf not found 3: %4|1675737023.386|CONFWARN|0080_admin_ut#consumer-78| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1675737023.386|CONFWARN|0080_admin_ut#consumer-78| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.611s] [ do_test_options:1588 ] 3: [0080_admin_ut / 6.611s] [ do_test_options:1588: PASS (0.00s) ] 3: [0080_admin_ut / 6.611s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 6.611s] Using topic "rdkafkatest_rnd15d28e285e91b3c7_do_test_CreateTopics" 3: [0080_admin_ut / 6.611s] Using topic "rdkafkatest_rnd370ff73b1b3839a7_do_test_CreateTopics" 3: [0080_admin_ut / 6.611s] Using topic "rdkafkatest_rnd5aebe49b72720701_do_test_CreateTopics" 3: [0080_admin_ut / 6.611s] Using topic "rdkafkatest_rnd76d5b82736c2bf7f_do_test_CreateTopics" 3: [0080_admin_ut / 6.611s] Using topic "rdkafkatest_rndcae2cef56d6e27e_do_test_CreateTopics" 3: [0080_admin_ut / 6.611s] Using topic "rdkafkatest_rndcbd2b9175380fe7_do_test_CreateTopics" 3: [0080_admin_ut / 6.612s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 6.612s] CreateTopics: duration 0.072ms 3: [0080_admin_ut / 6.712s] CreateTopics.queue_poll: duration 99.963ms 3: [0080_admin_ut / 6.712s] CreateTopics: got CreateTopicsResult in 99.963s 3: [0080_admin_ut / 6.712s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 6.712s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms ] 3: [0080_admin_ut / 6.712s] Using topic "rdkafkatest_rnd475af0b424709c75_do_test_CreateTopics" 3: [0080_admin_ut / 6.712s] Using topic "rdkafkatest_rnd4a0e0f5a73f51261_do_test_CreateTopics" 3: [0080_admin_ut / 6.712s] Using topic "rdkafkatest_rnd59a91d193d2f1f67_do_test_CreateTopics" 3: [0080_admin_ut / 6.712s] Using topic "rdkafkatest_rnd4d1de59c1cc373a2_do_test_CreateTopics" 3: [0080_admin_ut / 6.712s] Using topic "rdkafkatest_rndeeb8158329601ca_do_test_CreateTopics" 3: [0080_admin_ut / 6.712s] Using topic "rdkafkatest_rnd39ed22bb69a5f905_do_test_CreateTopics" 3: [0080_admin_ut / 6.712s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 6.712s] CreateTopics: duration 0.082ms 3: [0080_admin_ut / 6.812s] CreateTopics.wait_background_event_cb: duration 99.988ms 3: [0080_admin_ut / 6.812s] CreateTopics: got CreateTopicsResult in 99.988s 3: [0080_admin_ut / 6.812s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 6.812s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 6.812s] Using topic "rdkafkatest_rnd57b45d714491dda0_do_test_CreateTopics" 3: [0080_admin_ut / 6.812s] Using topic "rdkafkatest_rnd1ebaef483b6795c7_do_test_CreateTopics" 3: [0080_admin_ut / 6.812s] Using topic "rdkafkatest_rnd1634aac54815c2b6_do_test_CreateTopics" 3: [0080_admin_ut / 6.812s] Using topic "rdkafkatest_rnd30ee812654d2ee47_do_test_CreateTopics" 3: [0080_admin_ut / 6.812s] Using topic "rdkafkatest_rnd33dcea131c2ddefc_do_test_CreateTopics" 3: [0080_admin_ut / 6.812s] Using topic "rdkafkatest_rnd26ab6dea663bee56_do_test_CreateTopics" 3: [0080_admin_ut / 6.812s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 6.812s] CreateTopics: duration 0.064ms 3: [0080_admin_ut / 7.012s] CreateTopics.queue_poll: duration 199.966ms 3: [0080_admin_ut / 7.012s] CreateTopics: got CreateTopicsResult in 199.966s 3: [0080_admin_ut / 7.012s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 7.012s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.012s] Using topic "rdkafkatest_rnd369c4ae13fe50b6f_do_test_CreateTopics" 3: [0080_admin_ut / 7.012s] Using topic "rdkafkatest_rnd6a0de760d02f10b_do_test_CreateTopics" 3: [0080_admin_ut / 7.012s] Using topic "rdkafkatest_rnd31b67da31bee7263_do_test_CreateTopics" 3: [0080_admin_ut / 7.012s] Using topic "rdkafkatest_rnd3b5dd09755db951e_do_test_CreateTopics" 3: [0080_admin_ut / 7.012s] Using topic "rdkafkatest_rnd603d5d322d8ff290_do_test_CreateTopics" 3: [0080_admin_ut / 7.012s] Using topic "rdkafkatest_rnd778ef0cf4c4ae1d8_do_test_CreateTopics" 3: [0080_admin_ut / 7.012s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 7.012s] CreateTopics: duration 0.078ms 3: [0034_offset_reset_mock / 11.924s] #1: message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 11.924s] #1: got expected message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 11.924s] Waiting for up to 5000ms for metadata update 3: [0080_admin_ut / 7.212s] CreateTopics.queue_poll: duration 199.949ms 3: [0080_admin_ut / 7.212s] CreateTopics: got CreateTopicsResult in 199.949s 3: [0080_admin_ut / 7.212s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-78 CreateTopics with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 7.212s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-78 DeleteTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 7.212s] Using topic "rdkafkatest_rnd45a343eb627f8c9e_do_test_DeleteTopics" 3: [0080_admin_ut / 7.212s] Using topic "rdkafkatest_rnd70c6b8506a8e7985_do_test_DeleteTopics" 3: [0080_admin_ut / 7.212s] Using topic "rdkafkatest_rnd136e0dc44599a697_do_test_DeleteTopics" 3: [0080_admin_ut / 7.212s] Using topic "rdkafkatest_rnd4b1bf07474af7d7_do_test_DeleteTopics" 3: [0080_admin_ut / 7.212s] Call DeleteTopics, timeout is 100ms 3: [0080_admin_ut / 7.212s] DeleteTopics: duration 0.010ms 3: [0080_admin_ut / 7.312s] DeleteTopics.queue_poll: duration 100.023ms 3: [0080_admin_ut / 7.312s] DeleteTopics: got DeleteTopicsResult in 100.023s 3: [0080_admin_ut / 7.312s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.312s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-78 DeleteTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.313s] Using topic "rdkafkatest_rnd61c785934345a0fe_do_test_DeleteTopics" 3: [0080_admin_ut / 7.313s] Using topic "rdkafkatest_rnd6df665c1480373e9_do_test_DeleteTopics" 3: [0080_admin_ut / 7.313s] Using topic "rdkafkatest_rnd75ce95562492b0a3_do_test_DeleteTopics" 3: [0080_admin_ut / 7.313s] Using topic "rdkafkatest_rnd7e87f596586a916_do_test_DeleteTopics" 3: [0080_admin_ut / 7.313s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 7.313s] DeleteTopics: duration 0.007ms 3: [0080_admin_ut / 7.513s] DeleteTopics.queue_poll: duration 200.026ms 3: [0080_admin_ut / 7.513s] DeleteTopics: got DeleteTopicsResult in 200.026s 3: [0080_admin_ut / 7.513s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.513s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-78 DeleteTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.513s] Using topic "rdkafkatest_rnd2b338f1914eb7064_do_test_DeleteTopics" 3: [0080_admin_ut / 7.513s] Using topic "rdkafkatest_rnd720230775cea0cbc_do_test_DeleteTopics" 3: [0080_admin_ut / 7.513s] Using topic "rdkafkatest_rnd30d9e2c83113b9a3_do_test_DeleteTopics" 3: [0080_admin_ut / 7.513s] Using topic "rdkafkatest_rnd1847dd5306b577e6_do_test_DeleteTopics" 3: [0080_admin_ut / 7.513s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 7.513s] DeleteTopics: duration 0.007ms 3: [0034_offset_reset_mock / 12.427s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 12.427s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 12.427s] METADATA.WAIT: duration 502.704ms 3: [0080_admin_ut / 7.713s] DeleteTopics.queue_poll: duration 200.025ms 3: [0080_admin_ut / 7.713s] DeleteTopics: got DeleteTopicsResult in 200.025s 3: [0080_admin_ut / 7.713s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.713s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-78 DeleteGroups with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 7.713s] Using topic "rdkafkatest_rnd59cb3bd578853a85_do_test_DeleteGroups" 3: [0080_admin_ut / 7.713s] Using topic "rdkafkatest_rnd34456a7647149ba7_do_test_DeleteGroups" 3: [0080_admin_ut / 7.713s] Using topic "rdkafkatest_rnd70142b5500904c4f_do_test_DeleteGroups" 3: [0080_admin_ut / 7.713s] Using topic "rdkafkatest_rnd6fc5914035b76f40_do_test_DeleteGroups" 3: [0080_admin_ut / 7.713s] Call DeleteGroups, timeout is 100ms 3: [0080_admin_ut / 7.713s] DeleteGroups: duration 0.009ms 3: [0080_admin_ut / 7.813s] DeleteGroups.queue_poll: duration 100.037ms 3: [0080_admin_ut / 7.813s] DeleteGroups: got DeleteGroupsResult in 100.037s 3: [0080_admin_ut / 7.813s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 7.813s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-78 DeleteGroups with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.813s] Using topic "rdkafkatest_rnd630fd8ed608c4990_do_test_DeleteGroups" 3: [0080_admin_ut / 7.813s] Using topic "rdkafkatest_rnd2045e8c6767de6b1_do_test_DeleteGroups" 3: [0080_admin_ut / 7.813s] Using topic "rdkafkatest_rnd2625f02824f7a7cd_do_test_DeleteGroups" 3: [0080_admin_ut / 7.813s] Using topic "rdkafkatest_rnd3dc8de8807ed75bb_do_test_DeleteGroups" 3: [0080_admin_ut / 7.813s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 7.813s] DeleteGroups: duration 0.011ms 3: [0080_admin_ut / 8.013s] DeleteGroups.queue_poll: duration 200.033ms 3: [0080_admin_ut / 8.013s] DeleteGroups: got DeleteGroupsResult in 200.033s 3: [0080_admin_ut / 8.013s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 8.013s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-78 DeleteGroups with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.013s] Using topic "rdkafkatest_rnd683d48cb2bbf4449_do_test_DeleteGroups" 3: [0080_admin_ut / 8.013s] Using topic "rdkafkatest_rnd4ff0e9a55e0bde21_do_test_DeleteGroups" 3: [0080_admin_ut / 8.013s] Using topic "rdkafkatest_rnd5051f4ec57d968fe_do_test_DeleteGroups" 3: [0080_admin_ut / 8.013s] Using topic "rdkafkatest_rnd439287377b858405_do_test_DeleteGroups" 3: [0080_admin_ut / 8.013s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 8.013s] DeleteGroups: duration 0.011ms 3: [0080_admin_ut / 8.213s] DeleteGroups.queue_poll: duration 200.025ms 3: [0080_admin_ut / 8.213s] DeleteGroups: got DeleteGroupsResult in 200.025s 3: [0080_admin_ut / 8.213s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 8.213s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-78 DeleteRecords with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 8.213s] Using topic "rdkafkatest_rnd6cc4d9623594b7af_do_test_DeleteRecords" 3: [0080_admin_ut / 8.213s] Using topic "rdkafkatest_rnd586f90c11d9ebc2a_do_test_DeleteRecords" 3: [0080_admin_ut / 8.213s] Using topic "rdkafkatest_rnd66a8715270b76e14_do_test_DeleteRecords" 3: [0080_admin_ut / 8.213s] Using topic "rdkafkatest_rnd245434114073ad27_do_test_DeleteRecords" 3: [0080_admin_ut / 8.213s] Call DeleteRecords, timeout is 100ms 3: [0080_admin_ut / 8.213s] DeleteRecords: duration 0.017ms 3: [0080_admin_ut / 8.313s] DeleteRecords.queue_poll: duration 100.018ms 3: [0080_admin_ut / 8.313s] DeleteRecords: got DeleteRecordsResult in 100.018s 3: [0080_admin_ut / 8.313s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-78 DeleteRecords with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 8.313s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-78 DeleteRecords with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.313s] Using topic "rdkafkatest_rnd693ca89a58999e87_do_test_DeleteRecords" 3: [0080_admin_ut / 8.313s] Using topic "rdkafkatest_rnd78848ce5950d3ef_do_test_DeleteRecords" 3: [0080_admin_ut / 8.313s] Using topic "rdkafkatest_rnd5929ead6774dda0f_do_test_DeleteRecords" 3: [0080_admin_ut / 8.313s] Using topic "rdkafkatest_rndf08432f3c39c3c3_do_test_DeleteRecords" 3: [0080_admin_ut / 8.313s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 8.313s] DeleteRecords: duration 0.015ms 3: [0080_admin_ut / 8.513s] DeleteRecords.queue_poll: duration 200.020ms 3: [0080_admin_ut / 8.513s] DeleteRecords: got DeleteRecordsResult in 200.020s 3: [0080_admin_ut / 8.513s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-78 DeleteRecords with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 8.513s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-78 DeleteRecords with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.513s] Using topic "rdkafkatest_rnd57da239f2f4e2bf5_do_test_DeleteRecords" 3: [0080_admin_ut / 8.513s] Using topic "rdkafkatest_rnd32b7aa747e0013c7_do_test_DeleteRecords" 3: [0080_admin_ut / 8.513s] Using topic "rdkafkatest_rnd5445d3c2708088fc_do_test_DeleteRecords" 3: [0080_admin_ut / 8.513s] Using topic "rdkafkatest_rnd5ed89833c831c8d_do_test_DeleteRecords" 3: [0080_admin_ut / 8.513s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 8.513s] DeleteRecords: duration 0.010ms 3: [0034_offset_reset_mock / 13.427s] #2: injecting TOPIC_AUTHORIZATION_FAILED, expecting TOPIC_AUTHORIZATION_FAILED 3: [0034_offset_reset_mock / 13.427s] ASSIGN.PARTITIONS: duration 0.040ms 3: [0034_offset_reset_mock / 13.427s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 13.470s] #2: injected TOPIC_AUTHORIZATION_FAILED, got error _AUTO_OFFSET_RESET: failed to query logical offset: Broker: Topic authorization failed (broker 1) 3: [0034_offset_reset_mock / 13.470s] Waiting for up to 5000ms for metadata update 3: [0034_offset_reset_mock / 13.470s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 13.470s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 13.470s] METADATA.WAIT: duration 0.094ms 3: [0080_admin_ut / 8.713s] DeleteRecords.queue_poll: duration 200.027ms 3: [0080_admin_ut / 8.713s] DeleteRecords: got DeleteRecordsResult in 200.027s 3: [0080_admin_ut / 8.713s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-78 DeleteRecords with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 8.713s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-78 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 8.713s] Call DeleteConsumerGroupOffsets, timeout is 100ms 3: [0080_admin_ut / 8.713s] DeleteConsumerGroupOffsets: duration 0.009ms 3: [0080_admin_ut / 8.813s] DeleteConsumerGroupOffsets.queue_poll: duration 100.022ms 3: [0080_admin_ut / 8.813s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 100.022s 3: [0080_admin_ut / 8.813s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-78 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 8.813s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-78 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.813s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 8.813s] DeleteConsumerGroupOffsets: duration 0.006ms 3: [0080_admin_ut / 9.013s] DeleteConsumerGroupOffsets.queue_poll: duration 200.022ms 3: [0080_admin_ut / 9.013s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.022s 3: [0080_admin_ut / 9.013s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-78 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.013s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-78 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.013s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 9.013s] DeleteConsumerGroupOffsets: duration 0.007ms 3: [0080_admin_ut / 9.213s] DeleteConsumerGroupOffsets.queue_poll: duration 200.022ms 3: [0080_admin_ut / 9.213s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.022s 3: [0080_admin_ut / 9.213s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-78 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.213s] Using topic "rdkafkatest_rnd1c3fcd4655de7328_do_test_AclBinding" 3: [0080_admin_ut / 9.213s] [ do_test_AclBinding:721 ] 3: [0080_admin_ut / 9.213s] [ do_test_AclBinding:721: PASS (0.00s) ] 3: [0080_admin_ut / 9.213s] Using topic "rdkafkatest_rnd1a8efaaf6c91c232_do_test_AclBindingFilter" 3: [0080_admin_ut / 9.213s] [ do_test_AclBindingFilter:853 ] 3: [0080_admin_ut / 9.213s] [ do_test_AclBindingFilter:853: PASS (0.00s) ] 3: [0080_admin_ut / 9.213s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-78 CreaetAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 9.213s] Using topic "rdkafkatest_rnd2db7dc265e2181e6_do_test_CreateAcls" 3: [0080_admin_ut / 9.213s] Using topic "rdkafkatest_rnd681746381a7cb588_do_test_CreateAcls" 3: [0080_admin_ut / 9.213s] Call CreateAcls, timeout is 100ms 3: [0080_admin_ut / 9.213s] CreateAcls: duration 0.006ms 3: [0080_admin_ut / 9.313s] CreateAcls.queue_poll: duration 100.028ms 3: [0080_admin_ut / 9.313s] CreateAcls: got CreateAclsResult in 100.028s 3: [0080_admin_ut / 9.313s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-78 CreaetAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 9.313s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-78 CreaetAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.313s] Using topic "rdkafkatest_rnd13b639954086d6f9_do_test_CreateAcls" 3: [0080_admin_ut / 9.313s] Using topic "rdkafkatest_rnd381b71b37a5eaae8_do_test_CreateAcls" 3: [0080_admin_ut / 9.313s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 9.313s] CreateAcls: duration 0.007ms 3: [0080_admin_ut / 9.513s] CreateAcls.queue_poll: duration 200.026ms 3: [0080_admin_ut / 9.513s] CreateAcls: got CreateAclsResult in 200.026s 3: [0080_admin_ut / 9.513s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-78 CreaetAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.513s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-78 CreaetAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.513s] Using topic "rdkafkatest_rnd313e450e5c6fa5c4_do_test_CreateAcls" 3: [0080_admin_ut / 9.513s] Using topic "rdkafkatest_rnd3ad2580f1a7aeda8_do_test_CreateAcls" 3: [0080_admin_ut / 9.513s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 9.513s] CreateAcls: duration 0.009ms 3: [0034_offset_reset_mock / 14.470s] #3: injecting NO_ERROR, expecting _NO_OFFSET 3: [0034_offset_reset_mock / 14.470s] ASSIGN.PARTITIONS: duration 0.035ms 3: [0034_offset_reset_mock / 14.470s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 14.470s] #3: Ignoring Error event: Failed to query logical offset TAIL(10): Broker: Topic authorization failed 3: [0034_offset_reset_mock / 14.471s] #3: injected NO_ERROR, got error _AUTO_OFFSET_RESET: no previously committed offset available: Local: No offset stored 3: [0034_offset_reset_mock / 14.471s] [ offset_reset_errors:201: PASS (14.47s) ] 3: [0034_offset_reset_mock / 14.471s] 0034_offset_reset_mock: duration 14470.968ms 3: [0034_offset_reset_mock / 14.471s] ================= Test 0034_offset_reset_mock PASSED ================= 3: [
/ 14.634s] Too many tests running (5 >= 5): postponing 0106_cgrp_sess_timeout start... 3: [0105_transactions_mock / 0.000s] ================= Running test 0105_transactions_mock ================= 3: [0105_transactions_mock / 0.000s] ==== Stats written to file stats_0105_transactions_mock_3821660848827048158.json ==== 3: [0105_transactions_mock / 0.000s] Test config file test.conf not found 3: [0105_transactions_mock / 0.000s] [ do_test_txn_recoverable_errors:194 ] 3: [0105_transactions_mock / 0.000s] Test config file test.conf not found 3: [0105_transactions_mock / 0.000s] Setting test timeout to 60s * 2.7 3: %5|1675737026.457|MOCK|0105_transactions_mock#producer-79| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39491,127.0.0.1:38817,127.0.0.1:34887 3: [0105_transactions_mock / 0.001s] Created kafka instance 0105_transactions_mock#producer-79 3: %4|1675737026.458|GETPID|0105_transactions_mock#producer-79| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Coordinator not available: retrying 3: [0080_admin_ut / 9.713s] CreateAcls.queue_poll: duration 200.010ms 3: [0080_admin_ut / 9.713s] CreateAcls: got CreateAclsResult in 200.010s 3: [0080_admin_ut / 9.713s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-78 CreaetAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.714s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-78 DescribeAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 9.714s] Using topic "rdkafkatest_rnd73cbc1970e332f22_do_test_DescribeAcls" 3: [0080_admin_ut / 9.714s] Call DescribeAcls, timeout is 100ms 3: [0080_admin_ut / 9.714s] DescribeAcls: duration 0.008ms 3: [0080_admin_ut / 9.814s] DescribeAcls.queue_poll: duration 100.026ms 3: [0080_admin_ut / 9.814s] DescribeAcls: got DescribeAclsResult in 100.026s 3: [0080_admin_ut / 9.814s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-78 DescribeAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 9.814s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-78 DescribeAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.814s] Using topic "rdkafkatest_rnd39a87aed02d404c6_do_test_DescribeAcls" 3: [0080_admin_ut / 9.814s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 9.814s] DescribeAcls: duration 0.007ms 3: [0080_admin_ut / 10.014s] DescribeAcls.queue_poll: duration 200.023ms 3: [0080_admin_ut / 10.014s] DescribeAcls: got DescribeAclsResult in 200.023s 3: [0080_admin_ut / 10.014s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-78 DescribeAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.014s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-78 DescribeAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.014s] Using topic "rdkafkatest_rnd4a6cf2e511829e8c_do_test_DescribeAcls" 3: [0080_admin_ut / 10.014s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 10.014s] DescribeAcls: duration 0.006ms 3: %4|1675737026.958|GETPID|0105_transactions_mock#producer-79| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Not coordinator: retrying 3: [0080_admin_ut / 10.214s] DescribeAcls.queue_poll: duration 200.024ms 3: [0080_admin_ut / 10.214s] DescribeAcls: got DescribeAclsResult in 200.024s 3: [0080_admin_ut / 10.214s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-78 DescribeAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.214s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-78 DeleteAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 10.214s] Using topic "rdkafkatest_rnd322230bc7d249d5a_do_test_DeleteAcls" 3: [0080_admin_ut / 10.214s] Using topic "rdkafkatest_rndf82b2540668047e_do_test_DeleteAcls" 3: [0080_admin_ut / 10.214s] Call DeleteAcls, timeout is 100ms 3: [0080_admin_ut / 10.214s] DeleteAcls: duration 0.007ms 3: [0080_admin_ut / 10.314s] DeleteAcls.queue_poll: duration 100.029ms 3: [0080_admin_ut / 10.314s] DeleteAcls: got DeleteAclsResult in 100.029s 3: [0080_admin_ut / 10.314s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-78 DeleteAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 10.314s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-78 DeleteAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.314s] Using topic "rdkafkatest_rnd6da5265615703bd7_do_test_DeleteAcls" 3: [0080_admin_ut / 10.314s] Using topic "rdkafkatest_rnd42eb210c09e4f39c_do_test_DeleteAcls" 3: [0080_admin_ut / 10.314s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 10.314s] DeleteAcls: duration 0.007ms 3: [0103_transactions_local / 7.000s] init_transactions(): duration 7000.016ms 3: [0103_transactions_local / 7.000s] init_transactions() failed as expected: Failed to initialize Producer ID: Local: Timed out 3: [0103_transactions_local / 7.000s] [ do_test_txn_local:1168: PASS (7.00s) ] 3: [0103_transactions_local / 7.000s] 0103_transactions_local: duration 7000.468ms 3: [0103_transactions_local / 7.000s] ================= Test 0103_transactions_local PASSED ================= 3: [
/ 15.294s] Too many tests running (5 >= 5): postponing 0113_cooperative_rebalance_local start... 3: [0106_cgrp_sess_timeout / 0.000s] ================= Running test 0106_cgrp_sess_timeout ================= 3: [0106_cgrp_sess_timeout / 0.000s] ==== Stats written to file stats_0106_cgrp_sess_timeout_7732310022049897403.json ==== 3: [0106_cgrp_sess_timeout / 0.000s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.000s] [ do_test_session_timeout:152: Test session timeout with sync commit ] 3: %5|1675737027.117|CONFWARN|MOCK#producer-80| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 0.000s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.001s] Created kafka instance 0106_cgrp_sess_timeout#producer-81 3: [0106_cgrp_sess_timeout / 0.001s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.001s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 0.001s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 0.001s] PRODUCE: duration 0.046ms 3: [0106_cgrp_sess_timeout / 0.042s] PRODUCE.DELIVERY.WAIT: duration 41.191ms 3: [0106_cgrp_sess_timeout / 0.042s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.042s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 0.042s] Created kafka instance 0106_cgrp_sess_timeout#consumer-82 3: [0106_cgrp_sess_timeout / 0.043s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0104_fetch_from_follower_mock/ 6.216s] CONSUME: duration 5000.062ms 3: [0104_fetch_from_follower_mock/ 6.216s] test_consumer_poll_no_msgs:4075: latest: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 6.216s] test_consumer_poll_no_msgs:4075: latest: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 6.216s] Closing consumer 0104_fetch_from_follower_mock#consumer-73 3: [0104_fetch_from_follower_mock/ 6.216s] CONSUMER.CLOSE: duration 0.052ms 3: [0104_fetch_from_follower_mock/ 6.217s] [ Test FFF auto.offset.reset=latest PASSED ] 3: [0104_fetch_from_follower_mock/ 6.217s] [ Test lagging FFF offset reset ] 3: %5|1675737027.244|CONFWARN|MOCK#producer-83| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 6.217s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.217s] Created kafka instance 0104_fetch_from_follower_mock#producer-84 3: [0104_fetch_from_follower_mock/ 6.217s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.217s] Produce to test [0]: messages #0..10 3: [0104_fetch_from_follower_mock/ 6.218s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 6.218s] PRODUCE: duration 0.015ms 3: [0104_fetch_from_follower_mock/ 6.259s] PRODUCE.DELIVERY.WAIT: duration 41.272ms 3: [0104_fetch_from_follower_mock/ 6.259s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.259s] Created kafka instance 0104_fetch_from_follower_mock#consumer-85 3: [0104_fetch_from_follower_mock/ 6.259s] ASSIGN.PARTITIONS: duration 0.021ms 3: [0104_fetch_from_follower_mock/ 6.259s] lag: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 6.259s] up to wmark: consume 7 messages 3: [0080_admin_ut / 10.514s] DeleteAcls.queue_poll: duration 200.004ms 3: [0080_admin_ut / 10.514s] DeleteAcls: got DeleteAclsResult in 200.004s 3: [0080_admin_ut / 10.514s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-78 DeleteAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.514s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-78 DeleteAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.514s] Using topic "rdkafkatest_rnd7676b5cf19068b25_do_test_DeleteAcls" 3: [0080_admin_ut / 10.514s] Using topic "rdkafkatest_rnd3b9b9da15e8dfc07_do_test_DeleteAcls" 3: [0080_admin_ut / 10.514s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 10.514s] DeleteAcls: duration 0.006ms 3: [0104_fetch_from_follower_mock/ 6.361s] CONSUME: duration 101.218ms 3: [0104_fetch_from_follower_mock/ 6.361s] up to wmark: consumed 7/7 messages (0/0 EOFs) 3: [0104_fetch_from_follower_mock/ 6.361s] no msgs: not expecting any messages for 3000ms 3: %4|1675737027.458|GETPID|0105_transactions_mock#producer-79| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Coordinator load in progress: retrying 3: [0080_admin_ut / 10.714s] DeleteAcls.queue_poll: duration 200.027ms 3: [0080_admin_ut / 10.714s] DeleteAcls: got DeleteAclsResult in 200.027s 3: [0080_admin_ut / 10.714s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-78 DeleteAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.714s] [ do_test_mix:1342 ] 3: [0080_admin_ut / 10.714s] Creating 2 topics 3: [0080_admin_ut / 10.714s] Deleting 1 topics 3: [0080_admin_ut / 10.714s] Creating 1 topics 3: [0080_admin_ut / 10.714s] Deleting 3 groups 3: [0080_admin_ut / 10.714s] Deleting offsets from 3 partitions 3: [0080_admin_ut / 10.714s] Creating (up to) 15 partitions for topic "topicD" 3: [0080_admin_ut / 10.714s] Deleting committed offsets for group mygroup and 3 partitions 3: [0080_admin_ut / 10.714s] Provoking invalid DeleteConsumerGroupOffsets call 3: [0080_admin_ut / 10.714s] Creating 2 topics 3: [0080_admin_ut / 10.714s] Got event DeleteConsumerGroupOffsetsResult: Exactly one DeleteConsumerGroupOffsets must be passed 3: [0080_admin_ut / 10.814s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.814s] Got event DeleteTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.814s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.814s] Got event DeleteGroupsResult: Success 3: [0080_admin_ut / 10.814s] Got event DeleteRecordsResult: Failed to query partition leaders: Local: Timed out 3: [0080_admin_ut / 10.814s] Got event CreatePartitionsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.814s] Got event DeleteConsumerGroupOffsetsResult: Failed while waiting for response from broker: Local: Timed out 3: [0080_admin_ut / 10.814s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.814s] [ do_test_mix:1342: PASS (0.10s) ] 3: [0080_admin_ut / 10.814s] [ do_test_configs:1411 ] 3: [0105_transactions_mock / 1.502s] rd_kafka_init_transactions(rk, 5000): duration 1501.043ms 3: [0105_transactions_mock / 1.502s] rd_kafka_begin_transaction(rk): duration 0.028ms 3: [0105_transactions_mock / 1.502s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.008ms 3: [0105_transactions_mock / 2.002s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.008ms 3: [0105_transactions_mock / 2.103s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 0.611ms 3: [0080_admin_ut / 12.814s] [ do_test_configs:1411: PASS (2.00s) ] 3: [0080_admin_ut / 12.814s] Test config file test.conf not found 3: %4|1675737029.589|CONFWARN|0080_admin_ut#consumer-86| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1675737029.589|CONFWARN|0080_admin_ut#consumer-86| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 12.815s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-86 DeleteRecords with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rnd338340ad4f51d737_do_test_DeleteRecords" 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rnd1f14d3006b9eb260_do_test_DeleteRecords" 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rnd49b0821f5053180e_do_test_DeleteRecords" 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rnd480e58240482da2e_do_test_DeleteRecords" 3: [0080_admin_ut / 12.815s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 12.815s] DeleteRecords: duration 0.017ms 3: [0080_admin_ut / 12.815s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-86 DeleteRecords with main queue, options, destroy, timeout 100ms: PASS (0.00s) ] 3: [0080_admin_ut / 12.815s] Test config file test.conf not found 3: %4|1675737029.589|CONFWARN|0080_admin_ut#consumer-87| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1675737029.589|CONFWARN|0080_admin_ut#consumer-87| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 12.815s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-87 DeleteGroups with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rnd6ace05b67d179c70_do_test_DeleteGroups" 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rnd46dd7b0c5e99c74d_do_test_DeleteGroups" 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rndb4acb920085f5f9_do_test_DeleteGroups" 3: [0080_admin_ut / 12.815s] Using topic "rdkafkatest_rnd616dcc1455b7be77_do_test_DeleteGroups" 3: [0080_admin_ut / 12.815s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 12.815s] DeleteGroups: duration 0.008ms 3: [0080_admin_ut / 12.815s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 12.815s] 0080_admin_ut: duration 12814.930ms 3: [0080_admin_ut / 12.815s] ================= Test 0080_admin_ut PASSED ================= 3: [
/ 17.867s] Too many tests running (5 >= 5): postponing 0116_kafkaconsumer_close start... 3: [0113_cooperative_rebalance_local/ 0.000s] ================= Running test 0113_cooperative_rebalance_local ================= 3: [0113_cooperative_rebalance_local/ 0.000s] ==== Stats written to file stats_0113_cooperative_rebalance_local_1299451796071120080.json ==== 3: [0113_cooperative_rebalance_local/ 0.000s] [ a_assign_rapid:674 ] 3: %5|1675737029.690|CONFWARN|MOCK#producer-88| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0105_transactions_mock / 3.305s] rd_kafka_commit_transaction(rk, 5000): duration 1202.060ms 3: [0105_transactions_mock / 3.306s] [ do_test_txn_recoverable_errors:194: PASS (3.31s) ] 3: [0105_transactions_mock / 3.306s] [ do_test_txn_fatal_idempo_errors:305 ] 3: [0105_transactions_mock / 3.306s] Test config file test.conf not found 3: [0105_transactions_mock / 3.306s] Setting test timeout to 60s * 2.7 3: %5|1675737029.763|MOCK|0105_transactions_mock#producer-89| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40925,127.0.0.1:45151,127.0.0.1:34267 3: [0105_transactions_mock / 3.306s] Created kafka instance 0105_transactions_mock#producer-89 3: [0105_transactions_mock / 3.307s] rd_kafka_init_transactions(rk, 5000): duration 0.620ms 3: [0105_transactions_mock / 3.307s] rd_kafka_begin_transaction(rk): duration 0.025ms 3: [0105_transactions_mock / 3.307s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: [0105_transactions_mock / 3.307s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0106_cgrp_sess_timeout / 3.048s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 3.048s] ASSIGN.PARTITIONS: duration 0.053ms 3: [0106_cgrp_sess_timeout / 3.048s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 3.048s] consume: consume 10 messages 3: [0104_fetch_from_follower_mock/ 9.361s] CONSUME: duration 3000.077ms 3: [0104_fetch_from_follower_mock/ 9.361s] test_consumer_poll_no_msgs:4075: no msgs: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 9.361s] test_consumer_poll_no_msgs:4075: no msgs: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 9.361s] remaining: consume 3 messages 3: [0104_fetch_from_follower_mock/ 9.370s] CONSUME: duration 9.614ms 3: [0104_fetch_from_follower_mock/ 9.370s] remaining: consumed 3/3 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 9.370s] Closing consumer 0104_fetch_from_follower_mock#consumer-85 3: [0104_fetch_from_follower_mock/ 9.371s] CONSUMER.CLOSE: duration 0.114ms 3: [0104_fetch_from_follower_mock/ 9.372s] [ Test lagging FFF offset reset PASSED ] 3: [0104_fetch_from_follower_mock/ 9.372s] [ Test unknown follower ] 3: %5|1675737030.399|CONFWARN|MOCK#producer-90| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 9.372s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.373s] Created kafka instance 0104_fetch_from_follower_mock#producer-91 3: [0104_fetch_from_follower_mock/ 9.373s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.373s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 9.373s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 9.373s] PRODUCE: duration 0.749ms 3: [0104_fetch_from_follower_mock/ 9.416s] PRODUCE.DELIVERY.WAIT: duration 42.480ms 3: [0104_fetch_from_follower_mock/ 9.454s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.455s] Created kafka instance 0104_fetch_from_follower_mock#consumer-92 3: [0104_fetch_from_follower_mock/ 9.455s] ASSIGN.PARTITIONS: duration 0.029ms 3: [0104_fetch_from_follower_mock/ 9.455s] unknown follower: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 9.455s] unknown follower: not expecting any messages for 5000ms 3: %5|1675737030.582|FETCH|0104_fetch_from_follower_mock#consumer-92| [thrd:127.0.0.1:36889/bootstrap]: 127.0.0.1:36889/1: test [0]: preferred replica (19) is unknown: refreshing metadata 3: [0113_cooperative_rebalance_local/ 1.005s] Setting test timeout to 10s * 2.7 3: [0113_cooperative_rebalance_local/ 1.017s] Setting test timeout to 20s * 2.7 3: [0113_cooperative_rebalance_local/ 1.017s] pre-commit: 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.017s] topic1[0] offset 11: Success 3: [0113_cooperative_rebalance_local/ 1.017s] topic2[0] offset 22: Success 3: [0113_cooperative_rebalance_local/ 1.018s] a_assign_rapid#consumer-94: incremental assign of 2 partition(s) 3: [0113_cooperative_rebalance_local/ 1.018s] incremental_assign(): 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.018s] topic1[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.018s] topic2[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.018s] a_assign_rapid#consumer-94: incremental unassign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 1.018s] incremental_unassign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.018s] topic1[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.018s] commit: 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.018s] topic1[0] offset 55: Success 3: [0113_cooperative_rebalance_local/ 1.018s] topic2[0] offset 33: Success 3: [0113_cooperative_rebalance_local/ 1.018s] a_assign_rapid#consumer-94: incremental assign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 1.018s] incremental_assign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.018s] topic3[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.018s] Clearing rtt 3: [0113_cooperative_rebalance_local/ 1.018s] a_assign_rapid#consumer-94: incremental assign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 1.018s] incremental_assign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.018s] topic1[0] offset -1001: Success 3: %3|1675737030.764|TXNERR|0105_transactions_mock#producer-89| [thrd:127.0.0.1:45151/bootstrap]: Current transaction failed in state BeginCommit: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 4.307s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 4.308s] rd_kafka_abort_transaction(rk, -1): duration 1.037ms 3: [0105_transactions_mock / 4.308s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: [0105_transactions_mock / 4.308s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.008ms 3: [0105_transactions_mock / 4.309s] rd_kafka_commit_transaction(rk, -1): duration 1.168ms 3: [0105_transactions_mock / 4.310s] [ do_test_txn_fatal_idempo_errors:305: PASS (1.00s) ] 3: [0105_transactions_mock / 4.310s] [ do_test_txn_fenced_reinit:511: With error INVALID_PRODUCER_EPOCH ] 3: [0105_transactions_mock / 4.310s] Test config file test.conf not found 3: [0105_transactions_mock / 4.310s] Setting test timeout to 60s * 2.7 3: %5|1675737030.767|MOCK|0105_transactions_mock#producer-95| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39201,127.0.0.1:40083,127.0.0.1:44453 3: [0105_transactions_mock / 4.310s] Created kafka instance 0105_transactions_mock#producer-95 3: [0105_transactions_mock / 4.311s] rd_kafka_init_transactions(rk, -1): duration 0.513ms 3: [0105_transactions_mock / 4.311s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock / 4.311s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.005ms 3: [0105_transactions_mock / 4.311s] 0105_transactions_mock#producer-95: Flushing 1 messages 3: [0106_cgrp_sess_timeout / 3.651s] CONSUME: duration 602.927ms 3: [0106_cgrp_sess_timeout / 3.651s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 3.651s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: %5|1675737031.085|FETCH|0104_fetch_from_follower_mock#consumer-92| [thrd:127.0.0.1:36889/bootstrap]: 127.0.0.1:36889/1: test [0]: preferred replica (19) lease changing too quickly (0s < 60s): possibly due to unavailable replica or stale cluster state: backing off next fetch 3: [0105_transactions_mock / 5.311s] FLUSH: duration 1000.553ms 3: [0105_transactions_mock / 5.311s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock / 5.311s] 0105_transactions_mock#producer-95: Flushing 1 messages 3: %3|1675737031.768|TXNERR|0105_transactions_mock#producer-95| [thrd:127.0.0.1:40083/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 5.312s] FLUSH: duration 0.151ms 3: %1|1675737031.769|TXNERR|0105_transactions_mock#producer-95| [thrd:main]: Fatal transaction error: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch (_FENCED) 3: %0|1675737031.769|FATAL|0105_transactions_mock#producer-95| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.313s] abort_transaction() failed: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.313s] Fatal error: _FENCED: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.313s] [ do_test_txn_fenced_reinit:511: With error INVALID_PRODUCER_EPOCH: PASS (1.00s) ] 3: [0105_transactions_mock / 5.313s] [ do_test_txn_fenced_reinit:511: With error PRODUCER_FENCED ] 3: [0105_transactions_mock / 5.313s] Test config file test.conf not found 3: [0105_transactions_mock / 5.313s] Setting test timeout to 60s * 2.7 3: %5|1675737031.770|MOCK|0105_transactions_mock#producer-96| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44623,127.0.0.1:35419,127.0.0.1:33667 3: [0105_transactions_mock / 5.314s] Created kafka instance 0105_transactions_mock#producer-96 3: [0105_transactions_mock / 5.314s] rd_kafka_init_transactions(rk, -1): duration 0.529ms 3: [0105_transactions_mock / 5.314s] rd_kafka_begin_transaction(rk): duration 0.023ms 3: [0105_transactions_mock / 5.314s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock / 5.314s] 0105_transactions_mock#producer-96: Flushing 1 messages 3: [0045_subscribe_update_mock / 19.035s] 0045_subscribe_update_mock#consumer-33: Rebalance: _ASSIGN_PARTITIONS: 64 partition(s) 3: [0045_subscribe_update_mock / 19.035s] ASSIGN.PARTITIONS: duration 0.218ms 3: [0045_subscribe_update_mock / 19.035s] assign: assigned 64 partition(s) 3: [0113_cooperative_rebalance_local/ 3.029s] [ a_assign_rapid:674: PASS (3.03s) ] 3: [0113_cooperative_rebalance_local/ 3.029s] [ p_lost_partitions_heartbeat_illegal_generation_test:2695 ] 3: %5|1675737032.719|CONFWARN|MOCK#producer-97| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 3.030s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 3.031s] Created kafka instance 0113_cooperative_rebalance_local#producer-98 3: [0113_cooperative_rebalance_local/ 3.031s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 3.031s] Produce to test [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 3.031s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 3.031s] PRODUCE: duration 0.050ms 3: [0113_cooperative_rebalance_local/ 3.032s] PRODUCE.DELIVERY.WAIT: duration 0.516ms 3: [0113_cooperative_rebalance_local/ 3.032s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 3.032s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 3.032s] Created kafka instance 0113_cooperative_rebalance_local#consumer-99 3: [0113_cooperative_rebalance_local/ 3.032s] p_lost_partitions_heartbeat_illegal_generation_test:2720: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 6.316s] FLUSH: duration 1002.085ms 3: [0105_transactions_mock / 6.316s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.011ms 3: [0105_transactions_mock / 6.316s] 0105_transactions_mock#producer-96: Flushing 1 messages 3: %3|1675737032.773|TXNERR|0105_transactions_mock#producer-96| [thrd:127.0.0.1:44623/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 6.316s] FLUSH: duration 0.130ms 3: %1|1675737032.774|TXNERR|0105_transactions_mock#producer-96| [thrd:main]: Fatal transaction error: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one (_FENCED) 3: %0|1675737032.774|FATAL|0105_transactions_mock#producer-96| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.317s] abort_transaction() failed: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.317s] Fatal error: _FENCED: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.324s] [ do_test_txn_fenced_reinit:511: With error PRODUCER_FENCED: PASS (1.01s) ] 3: [0105_transactions_mock / 6.324s] [ do_test_txn_req_cnt:1071 ] 3: [0105_transactions_mock / 6.324s] Test config file test.conf not found 3: [0105_transactions_mock / 6.324s] Setting test timeout to 60s * 2.7 3: %5|1675737032.781|MOCK|0105_transactions_mock#producer-100| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43875,127.0.0.1:41979,127.0.0.1:33903 3: [0105_transactions_mock / 6.326s] Created kafka instance 0105_transactions_mock#producer-100 3: [0105_transactions_mock / 6.326s] rd_kafka_init_transactions(rk, 5000): duration 0.557ms 3: [0105_transactions_mock / 6.326s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [0045_subscribe_update_mock / 19.905s] 0045_subscribe_update_mock#consumer-33: Assignment (64 partition(s)): topic_0[0], topic_1[0], topic_1[1], topic_10[0], topic_10[1], topic_10[2], topic_11[0], topic_11[1], topic_11[2], topic_11[3], topic_12[0], topic_12[1], topic_12[2], topic_12[3], topic_12[4], topic_13[0], topic_13[1], topic_13[2], topic_13[3], topic_13[4], topic_13[5], topic_14[0], topic_14[1], topic_14[2], topic_14[3], topic_14[4], topic_14[5], topic_14[6], topic_2[0], topic_2[1], topic_2[2], topic_3[0], topic_3[1], topic_3[2], topic_3[3], topic_4[0], topic_4[1], topic_4[2], topic_4[3], topic_4[4], topic_5[0], topic_5[1], topic_5[2], topic_5[3], topic_5[4], topic_5[5], topic_6[0], topic_6[1], topic_6[2], topic_6[3], topic_6[4], topic_6[5], topic_6[6], topic_7[0], topic_7[1], topic_7[2], topic_7[3], topic_7[4], topic_7[5], topic_7[6], topic_7[7], topic_8[0], topic_9[0], topic_9[1] 3: [0045_subscribe_update_mock / 19.905s] Creating topic topic_15 3: [0045_subscribe_update_mock / 19.905s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.029ms 3: [0045_subscribe_update_mock / 19.905s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 6.528s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 201.941ms 3: [0105_transactions_mock / 6.528s] rd_kafka_abort_transaction(rk, 5000): duration 0.093ms 3: [0105_transactions_mock / 6.529s] [ do_test_txn_req_cnt:1071: PASS (0.20s) ] 3: [0105_transactions_mock / 6.529s] [ do_test_txn_requires_abort_errors:1132 ] 3: [0105_transactions_mock / 6.529s] Test config file test.conf not found 3: [0105_transactions_mock / 6.529s] Setting test timeout to 60s * 2.7 3: %5|1675737032.986|MOCK|0105_transactions_mock#producer-101| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35747,127.0.0.1:36063,127.0.0.1:34639 3: [0105_transactions_mock / 6.529s] Created kafka instance 0105_transactions_mock#producer-101 3: [0105_transactions_mock / 6.530s] rd_kafka_init_transactions(rk, 5000): duration 0.568ms 3: [0105_transactions_mock / 6.530s] rd_kafka_begin_transaction(rk): duration 0.015ms 3: [0105_transactions_mock / 6.530s] 1. Fail on produce 3: [0105_transactions_mock / 6.530s] 0105_transactions_mock#producer-101: Flushing 1 messages 3: [0045_subscribe_update_mock / 20.205s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 20.205s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 20.205s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 20.305s] Creating topic topic_16 3: [0045_subscribe_update_mock / 20.305s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.031ms 3: [0045_subscribe_update_mock / 20.305s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 20.605s] CONSUME: duration 300.070ms 3: [0045_subscribe_update_mock / 20.605s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 20.605s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 20.706s] Creating topic topic_17 3: [0045_subscribe_update_mock / 20.706s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.027ms 3: [0045_subscribe_update_mock / 20.706s] POLL: not expecting any messages for 300ms 3: %3|1675737033.987|TXNERR|0105_transactions_mock#producer-101| [thrd:127.0.0.1:35747/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 1 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:736756000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737033.987|PARTCNT|0105_transactions_mock#producer-101| [thrd:127.0.0.1:35747/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 7.530s] FLUSH: duration 1000.687ms 3: [0105_transactions_mock / 7.530s] Error TOPIC_AUTHORIZATION_FAILED: ProduceRequest for mytopic [0] with 1 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:736756000,Epoch:0}, base seq 0): current transaction must be aborted 3: [0105_transactions_mock / 7.531s] rd_kafka_abort_transaction(rk, -1): duration 0.147ms 3: [0105_transactions_mock / 7.531s] 2. Fail on AddPartitionsToTxn 3: [0105_transactions_mock / 7.531s] rd_kafka_begin_transaction(rk): duration 0.017ms 3: %3|1675737033.988|ADDPARTS|0105_transactions_mock#producer-101| [thrd:main]: TxnCoordinator/1: Failed to add partition "mytopic" [0] to transaction: Broker: Topic authorization failed 3: %3|1675737033.988|TXNERR|0105_transactions_mock#producer-101| [thrd:main]: Current transaction failed in state BeginCommit: Failed to add partition(s) to transaction on broker TxnCoordinator/1: Broker: Topic authorization failed (after 0 ms) (TOPIC_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 7.531s] commit_transaction() error TOPIC_AUTHORIZATION_FAILED: Failed to add partition(s) to transaction on broker TxnCoordinator/1: Broker: Topic authorization failed (after 0 ms) 3: [0105_transactions_mock / 7.531s] rd_kafka_abort_transaction(rk, -1): duration 0.105ms 3: [0105_transactions_mock / 7.531s] 3. Fail on AddOffsetsToTxn 3: [0105_transactions_mock / 7.531s] rd_kafka_begin_transaction(rk): duration 0.013ms 3: %3|1675737033.988|ADDOFFSETS|0105_transactions_mock#producer-101| [thrd:main]: TxnCoordinator/1: Failed to add offsets to transaction on broker TxnCoordinator/1: Broker: Group authorization failed 3: %3|1675737033.988|TXNERR|0105_transactions_mock#producer-101| [thrd:main]: Current transaction failed in state InTransaction: Failed to add offsets to transaction on broker TxnCoordinator/1: Broker: Group authorization failed (after 0ms) (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 7.531s] rd_kafka_abort_transaction(rk, -1): duration 0.041ms 3: [0105_transactions_mock / 7.532s] [ do_test_txn_requires_abort_errors:1132: PASS (1.00s) ] 3: [0105_transactions_mock / 7.532s] [ do_test_txn_slow_reinit:390: without sleep ] 3: [0105_transactions_mock / 7.532s] Test config file test.conf not found 3: [0105_transactions_mock / 7.532s] Setting test timeout to 60s * 2.7 3: %5|1675737033.989|MOCK|0105_transactions_mock#producer-102| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44779,127.0.0.1:43661,127.0.0.1:46815 3: [0105_transactions_mock / 7.532s] Created kafka instance 0105_transactions_mock#producer-102 3: [0105_transactions_mock / 7.533s] rd_kafka_init_transactions(rk, -1): duration 0.544ms 3: [0105_transactions_mock / 7.533s] rd_kafka_begin_transaction(rk): duration 0.021ms 3: [0105_transactions_mock / 7.533s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: [0105_transactions_mock / 7.533s] 0105_transactions_mock#producer-102: Flushing 1 messages 3: [0045_subscribe_update_mock / 21.006s] CONSUME: duration 300.056ms 3: [0045_subscribe_update_mock / 21.006s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 21.006s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 21.106s] Creating topic topic_18 3: [0045_subscribe_update_mock / 21.106s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.032ms 3: [0045_subscribe_update_mock / 21.106s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 21.159s] 0045_subscribe_update_mock#consumer-33: Rebalance: _REVOKE_PARTITIONS: 64 partition(s) 3: [0045_subscribe_update_mock / 21.159s] UNASSIGN.PARTITIONS: duration 0.028ms 3: [0045_subscribe_update_mock / 21.159s] unassign: unassigned current partitions 3: [0045_subscribe_update_mock / 21.406s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 21.406s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 21.406s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 21.506s] Creating topic topic_19 3: [0045_subscribe_update_mock / 21.506s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.029ms 3: [0045_subscribe_update_mock / 21.506s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 21.806s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 21.806s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 21.806s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 21.906s] Creating topic topic_20 3: [0045_subscribe_update_mock / 21.906s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.038ms 3: [0045_subscribe_update_mock / 21.906s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 8.533s] FLUSH: duration 1000.642ms 3: [0105_transactions_mock / 8.533s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: %3|1675737034.990|TXNERR|0105_transactions_mock#producer-102| [thrd:127.0.0.1:46815/bootstrap]: Current transaction failed in state BeginCommit: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 8.534s] commit_transaction(-1): duration 0.132ms 3: [0105_transactions_mock / 8.534s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 8.634s] abort_transaction(100): duration 100.093ms 3: [0105_transactions_mock / 8.634s] First abort_transaction() failed: Transactional operation timed out 3: [0105_transactions_mock / 8.634s] Retrying abort 3: [0045_subscribe_update_mock / 22.206s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 22.206s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 22.206s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 22.306s] Creating topic topic_21 3: [0045_subscribe_update_mock / 22.306s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.039ms 3: [0045_subscribe_update_mock / 22.306s] POLL: not expecting any messages for 300ms 3: [0104_fetch_from_follower_mock/ 14.455s] CONSUME: duration 5000.078ms 3: [0104_fetch_from_follower_mock/ 14.455s] test_consumer_poll_no_msgs:4075: unknown follower: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 14.455s] test_consumer_poll_no_msgs:4075: unknown follower: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 14.455s] proper follower: consume 1000 messages 3: [0045_subscribe_update_mock / 22.607s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 22.607s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 22.607s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 22.707s] Creating topic topic_22 3: [0045_subscribe_update_mock / 22.707s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.060ms 3: [0045_subscribe_update_mock / 22.707s] POLL: not expecting any messages for 300ms 3: [0113_cooperative_rebalance_local/ 6.038s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 6.038s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 6.038s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 6.038s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 6.038s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 6.038s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.042ms 3: [0113_cooperative_rebalance_local/ 6.038s] assign: incremental assign of 4 partition(s) done 3: [0045_subscribe_update_mock / 23.007s] CONSUME: duration 300.061ms 3: [0045_subscribe_update_mock / 23.007s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 23.007s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 23.107s] Creating topic topic_23 3: [0045_subscribe_update_mock / 23.107s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.030ms 3: [0045_subscribe_update_mock / 23.107s] POLL: not expecting any messages for 300ms 3: [0104_fetch_from_follower_mock/ 15.078s] CONSUME: duration 623.505ms 3: [0104_fetch_from_follower_mock/ 15.078s] proper follower: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 15.078s] do_test_unknown_follower:223: broker_id: Verifying 1000 received messages (flags 0x80000): expecting msgids 0..1000 (1000) 3: [0104_fetch_from_follower_mock/ 15.078s] do_test_unknown_follower:223: broker_id: Verification of 1000 received messages succeeded: expected msgids 0..1000 (1000) 3: [0104_fetch_from_follower_mock/ 15.078s] Closing consumer 0104_fetch_from_follower_mock#consumer-92 3: [0104_fetch_from_follower_mock/ 15.078s] CONSUMER.CLOSE: duration 0.098ms 3: [0104_fetch_from_follower_mock/ 15.079s] [ Test unknown follower PASSED ] 3: [0104_fetch_from_follower_mock/ 15.079s] [ Test REPLICA_NOT_AVAIALBLE ] 3: %5|1675737036.106|CONFWARN|MOCK#producer-103| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 15.079s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.079s] Created kafka instance 0104_fetch_from_follower_mock#producer-104 3: [0104_fetch_from_follower_mock/ 15.079s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.079s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 15.080s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 15.080s] PRODUCE: duration 0.687ms 3: [0104_fetch_from_follower_mock/ 15.122s] PRODUCE.DELIVERY.WAIT: duration 41.770ms 3: [0104_fetch_from_follower_mock/ 15.122s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.122s] Created kafka instance 0104_fetch_from_follower_mock#consumer-105 3: [0104_fetch_from_follower_mock/ 15.122s] ASSIGN.PARTITIONS: duration 0.027ms 3: [0104_fetch_from_follower_mock/ 15.122s] REPLICA_NOT_AVAIALBLE: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 15.122s] Wait initial metadata: not expecting any messages for 2000ms 3: [0113_cooperative_rebalance_local/ 6.641s] p_lost_partitions_heartbeat_illegal_generation_test:2732: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0045_subscribe_update_mock / 23.407s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 23.407s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 23.407s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 23.507s] Creating topic topic_24 3: [0045_subscribe_update_mock / 23.507s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.023ms 3: [0045_subscribe_update_mock / 23.507s] POLL: not expecting any messages for 300ms 3: [0113_cooperative_rebalance_local/ 7.038s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 7.038s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 7.038s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 7.038s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 7.038s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 7.038s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 7.038s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.034ms 3: [0113_cooperative_rebalance_local/ 7.038s] unassign: incremental unassign of 4 partition(s) done 3: [0045_subscribe_update_mock / 23.807s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 23.807s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 23.807s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 23.907s] Creating topic topic_25 3: [0045_subscribe_update_mock / 23.907s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.038ms 3: [0045_subscribe_update_mock / 23.907s] POLL: not expecting any messages for 300ms 3: %4|1675737037.165|SESSTMOUT|0106_cgrp_sess_timeout#consumer-82| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0045_subscribe_update_mock / 24.207s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 24.207s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 24.207s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 24.307s] Creating topic topic_26 3: [0045_subscribe_update_mock / 24.308s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.038ms 3: [0045_subscribe_update_mock / 24.308s] POLL: not expecting any messages for 300ms 3: [0113_cooperative_rebalance_local/ 7.642s] p_lost_partitions_heartbeat_illegal_generation_test:2737: Waiting for rejoin after lost (_ASSIGN_PARTITIONS) for 12s 3: [0045_subscribe_update_mock / 24.608s] CONSUME: duration 300.076ms 3: [0045_subscribe_update_mock / 24.608s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 24.608s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 24.708s] Creating topic topic_27 3: [0045_subscribe_update_mock / 24.708s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.045ms 3: [0045_subscribe_update_mock / 24.708s] POLL: not expecting any messages for 300ms 3: [0106_cgrp_sess_timeout / 10.652s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 10.652s] Performing sync commit 3: [0045_subscribe_update_mock / 25.008s] CONSUME: duration 300.068ms 3: [0045_subscribe_update_mock / 25.008s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 25.008s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 25.108s] Creating topic topic_28 3: [0045_subscribe_update_mock / 25.108s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.039ms 3: [0045_subscribe_update_mock / 25.108s] POLL: not expecting any messages for 300ms 3: [0104_fetch_from_follower_mock/ 17.122s] CONSUME: duration 2000.075ms 3: [0104_fetch_from_follower_mock/ 17.122s] test_consumer_poll_no_msgs:4075: Wait initial metadata: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 17.122s] test_consumer_poll_no_msgs:4075: Wait initial metadata: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 17.123s] Consume: consume 1000 messages 3: [0045_subscribe_update_mock / 25.408s] CONSUME: duration 300.077ms 3: [0045_subscribe_update_mock / 25.408s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 25.408s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 25.508s] Creating topic topic_29 3: [0045_subscribe_update_mock / 25.508s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.024ms 3: [0045_subscribe_update_mock / 25.508s] POLL: not expecting any messages for 300ms 3: [0106_cgrp_sess_timeout / 11.652s] UNASSIGN.PARTITIONS: duration 0.048ms 3: [0106_cgrp_sess_timeout / 11.652s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 11.652s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: [0045_subscribe_update_mock / 25.808s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 25.808s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 25.808s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 18.239s] CONSUME: duration 1116.257ms 3: [0104_fetch_from_follower_mock/ 18.239s] Consume: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 18.239s] Closing consumer 0104_fetch_from_follower_mock#consumer-105 3: [0104_fetch_from_follower_mock/ 18.239s] CONSUMER.CLOSE: duration 0.128ms 3: [0104_fetch_from_follower_mock/ 18.241s] [ Test REPLICA_NOT_AVAIALBLE PASSED ] 3: [0104_fetch_from_follower_mock/ 18.241s] 0104_fetch_from_follower_mock: duration 18241.240ms 3: [0104_fetch_from_follower_mock/ 18.241s] ================= Test 0104_fetch_from_follower_mock PASSED ================= 3: [
/ 27.546s] Too many tests running (5 >= 5): postponing 0117_mock_errors start... 3: [0116_kafkaconsumer_close / 0.000s] ================= Running test 0116_kafkaconsumer_close ================= 3: [0116_kafkaconsumer_close / 0.000s] ==== Stats written to file stats_0116_kafkaconsumer_close_5970748159755503322.json ==== 3: [0116_kafkaconsumer_close / 0.000s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=0 ] 3: %5|1675737039.369|CONFWARN|MOCK#producer-106| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 0.000s] Setting test timeout to 10s * 2.7 3: [0113_cooperative_rebalance_local/ 12.044s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 12.044s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 12.044s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 12.044s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 12.044s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 12.044s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.044ms 3: [0113_cooperative_rebalance_local/ 12.044s] assign: incremental assign of 4 partition(s) done 3: [0106_cgrp_sess_timeout / 15.067s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.067s] ASSIGN.PARTITIONS: duration 0.044ms 3: [0106_cgrp_sess_timeout / 15.067s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.067s] Closing consumer 0106_cgrp_sess_timeout#consumer-82 3: [0106_cgrp_sess_timeout / 15.067s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.067s] Performing sync commit 3: [0113_cooperative_rebalance_local/ 12.547s] Closing consumer 3: [0113_cooperative_rebalance_local/ 12.547s] Closing consumer 0113_cooperative_rebalance_local#consumer-99 3: [0113_cooperative_rebalance_local/ 12.547s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 12.547s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 12.547s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 12.547s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 12.547s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 12.547s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.023ms 3: [0113_cooperative_rebalance_local/ 12.547s] unassign: incremental unassign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 12.547s] CONSUMER.CLOSE: duration 0.189ms 3: [0113_cooperative_rebalance_local/ 12.547s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 12.548s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 12.548s] [ p_lost_partitions_heartbeat_illegal_generation_test:2695: PASS (9.52s) ] 3: [0113_cooperative_rebalance_local/ 12.548s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=0 ] 3: %5|1675737042.238|CONFWARN|MOCK#producer-109| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 12.548s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.549s] Created kafka instance 0113_cooperative_rebalance_local#producer-110 3: [0113_cooperative_rebalance_local/ 12.549s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.549s] Produce to test1 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 12.549s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 12.549s] PRODUCE: duration 0.050ms 3: [0113_cooperative_rebalance_local/ 12.549s] PRODUCE.DELIVERY.WAIT: duration 0.511ms 3: [0113_cooperative_rebalance_local/ 12.549s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.549s] Created kafka instance 0113_cooperative_rebalance_local#producer-111 3: [0113_cooperative_rebalance_local/ 12.549s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.550s] Produce to test2 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 12.550s] SUM(POLL): duration 0.001ms 3: [0113_cooperative_rebalance_local/ 12.550s] PRODUCE: duration 0.052ms 3: [0113_cooperative_rebalance_local/ 12.550s] PRODUCE.DELIVERY.WAIT: duration 0.449ms 3: [0113_cooperative_rebalance_local/ 12.550s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.550s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 12.550s] Created kafka instance 0113_cooperative_rebalance_local#consumer-112 3: [0113_cooperative_rebalance_local/ 12.550s] q_lost_partitions_illegal_generation_test:2801: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0106_cgrp_sess_timeout / 16.067s] UNASSIGN.PARTITIONS: duration 0.034ms 3: [0106_cgrp_sess_timeout / 16.067s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 16.067s] CONSUMER.CLOSE: duration 1000.356ms 3: [0106_cgrp_sess_timeout / 16.069s] [ do_test_session_timeout:152: Test session timeout with sync commit: PASS (16.07s) ] 3: [0106_cgrp_sess_timeout / 16.069s] [ do_test_session_timeout:152: Test session timeout with async commit ] 3: %5|1675737043.186|CONFWARN|MOCK#producer-113| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 16.069s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.069s] Created kafka instance 0106_cgrp_sess_timeout#producer-114 3: [0106_cgrp_sess_timeout / 16.069s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.069s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 16.069s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 16.069s] PRODUCE: duration 0.048ms 3: [0106_cgrp_sess_timeout / 16.070s] PRODUCE.DELIVERY.WAIT: duration 0.435ms 3: [0106_cgrp_sess_timeout / 16.070s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.070s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 16.070s] Created kafka instance 0106_cgrp_sess_timeout#consumer-115 3: [0106_cgrp_sess_timeout / 16.070s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0116_kafkaconsumer_close / 5.012s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=0: PASS (5.01s) ] 3: [0116_kafkaconsumer_close / 5.012s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=0 ] 3: %5|1675737044.381|CONFWARN|MOCK#producer-116| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 5.013s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 18.540s] rd_kafka_abort_transaction(rk, -1): duration 9906.276ms 3: [0105_transactions_mock / 18.540s] abort_transaction(-1): duration 9906.290ms 3: [0105_transactions_mock / 18.540s] rd_kafka_begin_transaction(rk): duration 0.115ms 3: [0105_transactions_mock / 18.540s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.010ms 3: [0105_transactions_mock / 18.541s] rd_kafka_commit_transaction(rk, -1): duration 1.284ms 3: [0105_transactions_mock / 18.542s] [ do_test_txn_slow_reinit:390: without sleep: PASS (11.01s) ] 3: [0105_transactions_mock / 18.542s] [ do_test_txn_slow_reinit:390: with sleep ] 3: [0105_transactions_mock / 18.542s] Test config file test.conf not found 3: [0105_transactions_mock / 18.542s] Setting test timeout to 60s * 2.7 3: %5|1675737044.999|MOCK|0105_transactions_mock#producer-119| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45055,127.0.0.1:41893,127.0.0.1:36237 3: [0105_transactions_mock / 18.542s] Created kafka instance 0105_transactions_mock#producer-119 3: [0105_transactions_mock / 18.543s] rd_kafka_init_transactions(rk, -1): duration 0.532ms 3: [0105_transactions_mock / 18.543s] rd_kafka_begin_transaction(rk): duration 0.022ms 3: [0105_transactions_mock / 18.543s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock / 18.543s] 0105_transactions_mock#producer-119: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 15.556s] Rebalance #5: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 15.556s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 15.556s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 15.556s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 15.556s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 15.556s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.064ms 3: [0113_cooperative_rebalance_local/ 15.556s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 16.160s] q_lost_partitions_illegal_generation_test:2823: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0105_transactions_mock / 19.544s] FLUSH: duration 1000.821ms 3: [0105_transactions_mock / 19.544s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.010ms 3: %3|1675737046.001|TXNERR|0105_transactions_mock#producer-119| [thrd:127.0.0.1:36237/bootstrap]: Current transaction failed in state BeginCommit: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 19.544s] commit_transaction(-1): duration 0.215ms 3: [0105_transactions_mock / 19.544s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 19.644s] abort_transaction(100): duration 100.074ms 3: [0105_transactions_mock / 19.644s] First abort_transaction() failed: Transactional operation timed out 3: [0106_cgrp_sess_timeout / 19.076s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 19.076s] ASSIGN.PARTITIONS: duration 0.070ms 3: [0106_cgrp_sess_timeout / 19.076s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 19.076s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 19.679s] CONSUME: duration 603.343ms 3: [0106_cgrp_sess_timeout / 19.679s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 19.679s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0116_kafkaconsumer_close / 8.130s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=0: PASS (3.12s) ] 3: [0116_kafkaconsumer_close / 8.130s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=0 ] 3: %5|1675737047.499|CONFWARN|MOCK#producer-120| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 8.131s] Setting test timeout to 10s * 2.7 3: [0045_subscribe_update_mock / 36.163s] 0045_subscribe_update_mock#consumer-33: Rebalance: _ASSIGN_PARTITIONS: 129 partition(s) 3: [0045_subscribe_update_mock / 36.163s] ASSIGN.PARTITIONS: duration 0.320ms 3: [0045_subscribe_update_mock / 36.163s] assign: assigned 129 partition(s) 3: [
/ 37.554s] Too many tests running (5 >= 5): postponing 0117_mock_errors start... 3: [0045_subscribe_update_mock / 36.809s] 0045_subscribe_update_mock#consumer-33: Assignment (129 partition(s)): topic_0[0], topic_1[0], topic_1[1], topic_10[0], topic_10[1], topic_10[2], topic_11[0], topic_11[1], topic_11[2], topic_11[3], topic_12[0], topic_12[1], topic_12[2], topic_12[3], topic_12[4], topic_13[0], topic_13[1], topic_13[2], topic_13[3], topic_13[4], topic_13[5], topic_14[0], topic_14[1], topic_14[2], topic_14[3], topic_14[4], topic_14[5], topic_14[6], topic_15[0], topic_15[1], topic_15[2], topic_15[3], topic_15[4], topic_15[5], topic_15[6], topic_15[7], topic_16[0], topic_17[0], topic_17[1], topic_18[0], topic_18[1], topic_18[2], topic_19[0], topic_19[1], topic_19[2], topic_19[3], topic_2[0], topic_2[1], topic_2[2], topic_20[0], topic_20[1], topic_20[2], topic_20[3], topic_20[4], topic_21[0], topic_21[1], topic_21[2], topic_21[3], topic_21[4], topic_21[5], topic_22[0], topic_22[1], topic_22[2], topic_22[3], topic_22[4], topic_22[5], topic_22[6], topic_23[0], topic_23[1], topic_23[2], topic_23[3], topic_23[4], topic_23[5], topic_23[6], topic_23[7], topic_24[0], topic_25[0], topic_25[1], topic_26[0], topic_26[1], topic_26[2], topic_27[0], topic_27[1], topic_27[2], topic_27[3], topic_28[0], topic_28[1], topic_28[2], topic_28[3], topic_28[4], topic_29[0], topic_29[1], topic_29[2], topic_29[3], topic_29[4], topic_29[5], topic_3[0], topic_3[1], topic_3[2], topic_3[3], topic_4[0], topic_4[1], topic_4[2], topic_4[3], topic_4[4], topic_5[0], topic_5[1], topic_5[2], topic_5[3], topic_5[4], topic_5[5], topic_6[0], topic_6[1], topic_6[2], topic_6[3], topic_6[4], topic_6[5], topic_6[6], topic_7[0], topic_7[1], topic_7[2], topic_7[3], topic_7[4], topic_7[5], topic_7[6], topic_7[7], topic_8[0], topic_9[0], topic_9[1] 3: [0045_subscribe_update_mock / 36.810s] Creating topic topic_30 3: [0045_subscribe_update_mock / 36.810s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.018ms 3: [0045_subscribe_update_mock / 36.810s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 37.110s] CONSUME: duration 300.078ms 3: [0045_subscribe_update_mock / 37.110s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 37.110s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 37.210s] Creating topic topic_31 3: [0045_subscribe_update_mock / 37.210s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.055ms 3: [0045_subscribe_update_mock / 37.210s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 37.510s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 37.510s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 37.510s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 37.610s] Creating topic topic_32 3: [0045_subscribe_update_mock / 37.610s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.033ms 3: [0045_subscribe_update_mock / 37.610s] POLL: not expecting any messages for 300ms 3: [0113_cooperative_rebalance_local/ 21.163s] Rebalance #6: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 21.163s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 21.163s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 21.163s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 21.163s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 21.163s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 21.163s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.036ms 3: [0113_cooperative_rebalance_local/ 21.163s] unassign: incremental unassign of 4 partition(s) done 3: [0045_subscribe_update_mock / 37.910s] CONSUME: duration 300.059ms 3: [0045_subscribe_update_mock / 37.910s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 37.910s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 38.010s] Creating topic topic_33 3: [0045_subscribe_update_mock / 38.010s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.034ms 3: [0045_subscribe_update_mock / 38.010s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 38.310s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 38.310s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 38.310s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 38.410s] Creating topic topic_34 3: [0045_subscribe_update_mock / 38.410s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.027ms 3: [0045_subscribe_update_mock / 38.410s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 38.711s] CONSUME: duration 300.071ms 3: [0045_subscribe_update_mock / 38.711s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 38.711s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 38.811s] Creating topic topic_35 3: [0045_subscribe_update_mock / 38.811s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.076ms 3: [0045_subscribe_update_mock / 38.811s] POLL: not expecting any messages for 300ms 3: [0113_cooperative_rebalance_local/ 22.161s] q_lost_partitions_illegal_generation_test:2830: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 12s 3: [0045_subscribe_update_mock / 39.111s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 39.111s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 39.111s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 39.211s] Creating topic topic_36 3: [0045_subscribe_update_mock / 39.211s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.034ms 3: [0045_subscribe_update_mock / 39.211s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 39.274s] 0045_subscribe_update_mock#consumer-33: Rebalance: _REVOKE_PARTITIONS: 129 partition(s) 3: [0045_subscribe_update_mock / 39.274s] UNASSIGN.PARTITIONS: duration 0.055ms 3: [0045_subscribe_update_mock / 39.274s] unassign: unassigned current partitions 3: [0045_subscribe_update_mock / 39.511s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 39.511s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 39.511s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0116_kafkaconsumer_close / 13.143s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=0: PASS (5.01s) ] 3: [0116_kafkaconsumer_close / 13.143s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=0 ] 3: %5|1675737052.512|CONFWARN|MOCK#producer-123| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 13.143s] Setting test timeout to 10s * 2.7 3: [0045_subscribe_update_mock / 39.611s] Creating topic topic_37 3: [0045_subscribe_update_mock / 39.611s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.034ms 3: [0045_subscribe_update_mock / 39.611s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 39.911s] CONSUME: duration 300.061ms 3: [0045_subscribe_update_mock / 39.911s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 39.911s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 40.011s] Creating topic topic_38 3: [0045_subscribe_update_mock / 40.011s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.040ms 3: [0045_subscribe_update_mock / 40.011s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 40.311s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 40.311s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 40.311s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: %4|1675737053.322|SESSTMOUT|0106_cgrp_sess_timeout#consumer-115| [thrd:main]: Consumer group session timed out (in join-state steady) after 6128 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0045_subscribe_update_mock / 40.412s] Creating topic topic_39 3: [0045_subscribe_update_mock / 40.412s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.041ms 3: [0045_subscribe_update_mock / 40.412s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 40.712s] CONSUME: duration 300.091ms 3: [0045_subscribe_update_mock / 40.712s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 40.712s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0106_cgrp_sess_timeout / 26.680s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 26.680s] Performing async commit 3: [0045_subscribe_update_mock / 40.812s] Creating topic topic_40 3: [0045_subscribe_update_mock / 40.812s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.039ms 3: [0045_subscribe_update_mock / 40.812s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 41.112s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 41.112s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 41.112s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 41.212s] Creating topic topic_41 3: [0045_subscribe_update_mock / 41.212s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.064ms 3: [0045_subscribe_update_mock / 41.212s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 41.512s] CONSUME: duration 300.082ms 3: [0045_subscribe_update_mock / 41.512s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 41.512s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 41.612s] Creating topic topic_42 3: [0045_subscribe_update_mock / 41.612s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.048ms 3: [0045_subscribe_update_mock / 41.612s] POLL: not expecting any messages for 300ms 3: [0106_cgrp_sess_timeout / 27.680s] UNASSIGN.PARTITIONS: duration 0.069ms 3: [0106_cgrp_sess_timeout / 27.680s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 27.680s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: %4|1675737054.797|COMMITFAIL|0106_cgrp_sess_timeout#consumer-115| [thrd:main]: Offset commit (manual) failed for 1/4 partition(s) in join-state wait-unassign-to-complete: Broker: Unknown member: test[0]@17(Broker: Unknown member) 3: [0045_subscribe_update_mock / 41.912s] CONSUME: duration 300.077ms 3: [0045_subscribe_update_mock / 41.912s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 41.912s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 42.012s] Creating topic topic_43 3: [0045_subscribe_update_mock / 42.013s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.059ms 3: [0045_subscribe_update_mock / 42.013s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 42.313s] CONSUME: duration 300.078ms 3: [0045_subscribe_update_mock / 42.313s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 42.313s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 42.413s] Creating topic topic_44 3: [0045_subscribe_update_mock / 42.413s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.077ms 3: [0045_subscribe_update_mock / 42.413s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 42.713s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 42.713s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 42.713s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0113_cooperative_rebalance_local/ 26.167s] Rebalance #7: _ASSIGN_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 26.167s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.167s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.078ms 3: [0113_cooperative_rebalance_local/ 26.167s] assign: incremental assign of 8 partition(s) done 3: [0116_kafkaconsumer_close / 16.767s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=0: PASS (3.62s) ] 3: [0116_kafkaconsumer_close / 16.767s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=0 ] 3: %5|1675737056.135|CONFWARN|MOCK#producer-126| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 16.767s] Setting test timeout to 10s * 2.7 3: [0113_cooperative_rebalance_local/ 26.669s] Closing consumer 3: [0113_cooperative_rebalance_local/ 26.669s] Closing consumer 0113_cooperative_rebalance_local#consumer-112 3: [0113_cooperative_rebalance_local/ 26.669s] Rebalance #8: _REVOKE_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 26.669s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.669s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.033ms 3: [0113_cooperative_rebalance_local/ 26.669s] unassign: incremental unassign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 26.669s] CONSUMER.CLOSE: duration 0.197ms 3: [0113_cooperative_rebalance_local/ 26.669s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 26.669s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 26.670s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=0: PASS (14.12s) ] 3: [0113_cooperative_rebalance_local/ 26.670s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=1 ] 3: %5|1675737056.359|CONFWARN|MOCK#producer-129| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 26.670s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.670s] Created kafka instance 0113_cooperative_rebalance_local#producer-130 3: [0113_cooperative_rebalance_local/ 26.670s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.670s] Produce to test1 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 26.670s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 26.670s] PRODUCE: duration 0.051ms 3: [0113_cooperative_rebalance_local/ 26.712s] PRODUCE.DELIVERY.WAIT: duration 41.797ms 3: [0113_cooperative_rebalance_local/ 26.712s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.712s] Created kafka instance 0113_cooperative_rebalance_local#producer-131 3: [0113_cooperative_rebalance_local/ 26.712s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.712s] Produce to test2 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 26.712s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 26.712s] PRODUCE: duration 0.054ms 3: [0113_cooperative_rebalance_local/ 26.713s] PRODUCE.DELIVERY.WAIT: duration 0.405ms 3: [0113_cooperative_rebalance_local/ 26.713s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.713s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 26.713s] Created kafka instance 0113_cooperative_rebalance_local#consumer-132 3: [0113_cooperative_rebalance_local/ 26.713s] q_lost_partitions_illegal_generation_test:2801: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 31.644s] Retrying abort 3: [0105_transactions_mock / 31.644s] rd_kafka_abort_transaction(rk, -1): duration 0.268ms 3: [0105_transactions_mock / 31.644s] abort_transaction(-1): duration 0.274ms 3: [0105_transactions_mock / 31.644s] rd_kafka_begin_transaction(rk): duration 0.041ms 3: [0105_transactions_mock / 31.644s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.010ms 3: [0105_transactions_mock / 31.646s] rd_kafka_commit_transaction(rk, -1): duration 1.261ms 3: [0105_transactions_mock / 31.646s] [ do_test_txn_slow_reinit:390: with sleep: PASS (13.10s) ] 3: [0105_transactions_mock / 31.646s] [ do_test_txn_endtxn_errors:705 ] 3: [0105_transactions_mock / 31.646s] Testing scenario #0 commit with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 31.646s] Test config file test.conf not found 3: [0105_transactions_mock / 31.646s] Setting test timeout to 60s * 2.7 3: %5|1675737058.103|MOCK|0105_transactions_mock#producer-133| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37155,127.0.0.1:45915,127.0.0.1:46791 3: [0105_transactions_mock / 31.646s] Created kafka instance 0105_transactions_mock#producer-133 3: [0105_transactions_mock / 31.647s] rd_kafka_init_transactions(rk, 5000): duration 0.594ms 3: [0105_transactions_mock / 31.647s] rd_kafka_begin_transaction(rk): duration 0.025ms 3: [0105_transactions_mock / 31.647s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.326ms 3: [0106_cgrp_sess_timeout / 31.094s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.094s] ASSIGN.PARTITIONS: duration 0.020ms 3: [0106_cgrp_sess_timeout / 31.094s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.094s] Closing consumer 0106_cgrp_sess_timeout#consumer-115 3: [0106_cgrp_sess_timeout / 31.094s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.094s] Performing async commit 3: [0105_transactions_mock / 32.748s] commit: duration 1100.581ms 3: [0105_transactions_mock / 32.748s] Scenario #0 commit succeeded 3: [0105_transactions_mock / 32.748s] Testing scenario #0 commit&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 32.748s] rd_kafka_begin_transaction(rk): duration 0.067ms 3: [0105_transactions_mock / 32.748s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 32.749s] FLUSH: duration 1.171ms 3: [0105_transactions_mock / 32.749s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.162ms 3: [0106_cgrp_sess_timeout / 32.094s] UNASSIGN.PARTITIONS: duration 0.042ms 3: [0106_cgrp_sess_timeout / 32.094s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 32.094s] CONSUMER.CLOSE: duration 1000.247ms 3: [0106_cgrp_sess_timeout / 32.095s] [ do_test_session_timeout:152: Test session timeout with async commit: PASS (16.03s) ] 3: [0106_cgrp_sess_timeout / 32.095s] [ do_test_session_timeout:152: Test session timeout with auto commit ] 3: %5|1675737059.212|CONFWARN|MOCK#producer-134| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 32.095s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.095s] Created kafka instance 0106_cgrp_sess_timeout#producer-135 3: [0106_cgrp_sess_timeout / 32.095s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.095s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 32.095s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 32.095s] PRODUCE: duration 0.061ms 3: [0106_cgrp_sess_timeout / 32.136s] PRODUCE.DELIVERY.WAIT: duration 40.568ms 3: [0106_cgrp_sess_timeout / 32.136s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.136s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 32.137s] Created kafka instance 0106_cgrp_sess_timeout#consumer-136 3: [0106_cgrp_sess_timeout / 32.137s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [
/ 47.562s] Too many tests running (5 >= 5): postponing 0117_mock_errors start... 3: [0113_cooperative_rebalance_local/ 29.719s] Rebalance #9: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 29.719s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 29.719s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 29.719s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 29.719s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 29.719s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.076ms 3: [0113_cooperative_rebalance_local/ 29.719s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 29.820s] q_lost_partitions_illegal_generation_test:2823: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0113_cooperative_rebalance_local/ 29.820s] Rebalance #10: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 29.820s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 29.820s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 29.820s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 29.820s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 29.820s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 29.820s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.023ms 3: [0113_cooperative_rebalance_local/ 29.820s] unassign: incremental unassign of 4 partition(s) done 3: [0105_transactions_mock / 33.748s] commit&flush: duration 998.621ms 3: [0105_transactions_mock / 33.748s] Scenario #0 commit&flush succeeded 3: [0105_transactions_mock / 33.748s] Testing scenario #0 abort with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 33.748s] rd_kafka_begin_transaction(rk): duration 0.044ms 3: [0105_transactions_mock / 33.748s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.144ms 3: [0113_cooperative_rebalance_local/ 30.820s] q_lost_partitions_illegal_generation_test:2830: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 12s 3: [0116_kafkaconsumer_close / 21.787s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=0: PASS (5.02s) ] 3: [0116_kafkaconsumer_close / 21.787s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=0 ] 3: %5|1675737061.156|CONFWARN|MOCK#producer-137| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 21.787s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 34.749s] abort: duration 1000.474ms 3: [0105_transactions_mock / 34.749s] Scenario #0 abort succeeded 3: [0105_transactions_mock / 34.749s] Testing scenario #0 abort&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 34.749s] rd_kafka_begin_transaction(rk): duration 0.035ms 3: [0105_transactions_mock / 34.749s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 34.750s] FLUSH: duration 1.172ms 3: [0105_transactions_mock / 34.750s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.142ms 3: [0105_transactions_mock / 35.749s] abort&flush: duration 998.938ms 3: [0105_transactions_mock / 35.749s] Scenario #0 abort&flush succeeded 3: [0105_transactions_mock / 35.749s] Testing scenario #1 commit with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 35.749s] rd_kafka_begin_transaction(rk): duration 0.026ms 3: [0105_transactions_mock / 35.749s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.128ms 3: [0106_cgrp_sess_timeout / 35.142s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 35.142s] ASSIGN.PARTITIONS: duration 0.073ms 3: [0106_cgrp_sess_timeout / 35.142s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 35.142s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 35.243s] CONSUME: duration 100.278ms 3: [0106_cgrp_sess_timeout / 35.243s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 35.243s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0105_transactions_mock / 36.750s] commit: duration 1000.451ms 3: [0105_transactions_mock / 36.750s] Scenario #1 commit succeeded 3: [0105_transactions_mock / 36.750s] Testing scenario #1 commit&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 36.750s] rd_kafka_begin_transaction(rk): duration 0.023ms 3: [0105_transactions_mock / 36.750s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 36.751s] FLUSH: duration 1.271ms 3: [0105_transactions_mock / 36.751s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.144ms 3: [0105_transactions_mock / 37.750s] commit&flush: duration 998.823ms 3: [0105_transactions_mock / 37.750s] Scenario #1 commit&flush succeeded 3: [0105_transactions_mock / 37.750s] Testing scenario #1 abort with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 37.750s] rd_kafka_begin_transaction(rk): duration 0.037ms 3: [0105_transactions_mock / 37.750s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.171ms 3: [0116_kafkaconsumer_close / 25.407s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=0: PASS (3.62s) ] 3: [0116_kafkaconsumer_close / 25.407s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=0 ] 3: %5|1675737064.796|CONFWARN|MOCK#producer-140| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 25.428s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 38.750s] abort: duration 1000.006ms 3: [0105_transactions_mock / 38.750s] Scenario #1 abort succeeded 3: [0105_transactions_mock / 38.750s] Testing scenario #1 abort&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 38.750s] rd_kafka_begin_transaction(rk): duration 0.025ms 3: [0105_transactions_mock / 38.750s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 38.752s] FLUSH: duration 1.161ms 3: [0105_transactions_mock / 38.752s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.117ms 3: [0105_transactions_mock / 39.751s] abort&flush: duration 999.193ms 3: [0105_transactions_mock / 39.751s] Scenario #1 abort&flush succeeded 3: [0105_transactions_mock / 39.751s] Testing scenario #2 commit with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 39.751s] rd_kafka_begin_transaction(rk): duration 0.027ms 3: [0105_transactions_mock / 39.751s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.399ms 3: [0105_transactions_mock / 39.853s] commit: duration 101.745ms 3: [0105_transactions_mock / 39.853s] Scenario #2 commit succeeded 3: [0105_transactions_mock / 39.853s] Testing scenario #2 commit&flush with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 39.853s] rd_kafka_begin_transaction(rk): duration 0.137ms 3: [0105_transactions_mock / 39.853s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 39.854s] FLUSH: duration 1.227ms 3: [0105_transactions_mock / 39.855s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.193ms 3: [0105_transactions_mock / 39.956s] commit&flush: duration 100.908ms 3: [0105_transactions_mock / 39.956s] Scenario #2 commit&flush succeeded 3: [0105_transactions_mock / 39.956s] Testing scenario #2 abort with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 39.956s] rd_kafka_begin_transaction(rk): duration 0.080ms 3: [0105_transactions_mock / 39.956s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.507ms 3: [0105_transactions_mock / 40.057s] abort: duration 100.939ms 3: [0105_transactions_mock / 40.057s] Scenario #2 abort succeeded 3: [0105_transactions_mock / 40.057s] Testing scenario #2 abort&flush with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.057s] rd_kafka_begin_transaction(rk): duration 0.044ms 3: [0105_transactions_mock / 40.057s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 40.058s] FLUSH: duration 1.205ms 3: [0105_transactions_mock / 40.059s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.120ms 3: [0105_transactions_mock / 40.159s] abort&flush: duration 100.894ms 3: [0105_transactions_mock / 40.159s] Scenario #2 abort&flush succeeded 3: [0105_transactions_mock / 40.159s] Testing scenario #3 commit with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.160s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [0105_transactions_mock / 40.160s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.136ms 3: [0113_cooperative_rebalance_local/ 37.222s] Rebalance #11: _ASSIGN_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.077ms 3: [0113_cooperative_rebalance_local/ 37.222s] assign: incremental assign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 37.222s] Closing consumer 3: [0113_cooperative_rebalance_local/ 37.222s] Closing consumer 0113_cooperative_rebalance_local#consumer-132 3: [0113_cooperative_rebalance_local/ 37.222s] Rebalance #12: _REVOKE_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.222s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.065ms 3: [0113_cooperative_rebalance_local/ 37.222s] unassign: incremental unassign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 37.222s] CONSUMER.CLOSE: duration 0.244ms 3: [0113_cooperative_rebalance_local/ 37.222s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 37.223s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 37.223s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=1: PASS (10.55s) ] 3: [0113_cooperative_rebalance_local/ 37.223s] [ r_lost_partitions_commit_illegal_generation_test_local:2860 ] 3: %5|1675737066.913|CONFWARN|MOCK#producer-143| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 37.223s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 37.224s] Created kafka instance 0113_cooperative_rebalance_local#producer-144 3: [0113_cooperative_rebalance_local/ 37.224s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 37.224s] Produce to test [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 37.224s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 37.224s] PRODUCE: duration 0.045ms 3: [0113_cooperative_rebalance_local/ 37.224s] PRODUCE.DELIVERY.WAIT: duration 0.740ms 3: [0113_cooperative_rebalance_local/ 37.225s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 37.225s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 37.225s] Created kafka instance 0113_cooperative_rebalance_local#consumer-145 3: [0113_cooperative_rebalance_local/ 37.225s] r_lost_partitions_commit_illegal_generation_test_local:2883: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 40.463s] commit: duration 303.204ms 3: [0105_transactions_mock / 40.463s] Scenario #3 commit succeeded 3: [0105_transactions_mock / 40.463s] Testing scenario #3 commit&flush with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.463s] rd_kafka_begin_transaction(rk): duration 0.028ms 3: [0105_transactions_mock / 40.463s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 40.464s] FLUSH: duration 1.244ms 3: [0105_transactions_mock / 40.464s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.238ms 3: [0105_transactions_mock / 40.767s] commit&flush: duration 302.339ms 3: [0105_transactions_mock / 40.767s] Scenario #3 commit&flush succeeded 3: [0105_transactions_mock / 40.767s] Testing scenario #3 abort with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.767s] rd_kafka_begin_transaction(rk): duration 0.094ms 3: [0105_transactions_mock / 40.767s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.269ms 3: [0045_subscribe_update_mock / 54.279s] 0045_subscribe_update_mock#consumer-33: Rebalance: _ASSIGN_PARTITIONS: 195 partition(s) 3: [0045_subscribe_update_mock / 54.279s] ASSIGN.PARTITIONS: duration 0.342ms 3: [0045_subscribe_update_mock / 54.279s] assign: assigned 195 partition(s) 3: [0105_transactions_mock / 41.069s] abort: duration 302.215ms 3: [0105_transactions_mock / 41.069s] Scenario #3 abort succeeded 3: [0105_transactions_mock / 41.069s] Testing scenario #3 abort&flush with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.070s] rd_kafka_begin_transaction(rk): duration 0.118ms 3: [0105_transactions_mock / 41.070s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 41.071s] FLUSH: duration 1.228ms 3: [0105_transactions_mock / 41.071s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.168ms 3: [0045_subscribe_update_mock / 54.714s] 0045_subscribe_update_mock#consumer-33: Assignment (195 partition(s)): topic_0[0], topic_1[0], topic_1[1], topic_10[0], topic_10[1], topic_10[2], topic_11[0], topic_11[1], topic_11[2], topic_11[3], topic_12[0], topic_12[1], topic_12[2], topic_12[3], topic_12[4], topic_13[0], topic_13[1], topic_13[2], topic_13[3], topic_13[4], topic_13[5], topic_14[0], topic_14[1], topic_14[2], topic_14[3], topic_14[4], topic_14[5], topic_14[6], topic_15[0], topic_15[1], topic_15[2], topic_15[3], topic_15[4], topic_15[5], topic_15[6], topic_15[7], topic_16[0], topic_17[0], topic_17[1], topic_18[0], topic_18[1], topic_18[2], topic_19[0], topic_19[1], topic_19[2], topic_19[3], topic_2[0], topic_2[1], topic_2[2], topic_20[0], topic_20[1], topic_20[2], topic_20[3], topic_20[4], topic_21[0], topic_21[1], topic_21[2], topic_21[3], topic_21[4], topic_21[5], topic_22[0], topic_22[1], topic_22[2], topic_22[3], topic_22[4], topic_22[5], topic_22[6], topic_23[0], topic_23[1], topic_23[2], topic_23[3], topic_23[4], topic_23[5], topic_23[6], topic_23[7], topic_24[0], topic_25[0], topic_25[1], topic_26[0], topic_26[1], topic_26[2], topic_27[0], topic_27[1], topic_27[2], topic_27[3], topic_28[0], topic_28[1], topic_28[2], topic_28[3], topic_28[4], topic_29[0], topic_29[1], topic_29[2], topic_29[3], topic_29[4], topic_29[5], topic_3[0], topic_3[1], topic_3[2], topic_3[3], topic_30[0], topic_30[1], topic_30[2], topic_30[3], topic_30[4], topic_30[5], topic_30[6], topic_31[0], topic_31[1], topic_31[2], topic_31[3], topic_31[4], topic_31[5], topic_31[6], topic_31[7], topic_32[0], topic_33[0], topic_33[1], topic_34[0], topic_34[1], topic_34[2], topic_35[0], topic_35[1], topic_35[2], topic_35[3], topic_36[0], topic_36[1], topic_36[2], topic_36[3], topic_36[4], topic_37[0], topic_37[1], topic_37[2], topic_37[3], topic_37[4], topic_37[5], topic_38[0], topic_38[1], topic_38[2], topic_38[3], topic_38[4], topic_38[5], topic_38[6], topic_39[0], topic_39[1], topic_39[2], topic_39[3], topic_39[4], topic_39[5], topic_39[6], topic_39[7], topic_4[0], topic_4[1], topic_4[2], topic_4[3], topic_4[4], topic_40[0], topic_41[0], topic_41[1], topic_42[0], topic_42[1], topic_42[2], topic_43[0], topic_43[1], topic_43[2], topic_43[3], topic_44[0], topic_44[1], topic_44[2], topic_44[3], topic_44[4], topic_5[0], topic_5[1], topic_5[2], topic_5[3], topic_5[4], topic_5[5], topic_6[0], topic_6[1], topic_6[2], topic_6[3], topic_6[4], topic_6[5], topic_6[6], topic_7[0], topic_7[1], topic_7[2], topic_7[3], topic_7[4], topic_7[5], topic_7[6], topic_7[7], topic_8[0], topic_9[0], topic_9[1] 3: [0045_subscribe_update_mock / 54.714s] Creating topic topic_45 3: [0045_subscribe_update_mock / 54.714s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.032ms 3: [0045_subscribe_update_mock / 54.714s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 41.373s] abort&flush: duration 302.348ms 3: [0105_transactions_mock / 41.373s] Scenario #3 abort&flush succeeded 3: [0105_transactions_mock / 41.373s] Testing scenario #4 commit with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 41.373s] rd_kafka_begin_transaction(rk): duration 0.068ms 3: [0105_transactions_mock / 41.374s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.249ms 3: %3|1675737067.831|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 41.375s] commit: duration 0.970ms 3: [0105_transactions_mock / 41.375s] Scenario #4 commit failed: UNKNOWN_PRODUCER_ID: EndTxn commit failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.375s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.375s] rd_kafka_abort_transaction(rk, -1): duration 0.092ms 3: [0105_transactions_mock / 41.375s] Testing scenario #4 commit&flush with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 41.375s] rd_kafka_begin_transaction(rk): duration 0.010ms 3: [0105_transactions_mock / 41.375s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 41.376s] FLUSH: duration 0.830ms 3: [0105_transactions_mock / 41.376s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.082ms 3: %3|1675737067.833|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 41.376s] commit&flush: duration 0.067ms 3: [0105_transactions_mock / 41.376s] Scenario #4 commit&flush failed: UNKNOWN_PRODUCER_ID: EndTxn commit failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.376s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.376s] rd_kafka_abort_transaction(rk, -1): duration 0.060ms 3: [0105_transactions_mock / 41.376s] Testing scenario #4 abort with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 41.376s] rd_kafka_begin_transaction(rk): duration 0.009ms 3: [0105_transactions_mock / 41.376s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.059ms 3: %3|1675737067.833|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 41.376s] abort: duration 0.101ms 3: [0105_transactions_mock / 41.376s] Scenario #4 abort failed: UNKNOWN_PRODUCER_ID: EndTxn abort failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.376s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.376s] rd_kafka_abort_transaction(rk, -1): duration 0.061ms 3: [0105_transactions_mock / 41.376s] Testing scenario #4 abort&flush with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 41.376s] rd_kafka_begin_transaction(rk): duration 0.014ms 3: [0105_transactions_mock / 41.376s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 41.377s] FLUSH: duration 0.514ms 3: [0105_transactions_mock / 41.377s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.066ms 3: %3|1675737067.834|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 41.377s] abort&flush: duration 0.074ms 3: [0105_transactions_mock / 41.377s] Scenario #4 abort&flush failed: UNKNOWN_PRODUCER_ID: EndTxn abort failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.377s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.377s] rd_kafka_abort_transaction(rk, -1): duration 0.062ms 3: [0105_transactions_mock / 41.377s] Testing scenario #5 commit with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 41.377s] rd_kafka_begin_transaction(rk): duration 0.010ms 3: [0105_transactions_mock / 41.377s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.068ms 3: %3|1675737067.834|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 41.378s] commit: duration 0.773ms 3: [0105_transactions_mock / 41.378s] Scenario #5 commit failed: INVALID_PRODUCER_ID_MAPPING: EndTxn commit failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.378s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.378s] rd_kafka_abort_transaction(rk, -1): duration 0.060ms 3: [0105_transactions_mock / 41.378s] Testing scenario #5 commit&flush with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 41.378s] rd_kafka_begin_transaction(rk): duration 0.012ms 3: [0105_transactions_mock / 41.378s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 41.379s] FLUSH: duration 0.905ms 3: [0105_transactions_mock / 41.379s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.062ms 3: %3|1675737067.836|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 41.379s] commit&flush: duration 0.048ms 3: [0105_transactions_mock / 41.379s] Scenario #5 commit&flush failed: INVALID_PRODUCER_ID_MAPPING: EndTxn commit failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.379s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.379s] rd_kafka_abort_transaction(rk, -1): duration 0.050ms 3: [0105_transactions_mock / 41.379s] Testing scenario #5 abort with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 41.379s] rd_kafka_begin_transaction(rk): duration 0.008ms 3: [0105_transactions_mock / 41.379s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.077ms 3: %3|1675737067.836|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 41.379s] abort: duration 0.081ms 3: [0105_transactions_mock / 41.379s] Scenario #5 abort failed: INVALID_PRODUCER_ID_MAPPING: EndTxn abort failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.379s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.379s] rd_kafka_abort_transaction(rk, -1): duration 0.060ms 3: [0105_transactions_mock / 41.379s] Testing scenario #5 abort&flush with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 41.379s] rd_kafka_begin_transaction(rk): duration 0.009ms 3: [0105_transactions_mock / 41.379s] 0105_transactions_mock#producer-133: Flushing 1 messages 3: [0105_transactions_mock / 41.380s] FLUSH: duration 0.581ms 3: [0105_transactions_mock / 41.380s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.058ms 3: %3|1675737067.837|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 41.380s] abort&flush: duration 0.055ms 3: [0105_transactions_mock / 41.380s] Scenario #5 abort&flush failed: INVALID_PRODUCER_ID_MAPPING: EndTxn abort failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 41.380s] Abortable error, aborting transaction 3: [0105_transactions_mock / 41.380s] rd_kafka_abort_transaction(rk, -1): duration 0.054ms 3: [0105_transactions_mock / 41.380s] Testing scenario #6 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 41.380s] rd_kafka_begin_transaction(rk): duration 0.009ms 3: [0105_transactions_mock / 41.380s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.088ms 3: %1|1675737067.838|TXNERR|0105_transactions_mock#producer-133| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737067.838|FATAL|0105_transactions_mock#producer-133| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 41.381s] commit: duration 0.790ms 3: [0105_transactions_mock / 41.381s] Scenario #6 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 41.381s] Fatal error, destroying producer 3: [0105_transactions_mock / 41.381s] Testing scenario #6 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 41.381s] Test config file test.conf not found 3: [0105_transactions_mock / 41.381s] Setting test timeout to 60s * 2.7 3: %5|1675737067.838|MOCK|0105_transactions_mock#producer-146| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45947,127.0.0.1:34441,127.0.0.1:45519 3: [0105_transactions_mock / 41.383s] Created kafka instance 0105_transactions_mock#producer-146 3: [0105_transactions_mock / 41.383s] rd_kafka_init_transactions(rk, 5000): duration 0.563ms 3: [0105_transactions_mock / 41.383s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock / 41.383s] 0105_transactions_mock#producer-146: Flushing 1 messages 3: [0045_subscribe_update_mock / 55.014s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 55.014s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 55.014s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 55.115s] Creating topic topic_46 3: [0045_subscribe_update_mock / 55.115s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.069ms 3: [0045_subscribe_update_mock / 55.115s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 55.415s] CONSUME: duration 300.077ms 3: [0045_subscribe_update_mock / 55.415s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 55.415s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 55.515s] Creating topic topic_47 3: [0045_subscribe_update_mock / 55.515s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.078ms 3: [0045_subscribe_update_mock / 55.515s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 55.815s] CONSUME: duration 300.058ms 3: [0045_subscribe_update_mock / 55.815s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 55.815s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0105_transactions_mock / 42.383s] FLUSH: duration 999.642ms 3: [0105_transactions_mock / 42.383s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.178ms 3: %1|1675737068.840|TXNERR|0105_transactions_mock#producer-146| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737068.840|FATAL|0105_transactions_mock#producer-146| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 42.383s] commit&flush: duration 0.146ms 3: [0105_transactions_mock / 42.383s] Scenario #6 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 42.383s] Fatal error, destroying producer 3: [0105_transactions_mock / 42.384s] Testing scenario #6 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 42.384s] Test config file test.conf not found 3: [0105_transactions_mock / 42.384s] Setting test timeout to 60s * 2.7 3: %5|1675737068.841|MOCK|0105_transactions_mock#producer-147| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:34111,127.0.0.1:38439,127.0.0.1:43275 3: [0105_transactions_mock / 42.384s] Created kafka instance 0105_transactions_mock#producer-147 3: [0105_transactions_mock / 42.385s] rd_kafka_init_transactions(rk, 5000): duration 0.496ms 3: [0105_transactions_mock / 42.385s] rd_kafka_begin_transaction(rk): duration 0.055ms 3: [0105_transactions_mock / 42.385s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.242ms 3: %1|1675737068.842|TXNERR|0105_transactions_mock#producer-147| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737068.842|FATAL|0105_transactions_mock#producer-147| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 42.385s] abort: duration 0.105ms 3: [0105_transactions_mock / 42.385s] Scenario #6 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 42.385s] Fatal error, destroying producer 3: [0105_transactions_mock / 42.386s] Testing scenario #6 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 42.386s] Test config file test.conf not found 3: [0105_transactions_mock / 42.386s] Setting test timeout to 60s * 2.7 3: %5|1675737068.842|MOCK|0105_transactions_mock#producer-148| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38763,127.0.0.1:38761,127.0.0.1:43239 3: [0105_transactions_mock / 42.386s] Created kafka instance 0105_transactions_mock#producer-148 3: [0105_transactions_mock / 42.386s] rd_kafka_init_transactions(rk, 5000): duration 0.538ms 3: [0105_transactions_mock / 42.386s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock / 42.386s] 0105_transactions_mock#producer-148: Flushing 1 messages 3: [0045_subscribe_update_mock / 55.915s] Creating topic topic_48 3: [0045_subscribe_update_mock / 55.915s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.022ms 3: [0045_subscribe_update_mock / 55.915s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 56.215s] CONSUME: duration 300.076ms 3: [0045_subscribe_update_mock / 56.215s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 56.215s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: %4|1675737069.259|SESSTMOUT|0106_cgrp_sess_timeout#consumer-136| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0045_subscribe_update_mock / 56.315s] Creating topic topic_49 3: [0045_subscribe_update_mock / 56.315s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.061ms 3: [0045_subscribe_update_mock / 56.315s] POLL: not expecting any messages for 300ms 3: [0106_cgrp_sess_timeout / 42.243s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 42.243s] UNASSIGN.PARTITIONS: duration 0.082ms 3: [0106_cgrp_sess_timeout / 42.243s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 42.243s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: %4|1675737069.360|COMMITFAIL|0106_cgrp_sess_timeout#consumer-136| [thrd:main]: Offset commit (unassigned partitions) failed for 1/4 partition(s) in join-state wait-unassign-to-complete: Broker: Unknown member: test[0]@17(Broker: Unknown member) 3: [
/ 57.570s] Too many tests running (5 >= 5): postponing 0117_mock_errors start... 3: [0045_subscribe_update_mock / 56.615s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 56.615s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 56.615s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 56.715s] Closing consumer 0045_subscribe_update_mock#consumer-33 3: [0045_subscribe_update_mock / 56.716s] 0045_subscribe_update_mock#consumer-33: Rebalance: _REVOKE_PARTITIONS: 195 partition(s) 3: [0045_subscribe_update_mock / 56.716s] UNASSIGN.PARTITIONS: duration 0.100ms 3: [0045_subscribe_update_mock / 56.716s] unassign: unassigned current partitions 3: [0045_subscribe_update_mock / 56.716s] CONSUMER.CLOSE: duration 0.637ms 3: [0045_subscribe_update_mock / 56.717s] [ do_test_regex_many_mock:378: range with 50 topics: PASS (56.72s) ] 3: [0045_subscribe_update_mock / 56.717s] [ do_test_regex_many_mock:378: cooperative-sticky with 50 topics ] 3: %5|1675737069.704|CONFWARN|MOCK#producer-149| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0045_subscribe_update_mock / 56.717s] Test config file test.conf not found 3: [0045_subscribe_update_mock / 56.717s] Setting test timeout to 300s * 2.7 3: [0045_subscribe_update_mock / 56.718s] Created kafka instance 0045_subscribe_update_mock#consumer-150 3: [0045_subscribe_update_mock / 56.718s] Creating topic topic_0 3: [0045_subscribe_update_mock / 56.718s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.023ms 3: [0045_subscribe_update_mock / 56.718s] POLL: not expecting any messages for 300ms 3: [0116_kafkaconsumer_close / 30.440s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=0: PASS (5.03s) ] 3: [0116_kafkaconsumer_close / 30.440s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=0 ] 3: %5|1675737069.808|CONFWARN|MOCK#producer-151| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 30.440s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 43.387s] FLUSH: duration 1000.453ms 3: [0105_transactions_mock / 43.387s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.163ms 3: %1|1675737069.844|TXNERR|0105_transactions_mock#producer-148| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737069.844|FATAL|0105_transactions_mock#producer-148| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 43.387s] abort&flush: duration 0.145ms 3: [0105_transactions_mock / 43.387s] Scenario #6 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 43.387s] Fatal error, destroying producer 3: [0105_transactions_mock / 43.404s] Testing scenario #7 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 43.404s] Test config file test.conf not found 3: [0105_transactions_mock / 43.404s] Setting test timeout to 60s * 2.7 3: %5|1675737069.861|MOCK|0105_transactions_mock#producer-154| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46669,127.0.0.1:41425,127.0.0.1:36513 3: [0105_transactions_mock / 43.406s] Created kafka instance 0105_transactions_mock#producer-154 3: [0105_transactions_mock / 43.407s] rd_kafka_init_transactions(rk, 5000): duration 0.640ms 3: [0105_transactions_mock / 43.407s] rd_kafka_begin_transaction(rk): duration 0.031ms 3: [0105_transactions_mock / 43.407s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.363ms 3: %1|1675737069.865|TXNERR|0105_transactions_mock#producer-154| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737069.865|FATAL|0105_transactions_mock#producer-154| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 43.408s] commit: duration 1.236ms 3: [0105_transactions_mock / 43.408s] Scenario #7 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 43.408s] Fatal error, destroying producer 3: [0105_transactions_mock / 43.409s] Testing scenario #7 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 43.409s] Test config file test.conf not found 3: [0105_transactions_mock / 43.409s] Setting test timeout to 60s * 2.7 3: %5|1675737069.866|MOCK|0105_transactions_mock#producer-155| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38975,127.0.0.1:37123,127.0.0.1:42489 3: [0105_transactions_mock / 43.409s] Created kafka instance 0105_transactions_mock#producer-155 3: [0105_transactions_mock / 43.410s] rd_kafka_init_transactions(rk, 5000): duration 0.528ms 3: [0105_transactions_mock / 43.410s] rd_kafka_begin_transaction(rk): duration 0.023ms 3: [0105_transactions_mock / 43.410s] 0105_transactions_mock#producer-155: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 40.231s] Rebalance #13: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 40.231s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 40.231s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 40.231s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 40.231s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 40.231s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.039ms 3: [0113_cooperative_rebalance_local/ 40.231s] assign: incremental assign of 4 partition(s) done 3: [0045_subscribe_update_mock / 57.018s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 57.018s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 57.018s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0113_cooperative_rebalance_local/ 40.331s] consume: consume 50 messages 3: [0113_cooperative_rebalance_local/ 40.332s] CONSUME: duration 0.218ms 3: [0113_cooperative_rebalance_local/ 40.332s] consume: consumed 50/50 messages (0/-1 EOFs) 3: [0113_cooperative_rebalance_local/ 40.332s] r_lost_partitions_commit_illegal_generation_test_local:2901: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0113_cooperative_rebalance_local/ 40.332s] Rebalance #14: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 40.332s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 40.332s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 40.332s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 40.332s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 40.332s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 40.332s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.024ms 3: [0113_cooperative_rebalance_local/ 40.332s] unassign: incremental unassign of 4 partition(s) done 3: [0045_subscribe_update_mock / 57.118s] Creating topic topic_1 3: [0045_subscribe_update_mock / 57.118s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.072ms 3: [0045_subscribe_update_mock / 57.118s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 57.418s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 57.418s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 57.418s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 57.518s] Creating topic topic_2 3: [0045_subscribe_update_mock / 57.518s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.054ms 3: [0045_subscribe_update_mock / 57.518s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 57.818s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 57.818s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 57.818s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0105_transactions_mock / 44.410s] FLUSH: duration 1000.629ms 3: [0105_transactions_mock / 44.411s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.446ms 3: %1|1675737070.868|TXNERR|0105_transactions_mock#producer-155| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737070.868|FATAL|0105_transactions_mock#producer-155| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 44.411s] commit&flush: duration 0.174ms 3: [0105_transactions_mock / 44.411s] Scenario #7 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 44.411s] Fatal error, destroying producer 3: [0105_transactions_mock / 44.412s] Testing scenario #7 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 44.412s] Test config file test.conf not found 3: [0105_transactions_mock / 44.412s] Setting test timeout to 60s * 2.7 3: %5|1675737070.868|MOCK|0105_transactions_mock#producer-156| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35927,127.0.0.1:37411,127.0.0.1:45349 3: [0105_transactions_mock / 44.412s] Created kafka instance 0105_transactions_mock#producer-156 3: [0105_transactions_mock / 44.413s] rd_kafka_init_transactions(rk, 5000): duration 0.518ms 3: [0105_transactions_mock / 44.413s] rd_kafka_begin_transaction(rk): duration 0.025ms 3: [0105_transactions_mock / 44.413s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.319ms 3: %1|1675737070.870|TXNERR|0105_transactions_mock#producer-156| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737070.870|FATAL|0105_transactions_mock#producer-156| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 44.413s] abort: duration 0.134ms 3: [0105_transactions_mock / 44.413s] Scenario #7 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 44.413s] Fatal error, destroying producer 3: [0105_transactions_mock / 44.413s] Testing scenario #7 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 44.413s] Test config file test.conf not found 3: [0105_transactions_mock / 44.413s] Setting test timeout to 60s * 2.7 3: %5|1675737070.870|MOCK|0105_transactions_mock#producer-157| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46087,127.0.0.1:39513,127.0.0.1:46347 3: [0105_transactions_mock / 44.414s] Created kafka instance 0105_transactions_mock#producer-157 3: [0105_transactions_mock / 44.414s] rd_kafka_init_transactions(rk, 5000): duration 0.454ms 3: [0105_transactions_mock / 44.414s] rd_kafka_begin_transaction(rk): duration 0.014ms 3: [0105_transactions_mock / 44.414s] 0105_transactions_mock#producer-157: Flushing 1 messages 3: [0045_subscribe_update_mock / 57.918s] Creating topic topic_3 3: [0045_subscribe_update_mock / 57.918s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.056ms 3: [0045_subscribe_update_mock / 57.918s] POLL: not expecting any messages for 300ms 3: [0113_cooperative_rebalance_local/ 41.332s] r_lost_partitions_commit_illegal_generation_test_local:2904: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 22s 3: [0045_subscribe_update_mock / 58.219s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 58.219s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 58.219s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 58.319s] Creating topic topic_4 3: [0045_subscribe_update_mock / 58.319s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.068ms 3: [0045_subscribe_update_mock / 58.319s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 58.619s] CONSUME: duration 300.082ms 3: [0045_subscribe_update_mock / 58.619s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 58.619s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 58.719s] Creating topic topic_5 3: [0045_subscribe_update_mock / 58.719s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.039ms 3: [0045_subscribe_update_mock / 58.719s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 45.415s] FLUSH: duration 1001.002ms 3: [0105_transactions_mock / 45.415s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.218ms 3: %1|1675737071.872|TXNERR|0105_transactions_mock#producer-157| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737071.872|FATAL|0105_transactions_mock#producer-157| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 45.415s] abort&flush: duration 0.114ms 3: [0105_transactions_mock / 45.415s] Scenario #7 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 45.415s] Fatal error, destroying producer 3: [0105_transactions_mock / 45.416s] Testing scenario #8 commit with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 45.416s] Test config file test.conf not found 3: [0105_transactions_mock / 45.416s] Setting test timeout to 60s * 2.7 3: %5|1675737071.873|MOCK|0105_transactions_mock#producer-158| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46457,127.0.0.1:39797,127.0.0.1:42387 3: [0105_transactions_mock / 45.416s] Created kafka instance 0105_transactions_mock#producer-158 3: [0105_transactions_mock / 45.417s] rd_kafka_init_transactions(rk, 5000): duration 0.509ms 3: [0105_transactions_mock / 45.417s] rd_kafka_begin_transaction(rk): duration 0.043ms 3: [0105_transactions_mock / 45.417s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.181ms 3: [0045_subscribe_update_mock / 59.019s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 59.019s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 59.019s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 59.119s] Creating topic topic_6 3: [0045_subscribe_update_mock / 59.119s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.036ms 3: [0045_subscribe_update_mock / 59.119s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 59.419s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 59.419s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 59.419s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 59.519s] Creating topic topic_7 3: [0045_subscribe_update_mock / 59.519s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.036ms 3: [0045_subscribe_update_mock / 59.519s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 59.819s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 59.819s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 59.819s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: %1|1675737072.874|TXNERR|0105_transactions_mock#producer-158| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1675737072.874|FATAL|0105_transactions_mock#producer-158| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 46.417s] commit: duration 1000.359ms 3: [0105_transactions_mock / 46.417s] Scenario #8 commit failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 46.417s] Fatal error, destroying producer 3: [0105_transactions_mock / 46.418s] Testing scenario #8 commit&flush with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 46.418s] Test config file test.conf not found 3: [0105_transactions_mock / 46.418s] Setting test timeout to 60s * 2.7 3: %5|1675737072.875|MOCK|0105_transactions_mock#producer-159| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38743,127.0.0.1:34337,127.0.0.1:34285 3: [0105_transactions_mock / 46.418s] Created kafka instance 0105_transactions_mock#producer-159 3: [0105_transactions_mock / 46.419s] rd_kafka_init_transactions(rk, 5000): duration 0.516ms 3: [0105_transactions_mock / 46.419s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock / 46.419s] 0105_transactions_mock#producer-159: Flushing 1 messages 3: [0045_subscribe_update_mock / 59.920s] Creating topic topic_8 3: [0045_subscribe_update_mock / 59.920s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.015ms 3: [0045_subscribe_update_mock / 59.920s] POLL: not expecting any messages for 300ms 3: [0116_kafkaconsumer_close / 33.558s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=0: PASS (3.12s) ] 3: [0116_kafkaconsumer_close / 33.558s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=1 ] 3: %5|1675737072.926|CONFWARN|MOCK#producer-160| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 33.558s] Setting test timeout to 10s * 2.7 3: [0045_subscribe_update_mock / 60.220s] CONSUME: duration 300.077ms 3: [0045_subscribe_update_mock / 60.220s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 60.220s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 60.320s] Creating topic topic_9 3: [0045_subscribe_update_mock / 60.320s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.037ms 3: [0045_subscribe_update_mock / 60.320s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 60.620s] CONSUME: duration 300.076ms 3: [0045_subscribe_update_mock / 60.620s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 60.620s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 60.720s] Creating topic topic_10 3: [0045_subscribe_update_mock / 60.720s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.043ms 3: [0045_subscribe_update_mock / 60.720s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 47.420s] FLUSH: duration 1000.784ms 3: [0105_transactions_mock / 47.420s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.300ms 3: %1|1675737073.877|TXNERR|0105_transactions_mock#producer-159| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1675737073.877|FATAL|0105_transactions_mock#producer-159| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 47.420s] commit&flush: duration 0.146ms 3: [0105_transactions_mock / 47.420s] Scenario #8 commit&flush failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 47.420s] Fatal error, destroying producer 3: [0105_transactions_mock / 47.442s] Testing scenario #8 abort with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 47.442s] Test config file test.conf not found 3: [0105_transactions_mock / 47.442s] Setting test timeout to 60s * 2.7 3: %5|1675737073.899|MOCK|0105_transactions_mock#producer-163| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39301,127.0.0.1:39387,127.0.0.1:43161 3: [0105_transactions_mock / 47.444s] Created kafka instance 0105_transactions_mock#producer-163 3: [0105_transactions_mock / 47.445s] rd_kafka_init_transactions(rk, 5000): duration 0.718ms 3: [0105_transactions_mock / 47.445s] rd_kafka_begin_transaction(rk): duration 0.031ms 3: [0105_transactions_mock / 47.445s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.390ms 3: %1|1675737073.902|TXNERR|0105_transactions_mock#producer-163| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1675737073.902|FATAL|0105_transactions_mock#producer-163| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 47.445s] abort: duration 0.109ms 3: [0105_transactions_mock / 47.445s] Scenario #8 abort failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 47.445s] Fatal error, destroying producer 3: [0105_transactions_mock / 47.445s] Testing scenario #8 abort&flush with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 47.445s] Test config file test.conf not found 3: [0105_transactions_mock / 47.445s] Setting test timeout to 60s * 2.7 3: %5|1675737073.902|MOCK|0105_transactions_mock#producer-164| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44379,127.0.0.1:36521,127.0.0.1:33119 3: [0105_transactions_mock / 47.446s] Created kafka instance 0105_transactions_mock#producer-164 3: [0105_transactions_mock / 47.446s] rd_kafka_init_transactions(rk, 5000): duration 0.525ms 3: [0105_transactions_mock / 47.446s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: [0105_transactions_mock / 47.446s] 0105_transactions_mock#producer-164: Flushing 1 messages 3: [0045_subscribe_update_mock / 61.020s] CONSUME: duration 300.077ms 3: [0045_subscribe_update_mock / 61.020s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 61.020s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 61.120s] Creating topic topic_11 3: [0045_subscribe_update_mock / 61.120s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.036ms 3: [0045_subscribe_update_mock / 61.120s] POLL: not expecting any messages for 300ms 3: [0106_cgrp_sess_timeout / 47.159s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.159s] ASSIGN.PARTITIONS: duration 0.045ms 3: [0106_cgrp_sess_timeout / 47.159s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.159s] Closing consumer 0106_cgrp_sess_timeout#consumer-136 3: [0106_cgrp_sess_timeout / 47.159s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.159s] UNASSIGN.PARTITIONS: duration 0.029ms 3: [0106_cgrp_sess_timeout / 47.159s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 47.159s] CONSUMER.CLOSE: duration 0.104ms 3: [0106_cgrp_sess_timeout / 47.160s] [ do_test_session_timeout:152: Test session timeout with auto commit: PASS (15.07s) ] 3: [0106_cgrp_sess_timeout / 47.160s] [ do_test_commit_on_lost:231 ] 3: %5|1675737074.277|CONFWARN|MOCK#producer-165| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 47.160s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.160s] Created kafka instance 0106_cgrp_sess_timeout#producer-166 3: [0106_cgrp_sess_timeout / 47.160s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.160s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 47.161s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 47.161s] PRODUCE: duration 0.047ms 3: [0106_cgrp_sess_timeout / 47.202s] PRODUCE.DELIVERY.WAIT: duration 41.330ms 3: [0106_cgrp_sess_timeout / 47.202s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.202s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 47.202s] Created kafka instance 0106_cgrp_sess_timeout#consumer-167 3: [0106_cgrp_sess_timeout / 47.202s] consume: consume 10 messages 3: [0045_subscribe_update_mock / 61.420s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 61.420s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 61.420s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 61.520s] Creating topic topic_12 3: [0045_subscribe_update_mock / 61.520s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.038ms 3: [0045_subscribe_update_mock / 61.520s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 61.821s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 61.821s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 61.821s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0105_transactions_mock / 48.447s] FLUSH: duration 1000.767ms 3: [0105_transactions_mock / 48.448s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.258ms 3: %1|1675737074.904|TXNERR|0105_transactions_mock#producer-164| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1675737074.904|FATAL|0105_transactions_mock#producer-164| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 48.448s] abort&flush: duration 0.125ms 3: [0105_transactions_mock / 48.448s] Scenario #8 abort&flush failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 48.448s] Fatal error, destroying producer 3: [0105_transactions_mock / 48.448s] Testing scenario #9 commit with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 48.448s] Test config file test.conf not found 3: [0105_transactions_mock / 48.448s] Setting test timeout to 60s * 2.7 3: %5|1675737074.905|MOCK|0105_transactions_mock#producer-168| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37677,127.0.0.1:41411,127.0.0.1:33113 3: [0105_transactions_mock / 48.448s] Created kafka instance 0105_transactions_mock#producer-168 3: [0105_transactions_mock / 48.449s] rd_kafka_init_transactions(rk, 5000): duration 0.487ms 3: [0105_transactions_mock / 48.449s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock / 48.449s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.330ms 3: %3|1675737074.907|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 48.450s] commit: duration 1.118ms 3: [0105_transactions_mock / 48.450s] Scenario #9 commit failed: GROUP_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.450s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.451s] rd_kafka_abort_transaction(rk, -1): duration 0.128ms 3: [0105_transactions_mock / 48.451s] Testing scenario #9 commit&flush with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 48.451s] rd_kafka_begin_transaction(rk): duration 0.022ms 3: [0105_transactions_mock / 48.451s] 0105_transactions_mock#producer-168: Flushing 1 messages 3: [0045_subscribe_update_mock / 61.921s] Creating topic topic_13 3: [0045_subscribe_update_mock / 61.921s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.033ms 3: [0045_subscribe_update_mock / 61.921s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 48.451s] FLUSH: duration 0.846ms 3: [0105_transactions_mock / 48.452s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.092ms 3: %3|1675737074.908|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 48.452s] commit&flush: duration 0.096ms 3: [0105_transactions_mock / 48.452s] Scenario #9 commit&flush failed: GROUP_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.452s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.452s] rd_kafka_abort_transaction(rk, -1): duration 0.095ms 3: [0105_transactions_mock / 48.452s] Testing scenario #9 abort with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 48.452s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: [0105_transactions_mock / 48.452s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.086ms 3: %3|1675737074.909|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 48.452s] abort: duration 0.112ms 3: [0105_transactions_mock / 48.452s] Scenario #9 abort failed: GROUP_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.452s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.452s] rd_kafka_abort_transaction(rk, -1): duration 0.106ms 3: [0105_transactions_mock / 48.452s] Testing scenario #9 abort&flush with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 48.452s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: [0105_transactions_mock / 48.452s] 0105_transactions_mock#producer-168: Flushing 1 messages 3: [0105_transactions_mock / 48.453s] FLUSH: duration 0.333ms 3: [0105_transactions_mock / 48.453s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.070ms 3: %3|1675737074.909|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 48.453s] abort&flush: duration 0.124ms 3: [0105_transactions_mock / 48.453s] Scenario #9 abort&flush failed: GROUP_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.453s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.453s] rd_kafka_abort_transaction(rk, -1): duration 0.101ms 3: [0105_transactions_mock / 48.453s] Testing scenario #10 commit with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 48.453s] rd_kafka_begin_transaction(rk): duration 0.015ms 3: [0105_transactions_mock / 48.453s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.069ms 3: %3|1675737074.910|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 48.454s] commit: duration 0.697ms 3: [0105_transactions_mock / 48.454s] Scenario #10 commit failed: INVALID_MSG_SIZE: EndTxn commit failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.454s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.454s] rd_kafka_abort_transaction(rk, -1): duration 0.098ms 3: [0105_transactions_mock / 48.454s] Testing scenario #10 commit&flush with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 48.454s] rd_kafka_begin_transaction(rk): duration 0.018ms 3: [0105_transactions_mock / 48.454s] 0105_transactions_mock#producer-168: Flushing 1 messages 3: [0105_transactions_mock / 48.455s] FLUSH: duration 0.849ms 3: [0105_transactions_mock / 48.455s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.074ms 3: %3|1675737074.912|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 48.455s] commit&flush: duration 0.086ms 3: [0105_transactions_mock / 48.455s] Scenario #10 commit&flush failed: INVALID_MSG_SIZE: EndTxn commit failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.455s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.455s] rd_kafka_abort_transaction(rk, -1): duration 0.088ms 3: [0105_transactions_mock / 48.455s] Testing scenario #10 abort with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 48.455s] rd_kafka_begin_transaction(rk): duration 0.017ms 3: [0105_transactions_mock / 48.455s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.097ms 3: %3|1675737074.912|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 48.455s] abort: duration 0.109ms 3: [0105_transactions_mock / 48.455s] Scenario #10 abort failed: INVALID_MSG_SIZE: EndTxn abort failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.455s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.455s] rd_kafka_abort_transaction(rk, -1): duration 0.104ms 3: [0105_transactions_mock / 48.455s] Testing scenario #10 abort&flush with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 48.455s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock / 48.455s] 0105_transactions_mock#producer-168: Flushing 1 messages 3: [0105_transactions_mock / 48.456s] FLUSH: duration 0.380ms 3: [0105_transactions_mock / 48.456s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.077ms 3: %3|1675737074.913|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 48.456s] abort&flush: duration 0.096ms 3: [0105_transactions_mock / 48.456s] Scenario #10 abort&flush failed: INVALID_MSG_SIZE: EndTxn abort failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 48.456s] Abortable error, aborting transaction 3: [0105_transactions_mock / 48.456s] rd_kafka_abort_transaction(rk, -1): duration 0.100ms 3: [0105_transactions_mock / 48.456s] Testing scenario #11 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 48.456s] rd_kafka_begin_transaction(rk): duration 0.016ms 3: [0105_transactions_mock / 48.456s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.080ms 3: %1|1675737074.914|TXNERR|0105_transactions_mock#producer-168| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737074.914|FATAL|0105_transactions_mock#producer-168| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 48.457s] commit: duration 0.681ms 3: [0105_transactions_mock / 48.457s] Scenario #11 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 48.457s] Fatal error, destroying producer 3: [0105_transactions_mock / 48.457s] Testing scenario #11 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 48.457s] Test config file test.conf not found 3: [0105_transactions_mock / 48.457s] Setting test timeout to 60s * 2.7 3: %5|1675737074.914|MOCK|0105_transactions_mock#producer-169| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45707,127.0.0.1:41677,127.0.0.1:44167 3: [0105_transactions_mock / 48.457s] Created kafka instance 0105_transactions_mock#producer-169 3: [0105_transactions_mock / 48.458s] rd_kafka_init_transactions(rk, 5000): duration 0.432ms 3: [0105_transactions_mock / 48.458s] rd_kafka_begin_transaction(rk): duration 0.022ms 3: [0105_transactions_mock / 48.458s] 0105_transactions_mock#producer-169: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 45.333s] Rebalance #15: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 45.333s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.044ms 3: [0113_cooperative_rebalance_local/ 45.333s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 45.333s] Closing consumer 3: [0113_cooperative_rebalance_local/ 45.333s] Closing consumer 0113_cooperative_rebalance_local#consumer-145 3: [0113_cooperative_rebalance_local/ 45.333s] Rebalance #16: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 45.333s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 45.333s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.014ms 3: [0113_cooperative_rebalance_local/ 45.333s] unassign: incremental unassign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 45.333s] CONSUMER.CLOSE: duration 0.118ms 3: [0113_cooperative_rebalance_local/ 45.333s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 45.333s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 45.333s] 0113_cooperative_rebalance_local: duration 45333.477ms 3: [0113_cooperative_rebalance_local/ 45.333s] ================= Test 0113_cooperative_rebalance_local PASSED ================= 3: [
/ 63.201s] Too many tests running (5 >= 5): postponing 0120_asymmetric_subscription start... 3: [0117_mock_errors / 0.000s] ================= Running test 0117_mock_errors ================= 3: [0117_mock_errors / 0.000s] ==== Stats written to file stats_0117_mock_errors_1871247080773747240.json ==== 3: [0117_mock_errors / 0.000s] Test config file test.conf not found 3: [0117_mock_errors / 0.000s] [ do_test_producer_storage_error:53: ] 3: [0117_mock_errors / 0.000s] Test config file test.conf not found 3: [0117_mock_errors / 0.000s] Setting test timeout to 10s * 2.7 3: %5|1675737075.024|MOCK|0117_mock_errors#producer-170| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:32807,127.0.0.1:38743,127.0.0.1:42101 3: [0117_mock_errors / 0.000s] Created kafka instance 0117_mock_errors#producer-170 3: [0117_mock_errors / 0.000s] 0117_mock_errors#producer-170: Flushing 1 messages 3: [0045_subscribe_update_mock / 62.221s] CONSUME: duration 300.079ms 3: [0045_subscribe_update_mock / 62.221s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 62.221s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 62.321s] Creating topic topic_14 3: [0045_subscribe_update_mock / 62.321s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.038ms 3: [0045_subscribe_update_mock / 62.321s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 62.621s] CONSUME: duration 300.080ms 3: [0045_subscribe_update_mock / 62.621s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 62.621s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0105_transactions_mock / 49.459s] FLUSH: duration 1000.651ms 3: [0105_transactions_mock / 49.459s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.285ms 3: %1|1675737075.916|TXNERR|0105_transactions_mock#producer-169| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737075.916|FATAL|0105_transactions_mock#producer-169| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 49.459s] commit&flush: duration 0.141ms 3: [0105_transactions_mock / 49.459s] Scenario #11 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 49.459s] Fatal error, destroying producer 3: [0105_transactions_mock / 49.459s] Testing scenario #11 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 49.459s] Test config file test.conf not found 3: [0105_transactions_mock / 49.459s] Setting test timeout to 60s * 2.7 3: %5|1675737075.916|MOCK|0105_transactions_mock#producer-171| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39083,127.0.0.1:44809,127.0.0.1:38705 3: [0105_transactions_mock / 49.460s] Created kafka instance 0105_transactions_mock#producer-171 3: [0105_transactions_mock / 49.460s] rd_kafka_init_transactions(rk, 5000): duration 0.500ms 3: [0105_transactions_mock / 49.460s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock / 49.461s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.284ms 3: %1|1675737075.918|TXNERR|0105_transactions_mock#producer-171| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737075.918|FATAL|0105_transactions_mock#producer-171| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 49.461s] abort: duration 0.148ms 3: [0105_transactions_mock / 49.461s] Scenario #11 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 49.461s] Fatal error, destroying producer 3: [0105_transactions_mock / 49.461s] Testing scenario #11 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 49.461s] Test config file test.conf not found 3: [0105_transactions_mock / 49.461s] Setting test timeout to 60s * 2.7 3: %5|1675737075.918|MOCK|0105_transactions_mock#producer-172| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45755,127.0.0.1:33425,127.0.0.1:42049 3: [0105_transactions_mock / 49.462s] Created kafka instance 0105_transactions_mock#producer-172 3: [0105_transactions_mock / 49.462s] rd_kafka_init_transactions(rk, 5000): duration 0.527ms 3: [0105_transactions_mock / 49.462s] rd_kafka_begin_transaction(rk): duration 0.015ms 3: [0105_transactions_mock / 49.462s] 0105_transactions_mock#producer-172: Flushing 1 messages 3: [0117_mock_errors / 1.505s] FLUSH: duration 1504.969ms 3: [0117_mock_errors / 1.506s] [ do_test_producer_storage_error:53: : PASS (1.51s) ] 3: [0117_mock_errors / 1.506s] [ do_test_producer_storage_error:53: with too few retries ] 3: [0117_mock_errors / 1.506s] Test config file test.conf not found 3: [0117_mock_errors / 1.506s] Setting test timeout to 10s * 2.7 3: %5|1675737076.530|MOCK|0117_mock_errors#producer-173| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40719,127.0.0.1:38807,127.0.0.1:42157 3: [0117_mock_errors / 1.507s] Created kafka instance 0117_mock_errors#producer-173 3: [0117_mock_errors / 1.507s] 0117_mock_errors#producer-173: Flushing 1 messages 3: [0105_transactions_mock / 50.463s] FLUSH: duration 1000.505ms 3: [0105_transactions_mock / 50.463s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.111ms 3: %1|1675737076.920|TXNERR|0105_transactions_mock#producer-172| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1675737076.920|FATAL|0105_transactions_mock#producer-172| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 50.463s] abort&flush: duration 0.084ms 3: [0105_transactions_mock / 50.463s] Scenario #11 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 50.463s] Fatal error, destroying producer 3: [0105_transactions_mock / 50.472s] [ do_test_txn_endtxn_errors:705: PASS (18.83s) ] 3: [0105_transactions_mock / 50.472s] [ do_test_txn_endtxn_infinite:901 ] 3: [0105_transactions_mock / 50.472s] Test config file test.conf not found 3: [0105_transactions_mock / 50.472s] Setting test timeout to 60s * 2.7 3: %5|1675737076.929|MOCK|0105_transactions_mock#producer-174| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:36305,127.0.0.1:40529,127.0.0.1:46639 3: [0105_transactions_mock / 50.474s] Created kafka instance 0105_transactions_mock#producer-174 3: [0105_transactions_mock / 50.474s] rd_kafka_init_transactions(rk, 5000): duration 0.566ms 3: [0105_transactions_mock / 50.474s] rd_kafka_begin_transaction(rk): duration 0.015ms 3: [0105_transactions_mock / 50.474s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: [0117_mock_errors / 2.009s] FLUSH: duration 501.841ms 3: [0117_mock_errors / 2.036s] [ do_test_producer_storage_error:53: with too few retries: PASS (0.53s) ] 3: [0117_mock_errors / 2.036s] [ do_test_offset_commit_error_during_rebalance:109 ] 3: [0117_mock_errors / 2.036s] Test config file test.conf not found 3: [0117_mock_errors / 2.036s] Setting test timeout to 60s * 2.7 3: %5|1675737077.059|CONFWARN|MOCK#producer-175| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 2.036s] Test config file test.conf not found 3: [0117_mock_errors / 2.037s] Created kafka instance 0117_mock_errors#producer-176 3: [0117_mock_errors / 2.037s] Test config file test.conf not found 3: [0117_mock_errors / 2.037s] Produce to test [-1]: messages #0..100 3: [0117_mock_errors / 2.037s] SUM(POLL): duration 0.000ms 3: [0117_mock_errors / 2.037s] PRODUCE: duration 0.039ms 3: [0117_mock_errors / 2.079s] PRODUCE.DELIVERY.WAIT: duration 42.035ms 3: [0117_mock_errors / 2.080s] Created kafka instance 0117_mock_errors#consumer-177 3: [0117_mock_errors / 2.080s] Created kafka instance 0117_mock_errors#consumer-178 3: [0117_mock_errors / 2.080s] C1.PRE: consume 1 messages 3: [0106_cgrp_sess_timeout / 50.208s] 0106_cgrp_sess_timeout#consumer-167: Rebalance: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 50.208s] ASSIGN.PARTITIONS: duration 0.045ms 3: [0106_cgrp_sess_timeout / 50.208s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 50.309s] CONSUME: duration 3106.360ms 3: [0106_cgrp_sess_timeout / 50.309s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 50.309s] Waiting for assignment to be lost... 3: %6|1675737077.426|FAIL|0106_cgrp_sess_timeout#consumer-167| [thrd:127.0.0.1:34543/bootstrap]: 127.0.0.1:34543/1: Disconnected (after 3106ms in state UP) 3: %6|1675737077.426|FAIL|0106_cgrp_sess_timeout#consumer-167| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:34543: Disconnected (after 3106ms in state UP) 3: %3|1675737077.426|FAIL|0106_cgrp_sess_timeout#consumer-167| [thrd:127.0.0.1:34543/bootstrap]: 127.0.0.1:34543/1: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737077.426|FAIL|0106_cgrp_sess_timeout#consumer-167| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:34543: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737077.617|FAIL|0106_cgrp_sess_timeout#consumer-167| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:34543: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0116_kafkaconsumer_close / 38.580s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=1: PASS (5.02s) ] 3: [0116_kafkaconsumer_close / 38.580s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=1 ] 3: %5|1675737077.949|CONFWARN|MOCK#producer-179| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 38.581s] Setting test timeout to 10s * 2.7 3: %3|1675737077.965|FAIL|0106_cgrp_sess_timeout#consumer-167| [thrd:127.0.0.1:34543/bootstrap]: 127.0.0.1:34543/1: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0117_mock_errors / 5.087s] 0117_mock_errors#consumer-177: Rebalance: _ASSIGN_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.087s] ASSIGN.PARTITIONS: duration 0.046ms 3: [0117_mock_errors / 5.087s] assign: assigned 2 partition(s) 3: [0117_mock_errors / 5.187s] CONSUME: duration 3106.477ms 3: [0117_mock_errors / 5.187s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 5.187s] C2.PRE: consume 1 messages 3: [0117_mock_errors / 5.187s] 0117_mock_errors#consumer-178: Rebalance: _ASSIGN_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.187s] ASSIGN.PARTITIONS: duration 0.098ms 3: [0117_mock_errors / 5.187s] assign: assigned 2 partition(s) 3: [0117_mock_errors / 5.288s] CONSUME: duration 100.864ms 3: [0117_mock_errors / 5.288s] C2.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 5.288s] Closing consumer 0117_mock_errors#consumer-178 3: [0117_mock_errors / 5.288s] 0117_mock_errors#consumer-178: Rebalance: _REVOKE_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.288s] UNASSIGN.PARTITIONS: duration 0.016ms 3: [0117_mock_errors / 5.288s] unassign: unassigned current partitions 3: [0117_mock_errors / 5.288s] CONSUMER.CLOSE: duration 0.159ms 3: [0117_mock_errors / 5.289s] Committing (should fail) 3: [0117_mock_errors / 5.289s] Commit returned REBALANCE_IN_PROGRESS 3: [0117_mock_errors / 5.289s] C1.PRE: consume 100 messages 3: [0116_kafkaconsumer_close / 42.592s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=1: PASS (4.01s) ] 3: [0116_kafkaconsumer_close / 42.592s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=1 ] 3: %5|1675737081.960|CONFWARN|MOCK#producer-182| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 42.592s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 56.077s] commit_transaction(): duration 4602.796ms 3: [0105_transactions_mock / 56.077s] commit returned success 3: [0105_transactions_mock / 56.077s] rd_kafka_begin_transaction(rk): duration 0.035ms 3: [0105_transactions_mock / 56.077s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.010ms 3: [0117_mock_errors / 8.087s] 0117_mock_errors#consumer-177: Rebalance: _REVOKE_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 8.087s] UNASSIGN.PARTITIONS: duration 0.016ms 3: [0117_mock_errors / 8.087s] unassign: unassigned current partitions 3: %4|1675737083.325|SESSTMOUT|0106_cgrp_sess_timeout#consumer-167| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Success): revoking assignment and rejoining group 3: [0106_cgrp_sess_timeout / 56.309s] Assignment is lost, committing 3: [0106_cgrp_sess_timeout / 56.309s] commit() returned: _ASSIGNMENT_LOST 3: [0106_cgrp_sess_timeout / 56.309s] Closing consumer 0106_cgrp_sess_timeout#consumer-167 3: [0106_cgrp_sess_timeout / 56.309s] 0106_cgrp_sess_timeout#consumer-167 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34543/1: Disconnected (after 3106ms in state UP) 3: [0106_cgrp_sess_timeout / 56.309s] 0106_cgrp_sess_timeout#consumer-167 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:34543: Disconnected (after 3106ms in state UP) 3: [0106_cgrp_sess_timeout / 56.309s] 0106_cgrp_sess_timeout#consumer-167 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34543/1: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT) 3: [0106_cgrp_sess_timeout / 56.309s] 0106_cgrp_sess_timeout#consumer-167 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:34543: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT) 3: [0106_cgrp_sess_timeout / 56.309s] 0106_cgrp_sess_timeout#consumer-167 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:34543: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0106_cgrp_sess_timeout / 56.309s] 0106_cgrp_sess_timeout#consumer-167 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:34543/1: Connect to ipv4#127.0.0.1:34543 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0106_cgrp_sess_timeout / 56.309s] 0106_cgrp_sess_timeout#consumer-167: Rebalance: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 56.310s] UNASSIGN.PARTITIONS: duration 0.055ms 3: [0106_cgrp_sess_timeout / 56.310s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 56.310s] CONSUMER.CLOSE: duration 0.199ms 3: [0106_cgrp_sess_timeout / 56.310s] [ do_test_commit_on_lost:231: PASS (9.15s) ] 3: [0106_cgrp_sess_timeout / 56.310s] 0106_cgrp_sess_timeout: duration 56310.219ms 3: [0106_cgrp_sess_timeout / 56.310s] ================= Test 0106_cgrp_sess_timeout PASSED ================= 3: [
/ 71.705s] Too many tests running (5 >= 5): postponing 0121_clusterid start... 3: [0120_asymmetric_subscription/ 0.000s] ================= Running test 0120_asymmetric_subscription ================= 3: [0120_asymmetric_subscription/ 0.000s] ==== Stats written to file stats_0120_asymmetric_subscription_3961903995481694810.json ==== 3: [0120_asymmetric_subscription/ 0.000s] Test config file test.conf not found 3: %5|1675737083.528|CONFWARN|MOCK#producer-185| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0120_asymmetric_subscription/ 0.001s] [ do_test_asymmetric:71: roundrobin assignor ] 3: [0120_asymmetric_subscription/ 0.001s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 0.001s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 0.001s] Created kafka instance c0#consumer-186 3: [0120_asymmetric_subscription/ 0.001s] rd_kafka_subscribe(c[i], tlist): duration 0.103ms 3: [0120_asymmetric_subscription/ 0.001s] Created kafka instance c1#consumer-187 3: [0120_asymmetric_subscription/ 0.001s] rd_kafka_subscribe(c[i], tlist): duration 0.072ms 3: [0120_asymmetric_subscription/ 0.002s] Created kafka instance c2#consumer-188 3: [0120_asymmetric_subscription/ 0.002s] rd_kafka_subscribe(c[i], tlist): duration 0.029ms 3: [0117_mock_errors / 10.289s] 0117_mock_errors#consumer-177: Rebalance: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0117_mock_errors / 10.290s] ASSIGN.PARTITIONS: duration 0.030ms 3: [0117_mock_errors / 10.290s] assign: assigned 4 partition(s) 3: [0117_mock_errors / 11.293s] CONSUME: duration 6003.974ms 3: [0117_mock_errors / 11.293s] C1.PRE: consumed 100/100 messages (0/-1 EOFs) 3: [0117_mock_errors / 11.293s] 0117_mock_errors#consumer-177: Rebalance: _REVOKE_PARTITIONS: 4 partition(s) 3: [0117_mock_errors / 11.293s] UNASSIGN.PARTITIONS: duration 0.074ms 3: [0117_mock_errors / 11.293s] unassign: unassigned current partitions 3: [0117_mock_errors / 11.294s] [ do_test_offset_commit_error_during_rebalance:109: PASS (9.26s) ] 3: [0117_mock_errors / 11.294s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=true ] 3: [0117_mock_errors / 11.294s] Test config file test.conf not found 3: [0117_mock_errors / 11.294s] Setting test timeout to 60s * 2.7 3: %5|1675737086.318|CONFWARN|MOCK#producer-189| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 11.294s] Test config file test.conf not found 3: [0117_mock_errors / 11.294s] Created kafka instance 0117_mock_errors#producer-190 3: [0117_mock_errors / 11.294s] Test config file test.conf not found 3: [0117_mock_errors / 11.294s] Produce to test [-1]: messages #0..1 3: [0117_mock_errors / 11.294s] SUM(POLL): duration 0.000ms 3: [0117_mock_errors / 11.295s] PRODUCE: duration 0.009ms 3: [0117_mock_errors / 11.295s] PRODUCE.DELIVERY.WAIT: duration 0.396ms 3: [0117_mock_errors / 11.295s] Created kafka instance 0117_mock_errors#consumer-191 3: [0117_mock_errors / 11.295s] C1.PRE: consume 1 messages 3: [0116_kafkaconsumer_close / 47.604s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=1: PASS (5.01s) ] 3: [0116_kafkaconsumer_close / 47.604s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=1 ] 3: %5|1675737086.973|CONFWARN|MOCK#producer-192| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 47.604s] Setting test timeout to 10s * 2.7 3: [0120_asymmetric_subscription/ 4.002s] c0#consumer-186: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[1], t2[3] 3: [0120_asymmetric_subscription/ 4.002s] c1#consumer-187: Assignment (6 partition(s)): t2[0], t2[2], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 4.003s] c2#consumer-188: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 4.003s] rd_kafka_assignment(c[i], &assignment): duration 0.040ms 3: [0120_asymmetric_subscription/ 4.003s] rd_kafka_assignment(c[i], &assignment): duration 0.053ms 3: [0120_asymmetric_subscription/ 4.003s] rd_kafka_assignment(c[i], &assignment): duration 0.023ms 3: [0120_asymmetric_subscription/ 4.003s] Closing consumer c0#consumer-186 3: [0120_asymmetric_subscription/ 4.003s] CONSUMER.CLOSE: duration 0.189ms 3: [0120_asymmetric_subscription/ 4.004s] Closing consumer c2#consumer-188 3: [0120_asymmetric_subscription/ 4.004s] CONSUMER.CLOSE: duration 0.121ms 3: [0120_asymmetric_subscription/ 4.004s] [ do_test_asymmetric:71: roundrobin assignor: PASS (4.00s) ] 3: [0120_asymmetric_subscription/ 4.004s] [ do_test_asymmetric:71: range assignor ] 3: [0120_asymmetric_subscription/ 4.004s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 4.004s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 4.004s] Created kafka instance c0#consumer-195 3: [0120_asymmetric_subscription/ 4.004s] rd_kafka_subscribe(c[i], tlist): duration 0.024ms 3: [0120_asymmetric_subscription/ 4.005s] Created kafka instance c1#consumer-196 3: [0120_asymmetric_subscription/ 4.005s] rd_kafka_subscribe(c[i], tlist): duration 0.025ms 3: [0120_asymmetric_subscription/ 4.005s] Created kafka instance c2#consumer-197 3: [0120_asymmetric_subscription/ 4.005s] rd_kafka_subscribe(c[i], tlist): duration 0.015ms 3: [0105_transactions_mock / 61.680s] abort_transaction(): duration 4602.746ms 3: [0105_transactions_mock / 61.680s] abort returned success 3: [0105_transactions_mock / 61.680s] [ do_test_txn_endtxn_infinite:901: PASS (11.21s) ] 3: [0105_transactions_mock / 61.680s] [ do_test_txn_broker_down_in_txn:1280: Test coordinator down ] 3: [0105_transactions_mock / 61.681s] Test config file test.conf not found 3: [0105_transactions_mock / 61.681s] Setting test timeout to 60s * 2.7 3: %5|1675737088.137|MOCK|0105_transactions_mock#producer-198| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:41253,127.0.0.1:38293,127.0.0.1:40331 3: [0105_transactions_mock / 61.681s] Created kafka instance 0105_transactions_mock#producer-198 3: [0105_transactions_mock / 61.681s] Starting transaction 3: [0105_transactions_mock / 61.682s] rd_kafka_init_transactions(rk, 5000): duration 0.611ms 3: [0105_transactions_mock / 61.682s] rd_kafka_begin_transaction(rk): duration 0.021ms 3: [0105_transactions_mock / 61.682s] Test config file test.conf not found 3: [0105_transactions_mock / 61.682s] Produce to test [-1]: messages #0..500 3: [0105_transactions_mock / 61.682s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock / 61.682s] PRODUCE: duration 0.450ms 3: [0105_transactions_mock / 61.682s] Bringing down coordinator 1 3: %6|1675737088.139|FAIL|0105_transactions_mock#producer-198| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:41253: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock / 61.682s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:41253: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock / 61.682s] 0105_transactions_mock#producer-198 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:41253: Disconnected (after 0ms in state UP) 3: %3|1675737088.257|FAIL|0105_transactions_mock#producer-198| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:41253: Connect to ipv4#127.0.0.1:41253 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 61.801s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:41253: Connect to ipv4#127.0.0.1:41253 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 61.801s] 0105_transactions_mock#producer-198 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:41253: Connect to ipv4#127.0.0.1:41253 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737088.623|FAIL|0105_transactions_mock#producer-198| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:41253: Connect to ipv4#127.0.0.1:41253 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 62.167s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:41253: Connect to ipv4#127.0.0.1:41253 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 62.167s] 0105_transactions_mock#producer-198 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:41253: Connect to ipv4#127.0.0.1:41253 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0045_subscribe_update_mock / 75.748s] 0045_subscribe_update_mock#consumer-150: incremental rebalance: _ASSIGN_PARTITIONS: 64 partition(s) 3: [0045_subscribe_update_mock / 75.748s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.384ms 3: [0045_subscribe_update_mock / 75.748s] rebalance_cb: incremental assign of 64 partition(s) done 3: [0117_mock_errors / 14.402s] CONSUME: duration 3106.586ms 3: [0117_mock_errors / 14.402s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 14.402s] Closing consumer 0117_mock_errors#consumer-191 3: [0045_subscribe_update_mock / 76.623s] 0045_subscribe_update_mock#consumer-150: Assignment (64 partition(s)): topic_0[0], topic_1[0], topic_1[1], topic_10[0], topic_10[1], topic_10[2], topic_11[0], topic_11[1], topic_11[2], topic_11[3], topic_12[0], topic_12[1], topic_12[2], topic_12[3], topic_12[4], topic_13[0], topic_13[1], topic_13[2], topic_13[3], topic_13[4], topic_13[5], topic_14[0], topic_14[1], topic_14[2], topic_14[3], topic_14[4], topic_14[5], topic_14[6], topic_2[0], topic_2[1], topic_2[2], topic_3[0], topic_3[1], topic_3[2], topic_3[3], topic_4[0], topic_4[1], topic_4[2], topic_4[3], topic_4[4], topic_5[0], topic_5[1], topic_5[2], topic_5[3], topic_5[4], topic_5[5], topic_6[0], topic_6[1], topic_6[2], topic_6[3], topic_6[4], topic_6[5], topic_6[6], topic_7[0], topic_7[1], topic_7[2], topic_7[3], topic_7[4], topic_7[5], topic_7[6], topic_7[7], topic_8[0], topic_9[0], topic_9[1] 3: [0045_subscribe_update_mock / 76.623s] Creating topic topic_15 3: [0045_subscribe_update_mock / 76.623s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.098ms 3: [0045_subscribe_update_mock / 76.623s] POLL: not expecting any messages for 300ms 3: [0117_mock_errors / 14.603s] CONSUMER.CLOSE: duration 201.392ms 3: [0117_mock_errors / 14.604s] Created kafka instance 0117_mock_errors#consumer-199 3: [0117_mock_errors / 14.604s] rd_kafka_committed(c2, partitions, 10 * 1000): duration 0.544ms 3: [0117_mock_errors / 14.605s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=true: PASS (3.31s) ] 3: [0117_mock_errors / 14.605s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=false ] 3: [0117_mock_errors / 14.605s] Test config file test.conf not found 3: [0117_mock_errors / 14.605s] Setting test timeout to 60s * 2.7 3: %5|1675737089.628|CONFWARN|MOCK#producer-200| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 14.605s] Test config file test.conf not found 3: [0117_mock_errors / 14.605s] Created kafka instance 0117_mock_errors#producer-201 3: [0117_mock_errors / 14.605s] Test config file test.conf not found 3: [0117_mock_errors / 14.605s] Produce to test [-1]: messages #0..1 3: [0117_mock_errors / 14.605s] SUM(POLL): duration 0.000ms 3: [0117_mock_errors / 14.605s] PRODUCE: duration 0.007ms 3: [0117_mock_errors / 14.606s] PRODUCE.DELIVERY.WAIT: duration 0.387ms 3: [0117_mock_errors / 14.606s] Created kafka instance 0117_mock_errors#consumer-202 3: [0117_mock_errors / 14.606s] C1.PRE: consume 1 messages 3: [0045_subscribe_update_mock / 76.923s] CONSUME: duration 300.085ms 3: [0045_subscribe_update_mock / 76.923s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 76.923s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 77.023s] Creating topic topic_16 3: [0045_subscribe_update_mock / 77.023s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.035ms 3: [0045_subscribe_update_mock / 77.023s] POLL: not expecting any messages for 300ms 3: [0116_kafkaconsumer_close / 50.722s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=1: PASS (3.12s) ] 3: [0116_kafkaconsumer_close / 50.722s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=1 ] 3: %5|1675737090.091|CONFWARN|MOCK#producer-203| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 50.722s] Setting test timeout to 10s * 2.7 3: [0045_subscribe_update_mock / 77.323s] CONSUME: duration 300.090ms 3: [0045_subscribe_update_mock / 77.323s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 77.323s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 77.423s] Creating topic topic_17 3: [0045_subscribe_update_mock / 77.423s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.031ms 3: [0045_subscribe_update_mock / 77.423s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 77.723s] CONSUME: duration 300.064ms 3: [0045_subscribe_update_mock / 77.723s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 77.723s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 77.824s] Creating topic topic_18 3: [0045_subscribe_update_mock / 77.824s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.113ms 3: [0045_subscribe_update_mock / 77.824s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 78.124s] CONSUME: duration 300.020ms 3: [0045_subscribe_update_mock / 78.124s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 78.124s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0105_transactions_mock / 64.682s] Test config file test.conf not found 3: [0105_transactions_mock / 64.682s] Produce to test [-1]: messages #500..1000 3: [0105_transactions_mock / 64.683s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock / 64.683s] PRODUCE: duration 0.350ms 3: [0045_subscribe_update_mock / 78.224s] Creating topic topic_19 3: [0045_subscribe_update_mock / 78.224s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.101ms 3: [0045_subscribe_update_mock / 78.224s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 78.524s] CONSUME: duration 300.093ms 3: [0045_subscribe_update_mock / 78.524s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 78.524s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0120_asymmetric_subscription/ 8.005s] c0#consumer-195: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[0], t2[1] 3: [0120_asymmetric_subscription/ 8.006s] c1#consumer-196: Assignment (6 partition(s)): t2[2], t2[3], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 8.006s] c2#consumer-197: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 8.006s] rd_kafka_assignment(c[i], &assignment): duration 0.020ms 3: [0120_asymmetric_subscription/ 8.006s] rd_kafka_assignment(c[i], &assignment): duration 0.025ms 3: [0120_asymmetric_subscription/ 8.006s] rd_kafka_assignment(c[i], &assignment): duration 0.054ms 3: [0120_asymmetric_subscription/ 8.007s] [ do_test_asymmetric:71: range assignor: PASS (4.00s) ] 3: [0120_asymmetric_subscription/ 8.007s] [ do_test_asymmetric:71: cooperative-sticky assignor ] 3: [0120_asymmetric_subscription/ 8.007s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 8.007s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 8.007s] Created kafka instance c0#consumer-206 3: [0120_asymmetric_subscription/ 8.007s] rd_kafka_subscribe(c[i], tlist): duration 0.015ms 3: [0120_asymmetric_subscription/ 8.007s] Created kafka instance c1#consumer-207 3: [0120_asymmetric_subscription/ 8.007s] rd_kafka_subscribe(c[i], tlist): duration 0.014ms 3: [0120_asymmetric_subscription/ 8.008s] Created kafka instance c2#consumer-208 3: [0120_asymmetric_subscription/ 8.008s] rd_kafka_subscribe(c[i], tlist): duration 0.027ms 3: [0045_subscribe_update_mock / 78.624s] Creating topic topic_20 3: [0045_subscribe_update_mock / 78.624s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.060ms 3: [0045_subscribe_update_mock / 78.624s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 78.924s] CONSUME: duration 300.064ms 3: [0045_subscribe_update_mock / 78.924s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 78.924s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 79.024s] Creating topic topic_21 3: [0045_subscribe_update_mock / 79.024s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.050ms 3: [0045_subscribe_update_mock / 79.024s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 79.325s] CONSUME: duration 300.092ms 3: [0045_subscribe_update_mock / 79.325s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 79.325s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 79.425s] Creating topic topic_22 3: [0045_subscribe_update_mock / 79.425s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.046ms 3: [0045_subscribe_update_mock / 79.425s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 79.725s] CONSUME: duration 300.063ms 3: [0045_subscribe_update_mock / 79.725s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 79.725s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0117_mock_errors / 17.711s] CONSUME: duration 3104.735ms 3: [0117_mock_errors / 17.711s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0045_subscribe_update_mock / 79.825s] Creating topic topic_23 3: [0045_subscribe_update_mock / 79.825s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.048ms 3: [0045_subscribe_update_mock / 79.825s] POLL: not expecting any messages for 300ms 3: [0117_mock_errors / 17.912s] rd_kafka_commit(c1, ((void *)0), 0 ): duration 201.354ms 3: [0117_mock_errors / 17.912s] Closing consumer 0117_mock_errors#consumer-202 3: [0117_mock_errors / 17.912s] CONSUMER.CLOSE: duration 0.111ms 3: [0117_mock_errors / 17.913s] Created kafka instance 0117_mock_errors#consumer-209 3: [0117_mock_errors / 17.913s] rd_kafka_committed(c2, partitions, 10 * 1000): duration 0.592ms 3: [0117_mock_errors / 17.914s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=false: PASS (3.31s) ] 3: [0117_mock_errors / 17.914s] 0117_mock_errors: duration 17913.717ms 3: [0117_mock_errors / 17.914s] ================= Test 0117_mock_errors PASSED ================= 3: [
/ 81.215s] Too many tests running (5 >= 5): postponing 0124_openssl_invalid_engine start... 3: [0121_clusterid / 0.000s] ================= Running test 0121_clusterid ================= 3: [0121_clusterid / 0.000s] ==== Stats written to file stats_0121_clusterid_5361101891680481712.json ==== 3: [0121_clusterid / 0.000s] Test config file test.conf not found 3: %5|1675737093.038|CONFWARN|MOCK#producer-210| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1675737093.038|CONFWARN|MOCK#producer-211| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0121_clusterid / 0.001s] Test config file test.conf not found 3: [0121_clusterid / 0.001s] Setting test timeout to 10s * 2.7 3: [0121_clusterid / 0.001s] Created kafka instance 0121_clusterid#producer-212 3: [
/ 81.216s] Log: 0121_clusterid#producer-212 level 3 fac FAIL: [thrd:127.0.0.1:41037/bootstrap]: 127.0.0.1:41037/bootstrap: Connect to ipv4#127.0.0.1:41037 failed: Connection refused (after 0ms in state CONNECT) 3: [0045_subscribe_update_mock / 80.125s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 80.125s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 80.125s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0105_transactions_mock / 66.683s] Bringing up coordinator 1 3: [0045_subscribe_update_mock / 80.225s] Creating topic topic_24 3: [0045_subscribe_update_mock / 80.225s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.051ms 3: [0045_subscribe_update_mock / 80.225s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 80.525s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 80.525s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 80.525s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 80.625s] Creating topic topic_25 3: [0045_subscribe_update_mock / 80.625s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.066ms 3: [0045_subscribe_update_mock / 80.625s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 67.266s] rd_kafka_commit_transaction(rk, -1): duration 583.155ms 3: [0105_transactions_mock / 67.267s] [ do_test_txn_broker_down_in_txn:1280: Test coordinator down: PASS (5.59s) ] 3: [0105_transactions_mock / 67.267s] [ do_test_txn_broker_down_in_txn:1280: Test leader down ] 3: [0105_transactions_mock / 67.267s] Test config file test.conf not found 3: [0105_transactions_mock / 67.267s] Setting test timeout to 60s * 2.7 3: %5|1675737093.724|MOCK|0105_transactions_mock#producer-213| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45475,127.0.0.1:41187,127.0.0.1:37165 3: [0105_transactions_mock / 67.270s] Created kafka instance 0105_transactions_mock#producer-213 3: [0105_transactions_mock / 67.270s] Starting transaction 3: [0105_transactions_mock / 67.276s] rd_kafka_init_transactions(rk, 5000): duration 5.631ms 3: [0105_transactions_mock / 67.276s] rd_kafka_begin_transaction(rk): duration 0.049ms 3: [0105_transactions_mock / 67.276s] Test config file test.conf not found 3: [0105_transactions_mock / 67.276s] Produce to test [-1]: messages #0..500 3: [0105_transactions_mock / 67.276s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock / 67.276s] PRODUCE: duration 0.384ms 3: [0105_transactions_mock / 67.276s] Bringing down leader 2 3: %3|1675737093.735|FAIL|0105_transactions_mock#producer-213| [thrd:127.0.0.1:41187/bootstrap]: 127.0.0.1:41187/2: Connect to ipv4#127.0.0.1:41187 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 67.278s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:41187/2: Connect to ipv4#127.0.0.1:41187 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 67.278s] 0105_transactions_mock#producer-213 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:41187/2: Connect to ipv4#127.0.0.1:41187 failed: Connection refused (after 0ms in state CONNECT) 3: [0045_subscribe_update_mock / 80.926s] CONSUME: duration 300.079ms 3: [0045_subscribe_update_mock / 80.926s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 80.926s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: %3|1675737093.990|FAIL|0105_transactions_mock#producer-213| [thrd:127.0.0.1:41187/bootstrap]: 127.0.0.1:41187/2: Connect to ipv4#127.0.0.1:41187 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 67.533s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:41187/2: Connect to ipv4#127.0.0.1:41187 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 67.533s] 0105_transactions_mock#producer-213 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:41187/2: Connect to ipv4#127.0.0.1:41187 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0045_subscribe_update_mock / 81.026s] Creating topic topic_26 3: [0045_subscribe_update_mock / 81.026s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.078ms 3: [0045_subscribe_update_mock / 81.026s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 81.326s] CONSUME: duration 300.086ms 3: [0045_subscribe_update_mock / 81.326s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 81.326s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 81.426s] Creating topic topic_27 3: [0045_subscribe_update_mock / 81.426s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.070ms 3: [0045_subscribe_update_mock / 81.426s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 81.726s] CONSUME: duration 300.072ms 3: [0045_subscribe_update_mock / 81.726s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 81.726s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 81.826s] Creating topic topic_28 3: [0045_subscribe_update_mock / 81.826s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.056ms 3: [0045_subscribe_update_mock / 81.826s] POLL: not expecting any messages for 300ms 3: [0116_kafkaconsumer_close / 55.734s] Closing with queue 3: [0116_kafkaconsumer_close / 55.734s] Attempting second close 3: [0116_kafkaconsumer_close / 55.735s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=1: PASS (5.01s) ] 3: [0116_kafkaconsumer_close / 55.735s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=1 ] 3: %5|1675737095.104|CONFWARN|MOCK#producer-214| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 55.736s] Setting test timeout to 10s * 2.7 3: [0045_subscribe_update_mock / 82.127s] CONSUME: duration 300.275ms 3: [0045_subscribe_update_mock / 82.127s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 82.127s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 82.228s] Creating topic topic_29 3: [0045_subscribe_update_mock / 82.232s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 3.977ms 3: [0045_subscribe_update_mock / 82.232s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 82.532s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 82.532s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 82.532s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 82.532s] 0045_subscribe_update_mock#consumer-150: Assignment (64 partition(s)): topic_0[0], topic_1[0], topic_1[1], topic_10[0], topic_10[1], topic_10[2], topic_11[0], topic_11[1], topic_11[2], topic_11[3], topic_12[0], topic_12[1], topic_12[2], topic_12[3], topic_12[4], topic_13[0], topic_13[1], topic_13[2], topic_13[3], topic_13[4], topic_13[5], topic_14[0], topic_14[1], topic_14[2], topic_14[3], topic_14[4], topic_14[5], topic_14[6], topic_2[0], topic_2[1], topic_2[2], topic_3[0], topic_3[1], topic_3[2], topic_3[3], topic_4[0], topic_4[1], topic_4[2], topic_4[3], topic_4[4], topic_5[0], topic_5[1], topic_5[2], topic_5[3], topic_5[4], topic_5[5], topic_6[0], topic_6[1], topic_6[2], topic_6[3], topic_6[4], topic_6[5], topic_6[6], topic_7[0], topic_7[1], topic_7[2], topic_7[3], topic_7[4], topic_7[5], topic_7[6], topic_7[7], topic_8[0], topic_9[0], topic_9[1] 3: [0045_subscribe_update_mock / 82.532s] Creating topic topic_30 3: [0045_subscribe_update_mock / 82.532s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.040ms 3: [0045_subscribe_update_mock / 82.532s] POLL: not expecting any messages for 300ms 3: [0120_asymmetric_subscription/ 12.008s] c0#consumer-206: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[0], t2[2] 3: [0120_asymmetric_subscription/ 12.008s] c1#consumer-207: Assignment (6 partition(s)): t2[1], t2[3], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 12.008s] c2#consumer-208: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 12.008s] rd_kafka_assignment(c[i], &assignment): duration 0.029ms 3: [0120_asymmetric_subscription/ 12.009s] rd_kafka_assignment(c[i], &assignment): duration 0.018ms 3: [0120_asymmetric_subscription/ 12.009s] rd_kafka_assignment(c[i], &assignment): duration 0.027ms 3: [0120_asymmetric_subscription/ 12.009s] Closing consumer c0#consumer-206 3: [0120_asymmetric_subscription/ 12.009s] CONSUMER.CLOSE: duration 0.101ms 3: [0120_asymmetric_subscription/ 12.010s] Closing consumer c2#consumer-208 3: [0120_asymmetric_subscription/ 12.010s] CONSUMER.CLOSE: duration 0.078ms 3: [0120_asymmetric_subscription/ 12.010s] [ do_test_asymmetric:71: cooperative-sticky assignor: PASS (4.00s) ] 3: [0120_asymmetric_subscription/ 12.011s] 0120_asymmetric_subscription: duration 12010.639ms 3: [0120_asymmetric_subscription/ 12.011s] ================= Test 0120_asymmetric_subscription PASSED ================= 3: [
/ 83.840s] Too many tests running (5 >= 5): postponing 0128_sasl_callback_queue start... 3: [0124_openssl_invalid_engine / 0.000s] ================= Running test 0124_openssl_invalid_engine ================= 3: [0124_openssl_invalid_engine / 0.000s] ==== Stats written to file stats_0124_openssl_invalid_engine_4205585888037383059.json ==== 3: [0124_openssl_invalid_engine / 0.000s] Test config file test.conf not found 3: [0124_openssl_invalid_engine / 0.000s] Setting test timeout to 30s * 2.7 3: %3|1675737095.663|SSL|0124_openssl_invalid_engine#producer-217| [thrd:app]: error:25066067:DSO support routines:dlfcn_load:could not load the shared library: filename(libinvalid_path.so): libinvalid_path.so: cannot open shared object file: No such file or directory 3: %3|1675737095.663|SSL|0124_openssl_invalid_engine#producer-217| [thrd:app]: error:25070067:DSO support routines:DSO_load:could not load the shared library 3: [0124_openssl_invalid_engine / 0.000s] rd_kafka_new() failed (as expected): OpenSSL engine initialization failed in ENGINE_ctrl_cmd_string LOAD: error:260B6084:engine routines:dynamic_load:dso not found 3: [0124_openssl_invalid_engine / 0.000s] 0124_openssl_invalid_engine: duration 0.463ms 3: [0124_openssl_invalid_engine / 0.000s] ================= Test 0124_openssl_invalid_engine PASSED ================= 3: [
/ 83.941s] Too many tests running (5 >= 5): postponing 0131_connect_timeout start... 3: [0128_sasl_callback_queue / 0.000s] ================= Running test 0128_sasl_callback_queue ================= 3: [0128_sasl_callback_queue / 0.000s] ==== Stats written to file stats_0128_sasl_callback_queue_4274123671367900087.json ==== 3: [0128_sasl_callback_queue / 0.000s] Feature "sasl_oauthbearer" is built-in 3: [0128_sasl_callback_queue / 0.000s] [ do_test:64: Use background queue = yes ] 3: %5|1675737095.764|CONFWARN|rdkafka#producer-218| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1675737095.765|ERROR|rdkafka#producer-218| [thrd:background]: Failed to acquire SASL OAUTHBEARER token: Not implemented by this test, but that's okay 3: [
/ 83.942s] Callback called! 3: [0045_subscribe_update_mock / 82.833s] CONSUME: duration 300.635ms 3: [0045_subscribe_update_mock / 82.833s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 82.833s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 82.933s] Creating topic topic_31 3: [0045_subscribe_update_mock / 82.933s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.063ms 3: [0045_subscribe_update_mock / 82.933s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 83.233s] CONSUME: duration 300.081ms 3: [0045_subscribe_update_mock / 83.233s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 83.233s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 83.333s] Creating topic topic_32 3: [0045_subscribe_update_mock / 83.333s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.078ms 3: [0045_subscribe_update_mock / 83.333s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 83.633s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 83.633s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 83.633s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 83.733s] Creating topic topic_33 3: [0045_subscribe_update_mock / 83.734s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.059ms 3: [0045_subscribe_update_mock / 83.734s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 70.278s] Test config file test.conf not found 3: [0105_transactions_mock / 70.278s] Produce to test [-1]: messages #500..1000 3: [0105_transactions_mock / 70.278s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock / 70.278s] PRODUCE: duration 0.533ms 3: [0045_subscribe_update_mock / 84.034s] CONSUME: duration 300.076ms 3: [0045_subscribe_update_mock / 84.034s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 84.034s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [
/ 85.217s] Log: 0121_clusterid#producer-212 level 6 fac FAIL: [thrd:127.0.0.1:33685/bootstrap]: 127.0.0.1:33685/1: Disconnected (after 3000ms in state UP) 3: [
/ 85.217s] Log: 0121_clusterid#producer-212 level 3 fac FAIL: [thrd:127.0.0.1:33685/bootstrap]: 127.0.0.1:33685/1: Connect to ipv4#127.0.0.1:33685 failed: Connection refused (after 0ms in state CONNECT) 3: [0045_subscribe_update_mock / 84.134s] Creating topic topic_34 3: [0045_subscribe_update_mock / 84.134s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.073ms 3: [0045_subscribe_update_mock / 84.134s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 84.434s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 84.434s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 84.434s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 84.534s] Creating topic topic_35 3: [0045_subscribe_update_mock / 84.534s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.053ms 3: [0045_subscribe_update_mock / 84.534s] POLL: not expecting any messages for 300ms 3: [
/ 85.787s] Log: 0121_clusterid#producer-212 level 3 fac FAIL: [thrd:127.0.0.1:33685/bootstrap]: 127.0.0.1:33685/1: Connect to ipv4#127.0.0.1:33685 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0045_subscribe_update_mock / 84.834s] CONSUME: duration 300.077ms 3: [0045_subscribe_update_mock / 84.834s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 84.834s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 84.934s] Creating topic topic_36 3: [0045_subscribe_update_mock / 84.934s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.067ms 3: [0045_subscribe_update_mock / 84.934s] POLL: not expecting any messages for 300ms 3: [
/ 86.217s] Log: 0121_clusterid#producer-212 level 4 fac CLUSTERID: [thrd:main]: Broker 127.0.0.1:41037/bootstrap reports different ClusterId "mockCluster1fe91c000394" than previously known "mockCluster1fe91c000238": a client must not be simultaneously connected to multiple clusters 3: [0045_subscribe_update_mock / 85.234s] CONSUME: duration 300.083ms 3: [0045_subscribe_update_mock / 85.235s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 85.235s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0116_kafkaconsumer_close / 58.927s] Closing with queue 3: [0116_kafkaconsumer_close / 58.928s] Attempting second close 3: [0116_kafkaconsumer_close / 58.931s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=1: PASS (3.20s) ] 3: [0116_kafkaconsumer_close / 58.931s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=1 ] 3: %5|1675737098.299|CONFWARN|MOCK#producer-219| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 58.931s] Setting test timeout to 10s * 2.7 3: [0045_subscribe_update_mock / 85.335s] Creating topic topic_37 3: [0045_subscribe_update_mock / 85.336s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 1.009ms 3: [0045_subscribe_update_mock / 85.336s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 85.636s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 85.636s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 85.636s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 85.736s] Creating topic topic_38 3: [0045_subscribe_update_mock / 85.736s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.061ms 3: [0045_subscribe_update_mock / 85.736s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 72.278s] Bringing up leader 2 3: [0045_subscribe_update_mock / 86.036s] CONSUME: duration 300.078ms 3: [0045_subscribe_update_mock / 86.036s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 86.036s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0121_clusterid / 6.003s] 0121_clusterid: duration 6003.354ms 3: [0121_clusterid / 6.003s] ================= Test 0121_clusterid PASSED ================= 3: [0045_subscribe_update_mock / 86.136s] Creating topic topic_39 3: [0045_subscribe_update_mock / 86.136s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.058ms 3: [0045_subscribe_update_mock / 86.136s] POLL: not expecting any messages for 300ms 3: [
/ 87.318s] 5 test(s) running: 0045_subscribe_update_mock 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0131_connect_timeout / 0.000s] ================= Running test 0131_connect_timeout ================= 3: [0131_connect_timeout / 0.000s] ==== Stats written to file stats_0131_connect_timeout_2263946781921639810.json ==== 3: [0131_connect_timeout / 0.000s] Test config file test.conf not found 3: [0131_connect_timeout / 0.000s] Setting test timeout to 20s * 2.7 3: [0131_connect_timeout / 0.003s] Created kafka instance 0131_connect_timeout#producer-222 3: [0045_subscribe_update_mock / 86.436s] CONSUME: duration 300.078ms 3: [0045_subscribe_update_mock / 86.436s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 86.436s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 86.537s] Creating topic topic_40 3: [0045_subscribe_update_mock / 86.537s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.083ms 3: [0045_subscribe_update_mock / 86.537s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 86.837s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 86.837s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 86.837s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 86.937s] Creating topic topic_41 3: [0045_subscribe_update_mock / 86.937s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.149ms 3: [0045_subscribe_update_mock / 86.937s] POLL: not expecting any messages for 300ms 3: [
/ 88.318s] 5 test(s) running: 0045_subscribe_update_mock 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0105_transactions_mock / 73.739s] rd_kafka_commit_transaction(rk, -1): duration 1461.082ms 3: [0105_transactions_mock / 73.740s] [ do_test_txn_broker_down_in_txn:1280: Test leader down: PASS (6.47s) ] 3: [0105_transactions_mock / 73.740s] [ do_test_txns_not_supported:1492 ] 3: [0105_transactions_mock / 73.740s] Test config file test.conf not found 3: [0105_transactions_mock / 73.740s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 73.740s] Created kafka instance 0105_transactions_mock#producer-223 3: %1|1675737100.210|TXNERR|0105_transactions_mock#producer-223| [thrd:main]: Fatal transaction error: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 (_UNSUPPORTED_FEATURE) 3: %0|1675737100.210|FATAL|0105_transactions_mock#producer-223| [thrd:main]: Fatal error: Local: Required feature not supported by broker: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 3: [0105_transactions_mock / 73.754s] init_transactions() returned _UNSUPPORTED_FEATURE: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 3: %6|1675737100.212|FAIL|0105_transactions_mock#producer-223| [thrd:127.0.0.1:39367/bootstrap]: 127.0.0.1:39367/3: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 2ms in state UP) 3: [0045_subscribe_update_mock / 87.238s] CONSUME: duration 300.606ms 3: [0045_subscribe_update_mock / 87.238s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 87.238s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0105_transactions_mock / 73.782s] [ do_test_txns_not_supported:1492: PASS (0.04s) ] 3: [0105_transactions_mock / 73.782s] [ do_test_txns_send_offsets_concurrent_is_retried:1551 ] 3: [0105_transactions_mock / 73.782s] Test config file test.conf not found 3: [0105_transactions_mock / 73.782s] Setting test timeout to 60s * 2.7 3: %5|1675737100.239|MOCK|0105_transactions_mock#producer-224| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:33041,127.0.0.1:46679,127.0.0.1:41107 3: [0105_transactions_mock / 73.783s] Created kafka instance 0105_transactions_mock#producer-224 3: [0105_transactions_mock / 73.801s] rd_kafka_init_transactions(rk, 5000): duration 17.814ms 3: [0105_transactions_mock / 73.801s] rd_kafka_begin_transaction(rk): duration 0.097ms 3: [0105_transactions_mock / 73.801s] 0105_transactions_mock#producer-224: Flushing 1 messages 3: [0045_subscribe_update_mock / 87.339s] Creating topic topic_42 3: [0045_subscribe_update_mock / 87.339s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.058ms 3: [0045_subscribe_update_mock / 87.339s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 87.639s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 87.639s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 87.639s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 87.739s] Creating topic topic_43 3: [0045_subscribe_update_mock / 87.739s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.055ms 3: [0045_subscribe_update_mock / 87.739s] POLL: not expecting any messages for 300ms 3: [0128_sasl_callback_queue / 5.001s] [ do_test:64: Use background queue = yes: PASS (5.00s) ] 3: [0128_sasl_callback_queue / 5.001s] [ do_test:64: Use background queue = no ] 3: %5|1675737100.765|CONFWARN|rdkafka#producer-225| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0045_subscribe_update_mock / 88.039s] CONSUME: duration 300.073ms 3: [0045_subscribe_update_mock / 88.039s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 88.039s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 88.139s] Creating topic topic_44 3: [0045_subscribe_update_mock / 88.139s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.059ms 3: [0045_subscribe_update_mock / 88.139s] POLL: not expecting any messages for 300ms 3: [
/ 89.318s] 5 test(s) running: 0045_subscribe_update_mock 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0105_transactions_mock / 74.786s] FLUSH: duration 984.822ms 3: [0045_subscribe_update_mock / 88.439s] CONSUME: duration 300.076ms 3: [0045_subscribe_update_mock / 88.439s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 88.439s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 88.440s] 0045_subscribe_update_mock#consumer-150: Assignment (64 partition(s)): topic_0[0], topic_1[0], topic_1[1], topic_10[0], topic_10[1], topic_10[2], topic_11[0], topic_11[1], topic_11[2], topic_11[3], topic_12[0], topic_12[1], topic_12[2], topic_12[3], topic_12[4], topic_13[0], topic_13[1], topic_13[2], topic_13[3], topic_13[4], topic_13[5], topic_14[0], topic_14[1], topic_14[2], topic_14[3], topic_14[4], topic_14[5], topic_14[6], topic_2[0], topic_2[1], topic_2[2], topic_3[0], topic_3[1], topic_3[2], topic_3[3], topic_4[0], topic_4[1], topic_4[2], topic_4[3], topic_4[4], topic_5[0], topic_5[1], topic_5[2], topic_5[3], topic_5[4], topic_5[5], topic_6[0], topic_6[1], topic_6[2], topic_6[3], topic_6[4], topic_6[5], topic_6[6], topic_7[0], topic_7[1], topic_7[2], topic_7[3], topic_7[4], topic_7[5], topic_7[6], topic_7[7], topic_8[0], topic_9[0], topic_9[1] 3: [0045_subscribe_update_mock / 88.440s] Creating topic topic_45 3: [0045_subscribe_update_mock / 88.440s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.062ms 3: [0045_subscribe_update_mock / 88.440s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 88.740s] CONSUME: duration 300.070ms 3: [0045_subscribe_update_mock / 88.740s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 88.740s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 88.840s] Creating topic topic_46 3: [0045_subscribe_update_mock / 88.840s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.078ms 3: [0045_subscribe_update_mock / 88.840s] POLL: not expecting any messages for 300ms 3: [0105_transactions_mock / 75.389s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 603.434ms 3: [0105_transactions_mock / 75.390s] rd_kafka_commit_transaction(rk, 5000): duration 0.170ms 3: [0105_transactions_mock / 75.390s] [ do_test_txns_send_offsets_concurrent_is_retried:1551: PASS (1.61s) ] 3: [0105_transactions_mock / 75.390s] [ do_test_txn_coord_req_destroy:1881 ] 3: [0105_transactions_mock / 75.390s] Test config file test.conf not found 3: [0105_transactions_mock / 75.390s] Setting test timeout to 60s * 2.7 3: %5|1675737101.847|MOCK|0105_transactions_mock#producer-226| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35439,127.0.0.1:32769,127.0.0.1:37107 3: [0105_transactions_mock / 75.395s] Created kafka instance 0105_transactions_mock#producer-226 3: [0105_transactions_mock / 75.401s] rd_kafka_init_transactions(rk, 5000): duration 6.163ms 3: [0105_transactions_mock / 75.401s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 75.401s] rd_kafka_begin_transaction(rk): duration 0.078ms 3: [0105_transactions_mock / 75.504s] send_offsets_to_transaction() #0: 3: %3|1675737102.102|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:35439/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 1 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:38810000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737102.102|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:35439/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0045_subscribe_update_mock / 89.140s] CONSUME: duration 300.075ms 3: [0045_subscribe_update_mock / 89.140s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 89.140s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [
/ 90.318s] 5 test(s) running: 0045_subscribe_update_mock 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0045_subscribe_update_mock / 89.240s] Creating topic topic_47 3: [0045_subscribe_update_mock / 89.240s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.082ms 3: [0045_subscribe_update_mock / 89.240s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 89.540s] CONSUME: duration 300.077ms 3: [0045_subscribe_update_mock / 89.540s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 89.540s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 89.641s] Creating topic topic_48 3: [0045_subscribe_update_mock / 89.641s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.064ms 3: [0045_subscribe_update_mock / 89.641s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 89.941s] CONSUME: duration 300.074ms 3: [0045_subscribe_update_mock / 89.941s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 89.941s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 90.041s] Creating topic topic_49 3: [0045_subscribe_update_mock / 90.041s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.070ms 3: [0045_subscribe_update_mock / 90.041s] POLL: not expecting any messages for 300ms 3: [
/ 91.319s] 5 test(s) running: 0045_subscribe_update_mock 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0116_kafkaconsumer_close / 63.952s] Closing with queue 3: [0116_kafkaconsumer_close / 63.953s] Attempting second close 3: [0116_kafkaconsumer_close / 63.953s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=1: PASS (5.02s) ] 3: [0116_kafkaconsumer_close / 63.953s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=1 ] 3: %5|1675737103.322|CONFWARN|MOCK#producer-227| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 63.954s] Setting test timeout to 10s * 2.7 3: [0045_subscribe_update_mock / 90.344s] CONSUME: duration 303.122ms 3: [0045_subscribe_update_mock / 90.344s] test_consumer_poll_no_msgs:4075: POLL: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0045_subscribe_update_mock / 90.344s] test_consumer_poll_no_msgs:4075: POLL: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0045_subscribe_update_mock / 90.444s] Closing consumer 0045_subscribe_update_mock#consumer-150 3: [0045_subscribe_update_mock / 90.445s] 0045_subscribe_update_mock#consumer-150: incremental rebalance: _REVOKE_PARTITIONS: 64 partition(s) 3: [0045_subscribe_update_mock / 90.445s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.254ms 3: [0045_subscribe_update_mock / 90.445s] rebalance_cb: incremental unassign of 64 partition(s) done 3: [0105_transactions_mock / 77.504s] rd_kafka_abort_transaction(rk, 5000): duration 0.484ms 3: [0105_transactions_mock / 77.505s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 77.505s] rd_kafka_begin_transaction(rk): duration 0.045ms 3: %3|1675737104.004|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:35439/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:38810000,Epoch:0}, base seq 1): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737104.004|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:35439/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 77.607s] send_offsets_to_transaction() #1: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 77.607s] send_offsets_to_transaction() #1 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/ 92.319s] 5 test(s) running: 0045_subscribe_update_mock 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [
/ 93.319s] 5 test(s) running: 0045_subscribe_update_mock 0105_transactions_mock 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [
/ 93.351s] Log: 0131_connect_timeout#producer-222 level 7 fac FAIL: [thrd:127.0.0.1:42757/bootstrap]: 127.0.0.1:42757/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) (_TRANSPORT) 3: [
/ 93.351s] Log: 0131_connect_timeout#producer-222 level 4 fac FAIL: [thrd:127.0.0.1:42757/bootstrap]: 127.0.0.1:42757/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) 3: [0128_sasl_callback_queue / 10.001s] [ do_test:64: Use background queue = no: PASS (5.00s) ] 3: [0128_sasl_callback_queue / 10.001s] 0128_sasl_callback_queue: duration 10001.269ms 3: [0128_sasl_callback_queue / 10.001s] ================= Test 0128_sasl_callback_queue PASSED ================= 3: [0045_subscribe_update_mock / 92.902s] CONSUMER.CLOSE: duration 2457.663ms 3: [0045_subscribe_update_mock / 92.913s] [ do_test_regex_many_mock:378: cooperative-sticky with 50 topics: PASS (36.20s) ] 3: [0045_subscribe_update_mock / 92.913s] [ do_test_regex_many_mock:378: cooperative-sticky with 300 topics ] 3: %5|1675737105.900|CONFWARN|MOCK#producer-230| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0045_subscribe_update_mock / 92.913s] Test config file test.conf not found 3: [0045_subscribe_update_mock / 92.913s] Setting test timeout to 300s * 2.7 3: [0045_subscribe_update_mock / 92.919s] Created kafka instance 0045_subscribe_update_mock#consumer-231 3: [0045_subscribe_update_mock / 92.923s] Creating topic topic_0 3: [0045_subscribe_update_mock / 92.923s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 0.154ms 3: [0045_subscribe_update_mock / 92.923s] POLL: not expecting any messages for 100ms 3: [0045_subscribe_update_mock / 92.923s] TEST FAILURE 3: ### Test "0045_subscribe_update_mock (do_test_regex_many_mock:378: cooperative-sticky with 300 topics)" failed at /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:4048:test_consumer_poll_no_msgs() at Tue Feb 7 02:31:45 2023: ### 3: ^topic_.* [0] error (offset -1001): Subscribed topic not available: ^topic_.*: Broker: Unknown topic or partition 3: [0105_transactions_mock / 79.608s] rd_kafka_abort_transaction(rk, 5000): duration 0.413ms 3: [0105_transactions_mock / 79.608s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 79.608s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [
/ 94.319s] 3 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 0131_connect_timeout 3: [0105_transactions_mock / 79.710s] send_offsets_to_transaction() #2: 3: [
/ 94.351s] Log: 0131_connect_timeout#producer-222 level 7 fac FAIL: [thrd:127.0.0.1:42833/bootstrap]: 127.0.0.1:42833/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) (_TRANSPORT) 3: [
/ 94.351s] Log: 0131_connect_timeout#producer-222 level 4 fac FAIL: [thrd:127.0.0.1:42833/bootstrap]: 127.0.0.1:42833/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) 3: %3|1675737106.308|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737106.308|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 95.319s] 3 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 0131_connect_timeout 3: [0131_connect_timeout / 8.004s] 0131_connect_timeout: duration 8004.428ms 3: [0131_connect_timeout / 8.004s] ================= Test 0131_connect_timeout PASSED ================= 3: [0116_kafkaconsumer_close / 68.617s] Closing with queue 3: [0116_kafkaconsumer_close / 68.618s] Attempting second close 3: [0116_kafkaconsumer_close / 68.621s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=1: PASS (4.67s) ] 3: [0116_kafkaconsumer_close / 68.621s] 0116_kafkaconsumer_close: duration 68620.852ms 3: [0116_kafkaconsumer_close / 68.621s] ================= Test 0116_kafkaconsumer_close PASSED ================= 3: [
/ 96.319s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 81.711s] rd_kafka_abort_transaction(rk, 5000): duration 0.894ms 3: [0105_transactions_mock / 81.711s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 81.711s] rd_kafka_begin_transaction(rk): duration 0.042ms 3: %3|1675737108.211|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [3] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737108.211|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 81.814s] send_offsets_to_transaction() #3: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 81.814s] send_offsets_to_transaction() #3 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/ 97.319s] 1 test(s) running: 0105_transactions_mock 3: [
/ 98.319s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 83.816s] rd_kafka_abort_transaction(rk, 5000): duration 1.633ms 3: [0105_transactions_mock / 83.816s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 83.816s] rd_kafka_begin_transaction(rk): duration 0.062ms 3: [0105_transactions_mock / 83.917s] send_offsets_to_transaction() #4: 3: %3|1675737110.517|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 2): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737110.517|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 99.320s] 1 test(s) running: 0105_transactions_mock 3: [
/100.320s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 85.918s] rd_kafka_abort_transaction(rk, 5000): duration 0.336ms 3: [0105_transactions_mock / 85.918s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 85.918s] rd_kafka_begin_transaction(rk): duration 0.034ms 3: %3|1675737112.417|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [3] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 2): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737112.417|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 86.020s] send_offsets_to_transaction() #5: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 86.020s] send_offsets_to_transaction() #5 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/101.320s] 1 test(s) running: 0105_transactions_mock 3: [
/102.320s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 88.021s] rd_kafka_abort_transaction(rk, 5000): duration 0.385ms 3: [0105_transactions_mock / 88.021s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 88.021s] rd_kafka_begin_transaction(rk): duration 0.045ms 3: [0105_transactions_mock / 88.123s] send_offsets_to_transaction() #6: 3: %3|1675737114.722|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 4): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737114.722|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/103.320s] 1 test(s) running: 0105_transactions_mock 3: [
/104.320s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 90.123s] rd_kafka_abort_transaction(rk, 5000): duration 0.304ms 3: [0105_transactions_mock / 90.123s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 90.123s] rd_kafka_begin_transaction(rk): duration 0.033ms 3: %3|1675737116.623|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [3] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 4): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737116.623|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 90.226s] send_offsets_to_transaction() #7: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 90.226s] send_offsets_to_transaction() #7 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/105.320s] 1 test(s) running: 0105_transactions_mock 3: [
/106.320s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 92.226s] rd_kafka_abort_transaction(rk, 5000): duration 0.483ms 3: [0105_transactions_mock / 92.226s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 92.226s] rd_kafka_begin_transaction(rk): duration 0.059ms 3: [0105_transactions_mock / 92.328s] send_offsets_to_transaction() #8: 3: %3|1675737118.927|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 6): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737118.927|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/107.321s] 1 test(s) running: 0105_transactions_mock 3: [
/108.321s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 94.328s] rd_kafka_abort_transaction(rk, 5000): duration 0.471ms 3: [0105_transactions_mock / 94.328s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 94.328s] rd_kafka_begin_transaction(rk): duration 0.051ms 3: %3|1675737120.828|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 1): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737120.828|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 94.430s] send_offsets_to_transaction() #9: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 94.430s] send_offsets_to_transaction() #9 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/109.321s] 1 test(s) running: 0105_transactions_mock 3: [
/110.321s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 96.430s] rd_kafka_abort_transaction(rk, 5000): duration 0.383ms 3: [0105_transactions_mock / 96.430s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 96.430s] rd_kafka_begin_transaction(rk): duration 0.032ms 3: [0105_transactions_mock / 96.531s] send_offsets_to_transaction() #10: 3: %3|1675737123.131|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 8): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737123.131|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/111.321s] 1 test(s) running: 0105_transactions_mock 3: [
/112.321s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 98.532s] rd_kafka_abort_transaction(rk, 5000): duration 0.363ms 3: [0105_transactions_mock / 98.532s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 98.532s] rd_kafka_begin_transaction(rk): duration 0.033ms 3: %3|1675737125.030|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 10): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737125.031|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 98.633s] send_offsets_to_transaction() #11: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 98.633s] send_offsets_to_transaction() #11 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/113.321s] 1 test(s) running: 0105_transactions_mock 3: [
/114.321s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /100.634s] rd_kafka_abort_transaction(rk, 5000): duration 0.463ms 3: [0105_transactions_mock /100.634s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /100.634s] rd_kafka_begin_transaction(rk): duration 0.043ms 3: [
/115.322s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /100.736s] send_offsets_to_transaction() #12: 3: %3|1675737127.334|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 3): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737127.334|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/116.322s] 1 test(s) running: 0105_transactions_mock 3: [
/117.322s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /102.737s] rd_kafka_abort_transaction(rk, 5000): duration 0.504ms 3: [0105_transactions_mock /102.737s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /102.737s] rd_kafka_begin_transaction(rk): duration 0.045ms 3: %3|1675737129.239|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [3] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 6): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737129.239|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock /102.839s] send_offsets_to_transaction() #13: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock /102.839s] send_offsets_to_transaction() #13 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/118.322s] 1 test(s) running: 0105_transactions_mock 3: [
/119.322s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /104.840s] rd_kafka_abort_transaction(rk, 5000): duration 0.435ms 3: [0105_transactions_mock /104.840s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /104.840s] rd_kafka_begin_transaction(rk): duration 0.188ms 3: [0105_transactions_mock /104.943s] send_offsets_to_transaction() #14: 3: %3|1675737131.541|TXNERR|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:38810000,Epoch:0}, base seq 12): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1675737131.541|PARTCNT|0105_transactions_mock#producer-226| [thrd:127.0.0.1:32769/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/120.322s] 1 test(s) running: 0105_transactions_mock 3: [
/121.322s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /106.943s] rd_kafka_abort_transaction(rk, 5000): duration 0.396ms 3: [0105_transactions_mock /106.944s] [ do_test_txn_coord_req_multi_find:2064 ] 3: [0105_transactions_mock /106.944s] Test config file test.conf not found 3: [0105_transactions_mock /106.944s] Setting test timeout to 60s * 2.7 3: %5|1675737133.401|MOCK|0105_transactions_mock#producer-232| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43825,127.0.0.1:37301,127.0.0.1:33351 3: [0105_transactions_mock /106.948s] Created kafka instance 0105_transactions_mock#producer-232 3: [0105_transactions_mock /106.970s] rd_kafka_init_transactions(rk, 5000): duration 14.130ms 3: [0105_transactions_mock /106.970s] rd_kafka_begin_transaction(rk): duration 0.017ms 3: [0105_transactions_mock /106.970s] 0105_transactions_mock#producer-232: Flushing 3 messages 3: [
/122.322s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /107.947s] FLUSH: duration 976.347ms 3: [
/123.323s] 1 test(s) running: 0105_transactions_mock 3: [
/124.323s] 1 test(s) running: 0105_transactions_mock 3: [
/125.323s] 1 test(s) running: 0105_transactions_mock 3: [
/126.323s] 1 test(s) running: 0105_transactions_mock 3: [
/126.586s] on_response_received_cb: 0105_transactions_mock#producer-232: TxnCoordinator/1: brokerid 1, ApiKey 25, CorrId 0, rtt 4004.65ms, not done yet: NO_ERROR 3: [
/126.586s] on_response_received_cb: 0105_transactions_mock#producer-232: TxnCoordinator/1: brokerid 1, ApiKey 25, CorrId 0, rtt 4004.65ms, not done yet: NO_ERROR 3: %6|1675737138.409|FAIL|0105_transactions_mock#producer-232| [thrd:127.0.0.1:37301/bootstrap]: 127.0.0.1:37301/2: Disconnected (after 5000ms in state UP) 3: %6|1675737138.409|FAIL|0105_transactions_mock#producer-232| [thrd:127.0.0.1:33351/bootstrap]: 127.0.0.1:33351/3: Disconnected (after 4981ms in state UP) 3: [
/127.323s] 1 test(s) running: 0105_transactions_mock 3: [
/128.323s] 1 test(s) running: 0105_transactions_mock 3: [
/129.323s] 1 test(s) running: 0105_transactions_mock 3: [
/130.323s] 1 test(s) running: 0105_transactions_mock 3: [
/131.324s] 1 test(s) running: 0105_transactions_mock 3: [
/132.324s] 1 test(s) running: 0105_transactions_mock 3: [
/133.324s] 1 test(s) running: 0105_transactions_mock 3: [
/134.324s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /120.036s] send_offsets_to_transaction() 3: [
/135.324s] 1 test(s) running: 0105_transactions_mock 3: [
/136.324s] 1 test(s) running: 0105_transactions_mock 3: [
/137.324s] 1 test(s) running: 0105_transactions_mock 3: [
/138.324s] 1 test(s) running: 0105_transactions_mock 3: [
/139.324s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /125.036s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:37301/2: Disconnected (after 5000ms in state UP) 3: [0105_transactions_mock /125.036s] 0105_transactions_mock#producer-232 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:37301/2: Disconnected (after 5000ms in state UP) 3: [0105_transactions_mock /125.036s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:33351/3: Disconnected (after 4981ms in state UP) 3: [0105_transactions_mock /125.036s] 0105_transactions_mock#producer-232 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:33351/3: Disconnected (after 4981ms in state UP) 3: [0105_transactions_mock /125.036s] rd_kafka_commit_transaction(rk, 5000): duration 0.529ms 3: [0105_transactions_mock /125.037s] [ do_test_txn_coord_req_multi_find:2064: PASS (18.09s) ] 3: [0105_transactions_mock /125.037s] [ do_test_txn_addparts_req_multi:2209 ] 3: [0105_transactions_mock /125.037s] Test config file test.conf not found 3: [0105_transactions_mock /125.037s] Setting test timeout to 60s * 2.7 3: %5|1675737151.495|MOCK|0105_transactions_mock#producer-233| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44309,127.0.0.1:37153,127.0.0.1:36953 3: [0105_transactions_mock /125.042s] Created kafka instance 0105_transactions_mock#producer-233 3: [0105_transactions_mock /125.094s] rd_kafka_init_transactions(rk, 5000): duration 50.817ms 3: [0105_transactions_mock /125.094s] Running seed transaction 3: [0105_transactions_mock /125.098s] rd_kafka_begin_transaction(rk): duration 3.961ms 3: [0105_transactions_mock /125.098s] rd_kafka_producev(rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { void * __t __attribute__((unused)) = ("seed"); size_t __t2 __attribute__((unused)) = (4); } RD_KAFKA_VTYPE_VALUE; }), (void *)"seed", (size_t)4, RD_KAFKA_VTYPE_END): duration 0.025ms 3: [
/140.325s] 1 test(s) running: 0105_transactions_mock 3: [
/140.676s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 0.09ms, count 0: NO_ERROR 3: [
/140.676s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 0.09ms, count 1: NO_ERROR 3: [0105_transactions_mock /126.046s] rd_kafka_commit_transaction(rk, 5000): duration 947.964ms 3: [0105_transactions_mock /126.046s] Running test transaction 3: [0105_transactions_mock /126.046s] rd_kafka_begin_transaction(rk): duration 0.056ms 3: [0105_transactions_mock /126.046s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.014ms 3: [0105_transactions_mock /126.546s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (1); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)1, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.039ms 3: [0105_transactions_mock /126.546s] Waiting for two AddPartitionsToTxnResponse 3: [
/141.325s] 1 test(s) running: 0105_transactions_mock 3: [
/141.688s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1007.06ms, count 0: NO_ERROR 3: [
/141.688s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1007.06ms, count 1: NO_ERROR 3: [0105_transactions_mock /127.064s] 2 AddPartitionsToTxnResponses seen 3: [0105_transactions_mock /127.064s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)2, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.024ms 3: [
/142.325s] 1 test(s) running: 0105_transactions_mock 3: [
/142.694s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.64ms, count 2: NO_ERROR 3: [
/142.694s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.64ms, count 3: NO_ERROR 3: [
/143.325s] 1 test(s) running: 0105_transactions_mock 3: [
/143.700s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.84ms, count 4: NO_ERROR 3: [
/143.700s] on_response_received_cb: 0105_transactions_mock#producer-233: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.84ms, count 5: NO_ERROR 3: [
/144.325s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /130.074s] rd_kafka_commit_transaction(rk, 10 * 1000): duration 2010.522ms 3: [0105_transactions_mock /130.079s] [ do_test_txn_addparts_req_multi:2209: PASS (5.04s) ] 3: [0105_transactions_mock /130.079s] [ do_test_txns_no_timeout_crash:1615 ] 3: [0105_transactions_mock /130.079s] Test config file test.conf not found 3: [0105_transactions_mock /130.079s] Setting test timeout to 60s * 2.7 3: %5|1675737156.536|MOCK|0105_transactions_mock#producer-234| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:34129,127.0.0.1:45663,127.0.0.1:39589 3: [0105_transactions_mock /130.084s] Created kafka instance 0105_transactions_mock#producer-234 3: [0105_transactions_mock /130.122s] rd_kafka_init_transactions(rk, 5000): duration 37.964ms 3: [0105_transactions_mock /130.126s] rd_kafka_begin_transaction(rk): duration 3.964ms 3: [0105_transactions_mock /130.126s] 0105_transactions_mock#producer-234: Flushing 1 messages 3: [
/145.325s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /131.085s] FLUSH: duration 958.893ms 3: [
/146.325s] 1 test(s) running: 0105_transactions_mock 3: %5|1675737158.547|REQTMOUT|0105_transactions_mock#producer-234| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out AddOffsetsToTxnRequest in flight (after 1004ms, timeout #0) 3: %4|1675737158.547|REQTMOUT|0105_transactions_mock#producer-234| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1675737158.547|FAIL|0105_transactions_mock#producer-234| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:34129: 1 request(s) timed out: disconnect (after 1982ms in state UP) 3: [
/147.325s] 1 test(s) running: 0105_transactions_mock 3: %5|1675737159.552|REQTMOUT|0105_transactions_mock#producer-234| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1675737159.552|FAIL|0105_transactions_mock#producer-234| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:34129: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1675737159.552|REQTMOUT|0105_transactions_mock#producer-234| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %5|1675737159.552|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: Timed out MetadataRequest in flight (after 1004ms, timeout #0) 3: %4|1675737159.552|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1675737159.552|FAIL|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: 1 request(s) timed out: disconnect (after 2010ms in state UP) 3: [
/148.326s] 1 test(s) running: 0105_transactions_mock 3: %5|1675737160.557|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1675737160.557|FAIL|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1675737160.557|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %5|1675737160.557|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:45663/bootstrap]: 127.0.0.1:45663/2: Timed out MetadataRequest in flight (after 1004ms, timeout #0) 3: %4|1675737160.557|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:45663/bootstrap]: 127.0.0.1:45663/2: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1675737160.557|FAIL|0105_transactions_mock#producer-234| [thrd:127.0.0.1:45663/bootstrap]: 127.0.0.1:45663/2: 1 request(s) timed out: disconnect (after 4009ms in state UP) 3: [
/149.326s] 1 test(s) running: 0105_transactions_mock 3: %4|1675737161.556|REQTMOUT|0105_transactions_mock#producer-234| [thrd:TxnCoordinator]: TxnCoordinator: Timed out 0 in-flight, 0 retry-queued, 1 out-queue, 0 partially-sent requests 3: %5|1675737161.562|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:34129/bootstrap]: 127.0.0.1:34129/1: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %5|1675737161.562|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:45663/bootstrap]: 127.0.0.1:45663/2: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1675737161.562|FAIL|0105_transactions_mock#producer-234| [thrd:127.0.0.1:34129/bootstrap]: 127.0.0.1:34129/1: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1675737161.562|FAIL|0105_transactions_mock#producer-234| [thrd:127.0.0.1:45663/bootstrap]: 127.0.0.1:45663/2: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1675737161.562|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:34129/bootstrap]: 127.0.0.1:34129/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %4|1675737161.562|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:45663/bootstrap]: 127.0.0.1:45663/2: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %5|1675737162.071|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1675737162.071|FAIL|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY, 1 identical error(s) suppressed) 3: %4|1675737162.071|REQTMOUT|0105_transactions_mock#producer-234| [thrd:127.0.0.1:39589/bootstrap]: 127.0.0.1:39589/3: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: [
/150.326s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /136.085s] send_offsets..() failed with retriable error: Transactional operation timed out 3: [0105_transactions_mock /136.085s] Retrying send_offsets..() 3: %3|1675737162.557|ADDOFFSETS|0105_transactions_mock#producer-234| [thrd:main]: TxnCoordinator/1: Failed to add offsets to transaction on broker TxnCoordinator/1: Local: Outdated 3: [0105_transactions_mock /136.244s] [ do_test_txns_no_timeout_crash:1615: PASS (6.16s) ] 3: [0105_transactions_mock /136.244s] [ do_test_txn_auth_failure:1690: ApiKey=InitProducerId ErrorCode=CLUSTER_AUTHORIZATION_FAILED ] 3: [0105_transactions_mock /136.244s] Test config file test.conf not found 3: [0105_transactions_mock /136.244s] Setting test timeout to 60s * 2.7 3: %5|1675737162.701|MOCK|0105_transactions_mock#producer-235| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39917,127.0.0.1:40865,127.0.0.1:37331 3: [0105_transactions_mock /136.248s] Created kafka instance 0105_transactions_mock#producer-235 3: %1|1675737162.747|TXNERR|0105_transactions_mock#producer-235| [thrd:main]: Fatal transaction error: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed (CLUSTER_AUTHORIZATION_FAILED) 3: %0|1675737162.747|FATAL|0105_transactions_mock#producer-235| [thrd:main]: Fatal error: Broker: Cluster authorization failed: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.291s] init_transactions() failed: CLUSTER_AUTHORIZATION_FAILED: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.305s] [ do_test_txn_auth_failure:1690: ApiKey=InitProducerId ErrorCode=CLUSTER_AUTHORIZATION_FAILED: PASS (0.06s) ] 3: [0105_transactions_mock /136.305s] [ do_test_txn_auth_failure:1690: ApiKey=FindCoordinator ErrorCode=CLUSTER_AUTHORIZATION_FAILED ] 3: [0105_transactions_mock /136.305s] Test config file test.conf not found 3: [0105_transactions_mock /136.305s] Setting test timeout to 60s * 2.7 3: %5|1675737162.762|MOCK|0105_transactions_mock#producer-236| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35831,127.0.0.1:34513,127.0.0.1:41537 3: [0105_transactions_mock /136.309s] Created kafka instance 0105_transactions_mock#producer-236 3: %1|1675737162.795|TXNERR|0105_transactions_mock#producer-236| [thrd:main]: Fatal transaction error: Failed to find transaction coordinator: 127.0.0.1:35831/1: Broker: Cluster authorization failed: Broker: Cluster authorization failed (CLUSTER_AUTHORIZATION_FAILED) 3: %0|1675737162.795|FATAL|0105_transactions_mock#producer-236| [thrd:main]: Fatal error: Broker: Cluster authorization failed: Failed to find transaction coordinator: 127.0.0.1:35831/1: Broker: Cluster authorization failed: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.338s] init_transactions() failed: CLUSTER_AUTHORIZATION_FAILED: Failed to find transaction coordinator: 127.0.0.1:35831/1: Broker: Cluster authorization failed: Broker: Cluster authorization failed 3: [0105_transactions_mock /136.355s] [ do_test_txn_auth_failure:1690: ApiKey=FindCoordinator ErrorCode=CLUSTER_AUTHORIZATION_FAILED: PASS (0.05s) ] 3: [0105_transactions_mock /136.355s] [ do_test_txn_flush_timeout:1737 ] 3: [0105_transactions_mock /136.355s] Test config file test.conf not found 3: [0105_transactions_mock /136.355s] Setting test timeout to 60s * 2.7 3: %5|1675737162.812|MOCK|0105_transactions_mock#producer-237| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:41029,127.0.0.1:42363,127.0.0.1:37851 3: [0105_transactions_mock /136.359s] Created kafka instance 0105_transactions_mock#producer-237 3: [0105_transactions_mock /136.386s] rd_kafka_init_transactions(rk, 5000): duration 19.040ms 3: [0105_transactions_mock /136.390s] rd_kafka_begin_transaction(rk): duration 3.040ms 3: [0105_transactions_mock /136.390s] Test config file test.conf not found 3: [0105_transactions_mock /136.390s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /136.390s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /136.390s] PRODUCE: duration 0.080ms 3: [0105_transactions_mock /136.390s] Test config file test.conf not found 3: [0105_transactions_mock /136.390s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /136.390s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /136.390s] PRODUCE: duration 0.066ms 3: [0105_transactions_mock /136.390s] Test config file test.conf not found 3: [0105_transactions_mock /136.390s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /136.390s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /136.390s] PRODUCE: duration 0.071ms 3: [0105_transactions_mock /136.405s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 14.568ms 3: [
/151.326s] 1 test(s) running: 0105_transactions_mock 3: [
/152.326s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /138.405s] Disconnecting transaction coordinator 2 3: %6|1675737164.862|FAIL|0105_transactions_mock#producer-237| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:42363: Disconnected (after 2023ms in state UP) 3: %3|1675737164.862|FAIL|0105_transactions_mock#producer-237| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:42363: Connect to ipv4#127.0.0.1:42363 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /138.406s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:42363: Disconnected (after 2023ms in state UP) 3: [0105_transactions_mock /138.406s] 0105_transactions_mock#producer-237 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:42363: Disconnected (after 2023ms in state UP) 3: [0105_transactions_mock /138.406s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:42363: Connect to ipv4#127.0.0.1:42363 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /138.406s] 0105_transactions_mock#producer-237 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:42363: Connect to ipv4#127.0.0.1:42363 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737165.148|FAIL|0105_transactions_mock#producer-237| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:42363: Connect to ipv4#127.0.0.1:42363 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /138.691s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:42363: Connect to ipv4#127.0.0.1:42363 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /138.691s] 0105_transactions_mock#producer-237 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:42363: Connect to ipv4#127.0.0.1:42363 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/153.326s] 1 test(s) running: 0105_transactions_mock 3: [
/154.326s] 1 test(s) running: 0105_transactions_mock 3: [
/155.326s] 1 test(s) running: 0105_transactions_mock 3: [
/156.327s] 1 test(s) running: 0105_transactions_mock 3: [
/157.327s] 1 test(s) running: 0105_transactions_mock 3: [
/158.327s] 1 test(s) running: 0105_transactions_mock 3: [
/159.327s] 1 test(s) running: 0105_transactions_mock 3: [
/160.327s] 1 test(s) running: 0105_transactions_mock 3: %3|1675737172.847|TXNERR|0105_transactions_mock#producer-237| [thrd:main]: Current transaction failed in state BeginCommit: 300 message(s) failed delivery (see individual delivery reports) (_INCONSISTENT) 3: [0105_transactions_mock /146.391s] commit_transaction() failed (expectedly): 300 message(s) failed delivery (see individual delivery reports) 3: [
/161.327s] 1 test(s) running: 0105_transactions_mock 3: [
/162.327s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /148.391s] Aborting and retrying 3: [0105_transactions_mock /148.391s] rd_kafka_abort_transaction(rk, 60000): duration 0.339ms 3: [0105_transactions_mock /148.391s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock /148.391s] Test config file test.conf not found 3: [0105_transactions_mock /148.391s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /148.391s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /148.391s] PRODUCE: duration 0.083ms 3: [0105_transactions_mock /148.391s] Test config file test.conf not found 3: [0105_transactions_mock /148.391s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /148.391s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /148.391s] PRODUCE: duration 0.069ms 3: [0105_transactions_mock /148.391s] Test config file test.conf not found 3: [0105_transactions_mock /148.391s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /148.391s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /148.391s] PRODUCE: duration 0.066ms 3: [0105_transactions_mock /148.393s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 1.890ms 3: [
/163.327s] 1 test(s) running: 0105_transactions_mock 3: [
/164.328s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /150.394s] [ do_test_txn_flush_timeout:1737: PASS (14.04s) ] 3: [0105_transactions_mock /150.394s] [ do_test_unstable_offset_commit:2320 ] 3: [0105_transactions_mock /150.394s] Test config file test.conf not found 3: [0105_transactions_mock /150.394s] Setting test timeout to 60s * 2.7 3: %5|1675737176.851|MOCK|0105_transactions_mock#producer-238| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35167,127.0.0.1:34447,127.0.0.1:43797 3: [0105_transactions_mock /150.400s] Created kafka instance 0105_transactions_mock#producer-238 3: [0105_transactions_mock /150.400s] Test config file test.conf not found 3: [0105_transactions_mock /150.401s] Created kafka instance 0105_transactions_mock#consumer-239 3: [0105_transactions_mock /150.407s] rd_kafka_init_transactions(rk, -1): duration 2.930ms 3: [0105_transactions_mock /150.407s] rd_kafka_begin_transaction(rk): duration 0.022ms 3: [0105_transactions_mock /150.407s] Test config file test.conf not found 3: [0105_transactions_mock /150.407s] Produce to mytopic [0]: messages #0..100 3: [0105_transactions_mock /150.407s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /150.407s] PRODUCE: duration 0.081ms 3: [0105_transactions_mock /150.409s] rd_kafka_commit_transaction(rk, -1): duration 2.672ms 3: [0105_transactions_mock /150.410s] rd_kafka_commit(c, offsets, 0 ): duration 0.279ms 3: [
/165.328s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /151.013s] #0: committed() returned NO_ERROR (expected NO_ERROR) 3: [0105_transactions_mock /151.213s] #1: committed() returned _TIMED_OUT (expected _TIMED_OUT) 3: [0105_transactions_mock /151.213s] Phase 2: OffsetFetch lookup through assignment 3: [0105_transactions_mock /151.214s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.327ms 3: [0105_transactions_mock /151.214s] assign: incremental assign of 1 partition(s) done 3: [0105_transactions_mock /151.214s] consume: consume exactly 50 messages 3: [
/166.328s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /152.623s] mytopic [0] reached EOF at offset 100 3: [0105_transactions_mock /152.623s] CONSUME: duration 1409.520ms 3: [0105_transactions_mock /152.623s] consume: consumed 50/50 messages (1/1 EOFs) 3: [0105_transactions_mock /152.625s] [ do_test_unstable_offset_commit:2320: PASS (2.23s) ] 3: [0105_transactions_mock /152.625s] [ do_test_commit_after_msg_timeout:2447 ] 3: [0105_transactions_mock /152.625s] Test config file test.conf not found 3: [0105_transactions_mock /152.625s] Setting test timeout to 60s * 2.7 3: %5|1675737179.082|MOCK|0105_transactions_mock#producer-240| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:36721,127.0.0.1:41549,127.0.0.1:40357 3: [0105_transactions_mock /152.629s] Created kafka instance 0105_transactions_mock#producer-240 3: [0105_transactions_mock /152.638s] Starting transaction 3: [0105_transactions_mock /152.681s] rd_kafka_init_transactions(rk, -1): duration 43.001ms 3: [0105_transactions_mock /152.685s] rd_kafka_begin_transaction(rk): duration 3.967ms 3: [0105_transactions_mock /152.685s] Bringing down 2 3: [0105_transactions_mock /152.690s] Test config file test.conf not found 3: [0105_transactions_mock /152.690s] Produce to test [0]: messages #0..1 3: [0105_transactions_mock /152.690s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /152.690s] PRODUCE: duration 0.014ms 3: %6|1675737179.147|FAIL|0105_transactions_mock#producer-240| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:36721: Disconnected (after 23ms in state UP) 3: [0105_transactions_mock /152.691s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:36721: Disconnected (after 23ms in state UP) 3: [0105_transactions_mock /152.691s] 0105_transactions_mock#producer-240 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:36721: Disconnected (after 23ms in state UP) 3: %6|1675737179.148|FAIL|0105_transactions_mock#producer-240| [thrd:127.0.0.1:36721/bootstrap]: 127.0.0.1:36721/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 49ms in state UP) 3: [0105_transactions_mock /152.691s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:36721/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 49ms in state UP) 3: [0105_transactions_mock /152.691s] 0105_transactions_mock#producer-240 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:36721/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 49ms in state UP) 3: %3|1675737179.149|FAIL|0105_transactions_mock#producer-240| [thrd:127.0.0.1:41549/bootstrap]: 127.0.0.1:41549/2: Connect to ipv4#127.0.0.1:41549 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.692s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:41549/2: Connect to ipv4#127.0.0.1:41549 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.692s] 0105_transactions_mock#producer-240 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:41549/2: Connect to ipv4#127.0.0.1:41549 failed: Connection refused (after 0ms in state CONNECT) 3: [
/167.328s] 1 test(s) running: 0105_transactions_mock 3: %3|1675737179.226|FAIL|0105_transactions_mock#producer-240| [thrd:127.0.0.1:36721/bootstrap]: 127.0.0.1:36721/1: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.769s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:36721/1: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.770s] 0105_transactions_mock#producer-240 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:36721/1: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737179.247|FAIL|0105_transactions_mock#producer-240| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:36721: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.790s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:36721: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /152.790s] 0105_transactions_mock#producer-240 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:36721: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737179.689|FAIL|0105_transactions_mock#producer-240| [thrd:127.0.0.1:36721/bootstrap]: 127.0.0.1:36721/1: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /153.232s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:36721/1: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /153.232s] 0105_transactions_mock#producer-240 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:36721/1: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: %3|1675737179.840|FAIL|0105_transactions_mock#producer-240| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:36721: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /153.383s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:36721: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /153.383s] 0105_transactions_mock#producer-240 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:36721: Connect to ipv4#127.0.0.1:36721 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/168.328s] 1 test(s) running: 0105_transactions_mock 3: [
/169.328s] 1 test(s) running: 0105_transactions_mock 3: [
/170.328s] 1 test(s) running: 0105_transactions_mock 3: [
/171.329s] 1 test(s) running: 0105_transactions_mock 3: [
/172.329s] 1 test(s) running: 0105_transactions_mock 3: [
/173.329s] 1 test(s) running: 0105_transactions_mock 3: [
/174.329s] 1 test(s) running: 0105_transactions_mock 3: [
/175.329s] 1 test(s) running: 0105_transactions_mock 3: [
/176.329s] 1 test(s) running: 0105_transactions_mock 3: [
/177.329s] 1 test(s) running: 0105_transactions_mock 3: %3|1675737189.620|TXNERR|0105_transactions_mock#producer-240| [thrd:127.0.0.1:41549/bootstrap]: Current transaction failed in state BeginCommit: 1 message(s) timed out on test [0] (_TIMED_OUT, requires epoch bump) 3: [0105_transactions_mock /163.164s] commit_transaction() failed (as expected): 1 message(s) timed out on test [0] 3: [
/178.329s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /164.168s] Aborting transaction 3: [0105_transactions_mock /164.437s] rd_kafka_abort_transaction(rk, -1): duration 268.707ms 3: [0105_transactions_mock /164.437s] Attempting second transaction, which should succeed 3: [0105_transactions_mock /164.437s] rd_kafka_begin_transaction(rk): duration 0.030ms 3: [0105_transactions_mock /164.437s] Test config file test.conf not found 3: [0105_transactions_mock /164.437s] Produce to test [0]: messages #0..1 3: [0105_transactions_mock /164.437s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /164.437s] PRODUCE: duration 0.014ms 3: [0105_transactions_mock /164.438s] rd_kafka_commit_transaction(rk, -1): duration 1.494ms 3: [0105_transactions_mock /164.439s] [ do_test_commit_after_msg_timeout:2447: PASS (11.81s) ] 3: [0105_transactions_mock /164.439s] Setting test timeout to 200s * 2.7 3: [0105_transactions_mock /164.439s] [ do_test_txn_switch_coordinator:1366: Test switching coordinators ] 3: [0105_transactions_mock /164.439s] Test config file test.conf not found 3: [0105_transactions_mock /164.439s] Setting test timeout to 60s * 2.7 3: %5|1675737190.896|MOCK|0105_transactions_mock#producer-241| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40129,127.0.0.1:40251,127.0.0.1:39225,127.0.0.1:34035,127.0.0.1:44671 3: [0105_transactions_mock /164.443s] Created kafka instance 0105_transactions_mock#producer-241 3: [0105_transactions_mock /164.443s] Starting transaction 3: [0105_transactions_mock /164.454s] rd_kafka_init_transactions(rk, 5000): duration 11.312ms 3: [0105_transactions_mock /164.454s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /164.454s] rd_kafka_begin_transaction(rk): duration 0.018ms 3: [0105_transactions_mock /164.454s] Test config file test.conf not found 3: [0105_transactions_mock /164.454s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /164.454s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /164.454s] PRODUCE: duration 0.040ms 3: [
/179.329s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /164.957s] PRODUCE.DELIVERY.WAIT: duration 502.853ms 3: [0105_transactions_mock /164.957s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /164.957s] Test config file test.conf not found 3: [0105_transactions_mock /164.957s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /164.958s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /164.958s] PRODUCE: duration 0.068ms 3: [0105_transactions_mock /164.958s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /165.560s] rd_kafka_commit_transaction(rk, -1): duration 602.696ms 3: [0105_transactions_mock /165.560s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /165.560s] rd_kafka_begin_transaction(rk): duration 0.071ms 3: [0105_transactions_mock /165.560s] Test config file test.conf not found 3: [0105_transactions_mock /165.560s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /165.561s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /165.561s] PRODUCE: duration 0.079ms 3: [
/180.329s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /166.063s] PRODUCE.DELIVERY.WAIT: duration 502.959ms 3: [0105_transactions_mock /166.064s] Test config file test.conf not found 3: [0105_transactions_mock /166.064s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /166.064s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /166.064s] PRODUCE: duration 0.062ms 3: [0105_transactions_mock /166.064s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /166.663s] rd_kafka_abort_transaction(rk, -1): duration 599.546ms 3: [0105_transactions_mock /166.663s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /166.663s] rd_kafka_begin_transaction(rk): duration 0.053ms 3: [0105_transactions_mock /166.663s] Test config file test.conf not found 3: [0105_transactions_mock /166.663s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /166.663s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /166.663s] PRODUCE: duration 0.072ms 3: [
/181.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /167.165s] PRODUCE.DELIVERY.WAIT: duration 501.509ms 3: [0105_transactions_mock /167.165s] Test config file test.conf not found 3: [0105_transactions_mock /167.165s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /167.165s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /167.165s] PRODUCE: duration 0.062ms 3: [0105_transactions_mock /167.166s] rd_kafka_abort_transaction(rk, -1): duration 0.915ms 3: [0105_transactions_mock /167.166s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /167.166s] rd_kafka_begin_transaction(rk): duration 0.057ms 3: [0105_transactions_mock /167.166s] Test config file test.conf not found 3: [0105_transactions_mock /167.166s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /167.166s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /167.166s] PRODUCE: duration 0.053ms 3: [0105_transactions_mock /167.668s] PRODUCE.DELIVERY.WAIT: duration 501.548ms 3: [0105_transactions_mock /167.668s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /167.668s] Test config file test.conf not found 3: [0105_transactions_mock /167.668s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /167.668s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /167.668s] PRODUCE: duration 0.059ms 3: [0105_transactions_mock /167.668s] Changing transaction coordinator from 4 to 5 3: [
/182.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /168.168s] rd_kafka_abort_transaction(rk, -1): duration 499.948ms 3: [0105_transactions_mock /168.168s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /168.168s] rd_kafka_begin_transaction(rk): duration 0.065ms 3: [0105_transactions_mock /168.168s] Test config file test.conf not found 3: [0105_transactions_mock /168.168s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /168.168s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /168.168s] PRODUCE: duration 0.087ms 3: [0105_transactions_mock /168.671s] PRODUCE.DELIVERY.WAIT: duration 502.405ms 3: [0105_transactions_mock /168.671s] Test config file test.conf not found 3: [0105_transactions_mock /168.671s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /168.671s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /168.671s] PRODUCE: duration 0.077ms 3: [0105_transactions_mock /168.671s] rd_kafka_abort_transaction(rk, -1): duration 0.214ms 3: [0105_transactions_mock /168.671s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /168.671s] rd_kafka_begin_transaction(rk): duration 0.042ms 3: [0105_transactions_mock /168.671s] Test config file test.conf not found 3: [0105_transactions_mock /168.671s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /168.671s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /168.671s] PRODUCE: duration 0.065ms 3: [
/183.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /169.174s] PRODUCE.DELIVERY.WAIT: duration 502.450ms 3: [0105_transactions_mock /169.174s] Test config file test.conf not found 3: [0105_transactions_mock /169.174s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /169.174s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /169.174s] PRODUCE: duration 0.067ms 3: [0105_transactions_mock /169.174s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /169.666s] rd_kafka_commit_transaction(rk, -1): duration 492.158ms 3: [0105_transactions_mock /169.666s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /169.666s] rd_kafka_begin_transaction(rk): duration 0.068ms 3: [0105_transactions_mock /169.666s] Test config file test.conf not found 3: [0105_transactions_mock /169.666s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /169.666s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /169.666s] PRODUCE: duration 0.075ms 3: [
/184.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /170.169s] PRODUCE.DELIVERY.WAIT: duration 502.431ms 3: [0105_transactions_mock /170.169s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /170.169s] Test config file test.conf not found 3: [0105_transactions_mock /170.169s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /170.169s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /170.169s] PRODUCE: duration 0.060ms 3: [0105_transactions_mock /170.666s] rd_kafka_abort_transaction(rk, -1): duration 497.572ms 3: [0105_transactions_mock /170.666s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /170.666s] rd_kafka_begin_transaction(rk): duration 0.062ms 3: [0105_transactions_mock /170.666s] Test config file test.conf not found 3: [0105_transactions_mock /170.666s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /170.666s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /170.666s] PRODUCE: duration 0.078ms 3: [
/185.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /171.169s] PRODUCE.DELIVERY.WAIT: duration 502.451ms 3: [0105_transactions_mock /171.169s] Test config file test.conf not found 3: [0105_transactions_mock /171.169s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /171.169s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /171.169s] PRODUCE: duration 0.082ms 3: [0105_transactions_mock /171.169s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /171.667s] rd_kafka_abort_transaction(rk, -1): duration 498.187ms 3: [0105_transactions_mock /171.667s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /171.667s] rd_kafka_begin_transaction(rk): duration 0.071ms 3: [0105_transactions_mock /171.667s] Test config file test.conf not found 3: [0105_transactions_mock /171.667s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /171.668s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /171.668s] PRODUCE: duration 0.082ms 3: [
/186.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /172.169s] PRODUCE.DELIVERY.WAIT: duration 501.674ms 3: [0105_transactions_mock /172.169s] Test config file test.conf not found 3: [0105_transactions_mock /172.169s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /172.169s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /172.169s] PRODUCE: duration 0.086ms 3: [0105_transactions_mock /172.169s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /172.668s] rd_kafka_abort_transaction(rk, -1): duration 498.637ms 3: [0105_transactions_mock /172.668s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /172.668s] rd_kafka_begin_transaction(rk): duration 0.072ms 3: [0105_transactions_mock /172.668s] Test config file test.conf not found 3: [0105_transactions_mock /172.668s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /172.668s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /172.668s] PRODUCE: duration 0.082ms 3: [
/187.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /173.170s] PRODUCE.DELIVERY.WAIT: duration 501.609ms 3: [0105_transactions_mock /173.170s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /173.170s] Test config file test.conf not found 3: [0105_transactions_mock /173.170s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /173.170s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /173.170s] PRODUCE: duration 0.062ms 3: [0105_transactions_mock /173.170s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /173.668s] rd_kafka_abort_transaction(rk, -1): duration 498.177ms 3: [0105_transactions_mock /173.668s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /173.668s] rd_kafka_begin_transaction(rk): duration 0.050ms 3: [0105_transactions_mock /173.668s] Test config file test.conf not found 3: [0105_transactions_mock /173.668s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /173.668s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /173.668s] PRODUCE: duration 0.073ms 3: [
/188.330s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /174.171s] PRODUCE.DELIVERY.WAIT: duration 502.566ms 3: [0105_transactions_mock /174.171s] Test config file test.conf not found 3: [0105_transactions_mock /174.171s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /174.171s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /174.171s] PRODUCE: duration 0.077ms 3: [0105_transactions_mock /174.172s] rd_kafka_commit_transaction(rk, -1): duration 0.337ms 3: [0105_transactions_mock /174.172s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /174.172s] rd_kafka_begin_transaction(rk): duration 0.044ms 3: [0105_transactions_mock /174.172s] Test config file test.conf not found 3: [0105_transactions_mock /174.172s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /174.172s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /174.172s] PRODUCE: duration 0.067ms 3: [0105_transactions_mock /174.674s] PRODUCE.DELIVERY.WAIT: duration 502.536ms 3: [0105_transactions_mock /174.674s] Test config file test.conf not found 3: [0105_transactions_mock /174.674s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /174.674s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /174.674s] PRODUCE: duration 0.062ms 3: [0105_transactions_mock /174.674s] Changing transaction coordinator from 4 to 5 3: [
/189.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /175.170s] rd_kafka_abort_transaction(rk, -1): duration 495.872ms 3: [0105_transactions_mock /175.170s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /175.170s] rd_kafka_begin_transaction(rk): duration 0.096ms 3: [0105_transactions_mock /175.171s] Test config file test.conf not found 3: [0105_transactions_mock /175.171s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /175.171s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /175.171s] PRODUCE: duration 0.085ms 3: [0105_transactions_mock /175.675s] PRODUCE.DELIVERY.WAIT: duration 504.162ms 3: [0105_transactions_mock /175.675s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /175.675s] Test config file test.conf not found 3: [0105_transactions_mock /175.675s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /175.675s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /175.675s] PRODUCE: duration 0.082ms 3: [
/190.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /176.277s] rd_kafka_abort_transaction(rk, -1): duration 602.215ms 3: [0105_transactions_mock /176.277s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /176.277s] rd_kafka_begin_transaction(rk): duration 0.058ms 3: [0105_transactions_mock /176.277s] Test config file test.conf not found 3: [0105_transactions_mock /176.277s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /176.277s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /176.277s] PRODUCE: duration 0.076ms 3: [
/191.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /176.781s] PRODUCE.DELIVERY.WAIT: duration 503.882ms 3: [0105_transactions_mock /176.781s] Test config file test.conf not found 3: [0105_transactions_mock /176.781s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /176.781s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /176.781s] PRODUCE: duration 0.071ms 3: [0105_transactions_mock /176.781s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /177.380s] rd_kafka_abort_transaction(rk, -1): duration 598.565ms 3: [0105_transactions_mock /177.380s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /177.380s] rd_kafka_begin_transaction(rk): duration 0.096ms 3: [0105_transactions_mock /177.380s] Test config file test.conf not found 3: [0105_transactions_mock /177.380s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /177.380s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /177.380s] PRODUCE: duration 0.075ms 3: [
/192.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /177.883s] PRODUCE.DELIVERY.WAIT: duration 502.526ms 3: [0105_transactions_mock /177.883s] Test config file test.conf not found 3: [0105_transactions_mock /177.883s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /177.883s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /177.883s] PRODUCE: duration 0.072ms 3: [0105_transactions_mock /177.883s] rd_kafka_abort_transaction(rk, -1): duration 0.211ms 3: [0105_transactions_mock /177.883s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /177.883s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [0105_transactions_mock /177.883s] Test config file test.conf not found 3: [0105_transactions_mock /177.883s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /177.883s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /177.883s] PRODUCE: duration 0.063ms 3: [0105_transactions_mock /178.386s] PRODUCE.DELIVERY.WAIT: duration 502.448ms 3: [0105_transactions_mock /178.386s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /178.386s] Test config file test.conf not found 3: [0105_transactions_mock /178.386s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /178.386s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /178.386s] PRODUCE: duration 0.078ms 3: [0105_transactions_mock /178.386s] Changing transaction coordinator from 2 to 3 3: [
/193.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /178.880s] rd_kafka_commit_transaction(rk, -1): duration 494.219ms 3: [0105_transactions_mock /178.880s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /178.880s] rd_kafka_begin_transaction(rk): duration 0.079ms 3: [0105_transactions_mock /178.880s] Test config file test.conf not found 3: [0105_transactions_mock /178.880s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /178.880s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /178.880s] PRODUCE: duration 0.079ms 3: [0105_transactions_mock /179.383s] PRODUCE.DELIVERY.WAIT: duration 502.434ms 3: [0105_transactions_mock /179.383s] Test config file test.conf not found 3: [0105_transactions_mock /179.383s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /179.383s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /179.383s] PRODUCE: duration 0.084ms 3: [0105_transactions_mock /179.383s] Changing transaction coordinator from 4 to 5 3: [
/194.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /179.881s] rd_kafka_abort_transaction(rk, -1): duration 498.020ms 3: [0105_transactions_mock /179.881s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /179.881s] rd_kafka_begin_transaction(rk): duration 0.052ms 3: [0105_transactions_mock /179.881s] Test config file test.conf not found 3: [0105_transactions_mock /179.881s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /179.881s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /179.881s] PRODUCE: duration 0.068ms 3: [0105_transactions_mock /180.383s] PRODUCE.DELIVERY.WAIT: duration 501.511ms 3: [0105_transactions_mock /180.383s] Test config file test.conf not found 3: [0105_transactions_mock /180.383s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /180.383s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /180.383s] PRODUCE: duration 0.063ms 3: [0105_transactions_mock /180.383s] Changing transaction coordinator from 1 to 2 3: [
/195.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /180.881s] rd_kafka_abort_transaction(rk, -1): duration 498.218ms 3: [0105_transactions_mock /180.881s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /180.881s] rd_kafka_begin_transaction(rk): duration 0.091ms 3: [0105_transactions_mock /180.881s] Test config file test.conf not found 3: [0105_transactions_mock /180.881s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /180.881s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /180.881s] PRODUCE: duration 0.079ms 3: [0105_transactions_mock /181.384s] PRODUCE.DELIVERY.WAIT: duration 502.741ms 3: [0105_transactions_mock /181.384s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /181.384s] Test config file test.conf not found 3: [0105_transactions_mock /181.384s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /181.384s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /181.384s] PRODUCE: duration 0.064ms 3: [
/196.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /181.987s] rd_kafka_abort_transaction(rk, -1): duration 602.542ms 3: [0105_transactions_mock /181.987s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /181.987s] rd_kafka_begin_transaction(rk): duration 0.095ms 3: [0105_transactions_mock /181.987s] Test config file test.conf not found 3: [0105_transactions_mock /181.987s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /181.987s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /181.987s] PRODUCE: duration 0.079ms 3: [0105_transactions_mock /182.491s] PRODUCE.DELIVERY.WAIT: duration 503.889ms 3: [0105_transactions_mock /182.491s] Test config file test.conf not found 3: [0105_transactions_mock /182.491s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /182.491s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /182.491s] PRODUCE: duration 0.062ms 3: [0105_transactions_mock /182.491s] Changing transaction coordinator from 5 to 1 3: [
/197.331s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /183.093s] rd_kafka_abort_transaction(rk, -1): duration 601.926ms 3: [0105_transactions_mock /183.124s] [ do_test_txn_switch_coordinator:1366: Test switching coordinators: PASS (18.68s) ] 3: [0105_transactions_mock /183.124s] [ do_test_txn_switch_coordinator_refresh:1433: Test switching coordinators (refresh) ] 3: [0105_transactions_mock /183.124s] Test config file test.conf not found 3: [0105_transactions_mock /183.124s] Setting test timeout to 60s * 2.7 3: %5|1675737209.581|MOCK|0105_transactions_mock#producer-242| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40133,127.0.0.1:35005,127.0.0.1:38259 3: [0105_transactions_mock /183.128s] Created kafka instance 0105_transactions_mock#producer-242 3: [0105_transactions_mock /183.128s] Starting transaction 3: [0105_transactions_mock /183.142s] rd_kafka_init_transactions(rk, 5000): duration 13.570ms 3: [0105_transactions_mock /183.142s] rd_kafka_begin_transaction(rk): duration 0.054ms 3: [0105_transactions_mock /183.142s] Switching to coordinator 2 3: [0105_transactions_mock /183.645s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, 20 * 1000): duration 502.843ms 3: [0105_transactions_mock /183.645s] Test config file test.conf not found 3: [0105_transactions_mock /183.645s] Produce to test [-1]: messages #0..10 3: [0105_transactions_mock /183.645s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /183.645s] PRODUCE: duration 0.014ms 3: [0105_transactions_mock /183.650s] PRODUCE.DELIVERY.WAIT: duration 5.617ms 3: [0105_transactions_mock /183.651s] rd_kafka_commit_transaction(rk, -1): duration 0.132ms 3: [0105_transactions_mock /183.651s] [ do_test_txn_switch_coordinator_refresh:1433: Test switching coordinators (refresh): PASS (0.53s) ] 3: [0105_transactions_mock /183.651s] [ do_test_out_of_order_seq:2532 ] 3: [0105_transactions_mock /183.651s] Test config file test.conf not found 3: [0105_transactions_mock /183.651s] Setting test timeout to 60s * 2.7 3: %5|1675737210.108|MOCK|0105_transactions_mock#producer-243| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43315,127.0.0.1:42891,127.0.0.1:36327 3: [0105_transactions_mock /183.654s] Created kafka instance 0105_transactions_mock#producer-243 3: [0105_transactions_mock /183.685s] rd_kafka_init_transactions(rk, -1): duration 22.878ms 3: [0105_transactions_mock /183.685s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock /183.685s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock /183.685s] 0105_transactions_mock#producer-243: Flushing 1 messages 3: [
/198.333s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /184.654s] FLUSH: duration 968.689ms 3: [0105_transactions_mock /184.654s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.007ms 3: [0105_transactions_mock /184.654s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /184.654s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /184.654s] Sleeping.. 3: [
/199.334s] 1 test(s) running: 0105_transactions_mock 3: [
/200.334s] 1 test(s) running: 0105_transactions_mock 3: [
/201.334s] 1 test(s) running: 0105_transactions_mock 3: %3|1675737213.158|TXNERR|0105_transactions_mock#producer-243| [thrd:127.0.0.1:42891/bootstrap]: Current transaction failed in state InTransaction: skipped sequence numbers (OUT_OF_ORDER_SEQUENCE_NUMBER, requires epoch bump) 3: [
/202.334s] 1 test(s) running: 0105_transactions_mock 3: [
/203.334s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /189.654s] produce() failed as expected: Local: Erroneous state 3: [0105_transactions_mock /189.654s] commit_transaction(-1): duration 0.031ms 3: [0105_transactions_mock /189.654s] commit_transaction() failed (expectedly): skipped sequence numbers 3: [0105_transactions_mock /189.654s] rd_kafka_abort_transaction(rk, -1): duration 0.104ms 3: [0105_transactions_mock /189.654s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock /189.654s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: [0105_transactions_mock /189.656s] rd_kafka_commit_transaction(rk, -1): duration 2.256ms 3: [0105_transactions_mock /189.657s] [ do_test_out_of_order_seq:2532: PASS (6.01s) ] 3: [0105_transactions_mock /189.657s] [ do_test_topic_disappears_for_awhile:2666 ] 3: [0105_transactions_mock /189.657s] Test config file test.conf not found 3: [0105_transactions_mock /189.657s] Setting test timeout to 60s * 2.7 3: %5|1675737216.114|MOCK|0105_transactions_mock#producer-244| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42461 3: [0105_transactions_mock /189.659s] Created kafka instance 0105_transactions_mock#producer-244 3: [0105_transactions_mock /189.696s] rd_kafka_init_transactions(rk, -1): duration 36.579ms 3: [0105_transactions_mock /189.696s] rd_kafka_begin_transaction(rk): duration 0.011ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /189.696s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [
/204.335s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /190.700s] rd_kafka_commit_transaction(rk, -1): duration 1004.015ms 3: [0105_transactions_mock /190.700s] commit_transaction(-1): duration 1004.042ms 3: [0105_transactions_mock /190.700s] Marking topic as non-existent 3: %5|1675737217.157|PARTCNT|0105_transactions_mock#producer-244| [thrd:main]: Topic mytopic partition count changed from 10 to 0 3: [0105_transactions_mock /190.701s] rd_kafka_metadata(rk, 0, ((void *)0), &md, tmout_multip(5000)): duration 0.130ms 3: [
/205.335s] 1 test(s) running: 0105_transactions_mock 3: [
/206.335s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /192.701s] Bringing topic back to life 3: [0105_transactions_mock /192.701s] rd_kafka_begin_transaction(rk): duration 0.053ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.005ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /192.701s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [
/207.336s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /193.700s] rd_kafka_commit_transaction(rk, -1): duration 999.387ms 3: [0105_transactions_mock /193.700s] commit_transaction(-1): duration 999.417ms 3: [0105_transactions_mock /193.700s] Verifying messages by consumtion 3: [0105_transactions_mock /193.700s] Test config file test.conf not found 3: [0105_transactions_mock /193.701s] Created kafka instance 0105_transactions_mock#consumer-245 3: [0105_transactions_mock /193.704s] consume: consume exactly 122 messages 3: [
/208.338s] 1 test(s) running: 0105_transactions_mock 3: [
/209.338s] 1 test(s) running: 0105_transactions_mock 3: [
/210.338s] 1 test(s) running: 0105_transactions_mock 3: [
/211.339s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /196.868s] mytopic [8] reached EOF at offset 12 3: [0105_transactions_mock /196.868s] mytopic [9] reached EOF at offset 12 3: [0105_transactions_mock /196.868s] mytopic [3] reached EOF at offset 12 3: [0105_transactions_mock /196.868s] mytopic [5] reached EOF at offset 12 3: [0105_transactions_mock /196.868s] mytopic [7] reached EOF at offset 12 3: [0105_transactions_mock /196.868s] mytopic [0] reached EOF at offset 12 3: [0105_transactions_mock /196.868s] mytopic [1] reached EOF at offset 12 3: [0105_transactions_mock /196.868s] mytopic [2] reached EOF at offset 12 3: [0105_transactions_mock /197.370s] mytopic [4] reached EOF at offset 13 3: [0105_transactions_mock /197.370s] mytopic [6] reached EOF at offset 13 3: [0105_transactions_mock /197.370s] CONSUME: duration 3666.361ms 3: [0105_transactions_mock /197.370s] consume: consumed 122/122 messages (10/10 EOFs) 3: [0105_transactions_mock /197.372s] [ do_test_topic_disappears_for_awhile:2666: PASS (7.71s) ] 3: [0105_transactions_mock /197.372s] [ do_test_disconnected_group_coord:2802: switch_coord=false ] 3: [0105_transactions_mock /197.372s] Test config file test.conf not found 3: [0105_transactions_mock /197.372s] Setting test timeout to 60s * 2.7 3: %5|1675737223.829|MOCK|0105_transactions_mock#producer-246| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:36979,127.0.0.1:36207,127.0.0.1:36665 3: [0105_transactions_mock /197.372s] Created kafka instance 0105_transactions_mock#producer-246 3: [0105_transactions_mock /197.373s] rd_kafka_init_transactions(rk, -1): duration 0.480ms 3: [0105_transactions_mock /197.373s] rd_kafka_begin_transaction(rk): duration 0.043ms 3: [0105_transactions_mock /197.373s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.008ms 3: [0105_transactions_mock /197.373s] 0105_transactions_mock#producer-246: Flushing 1 messages 3: [
/212.339s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /198.373s] FLUSH: duration 1000.367ms 3: [
/213.339s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /199.374s] Calling send_offsets_to_transaction() 3: %3|1675737225.831|FAIL|0105_transactions_mock#producer-246| [thrd:127.0.0.1:36207/bootstrap]: 127.0.0.1:36207/2: Connect to ipv4#127.0.0.1:36207 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737226.007|FAIL|0105_transactions_mock#producer-246| [thrd:127.0.0.1:36207/bootstrap]: 127.0.0.1:36207/2: Connect to ipv4#127.0.0.1:36207 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/214.339s] 1 test(s) running: 0105_transactions_mock 3: [
/215.339s] 1 test(s) running: 0105_transactions_mock 3: [
/216.339s] 1 test(s) running: 0105_transactions_mock 3: [
/217.008s] Bringing up group coordinator 2.. 3: [
/217.339s] 1 test(s) running: 0105_transactions_mock 3: [
/218.339s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /204.320s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 4946.921ms 3: [0105_transactions_mock /204.320s] send_offsets_to_transaction(-1): duration 4946.949ms 3: [0105_transactions_mock /204.321s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:36207/2: Connect to ipv4#127.0.0.1:36207 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /204.321s] 0105_transactions_mock#producer-246 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:36207/2: Connect to ipv4#127.0.0.1:36207 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /204.321s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:36207/2: Connect to ipv4#127.0.0.1:36207 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /204.321s] 0105_transactions_mock#producer-246 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:36207/2: Connect to ipv4#127.0.0.1:36207 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /204.321s] rd_kafka_commit_transaction(rk, -1): duration 0.179ms 3: [0105_transactions_mock /204.321s] commit_transaction(-1): duration 0.189ms 3: [0105_transactions_mock /204.321s] [ do_test_disconnected_group_coord:2802: switch_coord=false: PASS (6.95s) ] 3: [0105_transactions_mock /204.321s] [ do_test_disconnected_group_coord:2802: switch_coord=true ] 3: [0105_transactions_mock /204.321s] Test config file test.conf not found 3: [0105_transactions_mock /204.321s] Setting test timeout to 60s * 2.7 3: %5|1675737230.778|MOCK|0105_transactions_mock#producer-247| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38035,127.0.0.1:40805,127.0.0.1:46283 3: [0105_transactions_mock /204.326s] Created kafka instance 0105_transactions_mock#producer-247 3: [0105_transactions_mock /204.346s] rd_kafka_init_transactions(rk, -1): duration 19.886ms 3: [0105_transactions_mock /204.346s] rd_kafka_begin_transaction(rk): duration 0.014ms 3: [0105_transactions_mock /204.346s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.015ms 3: [0105_transactions_mock /204.346s] 0105_transactions_mock#producer-247: Flushing 1 messages 3: [
/219.339s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /205.327s] FLUSH: duration 980.889ms 3: [
/220.340s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /206.327s] Calling send_offsets_to_transaction() 3: %3|1675737232.785|FAIL|0105_transactions_mock#producer-247| [thrd:127.0.0.1:40805/bootstrap]: 127.0.0.1:40805/2: Connect to ipv4#127.0.0.1:40805 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1675737233.083|FAIL|0105_transactions_mock#producer-247| [thrd:127.0.0.1:40805/bootstrap]: 127.0.0.1:40805/2: Connect to ipv4#127.0.0.1:40805 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/221.340s] 1 test(s) running: 0105_transactions_mock 3: [
/222.340s] 1 test(s) running: 0105_transactions_mock 3: [
/223.340s] 1 test(s) running: 0105_transactions_mock 3: [
/223.961s] Switching group coordinator to 3 3: [
/224.340s] 1 test(s) running: 0105_transactions_mock 3: [
/225.340s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /211.282s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 4954.967ms 3: [0105_transactions_mock /211.282s] send_offsets_to_transaction(-1): duration 4954.990ms 3: [0105_transactions_mock /211.282s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:40805/2: Connect to ipv4#127.0.0.1:40805 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /211.282s] 0105_transactions_mock#producer-247 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:40805/2: Connect to ipv4#127.0.0.1:40805 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /211.282s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:40805/2: Connect to ipv4#127.0.0.1:40805 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /211.282s] 0105_transactions_mock#producer-247 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:40805/2: Connect to ipv4#127.0.0.1:40805 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /211.282s] rd_kafka_commit_transaction(rk, -1): duration 0.204ms 3: [0105_transactions_mock /211.282s] commit_transaction(-1): duration 0.211ms 3: [0105_transactions_mock /211.283s] [ do_test_disconnected_group_coord:2802: switch_coord=true: PASS (6.96s) ] 3: [0105_transactions_mock /211.283s] 0105_transactions_mock: duration 211282.922ms 3: [0105_transactions_mock /211.283s] ================= Test 0105_transactions_mock PASSED ================= 3: [
/226.340s] ALL-TESTS: duration 226340.234ms 3: [
/226.340s] 10 thread(s) in use by librdkafka, waiting... 3: [
/227.340s] 10 thread(s) in use by librdkafka 3: [
/227.340s] TEST FAILURE 3: ### Test "
" failed at /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:1581:test_wait_exit() at Tue Feb 7 02:33:59 2023: ### 3: 10 thread(s) still active in librdkafka 3: test-runner: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:6629: test_fail0: Assertion `0' failed. 1/1 Test #3: RdKafkaTestBrokerLess ............Subprocess aborted***Exception: 227.36 sec 0% tests passed, 1 tests failed out of 1 Total Test time (real) = 227.37 sec The following tests FAILED: 3 - RdKafkaTestBrokerLess (Subprocess aborted) Errors while running CTest Output from these tests are in: /usr/src/RPM/BUILD/librdkafka-1.9.2/Testing/Temporary/LastTest.log Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely. error: Bad exit status from /usr/src/tmp/rpm-tmp.74583 (%check) RPM build errors: Bad exit status from /usr/src/tmp/rpm-tmp.74583 (%check) Command exited with non-zero status 1 128.46user 17.91system 4:03.72elapsed 60%CPU (0avgtext+0avgdata 783944maxresident)k 0inputs+0outputs (0major+4480810minor)pagefaults 0swaps hsh-rebuild: rebuild of `librdkafka-1.9.2-alt1.src.rpm' failed. Command exited with non-zero status 1 1.85user 0.99system 4:09.76elapsed 1%CPU (0avgtext+0avgdata 109332maxresident)k 0inputs+0outputs (32620major+177893minor)pagefaults 0swaps