<86>Jan 12 02:36:15 userdel[1132331]: delete user 'rooter' <86>Jan 12 02:36:15 userdel[1132331]: removed shadow group 'rooter' owned by 'rooter' <86>Jan 12 02:36:15 groupadd[1132336]: group added to /etc/group: name=rooter, GID=720 <86>Jan 12 02:36:15 groupadd[1132336]: group added to /etc/gshadow: name=rooter <86>Jan 12 02:36:15 groupadd[1132336]: new group: name=rooter, GID=720 <86>Jan 12 02:36:15 useradd[1132340]: new user: name=rooter, UID=720, GID=720, home=/root, shell=/bin/bash <86>Jan 12 02:36:15 userdel[1132346]: delete user 'builder' <86>Jan 12 02:36:15 userdel[1132346]: removed shadow group 'builder' owned by 'builder' <86>Jan 12 02:36:15 groupadd[1132353]: group added to /etc/group: name=builder, GID=721 <86>Jan 12 02:36:15 groupadd[1132353]: group added to /etc/gshadow: name=builder <86>Jan 12 02:36:15 groupadd[1132353]: new group: name=builder, GID=721 <86>Jan 12 02:36:15 useradd[1132367]: new user: name=builder, UID=721, GID=721, home=/usr/src, shell=/bin/bash warning: Macro %cmake_insource not found <13>Jan 12 02:36:19 rpmi: libuv-1.44.2-alt1 sisyphus+303845.100.1.1 1658053885 installed <13>Jan 12 02:36:19 rpmi: libjsoncpp24-1.9.4-alt2 sisyphus+286441.100.1.1 1633444232 installed <13>Jan 12 02:36:19 rpmi: libexpat-2.5.0-alt1 sisyphus+309227.100.1.1 1667075764 installed <13>Jan 12 02:36:19 rpmi: libidn2-2.3.4-alt1 sisyphus+309023.100.1.1 1666791084 installed <13>Jan 12 02:36:19 rpmi: libxxhash-0.8.0-alt2 sisyphus+277476.100.2.1 1625621312 installed <13>Jan 12 02:36:19 rpmi: liblz4-1:1.9.4-alt1 sisyphus+309416.100.1.1 1667412981 installed <13>Jan 12 02:36:19 rpmi: gcc-c++-common-1.4.27-alt1 sisyphus+278099.1300.1.1 1626028636 installed <13>Jan 12 02:36:19 rpmi: libstdc++12-devel-12.1.1-alt2 sisyphus+307182.100.1.1 1663781909 installed <13>Jan 12 02:36:20 rpmi: gcc12-c++-12.1.1-alt2 sisyphus+307182.100.1.1 1663781909 installed <13>Jan 12 02:36:20 rpmi: rpm-macros-cmake-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Jan 12 02:36:20 rpmi: cmake-modules-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Jan 12 02:36:20 rpmi: librhash-1.3.5-alt3 sisyphus+286141.40.2.1 1632982456 installed <13>Jan 12 02:36:20 rpmi: publicsuffix-list-dafsa-20221003-alt1 sisyphus+308013.100.1.1 1665137688 installed <13>Jan 12 02:36:20 rpmi: libpsl-0.21.2-alt1 sisyphus+312536.100.1.1 1672131178 installed <13>Jan 12 02:36:20 rpmi: libnghttp2-1.51.0-alt1 sisyphus+310565.100.1.1 1669296590 installed <13>Jan 12 02:36:20 rpmi: openldap-common-2.6.3-alt1 sisyphus+306372.60.8.1 1663095223 installed <13>Jan 12 02:36:20 rpmi: libverto-0.3.2-alt1_1 sisyphus+279289.100.1.3 1626493868 installed <13>Jan 12 02:36:20 rpmi: liblmdb-0.9.29-alt1.1 sisyphus+306630.100.1.1 1663072360 installed <13>Jan 12 02:36:20 rpmi: libkeyutils-1.6.3-alt1 sisyphus+266061.100.1.1 1612919566 installed <13>Jan 12 02:36:20 rpmi: libcom_err-1.46.4.0.5.4cda-alt1 sisyphus+283826.100.1.1 1629975345 installed <13>Jan 12 02:36:20 rpmi: libbrotlicommon-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Jan 12 02:36:20 rpmi: libbrotlidec-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Jan 12 02:36:20 rpmi: libp11-kit-0.24.1-alt1 sisyphus+293720.100.1.1 1642535264 installed <13>Jan 12 02:36:21 rpmi: libtasn1-4.19.0-alt1 sisyphus+305700.100.1.1 1661359624 installed <13>Jan 12 02:36:21 rpmi: rpm-macros-alternatives-0.5.2-alt1 sisyphus+300869.100.1.1 1653844113 installed <13>Jan 12 02:36:21 rpmi: alternatives-0.5.2-alt1 sisyphus+300869.100.1.1 1653844113 installed <13>Jan 12 02:36:21 rpmi: ca-certificates-2022.12.14-alt1 sisyphus+311754.200.1.1 1671046143 installed <13>Jan 12 02:36:21 rpmi: ca-trust-0.1.4-alt1 sisyphus+308690.100.1.1 1666182992 installed <13>Jan 12 02:36:21 rpmi: p11-kit-trust-0.24.1-alt1 sisyphus+293720.100.1.1 1642535264 installed <13>Jan 12 02:36:21 rpmi: libcrypto1.1-1.1.1q-alt1 sisyphus+303203.100.1.1 1657026987 installed <13>Jan 12 02:36:21 rpmi: libssl1.1-1.1.1q-alt1 sisyphus+303203.100.1.1 1657026987 installed <86>Jan 12 02:36:21 groupadd[1136696]: group added to /etc/group: name=_keytab, GID=499 <86>Jan 12 02:36:21 groupadd[1136696]: group added to /etc/gshadow: name=_keytab <86>Jan 12 02:36:21 groupadd[1136696]: new group: name=_keytab, GID=499 <13>Jan 12 02:36:21 rpmi: libkrb5-1.19.4-alt1 sisyphus+310092.100.2.1 1668703482 installed <86>Jan 12 02:36:21 groupadd[1136771]: group added to /etc/group: name=sasl, GID=498 <86>Jan 12 02:36:21 groupadd[1136771]: group added to /etc/gshadow: name=sasl <86>Jan 12 02:36:21 groupadd[1136771]: new group: name=sasl, GID=498 <13>Jan 12 02:36:21 rpmi: libsasl2-3-2.1.27-alt2.2 sisyphus+306372.1000.8.1 1663097224 installed <13>Jan 12 02:36:21 rpmi: libldap2-2.6.3-alt1 sisyphus+306372.60.8.1 1663095223 installed <13>Jan 12 02:36:21 rpmi: libcurl-7.87.0-alt1 sisyphus+312113.100.1.1 1671611216 installed <13>Jan 12 02:36:21 rpmi: libarchive13-3.6.1-alt2 sisyphus+311213.100.1.1 1670244620 installed <13>Jan 12 02:36:21 rpmi: cmake-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Jan 12 02:36:21 rpmi: ctest-3.23.2-alt1.2 sisyphus+308755.100.1.1 1666345612 installed <13>Jan 12 02:36:21 rpmi: libsasl2-devel-2.1.27-alt2.2 sisyphus+306372.1000.8.1 1663097224 installed <13>Jan 12 02:36:21 rpmi: libssl-devel-1.1.1q-alt1 sisyphus+303203.100.1.1 1657026987 installed <13>Jan 12 02:36:21 rpmi: gcc-c++-12-alt1 sisyphus+300988.300.1.1 1654033053 installed <13>Jan 12 02:36:21 rpmi: liblz4-devel-1:1.9.4-alt1 sisyphus+309416.100.1.1 1667412981 installed <13>Jan 12 02:36:21 rpmi: libxxhash-devel-0.8.0-alt2 sisyphus+277476.100.2.1 1625621312 installed Building target platforms: x86_64 Building for target x86_64 Wrote: /usr/src/in/nosrpm/librdkafka-1.9.2-alt1.nosrc.rpm (w1.gzdio) Installing librdkafka-1.9.2-alt1.src.rpm Building target platforms: x86_64 Building for target x86_64 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.77123 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf librdkafka-1.9.2 + echo 'Source #0 (librdkafka-1.9.2.tar):' Source #0 (librdkafka-1.9.2.tar): + /bin/tar -xf /usr/src/RPM/SOURCES/librdkafka-1.9.2.tar + cd librdkafka-1.9.2 + /bin/chmod -c -Rf u+rwX,go-w . + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.77123 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd librdkafka-1.9.2 + mkdir -p . + cmake -DCMAKE_SKIP_INSTALL_RPATH:BOOL=yes '-DCMAKE_C_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto' '-DCMAKE_CXX_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto' '-DCMAKE_Fortran_FLAGS:STRING=-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto' -DCMAKE_INSTALL_PREFIX=/usr -DINCLUDE_INSTALL_DIR:PATH=/usr/include -DLIB_INSTALL_DIR:PATH=/usr/lib64 -DSYSCONF_INSTALL_DIR:PATH=/etc -DSHARE_INSTALL_PREFIX:PATH=/usr/share -DLIB_DESTINATION=lib64 -DLIB_SUFFIX=64 -S . -B . -- The C compiler identification is GNU 12.1.1 -- The CXX compiler identification is GNU 12.1.1 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/bin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Looking for pow in m -- Looking for pow in m - found -- Checking for module 'libsasl2' -- Found libsasl2, version 2.1.27 -- Found LZ4: /usr/lib64/liblz4.so (found version "1.9.4") -- Found OpenSSL: /usr/lib64/libcrypto.so (found version "1.1.1q") -- Looking for pthread.h -- Looking for pthread.h - found -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Configuring done -- Generating done CMake Warning: Manually-specified variables were not used by the project: CMAKE_Fortran_FLAGS INCLUDE_INSTALL_DIR LIB_DESTINATION LIB_INSTALL_DIR LIB_SUFFIX SHARE_INSTALL_PREFIX SYSCONF_INSTALL_DIR -- Build files have been written to: /usr/src/RPM/BUILD/librdkafka-1.9.2 + make -j8 make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdaddr.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/crc32c.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdcrc32.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 3%] Building C object src/CMakeFiles/rdkafka.dir/rdfnv1a.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 1%] Building C object src/CMakeFiles/rdkafka.dir/rdavl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_buf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 2%] Building C object src/CMakeFiles/rdkafka.dir/rdbuf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 6%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_feature.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 6%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 4%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_metadata_cache.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_metadata.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_conf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 7%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_lz4.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 8%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msg.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 3%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 9%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msgset_writer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 9%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_offset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 9%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_msgset_reader.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 11%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_queue.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 11%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_range_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_pattern.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_roundrobin_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_op.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 5%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_cgrp.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 13%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_plain.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 4%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_broker.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 14%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_assignment.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 10%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_partition.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 14%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_subscription.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_timer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 13%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sticky_assignor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_transport.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 12%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_request.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 16%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_interceptor.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 15%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_topic.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 16%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_header.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_background.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 17%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_aux.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_idempotence.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 19%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_cert.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 19%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_coord.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 18%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_txnmgr.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 21%] Building C object src/CMakeFiles/rdkafka.dir/rdmurmur2.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdports.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock_cgrp.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 21%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_error.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 21%] Building C object src/CMakeFiles/rdkafka.dir/rdlist.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdrand.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 23%] Building C object src/CMakeFiles/rdkafka.dir/rdregex.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 22%] Building C object src/CMakeFiles/rdkafka.dir/rdlog.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 24%] Building C object src/CMakeFiles/rdkafka.dir/rdstring.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/rdvarint.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 26%] Building C object src/CMakeFiles/rdkafka.dir/tinycthread.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 26%] Building C object src/CMakeFiles/rdkafka.dir/tinycthread_extra.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/rdmap.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 27%] Building C object src/CMakeFiles/rdkafka.dir/rdxxhash.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 27%] Building C object src/CMakeFiles/rdkafka.dir/cJSON.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 24%] Building C object src/CMakeFiles/rdkafka.dir/rdunittest.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rddl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 17%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_admin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 25%] Building C object src/CMakeFiles/rdkafka.dir/snappy.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rdhdrhistogram.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 28%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_ssl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 29%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_cyrus.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 20%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_mock_handlers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 29%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_plugin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_scram.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Building C object src/CMakeFiles/rdkafka.dir/rdkafka_sasl_oauthbearer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Linking C shared library librdkafka.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Built target rdkafka make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building C object examples/CMakeFiles/producer.dir/producer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building C object examples/CMakeFiles/consumer.dir/consumer.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building C object examples/CMakeFiles/misc.dir/misc.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 33%] Building C object examples/CMakeFiles/rdkafka_performance.dir/rdkafka_performance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 34%] Linking C executable consumer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Built target consumer make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 34%] Linking C executable producer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Built target producer make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Building C object examples/CMakeFiles/rdkafka_example.dir/rdkafka_example.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Building C object examples/CMakeFiles/rdkafka_complex_consumer_example.dir/rdkafka_complex_consumer_example.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Linking C executable misc make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Built target misc make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 34%] Linking C executable rdkafka_performance make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Building C object tests/interceptor_test/CMakeFiles/interceptor_test.dir/interceptor_test.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ConfImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target rdkafka_performance make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Linking C executable rdkafka_complex_consumer_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Built target rdkafka_complex_consumer_example make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/HeadersImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 35%] Linking C executable rdkafka_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Built target rdkafka_example make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Linking C shared library interceptor_test.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Built target interceptor_test make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 31%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ConsumerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 32%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/HandleImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/MessageImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/KafkaConsumerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/MetadataImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/RdKafka.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/TopicImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/QueueImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/ProducerImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Building CXX object src-cpp/CMakeFiles/rdkafka++.dir/TopicPartitionImpl.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Linking CXX shared library librdkafka++.so make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target rdkafka++ make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0000-unittests.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0002-unkpart.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Building C object tests/CMakeFiles/test-runner.dir/0001-multiobj.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 45%] Building C object tests/CMakeFiles/test-runner.dir/0003-msgmaxsize.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0005-order.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 45%] Building C object tests/CMakeFiles/test-runner.dir/0004-conf.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0006-symbols.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 46%] Building C object tests/CMakeFiles/test-runner.dir/0007-autotopic.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 47%] Building C object tests/CMakeFiles/test-runner.dir/0008-reqacks.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 47%] Building C object tests/CMakeFiles/test-runner.dir/0009-mock_cluster.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Building CXX object examples/CMakeFiles/producer_cpp.dir/producer.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 48%] Building C object tests/CMakeFiles/test-runner.dir/0011-produce_batch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 48%] Building C object tests/CMakeFiles/test-runner.dir/0012-produce_consume.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0013-null-msgs.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Building CXX object examples/CMakeFiles/openssl_engine_example_cpp.dir/openssl_engine_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Building CXX object examples/CMakeFiles/rdkafka_example_cpp.dir/rdkafka_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0014-reconsume-191.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Building C object tests/CMakeFiles/test-runner.dir/0015-offset_seeks.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0017-compression.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Building C object tests/CMakeFiles/test-runner.dir/0016-client_swname.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Building CXX object examples/CMakeFiles/rdkafka_complex_consumer_example_cpp.dir/rdkafka_complex_consumer_example.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 52%] Building C object tests/CMakeFiles/test-runner.dir/0018-cgrp_term.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 53%] Building C object tests/CMakeFiles/test-runner.dir/0020-destroy_hang.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 54%] Building C object tests/CMakeFiles/test-runner.dir/0021-rkt_destroy.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 52%] Building C object tests/CMakeFiles/test-runner.dir/0019-list_groups.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 54%] Building C object tests/CMakeFiles/test-runner.dir/0022-consume_batch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 49%] Linking CXX executable producer_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Built target producer_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 55%] Building C object tests/CMakeFiles/test-runner.dir/0025-timers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0028-long_topicnames.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 55%] Building C object tests/CMakeFiles/test-runner.dir/0026-consume_pause.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0029-assign_offset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Building CXX object examples/CMakeFiles/kafkatest_verifiable_client.dir/kafkatest_verifiable_client.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 57%] Building C object tests/CMakeFiles/test-runner.dir/0031-get_offsets.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 50%] Linking CXX executable openssl_engine_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0034-offset_reset.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Built target openssl_engine_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 56%] Building C object tests/CMakeFiles/test-runner.dir/0030-offset_commit.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 57%] Building C object tests/CMakeFiles/test-runner.dir/0033-regex_subscribe.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0035-api_version.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Building C object tests/CMakeFiles/test-runner.dir/0036-partial_fetch.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 59%] Building C object tests/CMakeFiles/test-runner.dir/0037-destroy_hang_local.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 60%] Building C object tests/CMakeFiles/test-runner.dir/0040-io_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 59%] Building C object tests/CMakeFiles/test-runner.dir/0038-performance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 60%] Building C object tests/CMakeFiles/test-runner.dir/0039-event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0041-fetch_max_bytes.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 53%] Linking CXX executable rdkafka_complex_consumer_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0043-no_connection.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 63%] Built target rdkafka_complex_consumer_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 61%] Building C object tests/CMakeFiles/test-runner.dir/0042-many_topics.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 62%] Building C object tests/CMakeFiles/test-runner.dir/0044-partition_cnt.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 62%] Building C object tests/CMakeFiles/test-runner.dir/0045-subscribe_update.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 63%] Building C object tests/CMakeFiles/test-runner.dir/0046-rkt_cache.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 63%] Building C object tests/CMakeFiles/test-runner.dir/0047-partial_buf_tmout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building C object tests/CMakeFiles/test-runner.dir/0048-partitioner.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building C object tests/CMakeFiles/test-runner.dir/0049-consume_conn_close.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 64%] Building C object tests/CMakeFiles/test-runner.dir/0050-subscribe_adds.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 65%] Building C object tests/CMakeFiles/test-runner.dir/0051-assign_adds.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 65%] Building C object tests/CMakeFiles/test-runner.dir/0052-msg_timestamps.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 51%] Linking CXX executable rdkafka_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Built target rdkafka_example_cpp make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 67%] Building C object tests/CMakeFiles/test-runner.dir/0056-balanced_group_mt.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building C object tests/CMakeFiles/test-runner.dir/0055-producer_latency.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building CXX object tests/CMakeFiles/test-runner.dir/0054-offset_time.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 66%] Building CXX object tests/CMakeFiles/test-runner.dir/0053-stats_cb.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Building CXX object tests/CMakeFiles/test-runner.dir/0057-invalid_topic.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Building CXX object tests/CMakeFiles/test-runner.dir/0058-log.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 69%] Building C object tests/CMakeFiles/test-runner.dir/0062-stats_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 70%] Building C object tests/CMakeFiles/test-runner.dir/0064-interceptors.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 68%] Building CXX object tests/CMakeFiles/test-runner.dir/0059-bsearch.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 69%] Building CXX object tests/CMakeFiles/test-runner.dir/0060-op_prio.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 69%] Building CXX object tests/CMakeFiles/test-runner.dir/0061-consumer_lag.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 72%] Building C object tests/CMakeFiles/test-runner.dir/0068-produce_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 72%] Building C object tests/CMakeFiles/test-runner.dir/0069-consumer_add_parts.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0073-headers.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0072-headers_ut.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0074-producev.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Building CXX object tests/CMakeFiles/test-runner.dir/0066-plugins.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 74%] Building C object tests/CMakeFiles/test-runner.dir/0075-retry.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 75%] Building C object tests/CMakeFiles/test-runner.dir/0076-produce_retry.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 75%] Building C object tests/CMakeFiles/test-runner.dir/0077-compaction.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building C object tests/CMakeFiles/test-runner.dir/0079-fork.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 58%] Linking CXX executable kafkatest_verifiable_client make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 77%] Built target kafkatest_verifiable_client make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 70%] Building CXX object tests/CMakeFiles/test-runner.dir/0063-clusterid.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Building CXX object tests/CMakeFiles/test-runner.dir/0065-yield.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building C object tests/CMakeFiles/test-runner.dir/0080-admin_ut.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 71%] Building CXX object tests/CMakeFiles/test-runner.dir/0067-empty_topic.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 78%] Building C object tests/CMakeFiles/test-runner.dir/0084-destroy_flags.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 78%] Building C object tests/CMakeFiles/test-runner.dir/0083-cb_event.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 78%] Building C object tests/CMakeFiles/test-runner.dir/0086-purge.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 73%] Building CXX object tests/CMakeFiles/test-runner.dir/0070-null_empty.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building C object tests/CMakeFiles/test-runner.dir/0088-produce_metadata_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 80%] Building C object tests/CMakeFiles/test-runner.dir/0089-max_poll_interval.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 77%] Building C object tests/CMakeFiles/test-runner.dir/0081-admin.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 80%] Building C object tests/CMakeFiles/test-runner.dir/0090-idempotence.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 81%] Building C object tests/CMakeFiles/test-runner.dir/0092-mixed_msgver.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 81%] Building C object tests/CMakeFiles/test-runner.dir/0091-max_poll_interval_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building C object tests/CMakeFiles/test-runner.dir/0093-holb.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building C object tests/CMakeFiles/test-runner.dir/0094-idempotence_msg_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 77%] Building CXX object tests/CMakeFiles/test-runner.dir/0082-fetch_max_bytes.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building C object tests/CMakeFiles/test-runner.dir/0099-commit_metadata.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 76%] Building CXX object tests/CMakeFiles/test-runner.dir/0078-c_from_cpp.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 85%] Building C object tests/CMakeFiles/test-runner.dir/0102-static_group_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 86%] Building C object tests/CMakeFiles/test-runner.dir/0104-fetch_from_follower_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 83%] Building CXX object tests/CMakeFiles/test-runner.dir/0098-consumer-txn.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 85%] Building C object tests/CMakeFiles/test-runner.dir/0103-transactions.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 79%] Building CXX object tests/CMakeFiles/test-runner.dir/0085-headers.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building C object tests/CMakeFiles/test-runner.dir/0106-cgrp_sess_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 82%] Building CXX object tests/CMakeFiles/test-runner.dir/0095-all_brokers_down.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building C object tests/CMakeFiles/test-runner.dir/0107-topic_recreate.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0101-fetch-from-follower.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building C object tests/CMakeFiles/test-runner.dir/0112-assign_unknown_part.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 84%] Building CXX object tests/CMakeFiles/test-runner.dir/0100-thread_interceptors.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 86%] Building C object tests/CMakeFiles/test-runner.dir/0105-transactions_mock.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 83%] Building CXX object tests/CMakeFiles/test-runner.dir/0097-ssl_verify.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 88%] Building CXX object tests/CMakeFiles/test-runner.dir/0110-batch_size.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 91%] Building C object tests/CMakeFiles/test-runner.dir/0117-mock_errors.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 91%] Building C object tests/CMakeFiles/test-runner.dir/0118-commit_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building C object tests/CMakeFiles/test-runner.dir/0120-asymmetric_subscription.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building C object tests/CMakeFiles/test-runner.dir/0121-clusterid.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building CXX object tests/CMakeFiles/test-runner.dir/0114-sticky_partitioning.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 93%] Building C object tests/CMakeFiles/test-runner.dir/0122-buffer_cleaning_after_rebalance.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 88%] Building CXX object tests/CMakeFiles/test-runner.dir/0111-delay_create_topics.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 93%] Building C object tests/CMakeFiles/test-runner.dir/0123-connections_max_idle.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0124-openssl_invalid_engine.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0125-immediate_flush.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 94%] Building C object tests/CMakeFiles/test-runner.dir/0126-oauthbearer_oidc.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 87%] Building CXX object tests/CMakeFiles/test-runner.dir/0109-auto_create_topics.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 90%] Building CXX object tests/CMakeFiles/test-runner.dir/0115-producer_auth.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 92%] Building CXX object tests/CMakeFiles/test-runner.dir/0119-consumer_auth.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 95%] Building C object tests/CMakeFiles/test-runner.dir/0129-fetch_aborted_msgs.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 90%] Building CXX object tests/CMakeFiles/test-runner.dir/0116-kafkaconsumer_close.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 96%] Building C object tests/CMakeFiles/test-runner.dir/0131-connect_timeout.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building C object tests/CMakeFiles/test-runner.dir/0132-strategy_ordering.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 96%] Building C object tests/CMakeFiles/test-runner.dir/0130-store_offsets.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 98%] Building C object tests/CMakeFiles/test-runner.dir/rusage.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Building C object tests/CMakeFiles/test-runner.dir/sockem_ctrl.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Building C object tests/CMakeFiles/test-runner.dir/sockem.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 95%] Building CXX object tests/CMakeFiles/test-runner.dir/0128-sasl_callback_queue.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building CXX object tests/CMakeFiles/test-runner.dir/8000-idle.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 97%] Building C object tests/CMakeFiles/test-runner.dir/test.c.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 98%] Building CXX object tests/CMakeFiles/test-runner.dir/testcpp.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 89%] Building CXX object tests/CMakeFiles/test-runner.dir/0113-cooperative_rebalance.cpp.o make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Linking CXX executable test-runner make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Built target test-runner make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' + exit 0 Executing(%install): /bin/sh -e /usr/src/tmp/rpm-tmp.60126 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + /bin/chmod -Rf u+rwX -- /usr/src/tmp/librdkafka-buildroot + : + /bin/rm -rf -- /usr/src/tmp/librdkafka-buildroot + PATH=/usr/libexec/rpm-build:/usr/src/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/games + cd librdkafka-1.9.2 + make 'INSTALL=/usr/libexec/rpm-build/install -p' install DESTDIR=/usr/src/tmp/librdkafka-buildroot make: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[1]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 30%] Built target rdkafka make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka++ make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 36%] Built target rdkafka++ make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target producer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target producer make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target producer_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target producer_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target consumer make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 37%] Built target consumer make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_performance make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 38%] Built target rdkafka_performance make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 39%] Built target rdkafka_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_complex_consumer_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 40%] Built target rdkafka_complex_consumer_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target openssl_engine_example_cpp make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 41%] Built target openssl_engine_example_cpp make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target misc make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Built target misc make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 42%] Built target rdkafka_example make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target rdkafka_complex_consumer_example make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 43%] Built target rdkafka_complex_consumer_example make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target kafkatest_verifiable_client make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 44%] Built target kafkatest_verifiable_client make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target test-runner make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [ 99%] Built target test-runner make[2]: Entering directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Consolidate compiler generated dependencies of target interceptor_test make[2]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' [100%] Built target interceptor_test make[1]: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' Install the project... -- Install configuration: "" -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaConfig.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaConfigVersion.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/FindLZ4.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaTargets.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/cmake/RdKafka/RdKafkaTargets-noconfig.cmake -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/share/licenses/librdkafka/LICENSES.txt -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka.pc -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka.so.1 -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka.so -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafka.h -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafka_mock.h -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka++.pc -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka++.so.1 -- Set runtime path of "/usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka++.so.1" to "" -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/lib64/librdkafka++.so -- Installing: /usr/src/tmp/librdkafka-buildroot/usr/include/librdkafka/rdkafkacpp.h make: Leaving directory '/usr/src/RPM/BUILD/librdkafka-1.9.2' + mkdir -p /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig + cp /usr/src/RPM/SOURCES/rdkafka.pc /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/ + /usr/bin/subst 's|@VERSION@|1.9.2|g' /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka++.pc /usr/src/tmp/librdkafka-buildroot/usr/lib64/pkgconfig/rdkafka.pc + rm -f '/usr/src/tmp/librdkafka-buildroot/usr/lib64/*.a' + rm -f /usr/src/tmp/librdkafka-buildroot/usr/share/licenses/librdkafka/LICENSES.txt + /usr/lib/rpm/brp-alt Cleaning files in /usr/src/tmp/librdkafka-buildroot (auto) mode of './usr/lib64/librdkafka++.so.1' changed from 0755 (rwxr-xr-x) to 0644 (rw-r--r--) mode of './usr/lib64/librdkafka.so.1' changed from 0755 (rwxr-xr-x) to 0644 (rw-r--r--) Verifying and fixing files in /usr/src/tmp/librdkafka-buildroot (binconfig,pkgconfig,libtool,desktop,gnuconfig) /usr/lib64/pkgconfig/rdkafka++.pc: Cflags: '-I${includedir}' --> '' Checking contents of files in /usr/src/tmp/librdkafka-buildroot/ (default) Compressing files in /usr/src/tmp/librdkafka-buildroot (auto) Adjusting library links in /usr/src/tmp/librdkafka-buildroot ./usr/lib64: (from :0) librdkafka.so.1 -> librdkafka.so.1 librdkafka++.so.1 -> librdkafka++.so.1 Verifying ELF objects in /usr/src/tmp/librdkafka-buildroot (arch=normal,fhs=normal,lfs=relaxed,lint=relaxed,rpath=normal,stack=normal,textrel=normal,unresolved=normal) Executing(%check): /bin/sh -e /usr/src/tmp/rpm-tmp.90595 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd librdkafka-1.9.2 + ctest -VV -R RdKafkaTestBrokerLess UpdateCTestConfiguration from :/usr/src/RPM/BUILD/librdkafka-1.9.2/DartConfiguration.tcl UpdateCTestConfiguration from :/usr/src/RPM/BUILD/librdkafka-1.9.2/DartConfiguration.tcl Test project /usr/src/RPM/BUILD/librdkafka-1.9.2 Constructing a list of tests Done constructing a list of tests Updating test list for fixtures Added 0 tests to meet fixture requirements Checking test dependency graph... Checking test dependency graph end test 3 Start 3: RdKafkaTestBrokerLess 3: Test command: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test-runner "-p5" "-l" 3: Test timeout computed to be: 10000000 3: [
/ 0.000s] Test config file test.conf not found 3: [
/ 0.000s] Setting test timeout to 10s * 1.0 3: [
/ 0.000s] Git version: HEAD 3: [
/ 0.000s] Broker version: 2.4.0.0 (2.4.0.0) 3: [
/ 0.000s] Tests to run : all 3: [
/ 0.000s] Test mode : bare 3: [
/ 0.000s] Test scenario: default 3: [
/ 0.000s] Test filter : local tests only 3: [
/ 0.000s] Test timeout multiplier: 2.7 3: [
/ 0.000s] Action on test failure: continue other tests 3: [
/ 0.000s] Current directory: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests 3: [
/ 0.000s] Setting test timeout to 30s * 2.7 3: [
/ 0.001s] Too many tests running (5 >= 5): postponing 0025_timers start... 3: [0006_symbols / 0.000s] ================= Running test 0006_symbols ================= 3: [0006_symbols / 0.000s] ==== Stats written to file stats_0006_symbols_6470528275816183109.json ==== 3: [0006_symbols / 0.000s] 0006_symbols: duration 0.000ms 3: [0006_symbols / 0.000s] ================= Test 0006_symbols PASSED ================= 3: [
/ 0.001s] Too many tests running (5 >= 5): postponing 0033_regex_subscribe_local start... 3: [0000_unittests / 0.000s] ================= Running test 0000_unittests ================= 3: [0000_unittests / 0.000s] ==== Stats written to file stats_0000_unittests_3169237199597645842.json ==== 3: [0000_unittests / 0.000s] builtin.features = snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer 3: [0022_consume_batch_local / 0.000s] ================= Running test 0022_consume_batch_local ================= 3: [0022_consume_batch_local / 0.000s] ==== Stats written to file stats_0022_consume_batch_local_6345829295200233511.json ==== 3: [0022_consume_batch_local / 0.000s] [ do_test_consume_batch_oauthbearer_cb:170 ] 3: [0009_mock_cluster / 0.000s] ================= Running test 0009_mock_cluster ================= 3: [0009_mock_cluster / 0.000s] ==== Stats written to file stats_0009_mock_cluster_5256625185970581690.json ==== 3: [0009_mock_cluster / 0.000s] Using topic "rdkafkatest_rnd75e06c71081686e0_0009_mock_cluster" 3: [0009_mock_cluster / 0.000s] Test config file test.conf not found 3: [0025_timers / 0.000s] ================= Running test 0025_timers ================= 3: [0025_timers / 0.000s] ==== Stats written to file stats_0025_timers_8576999135539193036.json ==== 3: [0025_timers / 0.000s] Test config file test.conf not found 3: [0025_timers / 0.000s] Setting test timeout to 200s * 2.7 3: [0004_conf / 0.000s] ================= Running test 0004_conf ================= 3: [0004_conf / 0.000s] ==== Stats written to file stats_0004_conf_6569717530608324615.json ==== 3: [0004_conf / 0.000s] Test config file test.conf not found 3: [0004_conf / 0.000s] Setting test timeout to 10s * 2.7 3: [0004_conf / 0.000s] Using topic "rdkafkatest_0004" 3: %7|1673491054.023|OPENSSL|rdkafka#producer-1| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: [0004_conf / 0.009s] : on_new() called 3: %5|1673491054.027|CONFWARN|0025_timers#consumer-4| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0025_timers / 0.012s] Created kafka instance 0025_timers#consumer-4 3: [0025_timers / 0.012s] rd_kafka_new(): duration 11.787ms 3: [0025_timers / 0.012s] Starting wait loop for 10 expected stats_cb calls with an interval of 600ms 3: %5|1673491054.027|CONFWARN|0022_consume_batch_local#consumer-2| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0022_consume_batch_local / 0.014s] Created kafka instance 0022_consume_batch_local#consumer-2 3: [0022_consume_batch_local / 0.014s] Refresh callback called 3: %3|1673491054.027|ERROR|0022_consume_batch_local#consumer-2| [thrd:app]: Failed to acquire SASL OAUTHBEARER token: Refresh called 3: %5|1673491054.029|CONFWARN|MOCK#producer-3| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0009_mock_cluster / 0.014s] Test config file test.conf not found 3: [0009_mock_cluster / 0.014s] Setting test timeout to 30s * 2.7 3: %7|1673491054.030|INIT|my id#producer-5| [thrd:app]: librdkafka v1.9.2 (0x10902ff) my id#producer-5 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x80c) 3: %4|1673491054.030|CONFWARN|my id#producer-5| [thrd:app]: Configuration property auto.offset.reset is a consumer property and will be ignored by this producer instance 3: %5|1673491054.030|CONFWARN|my id#producer-5| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.013s] Created kafka instance my id#producer-5 3: %7|1673491054.030|TOPIC|my id#producer-5| [thrd:app]: New local topic: rdkafkatest_0004 3: %7|1673491054.030|TOPPARNEW|my id#producer-5| [thrd:app]: NEW rdkafkatest_0004 [-1] 0x7fd7f0000de0 refcnt 0x7fd7f0000e70 (at rd_kafka_topic_new0:468) 3: %7|1673491054.030|METADATA|my id#producer-5| [thrd:app]: Hinted cache of 1/1 topic(s) being queried 3: %7|1673491054.030|METADATA|my id#producer-5| [thrd:app]: Skipping metadata refresh of 1 topic(s): leader query: no usable brokers 3: %7|1673491054.030|DESTROY|my id#producer-5| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1673491054.030|DESTROY|my id#producer-5| [thrd:main]: Destroy internal 3: %7|1673491054.030|DESTROY|my id#producer-5| [thrd:main]: Removing all topics 3: %7|1673491054.030|TOPPARREMOVE|my id#producer-5| [thrd:main]: Removing toppar rdkafkatest_0004 [-1] 0x7fd7f0000de0 3: %7|1673491054.030|DESTROY|my id#producer-5| [thrd:main]: rdkafkatest_0004 [-1]: 0x7fd7f0000de0 DESTROY_FINAL 3: [0009_mock_cluster / 0.022s] Created kafka instance 0009_mock_cluster#producer-6 3: %7|1673491054.038|INIT|my id#producer-7| [thrd:app]: librdkafka v1.9.2 (0x10902ff) my id#producer-7 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x80c) 3: %4|1673491054.038|CONFWARN|my id#producer-7| [thrd:app]: Configuration property auto.offset.reset is a consumer property and will be ignored by this producer instance 3: %5|1673491054.038|CONFWARN|my id#producer-7| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.021s] Created kafka instance my id#producer-7 3: %7|1673491054.038|TOPIC|my id#producer-7| [thrd:app]: New local topic: rdkafkatest_0004 3: %7|1673491054.038|TOPPARNEW|my id#producer-7| [thrd:app]: NEW rdkafkatest_0004 [-1] 0x7fd7f0000de0 refcnt 0x7fd7f0000e70 (at rd_kafka_topic_new0:468) 3: %7|1673491054.038|METADATA|my id#producer-7| [thrd:app]: Hinted cache of 1/1 topic(s) being queried 3: %7|1673491054.038|METADATA|my id#producer-7| [thrd:app]: Skipping metadata refresh of 1 topic(s): leader query: no usable brokers 3: %7|1673491054.038|DESTROY|my id#producer-7| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1673491054.038|DESTROY|my id#producer-7| [thrd:main]: Destroy internal 3: %7|1673491054.038|DESTROY|my id#producer-7| [thrd:main]: Removing all topics 3: %7|1673491054.038|TOPPARREMOVE|my id#producer-7| [thrd:main]: Removing toppar rdkafkatest_0004 [-1] 0x7fd7f0000de0 3: %7|1673491054.038|DESTROY|my id#producer-7| [thrd:main]: rdkafkatest_0004 [-1]: 0x7fd7f0000de0 DESTROY_FINAL 3: %7|1673491054.038|INIT|rdkafka#producer-1| [thrd:app]: librdkafka v1.9.2 (0x10902ff) rdkafka#producer-1 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x201) 3: %5|1673491054.038|CONFWARN|rdkafka#producer-1| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %7|1673491054.038|DESTROY|rdkafka#producer-1| [thrd:app]: Terminating instance (destroy flags none (0x0)) 3: %7|1673491054.038|TERMINATE|rdkafka#producer-1| [thrd:app]: Interrupting timers 3: %7|1673491054.038|TERMINATE|rdkafka#producer-1| [thrd:app]: Sending TERMINATE to internal main thread 3: %7|1673491054.038|TERMINATE|rdkafka#producer-1| [thrd:app]: Joining internal main thread 3: %7|1673491054.038|TERMINATE|rdkafka#producer-1| [thrd:main]: Internal main thread terminating 3: %7|1673491054.038|DESTROY|rdkafka#producer-1| [thrd:main]: Destroy internal 3: %7|1673491054.038|BROADCAST|rdkafka#producer-1| [thrd:main]: Broadcasting state change 3: %7|1673491054.038|DESTROY|rdkafka#producer-1| [thrd:main]: Removing all topics 3: %7|1673491054.038|TERMINATE|rdkafka#producer-1| [thrd:main]: Purging reply queue 3: %7|1673491054.038|TERMINATE|rdkafka#producer-1| [thrd:main]: Decommissioning internal broker 3: %7|1673491054.038|TERMINATE|rdkafka#producer-1| [thrd:main]: Join 1 broker thread(s) 3: %7|1673491054.038|BROADCAST|rdkafka#producer-1| [thrd::0/internal]: Broadcasting state change 3: %7|1673491054.039|TERMINATE|rdkafka#producer-1| [thrd:main]: Internal main thread termination done 3: %7|1673491054.039|TERMINATE|rdkafka#producer-1| [thrd:app]: Destroying op queues 3: %7|1673491054.039|TERMINATE|rdkafka#producer-1| [thrd:app]: Destroying SSL CTX 3: %7|1673491054.039|TERMINATE|rdkafka#producer-1| [thrd:app]: Termination done: freeing resources 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: empty tqh[0] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: prepend 1,0 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: prepend 2,1,0 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: append 1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: append 1,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 1,0,2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:158: ut_tq_test: Testing TAILQ: insert 2,0,1 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:345: unittest_sysqueue 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: sysqueue: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:393: ut_strcasestr: BEGIN:  3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:409: ut_strcasestr 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:590: ut_string_split: BEGIN:  3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdstring.c:616: ut_string_split 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: string: PASS 3: [0004_conf / 0.032s] Incremental S2F tests 3: [0004_conf / 0.032s] Set: generic,broker,queue,cgrp 3: [0004_conf / 0.032s] Now: generic,broker,queue,cgrp 3: [0004_conf / 0.032s] Set: -broker,+queue,topic 3: [0004_conf / 0.032s] Now: generic,topic,queue,cgrp 3: [0004_conf / 0.032s] Set: -all,security,-fetch,+metadata 3: [0004_conf / 0.032s] Now: metadata,security 3: [0004_conf / 0.032s] Error reporting for S2F properties 3: [0004_conf / 0.032s] Ok: Invalid value "invalid-value" for configuration property "debug" 3: [0004_conf / 0.032s] Verifying that ssl.ca.location is not overwritten (#3566) 3: %3|1673491054.049|SSL|rdkafka#producer-9| [thrd:app]: error:02001002:system library:fopen:No such file or directory: fopen('/?/does/!/not/exist!','r') 3: %3|1673491054.050|SSL|rdkafka#producer-9| [thrd:app]: error:2006D080:BIO routines:BIO_new_file:no such file 3: [0004_conf / 0.032s] rd_kafka_new() failed as expected: ssl.ca.location failed: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib 3: [0004_conf / 0.032s] Canonical tests 3: [0004_conf / 0.032s] Set: request.required.acks=0 expect 0 (topic) 3: [0004_conf / 0.032s] Set: request.required.acks=-1 expect -1 (topic) 3: [0004_conf / 0.032s] Set: request.required.acks=1 expect 1 (topic) 3: [0004_conf / 0.032s] Set: acks=3 expect 3 (topic) 3: [0004_conf / 0.032s] Set: request.required.acks=393 expect 393 (topic) 3: [0004_conf / 0.032s] Set: request.required.acks=bad expect (null) (topic) 3: [0004_conf / 0.032s] Set: request.required.acks=all expect -1 (topic) 3: [0004_conf / 0.032s] Set: request.required.acks=all expect -1 (global) 3: [0004_conf / 0.032s] Set: acks=0 expect 0 (topic) 3: [0004_conf / 0.032s] Set: sasl.mechanisms=GSSAPI expect GSSAPI (global) 3: [0004_conf / 0.032s] Set: sasl.mechanisms=PLAIN expect PLAIN (global) 3: [0004_conf / 0.032s] Set: sasl.mechanisms=GSSAPI,PLAIN expect (null) (global) 3: [0004_conf / 0.032s] Set: sasl.mechanisms= expect (null) (global) 3: [0004_conf / 0.032s] Set: linger.ms=12555.3 expect 12555.3 (global) 3: [0004_conf / 0.033s] Set: linger.ms=1500.000 expect 1500 (global) 3: [0004_conf / 0.033s] Set: linger.ms=0.0001 expect 0.0001 (global) 3: %4|1673491054.050|CONFWARN|0009_mock_cluster#consumer-8| [thrd:app]: Configuration property dr_msg_cb is a producer property and will be ignored by this consumer instance 3: [0009_mock_cluster / 0.035s] Created kafka instance 0009_mock_cluster#consumer-8 3: [0009_mock_cluster / 0.035s] Test config file test.conf not found 3: [0009_mock_cluster / 0.035s] Produce to rdkafkatest_rnd75e06c71081686e0_0009_mock_cluster [-1]: messages #0..100 3: [0009_mock_cluster / 0.035s] SUM(POLL): duration 0.000ms 3: [0009_mock_cluster / 0.035s] PRODUCE: duration 0.096ms 3: %5|1673491054.052|CONFWARN|rdkafka#producer-10| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %4|1673491054.058|CONFWARN|rdkafka#producer-12| [thrd:app]: Configuration property partition.assignment.strategy is a consumer property and will be ignored by this producer instance 3: %5|1673491054.058|CONFWARN|rdkafka#producer-12| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0009_mock_cluster / 0.043s] PRODUCE.DELIVERY.WAIT: duration 8.539ms 3: [0009_mock_cluster / 0.043s] Produce to rdkafkatest_rnd75e06c71081686e0_0009_mock_cluster [-1]: messages #0..100 3: [0009_mock_cluster / 0.043s] SUM(POLL): duration 0.001ms 3: [0009_mock_cluster / 0.043s] PRODUCE: duration 0.054ms 3: [0004_conf / 0.042s] Ok: `acks` must be set to `all` when `enable.idempotence` is true 3: [0004_conf / 0.042s] Ok: Java TrustStores are not supported, use `ssl.ca.location` and a certificate file instead. See https://github.com/edenhill/librdkafka/wiki/Using-SSL-with-librdkafka for more information. 3: [0004_conf / 0.042s] Ok: Java JAAS configuration is not supported, see https://github.com/edenhill/librdkafka/wiki/Using-SASL-with-librdkafka for more information. 3: [0004_conf / 0.042s] Ok: Internal property "interceptors" not settable 3: %5|1673491054.062|CONFWARN|rdkafka#producer-13| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1673491054.062|TOPICCONF|rdkafka#producer-13| [thrd:app]: Incompatible configuration settings for topic "mytopic": `acks` must be set to `all` when `enable.idempotence` is true 3: [0009_mock_cluster / 0.055s] PRODUCE.DELIVERY.WAIT: duration 11.723ms 3: [0009_mock_cluster / 0.055s] ASSIGN.PARTITIONS: duration 0.183ms 3: [0009_mock_cluster / 0.055s] CONSUME: assigned 4 partition(s) 3: [0009_mock_cluster / 0.055s] CONSUME: consume 100 messages 3: %5|1673491054.074|CONFWARN|rdkafka#producer-14| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1673491054.082|CONFWARN|rdkafka#producer-15| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1673491054.085|CONFWARN|rdkafka#producer-16| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1673491054.085|TOPICCONF|rdkafka#producer-16| [thrd:app]: Incompatible configuration settings for topic "mytopic": `queuing.strategy` must be set to `fifo` when `enable.idempotence` is true 3: %4|1673491054.089|CONFWARN|rdkafka#consumer-17| [thrd:app]: Configuration property queue.buffering.max.ms is a producer property and will be ignored by this consumer instance 3: [0004_conf / 0.072s] Instance config linger.ms=123 3: [0004_conf / 0.072s] Instance config group.id=test1 3: [0004_conf / 0.072s] Instance config enable.auto.commit=false 3: [0004_conf / 0.088s] [ do_test_default_topic_conf:381 ] 3: [0004_conf / 0.088s] [ do_test_default_topic_conf:381: PASS (0.00s) ] 3: [0004_conf / 0.088s] [ do_message_timeout_linger_checks:447 ] 3: %5|1673491054.112|CONFWARN|rdkafka#producer-18| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.095s] #0 "default and L and M": rd_kafka_new() succeeded 3: %5|1673491054.121|CONFWARN|rdkafka#producer-19| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.104s] #1 "set L such that L=M": rd_kafka_new() failed: `message.timeout.ms` must be greater than `linger.ms` 3: %5|1673491054.153|CONFWARN|rdkafka#producer-22| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0004_conf / 0.136s] #5 "set M such that L>=M": rd_kafka_new() succeeded 3: [0004_conf / 0.149s] #6 "set L and M such that L>=M": rd_kafka_new() failed: `message.timeout.ms` must be greater than `linger.ms` 3: [0004_conf / 0.149s] [ do_message_timeout_linger_checks:447: PASS (0.06s) ] 3: [0004_conf / 0.149s] 0004_conf: duration 149.133ms 3: [0004_conf / 0.149s] ================= Test 0004_conf PASSED ================= 3: [
/ 0.159s] Too many tests running (5 >= 5): postponing 0034_offset_reset_mock start... 3: [0033_regex_subscribe_local / 0.000s] ================= Running test 0033_regex_subscribe_local ================= 3: [0033_regex_subscribe_local / 0.000s] ==== Stats written to file stats_0033_regex_subscribe_local_6710757131114170526.json ==== 3: [0025_timers / 0.216s] rd_kafka_poll(): duration 100.081ms 3: [0033_regex_subscribe_local / 0.077s] 0033_regex_subscribe_local: duration 76.537ms 3: [0033_regex_subscribe_local / 0.077s] ================= Test 0033_regex_subscribe_local PASSED ================= 3: [
/ 0.237s] Too many tests running (5 >= 5): postponing 0037_destroy_hang_local start... 3: [0034_offset_reset_mock / 0.000s] ================= Running test 0034_offset_reset_mock ================= 3: [0034_offset_reset_mock / 0.000s] ==== Stats written to file stats_0034_offset_reset_mock_8700115682296417938.json ==== 3: [0034_offset_reset_mock / 0.000s] [ offset_reset_errors:201 ] 3: %5|1673491054.245|CONFWARN|MOCK#producer-24| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0034_offset_reset_mock / 0.001s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.002s] Created kafka instance 0034_offset_reset_mock#producer-25 3: [0034_offset_reset_mock / 0.002s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.002s] Produce to topic [0]: messages #0..10 3: [0034_offset_reset_mock / 0.002s] SUM(POLL): duration 0.001ms 3: [0034_offset_reset_mock / 0.002s] PRODUCE: duration 0.013ms 3: [0034_offset_reset_mock / 0.013s] PRODUCE.DELIVERY.WAIT: duration 11.054ms 3: [0034_offset_reset_mock / 0.019s] Test config file test.conf not found 3: [0034_offset_reset_mock / 0.019s] Setting test timeout to 300s * 2.7 3: [0034_offset_reset_mock / 0.021s] Created kafka instance 0034_offset_reset_mock#consumer-26 3: [0034_offset_reset_mock / 0.021s] Waiting for up to 5000ms for metadata update 3: [0034_offset_reset_mock / 0.059s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 0.059s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 0.059s] METADATA.WAIT: duration 37.943ms 3: [0025_timers / 0.318s] rd_kafka_poll(): duration 101.639ms 3: [0025_timers / 0.418s] rd_kafka_poll(): duration 100.081ms 3: [0025_timers / 0.518s] rd_kafka_poll(): duration 100.074ms 3: [0025_timers / 0.612s] Call #0: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 0.612s] rd_kafka_poll(): duration 93.883ms 3: [0025_timers / 0.712s] rd_kafka_poll(): duration 100.081ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:457: unittest_untyped_map: 500000 map_get iterations took 577.347ms = 1us/get 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:474: unittest_untyped_map: Total time over 100000 entries took 749.239ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:477: unittest_untyped_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:305: unittest_typed_map: enumerated key 2 person Hedvig Lindahl 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:305: unittest_typed_map: enumerated key 1 person Roy McPhearsome 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmap.c:323: unittest_typed_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: map: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1353: do_unittest_write_read 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1518: do_unittest_write_split_seek 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1608: do_unittest_write_read_payload_correctness 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1676: do_unittest_write_iov 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdbuf.c:1866: do_unittest_erase 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdbuf: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdvarint.c:107: do_test_rd_uvarint_enc_i64 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdvarint: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:411: unittest_rd_crc32c: Calculate CRC32C using hardware (SSE42) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:422: unittest_rd_crc32c: Calculate CRC32C using software 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/crc32c.c:429: unittest_rd_crc32c 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: crc32c: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:1985: unittest_msgq_order: FIFO: testing in FIFO mode 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2172: unittest_msg_seq_wrap 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: get baseline insert time 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 2 messages into destq with 2 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 1us, 0.2500us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: get baseline insert time 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 2 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 3 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0000us/msg over 2 messages in 0us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: single-message ranges 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 4 messages into destq with 5 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 0us, 0.0000us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: single-message ranges 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 5 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 6 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 1.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 7 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 1 messages into destq with 8 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.2500us/msg over 4 messages in 1us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: many messages 3: [0025_timers / 0.812s] rd_kafka_poll(): duration 100.076ms 3: [0025_timers / 0.912s] rd_kafka_poll(): duration 100.080ms 3: [0022_consume_batch_local / 1.014s] refresh_called = 1 3: [0025_timers / 1.012s] rd_kafka_poll(): duration 100.065ms 3: [0022_consume_batch_local / 1.018s] 0022_consume_batch_local: duration 1018.147ms 3: [0022_consume_batch_local / 1.018s] ================= Test 0022_consume_batch_local PASSED ================= 3: [
/ 1.024s] Too many tests running (5 >= 5): postponing 0039_event_log start... 3: [0037_destroy_hang_local / 0.000s] ================= Running test 0037_destroy_hang_local ================= 3: [0037_destroy_hang_local / 0.000s] ==== Stats written to file stats_0037_destroy_hang_local_4270445542198815966.json ==== 3: [0037_destroy_hang_local / 0.000s] Test config file test.conf not found 3: [0037_destroy_hang_local / 0.000s] Setting test timeout to 30s * 2.7 3: [0037_destroy_hang_local / 0.000s] Using topic "rdkafkatest_legacy_consumer_early_destroy" 3: [0037_destroy_hang_local / 0.000s] legacy_consumer_early_destroy: pass #0 3: [0037_destroy_hang_local / 0.000s] Test config file test.conf not found 3: %5|1673491055.041|CONFWARN|0037_destroy_hang_local#consumer-27| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0037_destroy_hang_local / 0.010s] Created kafka instance 0037_destroy_hang_local#consumer-27 3: [0037_destroy_hang_local / 0.019s] legacy_consumer_early_destroy: pass #1 3: [0037_destroy_hang_local / 0.019s] Test config file test.conf not found 3: %5|1673491055.050|CONFWARN|0037_destroy_hang_local#consumer-28| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0037_destroy_hang_local / 0.019s] Created kafka instance 0037_destroy_hang_local#consumer-28 3: [0025_timers / 1.112s] rd_kafka_poll(): duration 100.081ms 3: [0009_mock_cluster / 1.157s] CONSUME: duration 1101.506ms 3: [0009_mock_cluster / 1.157s] CONSUME: consumed 100/100 messages (0/-1 EOFs) 3: [0009_mock_cluster / 1.158s] 0009_mock_cluster: duration 1158.311ms 3: [0009_mock_cluster / 1.158s] ================= Test 0009_mock_cluster PASSED ================= 3: [0025_timers / 1.212s] Call #1: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 1.212s] rd_kafka_poll(): duration 99.522ms 3: [
/ 1.266s] Too many tests running (5 >= 5): postponing 0039_event start... 3: [0039_event_log / 0.000s] ================= Running test 0039_event_log ================= 3: [0039_event_log / 0.000s] ==== Stats written to file stats_0039_event_log_2599896536948070483.json ==== 3: [0039_event_log / 0.011s] Created kafka instance 0039_event_log#producer-29 3: [0039_event_log / 0.011s] rd_kafka_set_log_queue(rk, eventq): duration 0.003ms 3: [0039_event_log / 0.011s] Got log event: level: 7 ctx: queue fac: WAKEUPFD: msg: [thrd:app]: 0:65534/bootstrap: Enabled low-latency ops queue wake-ups 3: [0039_event_log / 0.011s] Destroying kafka instance 0039_event_log#producer-29 3: [0039_event_log / 0.020s] 0039_event_log: duration 20.250ms 3: [0039_event_log / 0.020s] ================= Test 0039_event_log PASSED ================= 3: [0034_offset_reset_mock / 1.059s] #0: injecting _TRANSPORT, expecting NO_ERROR 3: [0034_offset_reset_mock / 1.059s] Bringing down the broker 3: %6|1673491055.304|FAIL|0034_offset_reset_mock#consumer-26| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:45729: Disconnected (after 996ms in state UP) 3: [0034_offset_reset_mock / 1.060s] ASSIGN.PARTITIONS: duration 0.060ms 3: [0034_offset_reset_mock / 1.060s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 1.060s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:45729: Disconnected (after 996ms in state UP) 3: %6|1673491055.305|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:45729/bootstrap]: 127.0.0.1:45729/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1003ms in state UP) 3: %3|1673491055.305|FAIL|0034_offset_reset_mock#consumer-26| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:45729: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT) 3: [0034_offset_reset_mock / 1.060s] #0: Ignoring Error event: 127.0.0.1:45729/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1003ms in state UP) 3: [0034_offset_reset_mock / 1.060s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:45729: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT) 3: [0034_offset_reset_mock / 1.060s] #0: Ignoring Error event: 2/2 brokers are down 3: %3|1673491055.305|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:45729/bootstrap]: 127.0.0.1:45729/1: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT) 3: [0034_offset_reset_mock / 1.060s] #0: Ignoring Error event: 127.0.0.1:45729/1: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT) 3: [0025_timers / 1.312s] rd_kafka_poll(): duration 100.076ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 4315956 messages into destq with 165288 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 5603us, 0.0013us/msg 3: [
/ 1.386s] Too many tests running (5 >= 5): postponing 0043_no_connection start... 3: [0039_event / 0.000s] ================= Running test 0039_event ================= 3: [0039_event / 0.000s] ==== Stats written to file stats_0039_event_4623212127786273319.json ==== 3: [0039_event / 0.005s] Created kafka instance 0039_event#producer-30 3: %3|1673491055.410|FAIL|0039_event#producer-30| [thrd:0:65534/bootstrap]: 0:65534/bootstrap: Connect to ipv4#0.0.0.0:65534 failed: Connection refused (after 0ms in state CONNECT) 3: [0039_event / 0.009s] Got Error event: _TRANSPORT: 0:65534/bootstrap: Connect to ipv4#0.0.0.0:65534 failed: Connection refused (after 0ms in state CONNECT) 3: [0039_event / 0.009s] Destroying kafka instance 0039_event#producer-30 3: [0039_event / 0.013s] 0039_event: duration 12.982ms 3: [0039_event / 0.013s] ================= Test 0039_event PASSED ================= 3: [0025_timers / 1.413s] rd_kafka_poll(): duration 100.737ms 3: %3|1673491055.496|FAIL|0034_offset_reset_mock#consumer-26| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:45729: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0034_offset_reset_mock / 1.251s] #0: Ignoring Error event: GroupCoordinator: 127.0.0.1:45729: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/ 1.506s] Too many tests running (5 >= 5): postponing 0045_subscribe_update_mock start... 3: [0043_no_connection / 0.000s] ================= Running test 0043_no_connection ================= 3: [0043_no_connection / 0.000s] ==== Stats written to file stats_0043_no_connection_6363875660197263453.json ==== 3: [0043_no_connection / 0.000s] Test config file test.conf not found 3: [0043_no_connection / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1673491055.526|CONFWARN|0043_no_connection#producer-31| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0043_no_connection / 0.005s] Created kafka instance 0043_no_connection#producer-31 3: [0043_no_connection / 0.005s] Test config file test.conf not found 3: [0043_no_connection / 0.005s] Produce to test_producer_no_connection [-1]: messages #0..100 3: [0043_no_connection / 0.005s] SUM(POLL): duration 0.000ms 3: [0043_no_connection / 0.005s] PRODUCE: duration 0.085ms 3: [0043_no_connection / 0.005s] Produce to test_producer_no_connection [0]: messages #0..100 3: [0043_no_connection / 0.005s] SUM(POLL): duration 0.001ms 3: [0043_no_connection / 0.005s] PRODUCE: duration 0.066ms 3: [0043_no_connection / 0.005s] Produce to test_producer_no_connection [1]: messages #0..100 3: [0043_no_connection / 0.005s] SUM(POLL): duration 0.000ms 3: [0043_no_connection / 0.005s] PRODUCE: duration 0.060ms 3: [0025_timers / 1.514s] rd_kafka_poll(): duration 100.965ms 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: many messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 100001 messages into destq with 165288 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 21us, 0.0002us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 50001 messages into destq with 265289 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1878us, 0.0376us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 20001 messages into destq with 315290 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2368us, 0.1184us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 59129 messages into destq with 335291 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2672us, 0.0452us/msg 3: [0025_timers / 1.614s] rd_kafka_poll(): duration 100.078ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 86823 messages into destq with 394420 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 5587us, 0.0643us/msg 3: %3|1673491055.656|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:45729/bootstrap]: 127.0.0.1:45729/1: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0034_offset_reset_mock / 1.411s] #0: Ignoring Error event: 127.0.0.1:45729/1: Connect to ipv4#127.0.0.1:45729 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0025_timers / 1.714s] rd_kafka_poll(): duration 100.079ms 3: [0025_timers / 1.812s] Call #2: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 1.812s] rd_kafka_poll(): duration 97.927ms 3: [0025_timers / 1.912s] rd_kafka_poll(): duration 100.078ms 3: [0025_timers / 2.012s] rd_kafka_poll(): duration 100.065ms 3: [0037_destroy_hang_local / 1.020s] 0037_destroy_hang_local: duration 1019.629ms 3: [0037_destroy_hang_local / 1.020s] ================= Test 0037_destroy_hang_local PASSED ================= 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 4000001 messages into destq with 481243 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 6795us, 0.0017us/msg 3: [0025_timers / 2.112s] rd_kafka_poll(): duration 100.080ms 3: [
/ 2.143s] Too many tests running (5 >= 5): postponing 0046_rkt_cache start... 3: [0045_subscribe_update_mock / 0.000s] ================= Running test 0045_subscribe_update_mock ================= 3: [0045_subscribe_update_mock / 0.000s] ==== Stats written to file stats_0045_subscribe_update_mock_6451362331463313107.json ==== 3: [0045_subscribe_update_mock / 0.000s] [ do_test_regex_many_mock:378: range with 50 topics ] 3: %5|1673491056.160|CONFWARN|MOCK#producer-32| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0045_subscribe_update_mock / 0.009s] Test config file test.conf not found 3: [0045_subscribe_update_mock / 0.009s] Setting test timeout to 300s * 2.7 3: [0045_subscribe_update_mock / 0.017s] Created kafka instance 0045_subscribe_update_mock#consumer-33 3: [0045_subscribe_update_mock / 0.021s] Creating topic topic_0 3: [0045_subscribe_update_mock / 0.025s] rd_kafka_mock_topic_create(mcluster, topic, 1 + (i % 8), 1): duration 3.962ms 3: [0045_subscribe_update_mock / 0.025s] POLL: not expecting any messages for 300ms 3: [0045_subscribe_update_mock / 0.025s] TEST FAILURE 3: ### Test "0045_subscribe_update_mock (do_test_regex_many_mock:378: range with 50 topics)" failed at /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:4048:test_consumer_poll_no_msgs() at Thu Jan 12 02:37:36 2023: ### 3: ^topic_.* [0] error (offset -1001): Subscribed topic not available: ^topic_.*: Broker: Unknown topic or partition 3: [0025_timers / 2.213s] rd_kafka_poll(): duration 100.511ms 3: [
/ 2.244s] Too many tests running (5 >= 5): postponing 0053_stats_timing start... 3: [0046_rkt_cache / 0.000s] ================= Running test 0046_rkt_cache ================= 3: [0046_rkt_cache / 0.000s] ==== Stats written to file stats_0046_rkt_cache_7115575480178094635.json ==== 3: [0046_rkt_cache / 0.000s] Using topic "rdkafkatest_0046_rkt_cache" 3: [0046_rkt_cache / 0.000s] Test config file test.conf not found 3: %5|1673491056.260|CONFWARN|0046_rkt_cache#producer-34| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0046_rkt_cache / 0.007s] Created kafka instance 0046_rkt_cache#producer-34 3: [0046_rkt_cache / 0.007s] Test config file test.conf not found 3: [0046_rkt_cache / 0.015s] 0046_rkt_cache: duration 14.962ms 3: [0046_rkt_cache / 0.015s] ================= Test 0046_rkt_cache PASSED ================= 3: [
/ 2.261s] Too many tests running (5 >= 5): postponing 0058_log start... 3: [0053_stats_timing / 0.000s] ================= Running test 0053_stats_timing ================= 3: [0053_stats_timing / 0.000s] ==== Stats written to file stats_0053_stats_timing_1273207867383116873.json ==== 3: [0025_timers / 2.314s] rd_kafka_poll(): duration 100.955ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0045us/msg over 4315956 messages in 19321us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: issue #2508 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 145952 messages into destq with 154875 messages 3: [0053_stats_timing / 0.105s] Stats (#0): { "name": "rdkafka#producer-35", "client_id": "rdkafka", "type": "producer", "ts":201470376038, "time":1673491056, "age":104777, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 3485us, 0.0116us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: issue #2508 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 59129 messages into destq with 154875 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 2us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 86823 messages into destq with 214004 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.0000us/msg 3: [0025_timers / 2.412s] Call #3: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 2.412s] rd_kafka_poll(): duration 98.164ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0000us/msg over 145952 messages in 3us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2217: unittest_msgq_insert_all_sort: Testing msgq insert (all) efficiency: issue #2450 (v1.2.1 regression) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2250: unittest_msgq_insert_all_sort: Begin insert of 86 messages into destq with 199999 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2258: unittest_msgq_insert_all_sort: Done: took 1us, 0.0000us/msg 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2289: unittest_msgq_insert_all_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2313: unittest_msgq_insert_each_sort: Testing msgq insert (each) efficiency: issue #2450 (v1.2.1 regression) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 199999 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.5000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 5 messages into destq with 200001 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 4 messages into destq with 200006 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 1us, 0.2500us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200010 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 3 messages into destq with 200012 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 61 messages into destq with 200015 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200076 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200078 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 2 messages into destq with 200080 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2350: unittest_msgq_insert_each_sort: Begin insert of 3 messages into destq with 200082 messages 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2361: unittest_msgq_insert_each_sort: Done: took 0us, 0.0000us/msg 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2386: unittest_msgq_insert_each_sort: Total: 0.0233us/msg over 86 messages in 2us 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msg.c:2402: unittest_msgq_insert_each_sort 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: msg: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdmurmur2.c:166: unittest_murmur2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: murmurhash: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdfnv1a.c:112: unittest_fnv1a 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: fnv1a: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:468: ut_high_sigfig 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:495: ut_quantile 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:514: ut_mean 3: [0043_no_connection / 1.006s] 300 messages in queue 3: %4|1673491056.527|TERMINATE|0043_no_connection#producer-31| [thrd:app]: Producer terminating with 300 messages (30000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:536: ut_stddev 3: [0043_no_connection / 1.006s] rd_kafka_destroy(): duration 0.182ms 3: [0043_no_connection / 1.006s] 0043_no_connection: duration 1006.098ms 3: [0043_no_connection / 1.006s] ================= Test 0043_no_connection PASSED ================= 3: [0025_timers / 2.512s] rd_kafka_poll(): duration 100.078ms 3: [
/ 2.519s] Too many tests running (5 >= 5): postponing 0062_stats_event start... 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:555: ut_totalcount 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:573: ut_max 3: [0058_log / 0.000s] ================= Running test 0058_log ================= 3: [0058_log / 0.000s] ==== Stats written to file stats_0058_log_8314510949710315239.json ==== 3: [0058_log / 0.000s] main.queue: Creating producer, not expecting any log messages 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:590: ut_min 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:609: ut_reset 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:623: ut_nan 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:638: ut_sigfigs 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:654: ut_minmax_trackable 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:664: ut_unitmagnitude_overflow 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdhdrhistogram.c:697: ut_subbucketmask_overflow 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: rdhdrhistogram: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4311: unittest_conf: Safified client.software.name="aba.-va" 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4319: unittest_conf: Safified client.software.version="1.2.3.4.5----a" 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_conf.c:4323: unittest_conf 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: conf: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_broker.c:2081: rd_ut_reconnect_backoff 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: broker: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4827: unittest_idempotent_producer: Verifying idempotent producer error handling 3: %5|1673491056.550|CONFWARN|rdkafka#producer-37| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4994: unittest_idempotent_producer: Got DeliveryReport event with 3 message(s) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:4999: unittest_idempotent_producer: DR for message: Success: (persistence=2) 3: [0025_timers / 2.612s] rd_kafka_poll(): duration 100.080ms 3: [0025_timers / 2.712s] rd_kafka_poll(): duration 100.080ms 3: [0025_timers / 2.812s] rd_kafka_poll(): duration 100.080ms 3: [0025_timers / 2.912s] rd_kafka_poll(): duration 100.087ms 3: [0025_timers / 3.012s] Call #4: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 3.012s] rd_kafka_poll(): duration 99.449ms 3: [0025_timers / 3.112s] rd_kafka_poll(): duration 100.075ms 3: [0025_timers / 3.212s] rd_kafka_poll(): duration 100.080ms 3: [0025_timers / 3.312s] rd_kafka_poll(): duration 100.079ms 3: [0053_stats_timing / 1.105s] Stats (#10): { "name": "rdkafka#producer-35", "client_id": "rdkafka", "type": "producer", "ts":201471376223, "time":1673491057, "age":1104962, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0025_timers / 3.412s] rd_kafka_poll(): duration 100.080ms 3: [0053_stats_timing / 1.205s] 12 (expected 12) stats callbacks received in 1200ms (expected 1200ms +-25%) 3: [0053_stats_timing / 1.205s] 0053_stats_timing: duration 1205.368ms 3: [0053_stats_timing / 1.205s] ================= Test 0053_stats_timing PASSED ================= 3: [0025_timers / 3.512s] rd_kafka_poll(): duration 100.083ms 3: [0058_log / 1.004s] main.queue: Setting log queue 3: [0058_log / 1.004s] main.queue: Expecting at least one log message 3: [0058_log / 1.004s] Log: level 7, facility INIT, str [thrd:app]: librdkafka v1.9.2 (0x10902ff) 0058_log#producer-36 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x1) 3: [0058_log / 1.004s] Log: level 5, facility CONFWARN, str [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0058_log / 1.004s] main.queue: Saw 2 logs 3: [0058_log / 1.004s] local.queue: Creating producer, not expecting any log messages 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_request.c:5022: unittest_idempotent_producer 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: request: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1581: do_unittest_config_no_principal_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1610: do_unittest_config_empty_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1677: do_unittest_config_empty_value_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1715: do_unittest_config_value_with_quote_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1640: do_unittest_config_unrecognized_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1464: do_unittest_config_defaults 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1507: do_unittest_config_explicit_scope_and_life 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1551: do_unittest_config_all_explicit_values 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1754: do_unittest_config_extensions 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1773: do_unittest_illegal_extension_keys_should_fail 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_oauthbearer.c:1806: do_unittest_odd_extension_size_should_fail 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: sasl_oauthbearer: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_msgset_reader.c:1781: unittest_aborted_txns 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: aborted_txns: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5705: unittest_consumer_group_metadata 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5776: unittest_set_intersect 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5825: unittest_set_subtract 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5852: unittest_map_to_list 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_cgrp.c:5882: unittest_list_to_map 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: cgrp: PASS 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_scram.c:910: unittest_scram_nonce 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sasl_scram.c:949: unittest_scram_safe 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: scram: PASS 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case Symmetrical subscription: range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case Symmetrical subscription: roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case 1*3 partitions (asymmetrical): range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case 1*3 partitions (asymmetrical): roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case #2121 (asymmetrical): range assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:934: ut_assignors: Test case #2121 (asymmetrical): roundrobin assignor 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #0 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerNoTopic:2211: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2217: ut_testOneConsumerNoTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #0 ran for 0.024ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #1 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerNonexistentTopic:2237: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2243: ut_testOneConsumerNonexistentTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #1 ran for 0.013ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #2 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerOneTopic:2269: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2275: ut_testOneConsumerOneTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #2 ran for 0.020ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #3 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOnlyAssignsPartitionsFromSubscribedTopics:2300: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2306: ut_testOnlyAssignsPartitionsFromSubscribedTopics 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #3 ran for 0.016ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #4 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testOneConsumerMultipleTopics:2329: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2335: ut_testOneConsumerMultipleTopics 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #4 ran for 0.015ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #5 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersOneTopicOnePartition:2358: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2365: ut_testTwoConsumersOneTopicOnePartition 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #5 ran for 0.019ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #6 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersOneTopicTwoPartitions:2389: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2396: ut_testTwoConsumersOneTopicTwoPartitions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #6 ran for 0.018ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #7 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMultipleConsumersMixedTopicSubscriptions:2424: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2432: ut_testMultipleConsumersMixedTopicSubscriptions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #7 ran for 0.037ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #8 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testTwoConsumersTwoTopicsSixPartitions:2459: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2466: ut_testTwoConsumersTwoTopicsSixPartitions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #8 ran for 0.029ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #9 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2487: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2501: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveConsumerOneTopic:2514: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2522: ut_testAddRemoveConsumerOneTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #9 ran for 0.048ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #10 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testPoorRoundRobinAssignmentScenario:2576: verifying assignment for 4 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2585: ut_testPoorRoundRobinAssignmentScenario 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #10 ran for 0.045ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #11 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2609: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2615: ut_testAddRemoveTopicTwoConsumers: Adding topic2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2630: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2638: ut_testAddRemoveTopicTwoConsumers: Removing topic1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAddRemoveTopicTwoConsumers:2650: verifying assignment for 2 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2658: ut_testAddRemoveTopicTwoConsumers 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #11 ran for 0.071ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #12 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerLeaves:2706: verifying assignment for 19 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerLeaves:2721: verifying assignment for 18 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2728: ut_testReassignmentAfterOneConsumerLeaves 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #12 ran for 4.312ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #13 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerAdded:2762: verifying assignment for 8 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testReassignmentAfterOneConsumerAdded:2774: verifying assignment for 9 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2781: ut_testReassignmentAfterOneConsumerAdded 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #13 ran for 0.267ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #14 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testSameSubscriptions:2823: verifying assignment for 9 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testSameSubscriptions:2836: verifying assignment for 8 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2844: ut_testSameSubscriptions 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #14 ran for 6.628ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #15 ] 3: [
/ 3.574s] Too many tests running (5 >= 5): postponing 0066_plugins start... 3: [0062_stats_event / 0.000s] ================= Running test 0062_stats_event ================= 3: [0062_stats_event / 0.000s] ==== Stats written to file stats_0062_stats_event_5663715277246615709.json ==== 3: [0062_stats_event / 0.000s] Test config file test.conf not found 3: [0062_stats_event / 0.000s] Setting test timeout to 10s * 2.7 3: %5|1673491057.583|CONFWARN|0062_stats_event#producer-41| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0062_stats_event / 0.000s] Created kafka instance 0062_stats_event#producer-41 3: [0025_timers / 3.612s] Call #5: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 3.612s] rd_kafka_poll(): duration 99.469ms 3: [0062_stats_event / 0.100s] Stats event 3: [0062_stats_event / 0.100s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":201471684385, "time":1673491057, "age":100112, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.100s] STATS_EVENT: duration 99.856ms 3: [0025_timers / 3.712s] rd_kafka_poll(): duration 100.080ms 3: [0062_stats_event / 0.200s] Stats event 3: [0062_stats_event / 0.200s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":201471784404, "time":1673491057, "age":200131, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.200s] STATS_EVENT: duration 100.005ms 3: [0025_timers / 3.812s] rd_kafka_poll(): duration 100.079ms 3: [0062_stats_event / 0.300s] Stats event 3: [0062_stats_event / 0.300s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":201471884424, "time":1673491057, "age":300151, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.300s] STATS_EVENT: duration 100.013ms 3: [0025_timers / 3.912s] rd_kafka_poll(): duration 100.079ms 3: [0062_stats_event / 0.400s] Stats event 3: [0062_stats_event / 0.400s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":201471984443, "time":1673491057, "age":400170, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.400s] STATS_EVENT: duration 100.014ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testLargeAssignmentWithMultipleConsumersLeaving:2895: verifying assignment for 200 member(s): 3: [0025_timers / 4.012s] rd_kafka_poll(): duration 100.067ms 3: [0062_stats_event / 0.500s] Stats event 3: [0062_stats_event / 0.500s] Stats: { "name": "0062_stats_event#producer-41", "client_id": "0062_stats_event", "type": "producer", "ts":201472084463, "time":1673491058, "age":500190, "replyq":0, "msg_cnt":0, "msg_size":0, "msg_max":100000, "msg_size_max":1073741824, "simple_cnt":0, "metadata_cache_cnt":0, "brokers":{ }, "topics":{ } , "tx":0, "tx_bytes":0, "rx":0, "rx_bytes":0, "txmsgs":0, "txmsg_bytes":0, "rxmsgs":0, "rxmsg_bytes":0} 3: [0062_stats_event / 0.500s] STATS_EVENT: duration 100.018ms 3: [0062_stats_event / 0.501s] 0062_stats_event: duration 500.968ms 3: [0062_stats_event / 0.501s] ================= Test 0062_stats_event PASSED ================= 3: [
/ 4.075s] Too many tests running (5 >= 5): postponing 0072_headers_ut start... 3: [0066_plugins / 0.000s] ================= Running test 0066_plugins ================= 3: [0066_plugins / 0.000s] ==== Stats written to file stats_0066_plugins_5630478247686415362.json ==== 3: [0066_plugins / 0.000s] Using topic "rdkafkatest_rnd5c5231ae3ad9067e_0066_plugins" 3: [0066_plugins / 0.000s] running test from cwd /usr/src/RPM/BUILD/librdkafka-1.9.2/tests 3: [0066_plugins / 0.000s] set(session.timeout.ms, 6000) 3: [0066_plugins / 0.000s] set(plugin.library.paths, interceptor_test/interceptor_test) 3: [0066_plugins / 0.000s] conf_init(conf 0x7fd7b0001a40) called (setting opaque to 0x7fd7fc14f0ea) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fd7b0001a40) for ici 0x7fd7b0001610 with ici->conf 0x7fd7b0002670 3: [0066_plugins / 0.000s] set(socket.timeout.ms, 12) 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0001a40, "socket.timeout.ms", "12"): 0x7fd7b0001610 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0001a40, "socket.timeout.ms", "12"): 0x7fd7b0001610 3: [0066_plugins / 0.000s] set(interceptor_test.config1, one) 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0001a40, "interceptor_test.config1", "one"): 0x7fd7b0001610 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0001a40, interceptor_test.config1, one): 0x7fd7b0001610 3: [0066_plugins / 0.000s] set(interceptor_test.config2, two) 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0001a40, "interceptor_test.config2", "two"): 0x7fd7b0001610 3: [0066_plugins / 0.000s] set(topic.metadata.refresh.interval.ms, 1234) 3: [0066_plugins / 0.000s] on_conf_dup(new_conf 0x7fd7b0003720, old_conf 0x7fd7b0001a40, filter_cnt 0, ici 0x7fd7b0001610) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fd7b0003720) for ici 0x7fd7b0004040 with ici->conf 0x7fd7b0004070 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0003720, "socket.timeout.ms", "12"): 0x7fd7b0004040 3: [0066_plugins / 0.000s] conf_init(conf 0x7fd7b0004070) called (setting opaque to 0x7fd7fc14f0ea) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fd7b0004070) for ici 0x7fd7b0004e20 with ici->conf 0x7fd7b0004e50 3: [0066_plugins / 0.000s] conf_init(conf 0x7fd7b0003720) called (setting opaque to 0x7fd7fc14f0ea) 3: [0066_plugins / 0.000s] on_conf_dup(new_conf 0x7fd7b0005d30, old_conf 0x7fd7b0003720, filter_cnt 2, ici 0x7fd7b0004040) 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fd7b0005d30) for ici 0x7fd7b0006630 with ici->conf 0x7fd7b0006660 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0005d30, "socket.timeout.ms", "12"): 0x7fd7b0006630 3: [0066_plugins / 0.000s] conf_init0(conf 0x7fd7b0003720) for ici 0x7fd7b0005d00 with ici->conf 0x7fd7b0005d30 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0003720, "interceptor_test.config1", "one"): 0x7fd7b0004040 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0003720, interceptor_test.config1, one): 0x7fd7b0004040 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0003720, "interceptor_test.config2", "two"): 0x7fd7b0004040 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0003720, "session.timeout.ms", "6000"): 0x7fd7b0004040 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0004070, "session.timeout.ms", "6000"): 0x7fd7b0004e20 3: [0066_plugins / 0.000s] on_conf_set(conf 0x7fd7b0004070, "session.timeout.ms", "6000"): 0x7fd7b0004e20 3: [0066_plugins / 0.000s] on_new(rk 0x7fd7b00076c0, conf 0x7fd7b00077f8, ici->conf 0x7fd7b0004070): 0x7fd7b0004040: #1 3: %4|1673491058.093|CONFWARN|rdkafka#producer-42| [thrd:app]: Configuration property session.timeout.ms is a consumer property and will be ignored by this producer instance 3: %5|1673491058.093|CONFWARN|rdkafka#producer-42| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0066_plugins / 0.018s] 0066_plugins: duration 17.718ms 3: [0066_plugins / 0.018s] ================= Test 0066_plugins PASSED ================= 3: [0025_timers / 4.112s] rd_kafka_poll(): duration 100.082ms 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testLargeAssignmentWithMultipleConsumersLeaving:2911: verifying assignment for 150 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2918: ut_testLargeAssignmentWithMultipleConsumersLeaving 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #15 ran for 630.982ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #16 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNewSubscription:2957: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2963: ut_testNewSubscription: Adding topic1 to consumer1 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNewSubscription:2972: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:2980: ut_testNewSubscription 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #16 ran for 0.142ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #17 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMoveExistingAssignments:3006: verifying assignment for 4 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testMoveExistingAssignments:3027: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3063: ut_testMoveExistingAssignments 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #17 ran for 0.056ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #18 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness:3111: verifying assignment for 3 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3118: ut_testStickiness 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #18 ran for 0.032ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #19 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3144: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3154: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3168: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3168: verifying assignment for 3 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3180: verifying assignment for 2 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testStickiness2:3192: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3201: ut_testStickiness2 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #19 ran for 0.172ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #20 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testAssignmentUpdatedForDeletedTopic:3223: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3233: ut_testAssignmentUpdatedForDeletedTopic 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #20 ran for 0.397ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #21 ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted:3255: verifying assignment for 1 member(s): 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:1947: verifyValidityAndBalance0: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted:3269: verifying assignment for 1 member(s): 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3275: ut_testNoExceptionThrownWhenOnlySubscribedTopicDeleted 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #21 ran for 0.023ms ] 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3399: rd_kafka_sticky_assignor_unittest: [ Test #22 ] 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3290: ut_testConflictingPreviousAssignments 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_sticky_assignor.c:3401: rd_kafka_sticky_assignor_unittest: [ Test #22 ran for 0.003ms ] 3: RDUT: PASS: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdkafka_assignor.c:1054: ut_assignors 3: RDUT: INFO: /usr/src/RPM/BUILD/librdkafka-1.9.2/src/rdunittest.c:501: rd_unittest: unittest: assignors: PASS 3: [0000_unittests / 4.190s] 0000_unittests: duration 4189.949ms 3: [0000_unittests / 4.190s] ================= Test 0000_unittests PASSED ================= 3: [
/ 4.194s] Too many tests running (5 >= 5): postponing 0078_c_from_cpp start... 3: [0074_producev / 0.000s] ================= Running test 0074_producev ================= 3: [0074_producev / 0.000s] ==== Stats written to file stats_0074_producev_2335336641817293601.json ==== 3: [0072_headers_ut / 0.000s] ================= Running test 0072_headers_ut ================= 3: [0072_headers_ut / 0.000s] ==== Stats written to file stats_0072_headers_ut_477139918410826843.json ==== 3: [0072_headers_ut / 0.000s] Using topic "rdkafkatest_0072_headers_ut" 3: %5|1673491058.202|CONFWARN|0072_headers_ut#producer-44| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0072_headers_ut / 0.000s] Created kafka instance 0072_headers_ut#producer-44 3: %5|1673491058.204|CONFWARN|0074_producev#producer-43| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0074_producev / 0.002s] Created kafka instance 0074_producev#producer-43 3: [0074_producev / 0.002s] produceva() error (expected): Failed to produce message: Broker: Message size too large 3: [0074_producev / 0.002s] 0074_producev: duration 2.243ms 3: [0074_producev / 0.002s] ================= Test 0074_producev PASSED ================= 3: [0079_fork / 0.000s] WARN: SKIPPING TEST: Filtered due to negative test flags 3: [
/ 4.196s] Too many tests running (5 >= 5): postponing 0080_admin_ut start... 3: [0078_c_from_cpp / 0.000s] ================= Running test 0078_c_from_cpp ================= 3: [0078_c_from_cpp / 0.000s] ==== Stats written to file stats_0078_c_from_cpp_5381928236441700936.json ==== 3: %5|1673491058.205|CONFWARN|myclient#producer-45| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0078_c_from_cpp / 0.000s] Compare C name myclient#producer-45 to C++ name myclient#producer-45 3: [0078_c_from_cpp / 0.000s] Compare C topic mytopic to C++ topic mytopic 3: [0078_c_from_cpp / 0.000s] 0078_c_from_cpp: duration 0.292ms 3: [0078_c_from_cpp / 0.000s] ================= Test 0078_c_from_cpp PASSED ================= 3: [
/ 4.197s] Too many tests running (5 >= 5): postponing 0084_destroy_flags_local start... 3: [0080_admin_ut / 0.000s] ================= Running test 0080_admin_ut ================= 3: [0080_admin_ut / 0.000s] ==== Stats written to file stats_0080_admin_ut_1082004075193399381.json ==== 3: [0080_admin_ut / 0.000s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq ] 3: [0080_admin_ut / 0.000s] Test config file test.conf not found 3: %5|1673491058.205|CONFWARN|0080_admin_ut#producer-46| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0025_timers / 4.213s] Call #6: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 4.213s] rd_kafka_poll(): duration 100.145ms 3: [0080_admin_ut / 0.100s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 0.100s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 0.101s] rd_kafka_destroy(): duration 0.292ms 3: [0080_admin_ut / 0.101s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq: PASS (0.10s) ] 3: [0080_admin_ut / 0.101s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 0.101s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq ] 3: [0080_admin_ut / 0.101s] Test config file test.conf not found 3: %5|1673491058.306|CONFWARN|0080_admin_ut#producer-47| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0025_timers / 4.313s] rd_kafka_poll(): duration 100.078ms 3: [0080_admin_ut / 0.201s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 0.201s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 0.201s] rd_kafka_destroy(): duration 0.353ms 3: [0080_admin_ut / 0.201s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq: PASS (0.10s) ] 3: [0080_admin_ut / 0.201s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 0.201s] Test config file test.conf not found 3: %5|1673491058.406|CONFWARN|0080_admin_ut#producer-48| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 0.201s] [ do_test_options:1588 ] 3: [0080_admin_ut / 0.201s] [ do_test_options:1588: PASS (0.00s) ] 3: [0080_admin_ut / 0.202s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rndfd5d8db4a47b891_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd33ea88fb6bd690e4_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd29e5132244279816_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd36ceaff585c7ee9_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd7eb9d2a6a07d6b9_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Using topic "rdkafkatest_rnd7b4eb0d2009c15a0_do_test_CreateTopics" 3: [0080_admin_ut / 0.202s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 0.202s] CreateTopics: duration 0.087ms 3: [0025_timers / 4.414s] rd_kafka_poll(): duration 100.967ms 3: [0080_admin_ut / 0.302s] CreateTopics.queue_poll: duration 99.949ms 3: [0080_admin_ut / 0.302s] CreateTopics: got CreateTopicsResult in 99.949s 3: [0080_admin_ut / 0.302s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 0.302s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms ] 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd748b81704ebf8fc2_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd50ddb31e09989640_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd4756e6401037ba7a_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd1207597773f844c3_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd5574d1cc03ce1d9e_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Using topic "rdkafkatest_rnd604b770037b8a699_do_test_CreateTopics" 3: [0080_admin_ut / 0.302s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 0.302s] CreateTopics: duration 0.079ms 3: [0025_timers / 4.514s] rd_kafka_poll(): duration 100.088ms 3: [0058_log / 2.005s] local.queue: Setting log queue 3: [0058_log / 2.005s] local.queue: Expecting at least one log message 3: [0058_log / 2.005s] Log: level 7, facility INIT, str [thrd:app]: librdkafka v1.9.2 (0x10902ff) 0058_log#producer-38 initialized (builtin.features snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,sasl_oauthbearer, CMAKE GNU GNU PKGCONFIG HDRHISTOGRAM LIBDL PLUGINS SSL SASL_SCRAM SASL_OAUTHBEARER SASL_CYRUS LZ4_EXT C11THREADS CRC32C_HW SNAPPY SOCKEM, debug 0x1) 3: [0058_log / 2.005s] Log: level 5, facility CONFWARN, str [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0058_log / 2.005s] local.queue: Saw 2 logs 3: [0058_log / 2.005s] 0058_log: duration 2004.953ms 3: [0058_log / 2.005s] ================= Test 0058_log PASSED ================= 3: [
/ 4.535s] Too many tests running (5 >= 5): postponing 0086_purge_local start... 3: [0084_destroy_flags_local / 0.000s] ================= Running test 0084_destroy_flags_local ================= 3: [0084_destroy_flags_local / 0.000s] ==== Stats written to file stats_0084_destroy_flags_local_3119603815815166396.json ==== 3: [0084_destroy_flags_local / 0.000s] Using topic "rdkafkatest_rnd5168cd370f1a9fa4_destroy_flags" 3: [0084_destroy_flags_local / 0.000s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.000s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.000s] Setting test timeout to 20s * 2.7 3: %5|1673491058.543|CONFWARN|0084_destroy_flags_local#producer-49| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.000s] Created kafka instance 0084_destroy_flags_local#producer-49 3: [0084_destroy_flags_local / 0.000s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 0.012s] rd_kafka_destroy_flags(0x0): duration 11.598ms 3: [0084_destroy_flags_local / 0.012s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.012s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.012s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.012s] Setting test timeout to 20s * 2.7 3: %5|1673491058.555|CONFWARN|0084_destroy_flags_local#producer-50| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.012s] Created kafka instance 0084_destroy_flags_local#producer-50 3: [0084_destroy_flags_local / 0.012s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 0.013s] rd_kafka_destroy_flags(0x8): duration 0.895ms 3: [0084_destroy_flags_local / 0.013s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.013s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.013s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.013s] Setting test timeout to 20s * 2.7 3: %5|1673491058.556|CONFWARN|0084_destroy_flags_local#producer-51| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.013s] Created kafka instance 0084_destroy_flags_local#producer-51 3: [0084_destroy_flags_local / 0.013s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.013s] Produce to rdkafkatest_rnd5168cd370f1a9fa4_destroy_flags [-1]: messages #0..10000 3: [0084_destroy_flags_local / 0.021s] SUM(POLL): duration 0.001ms 3: [0084_destroy_flags_local / 0.021s] PRODUCE: duration 7.984ms 3: [0084_destroy_flags_local / 0.021s] Calling rd_kafka_destroy_flags(0x0) 3: %4|1673491058.565|TERMINATE|0084_destroy_flags_local#producer-51| [thrd:app]: Producer terminating with 10000 messages (1000000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0084_destroy_flags_local / 0.022s] rd_kafka_destroy_flags(0x0): duration 1.022ms 3: [0084_destroy_flags_local / 0.022s] [ test destroy_flags 0x0 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.022s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.022s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.022s] Setting test timeout to 20s * 2.7 3: %5|1673491058.566|CONFWARN|0084_destroy_flags_local#producer-52| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0084_destroy_flags_local / 0.022s] Created kafka instance 0084_destroy_flags_local#producer-52 3: [0084_destroy_flags_local / 0.022s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.022s] Produce to rdkafkatest_rnd5168cd370f1a9fa4_destroy_flags [-1]: messages #0..10000 3: [0084_destroy_flags_local / 0.030s] SUM(POLL): duration 0.001ms 3: [0084_destroy_flags_local / 0.030s] PRODUCE: duration 7.164ms 3: [0084_destroy_flags_local / 0.030s] Calling rd_kafka_destroy_flags(0x8) 3: %4|1673491058.573|TERMINATE|0084_destroy_flags_local#producer-52| [thrd:app]: Producer terminating with 10000 messages (1000000 bytes) still in queue or transit: use flush() to wait for outstanding message delivery 3: [0084_destroy_flags_local / 0.031s] rd_kafka_destroy_flags(0x8): duration 1.007ms 3: [0084_destroy_flags_local / 0.031s] [ test destroy_flags 0x8 for client_type 0, produce_cnt 10000, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.031s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.031s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.031s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 0.031s] Created kafka instance 0084_destroy_flags_local#consumer-53 3: [0080_admin_ut / 0.408s] CreateTopics.wait_background_event_cb: duration 106.272ms 3: [0080_admin_ut / 0.408s] CreateTopics: got CreateTopicsResult in 106.272s 3: [0080_admin_ut / 0.408s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms: PASS (0.11s) ] 3: [0080_admin_ut / 0.408s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.408s] Using topic "rdkafkatest_rnd650aa4bb29c54c21_do_test_CreateTopics" 3: [0080_admin_ut / 0.408s] Using topic "rdkafkatest_rnd6cf641e513cd22da_do_test_CreateTopics" 3: [0080_admin_ut / 0.408s] Using topic "rdkafkatest_rnd6844f2b81469387a_do_test_CreateTopics" 3: [0080_admin_ut / 0.408s] Using topic "rdkafkatest_rnd5cd074286328c83d_do_test_CreateTopics" 3: [0080_admin_ut / 0.408s] Using topic "rdkafkatest_rnd2dae27466cc15e7d_do_test_CreateTopics" 3: [0080_admin_ut / 0.408s] Using topic "rdkafkatest_rnd75050d867cf918f8_do_test_CreateTopics" 3: [0080_admin_ut / 0.408s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 0.409s] CreateTopics: duration 0.078ms 3: [0025_timers / 4.614s] rd_kafka_poll(): duration 100.077ms 3: [0025_timers / 4.714s] rd_kafka_poll(): duration 100.079ms 3: [0080_admin_ut / 0.609s] CreateTopics.queue_poll: duration 199.957ms 3: [0080_admin_ut / 0.609s] CreateTopics: got CreateTopicsResult in 199.957s 3: [0080_admin_ut / 0.609s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 0.609s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.609s] Using topic "rdkafkatest_rnd70c66fd70f15dbb_do_test_CreateTopics" 3: [0080_admin_ut / 0.609s] Using topic "rdkafkatest_rnd5c8138ca74bf7b59_do_test_CreateTopics" 3: [0080_admin_ut / 0.609s] Using topic "rdkafkatest_rnd3cccafca2c7821f2_do_test_CreateTopics" 3: [0080_admin_ut / 0.609s] Using topic "rdkafkatest_rnd6817beee0e15dbae_do_test_CreateTopics" 3: [0080_admin_ut / 0.609s] Using topic "rdkafkatest_rnd77325e9373208069_do_test_CreateTopics" 3: [0080_admin_ut / 0.609s] Using topic "rdkafkatest_rnd4cdc28eb6016c24f_do_test_CreateTopics" 3: [0080_admin_ut / 0.609s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 0.609s] CreateTopics: duration 0.075ms 3: [0025_timers / 4.813s] Call #7: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 4.813s] rd_kafka_poll(): duration 98.584ms 3: [0025_timers / 4.913s] rd_kafka_poll(): duration 100.059ms 3: [0080_admin_ut / 0.809s] CreateTopics.queue_poll: duration 199.962ms 3: [0080_admin_ut / 0.809s] CreateTopics: got CreateTopicsResult in 199.962s 3: [0080_admin_ut / 0.809s] [ do_test_CreateTopics:101: 0080_admin_ut#producer-48 CreateTopics with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 0.809s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-48 DeleteTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 0.809s] Using topic "rdkafkatest_rnd2daf9241485bb507_do_test_DeleteTopics" 3: [0080_admin_ut / 0.809s] Using topic "rdkafkatest_rnd79ab1c236846299e_do_test_DeleteTopics" 3: [0080_admin_ut / 0.809s] Using topic "rdkafkatest_rnd252c292f5cd3e460_do_test_DeleteTopics" 3: [0080_admin_ut / 0.809s] Using topic "rdkafkatest_rnd7989930352da5075_do_test_DeleteTopics" 3: [0080_admin_ut / 0.809s] Call DeleteTopics, timeout is 100ms 3: [0080_admin_ut / 0.809s] DeleteTopics: duration 0.008ms 3: [0025_timers / 5.013s] rd_kafka_poll(): duration 100.065ms 3: [0084_destroy_flags_local / 0.531s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 0.532s] rd_kafka_destroy_flags(0x0): duration 0.241ms 3: [0084_destroy_flags_local / 0.532s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 0.532s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 0.532s] Test config file test.conf not found 3: [0084_destroy_flags_local / 0.532s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 0.532s] Created kafka instance 0084_destroy_flags_local#consumer-54 3: [0080_admin_ut / 0.909s] DeleteTopics.queue_poll: duration 100.029ms 3: [0080_admin_ut / 0.909s] DeleteTopics: got DeleteTopicsResult in 100.029s 3: [0080_admin_ut / 0.909s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 0.909s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-48 DeleteTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 0.909s] Using topic "rdkafkatest_rnd499542dd01e296bf_do_test_DeleteTopics" 3: [0080_admin_ut / 0.909s] Using topic "rdkafkatest_rnd47df5dfc468e5bd5_do_test_DeleteTopics" 3: [0080_admin_ut / 0.909s] Using topic "rdkafkatest_rnd525a792d4eebc4f9_do_test_DeleteTopics" 3: [0080_admin_ut / 0.909s] Using topic "rdkafkatest_rnd377fb99079f54745_do_test_DeleteTopics" 3: [0080_admin_ut / 0.909s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 0.909s] DeleteTopics: duration 0.007ms 3: [0025_timers / 5.113s] rd_kafka_poll(): duration 100.074ms 3: [0072_headers_ut / 1.000s] 0072_headers_ut: duration 1000.399ms 3: [0072_headers_ut / 1.000s] ================= Test 0072_headers_ut PASSED ================= 3: [0025_timers / 5.213s] rd_kafka_poll(): duration 100.081ms 3: [
/ 5.295s] Too many tests running (5 >= 5): postponing 0095_all_brokers_down start... 3: [0086_purge_local / 0.000s] ================= Running test 0086_purge_local ================= 3: [0086_purge_local / 0.000s] ==== Stats written to file stats_0086_purge_local_3129154855818441961.json ==== 3: [0086_purge_local / 0.000s] Using topic "rdkafkatest_0086_purge" 3: [0086_purge_local / 0.000s] Test rd_kafka_purge(): local 3: [0086_purge_local / 0.000s] Test config file test.conf not found 3: [0086_purge_local / 0.000s] Setting test timeout to 20s * 2.7 3: %4|1673491059.303|CONFWARN|0086_purge_local#producer-55| [thrd:app]: Configuration property enable.gapless.guarantee is experimental: When set to `true`, any error that could result in a gap in the produced message series when a batch of messages fails, will raise a fatal error (ERR__GAPLESS_GUARANTEE) and stop the producer. Messages failing due to `message.timeout.ms` are not covered by this guarantee. Requires `enable.idempotence=true`. 3: %5|1673491059.303|CONFWARN|0086_purge_local#producer-55| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0086_purge_local / 0.000s] Created kafka instance 0086_purge_local#producer-55 3: [0086_purge_local / 0.000s] Producing 20 messages to topic rdkafkatest_0086_purge 3: [0086_purge_local / 0.000s] local:281: purge(0x2): expecting 20 messages to remain when done 3: [0086_purge_local / 0.000s] local:281: purge(0x2): duration 0.005ms 3: [0086_purge_local / 0.000s] local:285: purge(0x1): expecting 0 messages to remain when done 3: [0086_purge_local / 0.000s] local:285: purge(0x1): duration 0.003ms 3: [0086_purge_local / 0.000s] DeliveryReport for msg #0: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #1: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #2: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #3: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #4: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #5: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #6: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #7: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #8: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #9: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #10: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #11: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #12: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #13: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #14: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #15: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #16: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #17: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #18: _PURGE_QUEUE 3: [0086_purge_local / 0.000s] DeliveryReport for msg #19: _PURGE_QUEUE 3: [0086_purge_local / 0.008s] 0086_purge_local: duration 8.217ms 3: [0086_purge_local / 0.008s] ================= Test 0086_purge_local PASSED ================= 3: [0080_admin_ut / 1.109s] DeleteTopics.queue_poll: duration 200.022ms 3: [0080_admin_ut / 1.109s] DeleteTopics: got DeleteTopicsResult in 200.022s 3: [0080_admin_ut / 1.109s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 1.109s] [ do_test_DeleteTopics:300: 0080_admin_ut#producer-48 DeleteTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.109s] Using topic "rdkafkatest_rnd42fab99a6839ad8e_do_test_DeleteTopics" 3: [0080_admin_ut / 1.109s] Using topic "rdkafkatest_rnd58b756dc19416110_do_test_DeleteTopics" 3: [0080_admin_ut / 1.109s] Using topic "rdkafkatest_rnd50516c7c66cd328a_do_test_DeleteTopics" 3: [0080_admin_ut / 1.109s] Using topic "rdkafkatest_rnd40f0d5be4783cb0f_do_test_DeleteTopics" 3: [0080_admin_ut / 1.109s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 1.109s] DeleteTopics: duration 0.006ms 3: [
/ 5.306s] Too many tests running (5 >= 5): postponing 0097_ssl_verify_local start... 3: [0095_all_brokers_down / 0.000s] ================= Running test 0095_all_brokers_down ================= 3: [0095_all_brokers_down / 0.000s] ==== Stats written to file stats_0095_all_brokers_down_6480032200206423692.json ==== 3: [0095_all_brokers_down / 0.000s] Setting test timeout to 20s * 2.7 3: [0095_all_brokers_down / 0.000s] Test Producer 3: [
/ 5.310s] Log: [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 0.005s] Error: Local: Broker transport failure: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0025_timers / 5.314s] rd_kafka_poll(): duration 100.667ms 3: [0025_timers / 5.413s] Call #8: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 5.413s] rd_kafka_poll(): duration 98.926ms 3: [0080_admin_ut / 1.309s] DeleteTopics.queue_poll: duration 200.038ms 3: [0080_admin_ut / 1.309s] DeleteTopics: got DeleteTopicsResult in 200.038s 3: [0080_admin_ut / 1.309s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 1.309s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-48 DeleteGroups with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 1.309s] Using topic "rdkafkatest_rnd145ff3fb3a047543_do_test_DeleteGroups" 3: [0080_admin_ut / 1.309s] Using topic "rdkafkatest_rnd77a77a35420f863c_do_test_DeleteGroups" 3: [0080_admin_ut / 1.309s] Using topic "rdkafkatest_rnd2602a4a71529658_do_test_DeleteGroups" 3: [0080_admin_ut / 1.309s] Using topic "rdkafkatest_rnd2a55afda278c5379_do_test_DeleteGroups" 3: [0080_admin_ut / 1.309s] Call DeleteGroups, timeout is 100ms 3: [0080_admin_ut / 1.309s] DeleteGroups: duration 0.021ms 3: [0025_timers / 5.513s] rd_kafka_poll(): duration 100.033ms 3: [0084_destroy_flags_local / 1.033s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 1.034s] rd_kafka_destroy_flags(0x8): duration 0.846ms 3: [0084_destroy_flags_local / 1.034s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 1.034s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode ] 3: [0084_destroy_flags_local / 1.034s] Test config file test.conf not found 3: [0084_destroy_flags_local / 1.034s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 1.034s] Created kafka instance 0084_destroy_flags_local#consumer-57 3: [0080_admin_ut / 1.409s] DeleteGroups.queue_poll: duration 100.058ms 3: [0080_admin_ut / 1.409s] DeleteGroups: got DeleteGroupsResult in 100.058s 3: [0080_admin_ut / 1.409s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.409s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-48 DeleteGroups with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.409s] Using topic "rdkafkatest_rnd4e267ab823df42de_do_test_DeleteGroups" 3: [0080_admin_ut / 1.409s] Using topic "rdkafkatest_rnd7a66a3ee17bbbd95_do_test_DeleteGroups" 3: [0080_admin_ut / 1.409s] Using topic "rdkafkatest_rnd25c1d99d424601ea_do_test_DeleteGroups" 3: [0080_admin_ut / 1.409s] Using topic "rdkafkatest_rnd5e4a196b781c52ca_do_test_DeleteGroups" 3: [0080_admin_ut / 1.409s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 1.409s] DeleteGroups: duration 0.011ms 3: [0025_timers / 5.613s] rd_kafka_poll(): duration 100.075ms 3: [0025_timers / 5.713s] rd_kafka_poll(): duration 100.080ms 3: [0080_admin_ut / 1.609s] DeleteGroups.queue_poll: duration 200.043ms 3: [0080_admin_ut / 1.609s] DeleteGroups: got DeleteGroupsResult in 200.043s 3: [0080_admin_ut / 1.609s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.609s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-48 DeleteGroups with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.609s] Using topic "rdkafkatest_rnd1131c6e415c9d2fb_do_test_DeleteGroups" 3: [0080_admin_ut / 1.609s] Using topic "rdkafkatest_rnd72119a103c9ec4a7_do_test_DeleteGroups" 3: [0080_admin_ut / 1.609s] Using topic "rdkafkatest_rnd420907e5350c53aa_do_test_DeleteGroups" 3: [0080_admin_ut / 1.609s] Using topic "rdkafkatest_rnd24d872351ac05ec1_do_test_DeleteGroups" 3: [0080_admin_ut / 1.609s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 1.609s] DeleteGroups: duration 0.010ms 3: [0025_timers / 5.813s] rd_kafka_poll(): duration 100.080ms 3: [0025_timers / 5.913s] rd_kafka_poll(): duration 100.081ms 3: [0080_admin_ut / 1.810s] DeleteGroups.queue_poll: duration 200.043ms 3: [0080_admin_ut / 1.810s] DeleteGroups: got DeleteGroupsResult in 200.043s 3: [0080_admin_ut / 1.810s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 1.810s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-48 DeleteRecords with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 1.810s] Using topic "rdkafkatest_rnd4e4db4bb7529deb2_do_test_DeleteRecords" 3: [0080_admin_ut / 1.810s] Using topic "rdkafkatest_rnd18d914b0f3e8a79_do_test_DeleteRecords" 3: [0080_admin_ut / 1.810s] Using topic "rdkafkatest_rnd3cada9c15b7b443f_do_test_DeleteRecords" 3: [0080_admin_ut / 1.810s] Using topic "rdkafkatest_rnd21a42105510d9dbc_do_test_DeleteRecords" 3: [0080_admin_ut / 1.810s] Call DeleteRecords, timeout is 100ms 3: [0080_admin_ut / 1.810s] DeleteRecords: duration 0.028ms 3: [0025_timers / 6.013s] Call #9: after 600ms, 0% outside interval 600 >-60 <+120 3: [0025_timers / 6.013s] rd_kafka_poll(): duration 99.508ms 3: [0025_timers / 6.013s] All 10 intervals okay 3: [0025_timers / 6.013s] 0025_timers: duration 6012.901ms 3: [0025_timers / 6.013s] ================= Test 0025_timers PASSED ================= 3: [
/ 6.020s] Too many tests running (5 >= 5): postponing 0100_thread_interceptors start... 3: [0097_ssl_verify_local / 0.000s] ================= Running test 0097_ssl_verify_local ================= 3: [0097_ssl_verify_local / 0.000s] ==== Stats written to file stats_0097_ssl_verify_local_1549160765260012346.json ==== 3: [0097_ssl_verify_local / 0.000s] Feature "ssl" is built-in 3: %7|1673491060.029|OPENSSL|rdkafka#producer-58| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1673491060.029|SSL|rdkafka#producer-58| [thrd:app]: Loading CA certificate from string 3: [0097_ssl_verify_local / 0.000s] Failed to create producer with junk ssl.ca.pem (as expected): ssl.ca.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: CERTIFICATE 3: %7|1673491060.029|OPENSSL|rdkafka#producer-59| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1673491060.036|SSL|rdkafka#producer-59| [thrd:app]: Loading private key from string 3: [0097_ssl_verify_local / 0.008s] Failed to create producer with junk ssl.key.pem (as expected): ssl.key.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: ANY PRIVATE KEY 3: %7|1673491060.036|OPENSSL|rdkafka#producer-60| [thrd:app]: Using OpenSSL version OpenSSL 1.1.1q 5 Jul 2022 (0x1010111f, librdkafka built with 0x1010111f) 3: %7|1673491060.042|SSL|rdkafka#producer-60| [thrd:app]: Loading public key from string 3: [0097_ssl_verify_local / 0.014s] Failed to create producer with junk ssl.certificate.pem (as expected): ssl.certificate.pem failed: not in PEM format?: crypto/pem/pem_lib.c:745: error:0909006C:PEM routines:get_name:no start line: Expecting: CERTIFICATE 3: [0097_ssl_verify_local / 0.014s] 0097_ssl_verify_local: duration 14.489ms 3: [0097_ssl_verify_local / 0.014s] ================= Test 0097_ssl_verify_local PASSED ================= 3: [0084_destroy_flags_local / 1.538s] Calling rd_kafka_unsubscribe 3: [0084_destroy_flags_local / 1.538s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 1.539s] rd_kafka_destroy_flags(0x0): duration 0.253ms 3: [0084_destroy_flags_local / 1.539s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode: PASS ] 3: [0084_destroy_flags_local / 1.539s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode ] 3: [0084_destroy_flags_local / 1.539s] Test config file test.conf not found 3: [0084_destroy_flags_local / 1.539s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 1.539s] Created kafka instance 0084_destroy_flags_local#consumer-61 3: [0080_admin_ut / 1.910s] DeleteRecords.queue_poll: duration 100.026ms 3: [0080_admin_ut / 1.910s] DeleteRecords: got DeleteRecordsResult in 100.026s 3: [0080_admin_ut / 1.910s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-48 DeleteRecords with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 1.910s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-48 DeleteRecords with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 1.910s] Using topic "rdkafkatest_rnd131d23f917dfe3cc_do_test_DeleteRecords" 3: [0080_admin_ut / 1.910s] Using topic "rdkafkatest_rnda9e31923d72d3d3_do_test_DeleteRecords" 3: [0080_admin_ut / 1.910s] Using topic "rdkafkatest_rnd3f6c374558c4ac4a_do_test_DeleteRecords" 3: [0080_admin_ut / 1.910s] Using topic "rdkafkatest_rnd615216b139d2db34_do_test_DeleteRecords" 3: [0080_admin_ut / 1.910s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 1.910s] DeleteRecords: duration 0.020ms 3: [
/ 6.135s] Too many tests running (5 >= 5): postponing 0103_transactions_local start... 3: [0100_thread_interceptors / 0.000s] ================= Running test 0100_thread_interceptors ================= 3: [0100_thread_interceptors / 0.000s] ==== Stats written to file stats_0100_thread_interceptors_8106595740179230799.json ==== 3: [0100_thread_interceptors / 0.000s] on_conf_dup() interceptor called 3: [0100_thread_interceptors / 0.000s] on_new() interceptor called 3: [
/ 6.136s] on_thread_start(0, main) called 3: [
/ 6.136s] Started thread: main 3: [
/ 6.136s] on_thread_start(2, :0/internal) called 3: [
/ 6.136s] Started thread: :0/internal 3: [
/ 6.136s] on_thread_start(2, 127.0.0.1:1/bootstrap) called 3: [
/ 6.136s] Started thread: 127.0.0.1:1/bootstrap 3: %3|1673491060.144|FAIL|rdkafka#producer-62| [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1673491060.144|ERROR|rdkafka#producer-62| [thrd:127.0.0.1:1/bootstrap]: 1/1 brokers are down 3: %3|1673491060.147|ERROR|rdkafka#producer-62| [thrd:app]: rdkafka#producer-62: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 6.140s] on_thread_exit(0, main) called 3: [
/ 6.140s] Exiting from thread: main 3: [
/ 6.142s] on_thread_exit(2, 127.0.0.1:1/bootstrap) called 3: [
/ 6.142s] Exiting from thread: 127.0.0.1:1/bootstrap 3: [
/ 6.143s] on_thread_exit(2, :0/internal) called 3: [
/ 6.143s] Exiting from thread: :0/internal 3: [0100_thread_interceptors / 0.009s] 3 thread start calls, 3 thread exit calls seen 3: [0100_thread_interceptors / 0.009s] 0100_thread_interceptors: duration 8.925ms 3: [0100_thread_interceptors / 0.009s] ================= Test 0100_thread_interceptors PASSED ================= 3: [
/ 6.144s] Too many tests running (5 >= 5): postponing 0104_fetch_from_follower_mock start... 3: [0103_transactions_local / 0.000s] ================= Running test 0103_transactions_local ================= 3: [0103_transactions_local / 0.000s] ==== Stats written to file stats_0103_transactions_local_8942140182384771915.json ==== 3: [0103_transactions_local / 0.000s] [ do_test_txn_local:1168 ] 3: [0103_transactions_local / 0.000s] Test config file test.conf not found 3: %5|1673491060.157|CONFWARN|0103_transactions_local#producer-63| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0103_transactions_local / 0.005s] Created kafka instance 0103_transactions_local#producer-63 3: [0103_transactions_local / 0.012s] Test config file test.conf not found 3: [0103_transactions_local / 0.013s] Created kafka instance 0103_transactions_local#producer-64 3: [0103_transactions_local / 0.013s] Waiting for init_transactions() timeout 7000 ms 3: [0103_transactions_local / 0.013s] Setting test timeout to 9s * 2.7 3: [0080_admin_ut / 2.110s] DeleteRecords.queue_poll: duration 200.027ms 3: [0080_admin_ut / 2.110s] DeleteRecords: got DeleteRecordsResult in 200.027s 3: [0080_admin_ut / 2.110s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-48 DeleteRecords with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.110s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-48 DeleteRecords with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.110s] Using topic "rdkafkatest_rnd7f3043190d4aa402_do_test_DeleteRecords" 3: [0080_admin_ut / 2.110s] Using topic "rdkafkatest_rnd649456467141dd29_do_test_DeleteRecords" 3: [0080_admin_ut / 2.110s] Using topic "rdkafkatest_rnd49e968aa269d5e2b_do_test_DeleteRecords" 3: [0080_admin_ut / 2.110s] Using topic "rdkafkatest_rnd264e30d46ec1dadf_do_test_DeleteRecords" 3: [0080_admin_ut / 2.110s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 2.110s] DeleteRecords: duration 0.018ms 3: [
/ 6.310s] Log: [thrd:127.0.0.1:2/bootstrap]: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 1.004s] Error: Local: Broker transport failure: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 1.004s] Error: Local: All broker connections are down: 2/2 brokers are down 3: [0095_all_brokers_down / 1.004s] Test KafkaConsumer 3: [0080_admin_ut / 2.310s] DeleteRecords.queue_poll: duration 200.037ms 3: [0080_admin_ut / 2.310s] DeleteRecords: got DeleteRecordsResult in 200.037s 3: [0080_admin_ut / 2.310s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-48 DeleteRecords with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.310s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-48 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 2.310s] Call DeleteConsumerGroupOffsets, timeout is 100ms 3: [0080_admin_ut / 2.310s] DeleteConsumerGroupOffsets: duration 0.013ms 3: [0084_destroy_flags_local / 2.041s] Calling rd_kafka_unsubscribe 3: [0084_destroy_flags_local / 2.041s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 2.041s] rd_kafka_destroy_flags(0x8): duration 0.249ms 3: [0084_destroy_flags_local / 2.041s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 1, unsubscribe 1, local mode: PASS ] 3: [0084_destroy_flags_local / 2.041s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 2.041s] Test config file test.conf not found 3: [0084_destroy_flags_local / 2.041s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 2.041s] Created kafka instance 0084_destroy_flags_local#consumer-66 3: [0080_admin_ut / 2.410s] DeleteConsumerGroupOffsets.queue_poll: duration 100.034ms 3: [0080_admin_ut / 2.410s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 100.034s 3: [0080_admin_ut / 2.410s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-48 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 2.410s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-48 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.410s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 2.410s] DeleteConsumerGroupOffsets: duration 0.005ms 3: [0034_offset_reset_mock / 6.412s] Bringing up the broker 3: [0080_admin_ut / 2.610s] DeleteConsumerGroupOffsets.queue_poll: duration 200.036ms 3: [0080_admin_ut / 2.610s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.036s 3: [0080_admin_ut / 2.610s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-48 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.610s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-48 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.610s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 2.610s] DeleteConsumerGroupOffsets: duration 0.007ms 3: [0080_admin_ut / 2.810s] DeleteConsumerGroupOffsets.queue_poll: duration 200.036ms 3: [0080_admin_ut / 2.810s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.036s 3: [0080_admin_ut / 2.810s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#producer-48 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 2.810s] Using topic "rdkafkatest_rnd415dbcec749be58f_do_test_AclBinding" 3: [0080_admin_ut / 2.810s] [ do_test_AclBinding:721 ] 3: [0080_admin_ut / 2.810s] [ do_test_AclBinding:721: PASS (0.00s) ] 3: [0080_admin_ut / 2.810s] Using topic "rdkafkatest_rnd63ebb99142eb4e38_do_test_AclBindingFilter" 3: [0080_admin_ut / 2.810s] [ do_test_AclBindingFilter:853 ] 3: [0080_admin_ut / 2.810s] [ do_test_AclBindingFilter:853: PASS (0.00s) ] 3: [0080_admin_ut / 2.810s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-48 CreaetAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 2.810s] Using topic "rdkafkatest_rnd3da700820996353_do_test_CreateAcls" 3: [0080_admin_ut / 2.810s] Using topic "rdkafkatest_rnd1e669277257e910d_do_test_CreateAcls" 3: [0080_admin_ut / 2.810s] Call CreateAcls, timeout is 100ms 3: [0080_admin_ut / 2.810s] CreateAcls: duration 0.010ms 3: [0084_destroy_flags_local / 2.542s] Calling rd_kafka_destroy_flags(0x0) 3: [0084_destroy_flags_local / 2.542s] rd_kafka_destroy_flags(0x0): duration 0.276ms 3: [0084_destroy_flags_local / 2.542s] [ test destroy_flags 0x0 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 2.542s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode ] 3: [0084_destroy_flags_local / 2.542s] Test config file test.conf not found 3: [0084_destroy_flags_local / 2.542s] Setting test timeout to 20s * 2.7 3: [0084_destroy_flags_local / 2.542s] Created kafka instance 0084_destroy_flags_local#consumer-67 3: [0080_admin_ut / 2.910s] CreateAcls.queue_poll: duration 100.031ms 3: [0080_admin_ut / 2.910s] CreateAcls: got CreateAclsResult in 100.031s 3: [0080_admin_ut / 2.910s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-48 CreaetAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 2.910s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-48 CreaetAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 2.910s] Using topic "rdkafkatest_rnd71a7010f33e64bfa_do_test_CreateAcls" 3: [0080_admin_ut / 2.910s] Using topic "rdkafkatest_rnd3eca2c4804c42508_do_test_CreateAcls" 3: [0080_admin_ut / 2.910s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 2.910s] CreateAcls: duration 0.006ms 3: [0034_offset_reset_mock / 6.965s] #0: message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 6.965s] #0: got expected message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 6.965s] Waiting for up to 5000ms for metadata update 3: [0080_admin_ut / 3.110s] CreateAcls.queue_poll: duration 200.037ms 3: [0080_admin_ut / 3.110s] CreateAcls: got CreateAclsResult in 200.037s 3: [0080_admin_ut / 3.110s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-48 CreaetAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.110s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-48 CreaetAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.110s] Using topic "rdkafkatest_rnd4bc62fc649685dda_do_test_CreateAcls" 3: [0080_admin_ut / 3.110s] Using topic "rdkafkatest_rnd4236f8dc0b32670c_do_test_CreateAcls" 3: [0080_admin_ut / 3.110s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 3.110s] CreateAcls: duration 0.006ms 3: [
/ 7.310s] Log: [thrd:127.0.0.1:1/bootstrap]: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 2.004s] Error: Local: Broker transport failure: 127.0.0.1:1/bootstrap: Connect to ipv4#127.0.0.1:1 failed: Connection refused (after 0ms in state CONNECT) 3: [0080_admin_ut / 3.311s] CreateAcls.queue_poll: duration 200.035ms 3: [0080_admin_ut / 3.311s] CreateAcls: got CreateAclsResult in 200.035s 3: [0080_admin_ut / 3.311s] [ do_test_CreateAcls:993: 0080_admin_ut#producer-48 CreaetAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.311s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-48 DescribeAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 3.311s] Using topic "rdkafkatest_rnd222d0a2523890f8d_do_test_DescribeAcls" 3: [0080_admin_ut / 3.311s] Call DescribeAcls, timeout is 100ms 3: [0080_admin_ut / 3.311s] DescribeAcls: duration 0.009ms 3: [0084_destroy_flags_local / 3.043s] Calling rd_kafka_destroy_flags(0x8) 3: [0084_destroy_flags_local / 3.043s] rd_kafka_destroy_flags(0x8): duration 0.350ms 3: [0084_destroy_flags_local / 3.043s] [ test destroy_flags 0x8 for client_type 1, produce_cnt 0, subscribe 0, unsubscribe 0, local mode: PASS ] 3: [0084_destroy_flags_local / 3.043s] 0084_destroy_flags_local: duration 3043.055ms 3: [0084_destroy_flags_local / 3.043s] ================= Test 0084_destroy_flags_local PASSED ================= 3: [
/ 7.579s] Too many tests running (5 >= 5): postponing 0105_transactions_mock start... 3: [0104_fetch_from_follower_mock/ 0.000s] ================= Running test 0104_fetch_from_follower_mock ================= 3: [0104_fetch_from_follower_mock/ 0.000s] ==== Stats written to file stats_0104_fetch_from_follower_mock_4973454206459278341.json ==== 3: [0104_fetch_from_follower_mock/ 0.000s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.000s] [ Test FFF auto.offset.reset=earliest ] 3: %5|1673491061.587|CONFWARN|MOCK#producer-68| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 0.000s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.005s] Created kafka instance 0104_fetch_from_follower_mock#producer-69 3: [0104_fetch_from_follower_mock/ 0.005s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.005s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 0.006s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 0.006s] PRODUCE: duration 1.115ms 3: [0080_admin_ut / 3.411s] DescribeAcls.queue_poll: duration 100.021ms 3: [0080_admin_ut / 3.411s] DescribeAcls: got DescribeAclsResult in 100.021s 3: [0080_admin_ut / 3.411s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-48 DescribeAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 3.411s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-48 DescribeAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.411s] Using topic "rdkafkatest_rnd2a9cffdc411e1f5e_do_test_DescribeAcls" 3: [0080_admin_ut / 3.411s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 3.411s] DescribeAcls: duration 0.005ms 3: [0104_fetch_from_follower_mock/ 0.083s] PRODUCE.DELIVERY.WAIT: duration 76.799ms 3: [0104_fetch_from_follower_mock/ 0.092s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 0.092s] Created kafka instance 0104_fetch_from_follower_mock#consumer-70 3: [0104_fetch_from_follower_mock/ 0.097s] ASSIGN.PARTITIONS: duration 4.387ms 3: [0104_fetch_from_follower_mock/ 0.097s] earliest: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 0.097s] earliest: consume 1000 messages 3: [0034_offset_reset_mock / 7.473s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 7.473s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 7.473s] METADATA.WAIT: duration 508.843ms 3: [0080_admin_ut / 3.611s] DescribeAcls.queue_poll: duration 200.033ms 3: [0080_admin_ut / 3.611s] DescribeAcls: got DescribeAclsResult in 200.033s 3: [0080_admin_ut / 3.611s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-48 DescribeAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.611s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-48 DescribeAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.611s] Using topic "rdkafkatest_rnd6177f75029cd42f6_do_test_DescribeAcls" 3: [0080_admin_ut / 3.611s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 3.611s] DescribeAcls: duration 0.006ms 3: [0080_admin_ut / 3.811s] DescribeAcls.queue_poll: duration 200.035ms 3: [0080_admin_ut / 3.811s] DescribeAcls: got DescribeAclsResult in 200.035s 3: [0080_admin_ut / 3.811s] [ do_test_DescribeAcls:1108: 0080_admin_ut#producer-48 DescribeAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 3.811s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-48 DeleteAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 3.811s] Using topic "rdkafkatest_rnd4e68c361460c4d96_do_test_DeleteAcls" 3: [0080_admin_ut / 3.811s] Using topic "rdkafkatest_rnd1b0f201f18522c0b_do_test_DeleteAcls" 3: [0080_admin_ut / 3.811s] Call DeleteAcls, timeout is 100ms 3: [0080_admin_ut / 3.811s] DeleteAcls: duration 0.008ms 3: [0080_admin_ut / 3.911s] DeleteAcls.queue_poll: duration 100.032ms 3: [0080_admin_ut / 3.911s] DeleteAcls: got DeleteAclsResult in 100.032s 3: [0080_admin_ut / 3.911s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-48 DeleteAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 3.911s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-48 DeleteAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 3.911s] Using topic "rdkafkatest_rnd6ca9abc2415d50f3_do_test_DeleteAcls" 3: [0080_admin_ut / 3.911s] Using topic "rdkafkatest_rnd71406ea2e0768ae_do_test_DeleteAcls" 3: [0080_admin_ut / 3.911s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 3.911s] DeleteAcls: duration 0.006ms 3: %4|1673491062.303|OFFSET|0104_fetch_from_follower_mock#consumer-70| [thrd:main]: test [0]: offset reset (at offset 10, broker 2) to cached BEGINNING offset 0: fetch failed due to requested offset not available on the broker: Broker: Offset out of range 3: [0080_admin_ut / 4.111s] DeleteAcls.queue_poll: duration 200.034ms 3: [0080_admin_ut / 4.111s] DeleteAcls: got DeleteAclsResult in 200.034s 3: [0080_admin_ut / 4.111s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-48 DeleteAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 4.111s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-48 DeleteAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 4.111s] Using topic "rdkafkatest_rnd35f936826affc07c_do_test_DeleteAcls" 3: [0080_admin_ut / 4.111s] Using topic "rdkafkatest_rnd70f2b6e639d3a68a_do_test_DeleteAcls" 3: [0080_admin_ut / 4.111s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 4.111s] DeleteAcls: duration 0.007ms 3: [
/ 8.310s] Log: [thrd:127.0.0.1:2/bootstrap]: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 3.004s] Error: Local: Broker transport failure: 127.0.0.1:2/bootstrap: Connect to ipv4#127.0.0.1:2 failed: Connection refused (after 0ms in state CONNECT) 3: [0095_all_brokers_down / 3.004s] Error: Local: All broker connections are down: 2/2 brokers are down 3: [0095_all_brokers_down / 3.005s] 0095_all_brokers_down: duration 3005.167ms 3: [0095_all_brokers_down / 3.005s] ================= Test 0095_all_brokers_down PASSED ================= 3: [
/ 8.312s] Too many tests running (5 >= 5): postponing 0106_cgrp_sess_timeout start... 3: [0105_transactions_mock / 0.000s] ================= Running test 0105_transactions_mock ================= 3: [0105_transactions_mock / 0.000s] ==== Stats written to file stats_0105_transactions_mock_835738578076649822.json ==== 3: [0105_transactions_mock / 0.000s] Test config file test.conf not found 3: [0105_transactions_mock / 0.000s] [ do_test_txn_recoverable_errors:194 ] 3: [0105_transactions_mock / 0.000s] Test config file test.conf not found 3: [0105_transactions_mock / 0.000s] Setting test timeout to 60s * 2.7 3: %5|1673491062.320|MOCK|0105_transactions_mock#producer-71| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42633,127.0.0.1:43533,127.0.0.1:34601 3: [0105_transactions_mock / 0.008s] Created kafka instance 0105_transactions_mock#producer-71 3: %4|1673491062.362|GETPID|0105_transactions_mock#producer-71| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Coordinator not available: retrying 3: [0080_admin_ut / 4.311s] DeleteAcls.queue_poll: duration 200.036ms 3: [0080_admin_ut / 4.311s] DeleteAcls: got DeleteAclsResult in 200.036s 3: [0080_admin_ut / 4.311s] [ do_test_DeleteAcls:1221: 0080_admin_ut#producer-48 DeleteAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 4.311s] [ do_test_mix:1342 ] 3: [0080_admin_ut / 4.311s] Creating 2 topics 3: [0080_admin_ut / 4.311s] Deleting 1 topics 3: [0080_admin_ut / 4.311s] Creating 1 topics 3: [0080_admin_ut / 4.311s] Deleting 3 groups 3: [0080_admin_ut / 4.311s] Deleting offsets from 3 partitions 3: [0080_admin_ut / 4.311s] Creating (up to) 15 partitions for topic "topicD" 3: [0080_admin_ut / 4.311s] Deleting committed offsets for group mygroup and 3 partitions 3: [0080_admin_ut / 4.311s] Provoking invalid DeleteConsumerGroupOffsets call 3: [0080_admin_ut / 4.311s] Creating 2 topics 3: [0080_admin_ut / 4.311s] Got event DeleteConsumerGroupOffsetsResult: Exactly one DeleteConsumerGroupOffsets must be passed 3: [0080_admin_ut / 4.411s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.411s] Got event DeleteTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.411s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.411s] Got event DeleteGroupsResult: Success 3: [0080_admin_ut / 4.411s] Got event DeleteRecordsResult: Failed to query partition leaders: Local: Timed out 3: [0080_admin_ut / 4.411s] Got event CreatePartitionsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.411s] Got event DeleteConsumerGroupOffsetsResult: Failed while waiting for response from broker: Local: Timed out 3: [0080_admin_ut / 4.411s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 4.411s] [ do_test_mix:1342: PASS (0.10s) ] 3: [0080_admin_ut / 4.411s] [ do_test_configs:1411 ] 3: [0034_offset_reset_mock / 8.473s] #1: injecting _TRANSPORT, expecting NO_ERROR 3: [0034_offset_reset_mock / 8.474s] ASSIGN.PARTITIONS: duration 0.072ms 3: [0034_offset_reset_mock / 8.474s] ASSIGN: assigned 1 partition(s) 3: %4|1673491062.719|FAIL|0034_offset_reset_mock#consumer-26| [thrd:127.0.0.1:45729/bootstrap]: 127.0.0.1:45729/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1550ms in state UP) 3: [0034_offset_reset_mock / 8.474s] #1: Ignoring Error event: 127.0.0.1:45729/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 1550ms in state UP) 3: [0104_fetch_from_follower_mock/ 1.225s] CONSUME: duration 1128.265ms 3: [0104_fetch_from_follower_mock/ 1.225s] earliest: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 1.225s] Closing consumer 0104_fetch_from_follower_mock#consumer-70 3: [0104_fetch_from_follower_mock/ 1.225s] CONSUMER.CLOSE: duration 0.216ms 3: [0104_fetch_from_follower_mock/ 1.227s] [ Test FFF auto.offset.reset=earliest PASSED ] 3: [0104_fetch_from_follower_mock/ 1.227s] [ Test FFF auto.offset.reset=latest ] 3: %5|1673491062.813|CONFWARN|MOCK#producer-72| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 1.227s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.231s] Created kafka instance 0104_fetch_from_follower_mock#producer-73 3: [0104_fetch_from_follower_mock/ 1.231s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.231s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 1.231s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 1.231s] PRODUCE: duration 0.796ms 3: %4|1673491062.855|GETPID|0105_transactions_mock#producer-71| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Not coordinator: retrying 3: [0104_fetch_from_follower_mock/ 1.296s] PRODUCE.DELIVERY.WAIT: duration 64.409ms 3: [0104_fetch_from_follower_mock/ 1.308s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 1.309s] Created kafka instance 0104_fetch_from_follower_mock#consumer-74 3: [0104_fetch_from_follower_mock/ 1.309s] ASSIGN.PARTITIONS: duration 0.035ms 3: [0104_fetch_from_follower_mock/ 1.309s] latest: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 1.309s] latest: not expecting any messages for 5000ms 3: %4|1673491063.355|GETPID|0105_transactions_mock#producer-71| [thrd:main]: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Coordinator load in progress: retrying 3: [0105_transactions_mock / 1.536s] rd_kafka_init_transactions(rk, 5000): duration 1517.245ms 3: [0105_transactions_mock / 1.536s] rd_kafka_begin_transaction(rk): duration 0.021ms 3: [0105_transactions_mock / 1.536s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.016ms 3: %4|1673491064.030|OFFSET|0104_fetch_from_follower_mock#consumer-74| [thrd:main]: test [0]: offset reset (at offset 1000, broker 2) to END: fetch failed due to requested offset not available on the broker: Broker: Offset out of range 3: [0105_transactions_mock / 2.007s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: [0105_transactions_mock / 2.109s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 0.959ms 3: [0080_admin_ut / 6.411s] [ do_test_configs:1411: PASS (2.00s) ] 3: [0080_admin_ut / 6.412s] Test config file test.conf not found 3: %5|1673491064.617|CONFWARN|0080_admin_ut#producer-75| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.412s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-75 DeleteRecords with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 6.412s] Using topic "rdkafkatest_rnd5f5237987d4024de_do_test_DeleteRecords" 3: [0080_admin_ut / 6.412s] Using topic "rdkafkatest_rnd433f95581e1c63e0_do_test_DeleteRecords" 3: [0080_admin_ut / 6.412s] Using topic "rdkafkatest_rnd20449e70f05c51e_do_test_DeleteRecords" 3: [0080_admin_ut / 6.412s] Using topic "rdkafkatest_rnd6784c1ba443b42c3_do_test_DeleteRecords" 3: [0080_admin_ut / 6.412s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 6.412s] DeleteRecords: duration 0.017ms 3: [0080_admin_ut / 6.412s] [ do_test_DeleteRecords:520: 0080_admin_ut#producer-75 DeleteRecords with main queue, options, destroy, timeout 100ms: PASS (0.00s) ] 3: [0080_admin_ut / 6.418s] Test config file test.conf not found 3: %5|1673491064.623|CONFWARN|0080_admin_ut#producer-76| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.418s] [ do_test_DeleteGroups:402: 0080_admin_ut#producer-76 DeleteGroups with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 6.418s] Using topic "rdkafkatest_rnd1a382c2a09b1cbdf_do_test_DeleteGroups" 3: [0080_admin_ut / 6.418s] Using topic "rdkafkatest_rnd67c452505f3d6e6a_do_test_DeleteGroups" 3: [0080_admin_ut / 6.418s] Using topic "rdkafkatest_rnd1c5f3fe41261522d_do_test_DeleteGroups" 3: [0080_admin_ut / 6.418s] Using topic "rdkafkatest_rnd205b8dc97dd73734_do_test_DeleteGroups" 3: [0080_admin_ut / 6.418s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 6.418s] DeleteGroups: duration 0.011ms 3: [0080_admin_ut / 6.418s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 6.427s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq ] 3: [0080_admin_ut / 6.427s] Test config file test.conf not found 3: %5|1673491064.632|CONFWARN|0080_admin_ut#consumer-77| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.528s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 6.528s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 6.528s] rd_kafka_destroy(): duration 0.215ms 3: [0080_admin_ut / 6.528s] [ do_test_unclean_destroy:1505: Test unclean destroy using tempq: PASS (0.10s) ] 3: [0080_admin_ut / 6.528s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 6.528s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq ] 3: [0080_admin_ut / 6.528s] Test config file test.conf not found 3: %5|1673491064.733|CONFWARN|0080_admin_ut#consumer-78| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.628s] Giving rd_kafka_destroy() 5s to finish, despite Admin API request being processed 3: [0080_admin_ut / 6.628s] Setting test timeout to 5s * 2.7 3: [0080_admin_ut / 6.628s] rd_kafka_destroy(): duration 0.187ms 3: [0080_admin_ut / 6.628s] [ do_test_unclean_destroy:1505: Test unclean destroy using mainq: PASS (0.10s) ] 3: [0080_admin_ut / 6.628s] Setting test timeout to 60s * 2.7 3: [0080_admin_ut / 6.628s] Test config file test.conf not found 3: %4|1673491064.833|CONFWARN|0080_admin_ut#consumer-79| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1673491064.833|CONFWARN|0080_admin_ut#consumer-79| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 6.629s] [ do_test_options:1588 ] 3: [0080_admin_ut / 6.629s] [ do_test_options:1588: PASS (0.00s) ] 3: [0080_admin_ut / 6.629s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 6.629s] Using topic "rdkafkatest_rnd3c2e95236ec4512a_do_test_CreateTopics" 3: [0080_admin_ut / 6.629s] Using topic "rdkafkatest_rnd573db54207167d35_do_test_CreateTopics" 3: [0080_admin_ut / 6.629s] Using topic "rdkafkatest_rnd189b06360e2a841f_do_test_CreateTopics" 3: [0080_admin_ut / 6.629s] Using topic "rdkafkatest_rnd4e943cb8792a449b_do_test_CreateTopics" 3: [0080_admin_ut / 6.629s] Using topic "rdkafkatest_rnd867e34304c3686a_do_test_CreateTopics" 3: [0080_admin_ut / 6.629s] Using topic "rdkafkatest_rnd67ba1adb02038d49_do_test_CreateTopics" 3: [0080_admin_ut / 6.629s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 6.629s] CreateTopics: duration 0.078ms 3: [0080_admin_ut / 6.729s] CreateTopics.queue_poll: duration 99.961ms 3: [0080_admin_ut / 6.729s] CreateTopics: got CreateTopicsResult in 99.961s 3: [0080_admin_ut / 6.729s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 6.729s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms ] 3: [0080_admin_ut / 6.729s] Using topic "rdkafkatest_rnd5d67ebb0407d730_do_test_CreateTopics" 3: [0080_admin_ut / 6.729s] Using topic "rdkafkatest_rnd6d5b4075484319f3_do_test_CreateTopics" 3: [0080_admin_ut / 6.729s] Using topic "rdkafkatest_rnd770d0c5530076c43_do_test_CreateTopics" 3: [0080_admin_ut / 6.729s] Using topic "rdkafkatest_rnd136c4c394268be70_do_test_CreateTopics" 3: [0080_admin_ut / 6.729s] Using topic "rdkafkatest_rnd1143836e7e975393_do_test_CreateTopics" 3: [0080_admin_ut / 6.729s] Using topic "rdkafkatest_rnd5527083955d508d6_do_test_CreateTopics" 3: [0080_admin_ut / 6.729s] Call CreateTopics, timeout is 100ms 3: [0080_admin_ut / 6.729s] CreateTopics: duration 0.077ms 3: [0080_admin_ut / 6.829s] CreateTopics.wait_background_event_cb: duration 99.990ms 3: [0080_admin_ut / 6.829s] CreateTopics: got CreateTopicsResult in 99.990s 3: [0080_admin_ut / 6.829s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with temp queue, no options, background_event_cb, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 6.829s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 6.829s] Using topic "rdkafkatest_rnd5b438c66e700f0c_do_test_CreateTopics" 3: [0080_admin_ut / 6.829s] Using topic "rdkafkatest_rnd6448d2013d044bc4_do_test_CreateTopics" 3: [0080_admin_ut / 6.829s] Using topic "rdkafkatest_rnd33d02223456c2f07_do_test_CreateTopics" 3: [0080_admin_ut / 6.829s] Using topic "rdkafkatest_rnd12b0bba32d2649e2_do_test_CreateTopics" 3: [0080_admin_ut / 6.829s] Using topic "rdkafkatest_rnd34d0ea7b32fcc89d_do_test_CreateTopics" 3: [0080_admin_ut / 6.829s] Using topic "rdkafkatest_rnd65f6de7220580913_do_test_CreateTopics" 3: [0080_admin_ut / 6.829s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 6.830s] CreateTopics: duration 0.078ms 3: [0080_admin_ut / 7.029s] CreateTopics.queue_poll: duration 199.959ms 3: [0080_admin_ut / 7.029s] CreateTopics: got CreateTopicsResult in 199.959s 3: [0080_admin_ut / 7.030s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 7.030s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.030s] Using topic "rdkafkatest_rnd3154fe9317651568_do_test_CreateTopics" 3: [0080_admin_ut / 7.030s] Using topic "rdkafkatest_rnd5bf08d1e2ad161a1_do_test_CreateTopics" 3: [0080_admin_ut / 7.030s] Using topic "rdkafkatest_rnd26e7a9733c14e50f_do_test_CreateTopics" 3: [0080_admin_ut / 7.030s] Using topic "rdkafkatest_rnd60a316f1113bed48_do_test_CreateTopics" 3: [0080_admin_ut / 7.030s] Using topic "rdkafkatest_rnd217501a516f0260e_do_test_CreateTopics" 3: [0080_admin_ut / 7.030s] Using topic "rdkafkatest_rnd707170787b38f810_do_test_CreateTopics" 3: [0080_admin_ut / 7.030s] Call CreateTopics, timeout is 200ms 3: [0080_admin_ut / 7.030s] CreateTopics: duration 0.076ms 3: [0080_admin_ut / 7.230s] CreateTopics.queue_poll: duration 199.962ms 3: [0080_admin_ut / 7.230s] CreateTopics: got CreateTopicsResult in 199.962s 3: [0080_admin_ut / 7.230s] [ do_test_CreateTopics:101: 0080_admin_ut#consumer-79 CreateTopics with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 7.230s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-79 DeleteTopics with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 7.230s] Using topic "rdkafkatest_rnd389823e62f091a33_do_test_DeleteTopics" 3: [0080_admin_ut / 7.230s] Using topic "rdkafkatest_rnd52f5ab3a05823fbf_do_test_DeleteTopics" 3: [0080_admin_ut / 7.230s] Using topic "rdkafkatest_rnd41b9d5d7001bf51d_do_test_DeleteTopics" 3: [0080_admin_ut / 7.230s] Using topic "rdkafkatest_rnd546fe8e1768ac052_do_test_DeleteTopics" 3: [0080_admin_ut / 7.230s] Call DeleteTopics, timeout is 100ms 3: [0080_admin_ut / 7.230s] DeleteTopics: duration 0.008ms 3: [0080_admin_ut / 7.330s] DeleteTopics.queue_poll: duration 100.036ms 3: [0080_admin_ut / 7.330s] DeleteTopics: got DeleteTopicsResult in 100.036s 3: [0080_admin_ut / 7.330s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.330s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-79 DeleteTopics with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.330s] Using topic "rdkafkatest_rnd3318bdba27656933_do_test_DeleteTopics" 3: [0080_admin_ut / 7.330s] Using topic "rdkafkatest_rnd5c819ec45370c6cd_do_test_DeleteTopics" 3: [0080_admin_ut / 7.330s] Using topic "rdkafkatest_rnd429e03780dd69d57_do_test_DeleteTopics" 3: [0080_admin_ut / 7.330s] Using topic "rdkafkatest_rnd6ad5dc350dde0a01_do_test_DeleteTopics" 3: [0080_admin_ut / 7.330s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 7.330s] DeleteTopics: duration 0.008ms 3: [0105_transactions_mock / 3.311s] rd_kafka_commit_transaction(rk, 5000): duration 1202.589ms 3: [0105_transactions_mock / 3.312s] [ do_test_txn_recoverable_errors:194: PASS (3.31s) ] 3: [0105_transactions_mock / 3.312s] [ do_test_txn_fatal_idempo_errors:305 ] 3: [0105_transactions_mock / 3.312s] Test config file test.conf not found 3: [0105_transactions_mock / 3.312s] Setting test timeout to 60s * 2.7 3: %5|1673491065.632|MOCK|0105_transactions_mock#producer-80| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45545,127.0.0.1:43537,127.0.0.1:45275 3: [0105_transactions_mock / 3.313s] Created kafka instance 0105_transactions_mock#producer-80 3: [0105_transactions_mock / 3.334s] rd_kafka_init_transactions(rk, 5000): duration 20.882ms 3: [0105_transactions_mock / 3.334s] rd_kafka_begin_transaction(rk): duration 0.013ms 3: [0105_transactions_mock / 3.334s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock / 3.334s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0080_admin_ut / 7.530s] DeleteTopics.queue_poll: duration 200.034ms 3: [0080_admin_ut / 7.530s] DeleteTopics: got DeleteTopicsResult in 200.034s 3: [0080_admin_ut / 7.530s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.530s] [ do_test_DeleteTopics:300: 0080_admin_ut#consumer-79 DeleteTopics with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.530s] Using topic "rdkafkatest_rnd69c72a7615a73dd7_do_test_DeleteTopics" 3: [0080_admin_ut / 7.530s] Using topic "rdkafkatest_rnd1b86cefa10aed3e9_do_test_DeleteTopics" 3: [0080_admin_ut / 7.530s] Using topic "rdkafkatest_rnd51bc22e627c6e786_do_test_DeleteTopics" 3: [0080_admin_ut / 7.530s] Using topic "rdkafkatest_rnd7151eada62f8102f_do_test_DeleteTopics" 3: [0080_admin_ut / 7.530s] Call DeleteTopics, timeout is 200ms 3: [0080_admin_ut / 7.530s] DeleteTopics: duration 0.007ms 3: [0080_admin_ut / 7.730s] DeleteTopics.queue_poll: duration 200.034ms 3: [0080_admin_ut / 7.730s] DeleteTopics: got DeleteTopicsResult in 200.034s 3: [0080_admin_ut / 7.730s] [ do_test_DeleteTopics:371 ] 3: [0080_admin_ut / 7.730s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-79 DeleteGroups with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 7.730s] Using topic "rdkafkatest_rnd9dc08e912c6ec7f_do_test_DeleteGroups" 3: [0080_admin_ut / 7.730s] Using topic "rdkafkatest_rnd79e8363d5a613957_do_test_DeleteGroups" 3: [0080_admin_ut / 7.730s] Using topic "rdkafkatest_rnd3385cf775212e4d_do_test_DeleteGroups" 3: [0080_admin_ut / 7.730s] Using topic "rdkafkatest_rnd67eab58a3bd080de_do_test_DeleteGroups" 3: [0080_admin_ut / 7.730s] Call DeleteGroups, timeout is 100ms 3: [0080_admin_ut / 7.730s] DeleteGroups: duration 0.015ms 3: [0080_admin_ut / 7.830s] DeleteGroups.queue_poll: duration 100.042ms 3: [0080_admin_ut / 7.830s] DeleteGroups: got DeleteGroupsResult in 100.042s 3: [0080_admin_ut / 7.830s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 7.830s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-79 DeleteGroups with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 7.830s] Using topic "rdkafkatest_rnd242a48813ae060c5_do_test_DeleteGroups" 3: [0080_admin_ut / 7.830s] Using topic "rdkafkatest_rnd4152c09d65e41e58_do_test_DeleteGroups" 3: [0080_admin_ut / 7.830s] Using topic "rdkafkatest_rnd3afc55e215c2a97f_do_test_DeleteGroups" 3: [0080_admin_ut / 7.830s] Using topic "rdkafkatest_rnd5c6edeaa6e15139c_do_test_DeleteGroups" 3: [0080_admin_ut / 7.830s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 7.830s] DeleteGroups: duration 0.010ms 3: [0080_admin_ut / 8.030s] DeleteGroups.queue_poll: duration 200.046ms 3: [0080_admin_ut / 8.030s] DeleteGroups: got DeleteGroupsResult in 200.046s 3: [0080_admin_ut / 8.030s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 8.030s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-79 DeleteGroups with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.030s] Using topic "rdkafkatest_rnd3d2812b238f07d6f_do_test_DeleteGroups" 3: [0080_admin_ut / 8.030s] Using topic "rdkafkatest_rnd4185da6a7fc6162b_do_test_DeleteGroups" 3: [0080_admin_ut / 8.030s] Using topic "rdkafkatest_rnd46c71ac62c5bb69f_do_test_DeleteGroups" 3: [0080_admin_ut / 8.030s] Using topic "rdkafkatest_rndda4202c308e453c_do_test_DeleteGroups" 3: [0080_admin_ut / 8.030s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 8.030s] DeleteGroups: duration 0.010ms 3: [0080_admin_ut / 8.231s] DeleteGroups.queue_poll: duration 200.043ms 3: [0080_admin_ut / 8.231s] DeleteGroups: got DeleteGroupsResult in 200.043s 3: [0080_admin_ut / 8.231s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 8.231s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-79 DeleteRecords with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 8.231s] Using topic "rdkafkatest_rnd4202f476292aef26_do_test_DeleteRecords" 3: [0080_admin_ut / 8.231s] Using topic "rdkafkatest_rnd413d192513bf175d_do_test_DeleteRecords" 3: [0080_admin_ut / 8.231s] Using topic "rdkafkatest_rnd50f1d6ac328f0400_do_test_DeleteRecords" 3: [0080_admin_ut / 8.231s] Using topic "rdkafkatest_rnd76b7278c5acddf95_do_test_DeleteRecords" 3: [0080_admin_ut / 8.231s] Call DeleteRecords, timeout is 100ms 3: [0080_admin_ut / 8.231s] DeleteRecords: duration 0.020ms 3: [0034_offset_reset_mock / 12.257s] #1: message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 12.257s] #1: got expected message at offset 0 (NO_ERROR) 3: [0034_offset_reset_mock / 12.257s] Waiting for up to 5000ms for metadata update 3: [0080_admin_ut / 8.331s] DeleteRecords.queue_poll: duration 100.027ms 3: [0080_admin_ut / 8.331s] DeleteRecords: got DeleteRecordsResult in 100.027s 3: [0080_admin_ut / 8.331s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-79 DeleteRecords with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 8.331s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-79 DeleteRecords with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.331s] Using topic "rdkafkatest_rnd4555f07f709f5dc9_do_test_DeleteRecords" 3: [0080_admin_ut / 8.331s] Using topic "rdkafkatest_rnd352f18ed488e4d77_do_test_DeleteRecords" 3: [0080_admin_ut / 8.331s] Using topic "rdkafkatest_rnd65c08c171d19ce77_do_test_DeleteRecords" 3: [0080_admin_ut / 8.331s] Using topic "rdkafkatest_rnd45ece5509ead498_do_test_DeleteRecords" 3: [0080_admin_ut / 8.331s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 8.331s] DeleteRecords: duration 0.019ms 3: %3|1673491066.636|TXNERR|0105_transactions_mock#producer-80| [thrd:127.0.0.1:45545/bootstrap]: Current transaction failed in state BeginCommit: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 4.317s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 4.317s] rd_kafka_abort_transaction(rk, -1): duration 0.270ms 3: [0105_transactions_mock / 4.317s] rd_kafka_begin_transaction(rk): duration 0.080ms 3: [0105_transactions_mock / 4.317s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: [0105_transactions_mock / 4.320s] rd_kafka_commit_transaction(rk, -1): duration 2.219ms 3: [0105_transactions_mock / 4.320s] [ do_test_txn_fatal_idempo_errors:305: PASS (1.01s) ] 3: [0105_transactions_mock / 4.320s] [ do_test_txn_fenced_reinit:511: With error INVALID_PRODUCER_EPOCH ] 3: [0105_transactions_mock / 4.320s] Test config file test.conf not found 3: [0105_transactions_mock / 4.320s] Setting test timeout to 60s * 2.7 3: %5|1673491066.640|MOCK|0105_transactions_mock#producer-81| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46503,127.0.0.1:38205,127.0.0.1:42393 3: [0105_transactions_mock / 4.321s] Created kafka instance 0105_transactions_mock#producer-81 3: [0105_transactions_mock / 4.340s] rd_kafka_init_transactions(rk, -1): duration 15.283ms 3: [0105_transactions_mock / 4.340s] rd_kafka_begin_transaction(rk): duration 0.013ms 3: [0105_transactions_mock / 4.340s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock / 4.340s] 0105_transactions_mock#producer-81: Flushing 1 messages 3: [0080_admin_ut / 8.531s] DeleteRecords.queue_poll: duration 200.023ms 3: [0080_admin_ut / 8.531s] DeleteRecords: got DeleteRecordsResult in 200.023s 3: [0080_admin_ut / 8.531s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-79 DeleteRecords with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 8.531s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-79 DeleteRecords with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.531s] Using topic "rdkafkatest_rnd57fa2f3c45b18ef2_do_test_DeleteRecords" 3: [0080_admin_ut / 8.531s] Using topic "rdkafkatest_rnd6fcef2f012f6851e_do_test_DeleteRecords" 3: [0080_admin_ut / 8.531s] Using topic "rdkafkatest_rnd5b7438714c3dd19a_do_test_DeleteRecords" 3: [0080_admin_ut / 8.531s] Using topic "rdkafkatest_rnd10b98bb189c4b24_do_test_DeleteRecords" 3: [0080_admin_ut / 8.531s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 8.531s] DeleteRecords: duration 0.019ms 3: [0080_admin_ut / 8.731s] DeleteRecords.queue_poll: duration 200.027ms 3: [0080_admin_ut / 8.731s] DeleteRecords: got DeleteRecordsResult in 200.027s 3: [0080_admin_ut / 8.731s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-79 DeleteRecords with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 8.731s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-79 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 8.731s] Call DeleteConsumerGroupOffsets, timeout is 100ms 3: [0080_admin_ut / 8.731s] DeleteConsumerGroupOffsets: duration 0.011ms 3: [0034_offset_reset_mock / 12.760s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 12.760s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 12.760s] METADATA.WAIT: duration 502.493ms 3: [0080_admin_ut / 8.831s] DeleteConsumerGroupOffsets.queue_poll: duration 100.026ms 3: [0080_admin_ut / 8.831s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 100.026s 3: [0080_admin_ut / 8.831s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-79 DeleteConsumerGroupOffsets with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 8.831s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-79 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 8.831s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 8.831s] DeleteConsumerGroupOffsets: duration 0.008ms 3: [0103_transactions_local / 7.013s] init_transactions(): duration 7000.024ms 3: [0103_transactions_local / 7.013s] init_transactions() failed as expected: Failed to initialize Producer ID: Local: Timed out 3: [0103_transactions_local / 7.014s] [ do_test_txn_local:1168: PASS (7.01s) ] 3: [0103_transactions_local / 7.014s] 0103_transactions_local: duration 7013.890ms 3: [0103_transactions_local / 7.014s] ================= Test 0103_transactions_local PASSED ================= 3: [0080_admin_ut / 9.031s] DeleteConsumerGroupOffsets.queue_poll: duration 200.027ms 3: [0080_admin_ut / 9.031s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.027s 3: [0080_admin_ut / 9.031s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-79 DeleteConsumerGroupOffsets with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.031s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-79 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.031s] Call DeleteConsumerGroupOffsets, timeout is 200ms 3: [0080_admin_ut / 9.031s] DeleteConsumerGroupOffsets: duration 0.008ms 3: [
/ 13.258s] Too many tests running (5 >= 5): postponing 0113_cooperative_rebalance_local start... 3: [0106_cgrp_sess_timeout / 0.000s] ================= Running test 0106_cgrp_sess_timeout ================= 3: [0106_cgrp_sess_timeout / 0.000s] ==== Stats written to file stats_0106_cgrp_sess_timeout_373322720308458277.json ==== 3: [0106_cgrp_sess_timeout / 0.000s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.000s] [ do_test_session_timeout:152: Test session timeout with sync commit ] 3: %5|1673491067.267|CONFWARN|MOCK#producer-82| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 0.003s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.006s] Created kafka instance 0106_cgrp_sess_timeout#producer-83 3: [0106_cgrp_sess_timeout / 0.006s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.006s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 0.006s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 0.006s] PRODUCE: duration 0.062ms 3: [0106_cgrp_sess_timeout / 0.060s] PRODUCE.DELIVERY.WAIT: duration 54.605ms 3: [0106_cgrp_sess_timeout / 0.065s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 0.065s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 0.074s] Created kafka instance 0106_cgrp_sess_timeout#consumer-84 3: [0106_cgrp_sess_timeout / 0.078s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0080_admin_ut / 9.231s] DeleteConsumerGroupOffsets.queue_poll: duration 200.034ms 3: [0080_admin_ut / 9.231s] DeleteConsumerGroupOffsets: got DeleteConsumerGroupOffsetsResult in 200.034s 3: [0080_admin_ut / 9.231s] [ do_test_DeleteConsumerGroupOffsets:621: 0080_admin_ut#consumer-79 DeleteConsumerGroupOffsets with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.231s] Using topic "rdkafkatest_rnd1862614f4bf569d0_do_test_AclBinding" 3: [0080_admin_ut / 9.231s] [ do_test_AclBinding:721 ] 3: [0080_admin_ut / 9.231s] [ do_test_AclBinding:721: PASS (0.00s) ] 3: [0080_admin_ut / 9.231s] Using topic "rdkafkatest_rnd6eed29c42606817b_do_test_AclBindingFilter" 3: [0080_admin_ut / 9.231s] [ do_test_AclBindingFilter:853 ] 3: [0080_admin_ut / 9.231s] [ do_test_AclBindingFilter:853: PASS (0.00s) ] 3: [0080_admin_ut / 9.231s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-79 CreaetAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 9.231s] Using topic "rdkafkatest_rnd7c83af0c30f01e3b_do_test_CreateAcls" 3: [0080_admin_ut / 9.231s] Using topic "rdkafkatest_rnd4f3170a13dc0c832_do_test_CreateAcls" 3: [0080_admin_ut / 9.231s] Call CreateAcls, timeout is 100ms 3: [0080_admin_ut / 9.231s] CreateAcls: duration 0.008ms 3: [0080_admin_ut / 9.331s] CreateAcls.queue_poll: duration 100.031ms 3: [0080_admin_ut / 9.331s] CreateAcls: got CreateAclsResult in 100.031s 3: [0080_admin_ut / 9.331s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-79 CreaetAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 9.331s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-79 CreaetAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.331s] Using topic "rdkafkatest_rnd44af35982023474d_do_test_CreateAcls" 3: [0080_admin_ut / 9.331s] Using topic "rdkafkatest_rnd704fcc323b665d24_do_test_CreateAcls" 3: [0080_admin_ut / 9.331s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 9.331s] CreateAcls: duration 0.008ms 3: [0105_transactions_mock / 5.323s] FLUSH: duration 982.926ms 3: [0105_transactions_mock / 5.323s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: [0105_transactions_mock / 5.323s] 0105_transactions_mock#producer-81: Flushing 1 messages 3: %3|1673491067.643|TXNERR|0105_transactions_mock#producer-81| [thrd:127.0.0.1:42393/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 5.323s] FLUSH: duration 0.124ms 3: %1|1673491067.644|TXNERR|0105_transactions_mock#producer-81| [thrd:main]: Fatal transaction error: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch (_FENCED) 3: %0|1673491067.644|FATAL|0105_transactions_mock#producer-81| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.324s] abort_transaction() failed: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.324s] Fatal error: _FENCED: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: Producer attempted an operation with an old epoch 3: [0105_transactions_mock / 5.325s] [ do_test_txn_fenced_reinit:511: With error INVALID_PRODUCER_EPOCH: PASS (1.00s) ] 3: [0105_transactions_mock / 5.325s] [ do_test_txn_fenced_reinit:511: With error PRODUCER_FENCED ] 3: [0105_transactions_mock / 5.325s] Test config file test.conf not found 3: [0105_transactions_mock / 5.325s] Setting test timeout to 60s * 2.7 3: %5|1673491067.645|MOCK|0105_transactions_mock#producer-85| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35005,127.0.0.1:42445,127.0.0.1:37735 3: [0105_transactions_mock / 5.326s] Created kafka instance 0105_transactions_mock#producer-85 3: [0105_transactions_mock / 5.370s] rd_kafka_init_transactions(rk, -1): duration 41.265ms 3: [0105_transactions_mock / 5.370s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: [0105_transactions_mock / 5.370s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.015ms 3: [0105_transactions_mock / 5.370s] 0105_transactions_mock#producer-85: Flushing 1 messages 3: [0080_admin_ut / 9.531s] CreateAcls.queue_poll: duration 200.032ms 3: [0080_admin_ut / 9.531s] CreateAcls: got CreateAclsResult in 200.032s 3: [0080_admin_ut / 9.531s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-79 CreaetAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.531s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-79 CreaetAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.531s] Using topic "rdkafkatest_rnd7af126e335a5bcb1_do_test_CreateAcls" 3: [0080_admin_ut / 9.531s] Using topic "rdkafkatest_rnd2c05baed30203fd0_do_test_CreateAcls" 3: [0080_admin_ut / 9.531s] Call CreateAcls, timeout is 200ms 3: [0080_admin_ut / 9.531s] CreateAcls: duration 0.007ms 3: [0104_fetch_from_follower_mock/ 6.309s] CONSUME: duration 5000.063ms 3: [0104_fetch_from_follower_mock/ 6.309s] test_consumer_poll_no_msgs:4075: latest: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 6.309s] test_consumer_poll_no_msgs:4075: latest: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 6.309s] Closing consumer 0104_fetch_from_follower_mock#consumer-74 3: [0104_fetch_from_follower_mock/ 6.309s] CONSUMER.CLOSE: duration 0.040ms 3: [0104_fetch_from_follower_mock/ 6.310s] [ Test FFF auto.offset.reset=latest PASSED ] 3: [0104_fetch_from_follower_mock/ 6.310s] [ Test lagging FFF offset reset ] 3: %5|1673491067.896|CONFWARN|MOCK#producer-86| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 6.310s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.310s] Created kafka instance 0104_fetch_from_follower_mock#producer-87 3: [0104_fetch_from_follower_mock/ 6.310s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.310s] Produce to test [0]: messages #0..10 3: [0104_fetch_from_follower_mock/ 6.310s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 6.310s] PRODUCE: duration 0.015ms 3: [0080_admin_ut / 9.731s] CreateAcls.queue_poll: duration 200.036ms 3: [0080_admin_ut / 9.732s] CreateAcls: got CreateAclsResult in 200.036s 3: [0080_admin_ut / 9.732s] [ do_test_CreateAcls:993: 0080_admin_ut#consumer-79 CreaetAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 9.732s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-79 DescribeAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 9.732s] Using topic "rdkafkatest_rnd7e340a2811c64704_do_test_DescribeAcls" 3: [0080_admin_ut / 9.732s] Call DescribeAcls, timeout is 100ms 3: [0080_admin_ut / 9.732s] DescribeAcls: duration 0.008ms 3: [0104_fetch_from_follower_mock/ 6.368s] PRODUCE.DELIVERY.WAIT: duration 57.084ms 3: [0104_fetch_from_follower_mock/ 6.375s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 6.375s] Created kafka instance 0104_fetch_from_follower_mock#consumer-88 3: [0104_fetch_from_follower_mock/ 6.379s] ASSIGN.PARTITIONS: duration 3.521ms 3: [0104_fetch_from_follower_mock/ 6.379s] lag: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 6.379s] up to wmark: consume 7 messages 3: [0034_offset_reset_mock / 13.761s] #2: injecting TOPIC_AUTHORIZATION_FAILED, expecting TOPIC_AUTHORIZATION_FAILED 3: [0034_offset_reset_mock / 13.761s] ASSIGN.PARTITIONS: duration 0.062ms 3: [0034_offset_reset_mock / 13.761s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 13.767s] #2: injected TOPIC_AUTHORIZATION_FAILED, got error _AUTO_OFFSET_RESET: failed to query logical offset: Broker: Topic authorization failed (broker 1) 3: [0034_offset_reset_mock / 13.767s] Waiting for up to 5000ms for metadata update 3: [0034_offset_reset_mock / 13.771s] Metadata verification succeeded: 1 desired topics seen, 0 undesired topics not seen 3: [0034_offset_reset_mock / 13.771s] All expected topics (not?) seen in metadata 3: [0034_offset_reset_mock / 13.771s] METADATA.WAIT: duration 3.989ms 3: [0080_admin_ut / 9.832s] DescribeAcls.queue_poll: duration 100.030ms 3: [0080_admin_ut / 9.832s] DescribeAcls: got DescribeAclsResult in 100.030s 3: [0080_admin_ut / 9.832s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-79 DescribeAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 9.832s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-79 DescribeAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 9.832s] Using topic "rdkafkatest_rnd4d3a0e470292d87d_do_test_DescribeAcls" 3: [0080_admin_ut / 9.832s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 9.832s] DescribeAcls: duration 0.007ms 3: [0104_fetch_from_follower_mock/ 6.494s] CONSUME: duration 115.129ms 3: [0104_fetch_from_follower_mock/ 6.494s] up to wmark: consumed 7/7 messages (0/0 EOFs) 3: [0104_fetch_from_follower_mock/ 6.494s] no msgs: not expecting any messages for 3000ms 3: [0080_admin_ut / 10.032s] DescribeAcls.queue_poll: duration 200.029ms 3: [0080_admin_ut / 10.032s] DescribeAcls: got DescribeAclsResult in 200.029s 3: [0080_admin_ut / 10.032s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-79 DescribeAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.032s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-79 DescribeAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.032s] Using topic "rdkafkatest_rnd1bb11b9c25343d84_do_test_DescribeAcls" 3: [0080_admin_ut / 10.032s] Call DescribeAcls, timeout is 200ms 3: [0080_admin_ut / 10.032s] DescribeAcls: duration 0.007ms 3: [0080_admin_ut / 10.232s] DescribeAcls.queue_poll: duration 200.032ms 3: [0080_admin_ut / 10.232s] DescribeAcls: got DescribeAclsResult in 200.032s 3: [0080_admin_ut / 10.232s] [ do_test_DescribeAcls:1108: 0080_admin_ut#consumer-79 DescribeAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.232s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-79 DeleteAcls with temp queue, no options, timeout 100ms ] 3: [0080_admin_ut / 10.232s] Using topic "rdkafkatest_rnd484467700b800e8c_do_test_DeleteAcls" 3: [0080_admin_ut / 10.232s] Using topic "rdkafkatest_rnd382ac2a223b89fe1_do_test_DeleteAcls" 3: [0080_admin_ut / 10.232s] Call DeleteAcls, timeout is 100ms 3: [0080_admin_ut / 10.232s] DeleteAcls: duration 0.007ms 3: [0080_admin_ut / 10.332s] DeleteAcls.queue_poll: duration 100.035ms 3: [0080_admin_ut / 10.332s] DeleteAcls: got DeleteAclsResult in 100.035s 3: [0080_admin_ut / 10.332s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-79 DeleteAcls with temp queue, no options, timeout 100ms: PASS (0.10s) ] 3: [0080_admin_ut / 10.332s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-79 DeleteAcls with temp queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.332s] Using topic "rdkafkatest_rnd57bde02739365b5d_do_test_DeleteAcls" 3: [0080_admin_ut / 10.332s] Using topic "rdkafkatest_rnd3c54eb055cec2f30_do_test_DeleteAcls" 3: [0080_admin_ut / 10.332s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 10.332s] DeleteAcls: duration 0.007ms 3: [0105_transactions_mock / 6.328s] FLUSH: duration 958.777ms 3: [0105_transactions_mock / 6.328s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.005ms 3: [0105_transactions_mock / 6.328s] 0105_transactions_mock#producer-85: Flushing 1 messages 3: %3|1673491068.648|TXNERR|0105_transactions_mock#producer-85| [thrd:127.0.0.1:37735/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 6.329s] FLUSH: duration 0.100ms 3: %1|1673491068.649|TXNERR|0105_transactions_mock#producer-85| [thrd:main]: Fatal transaction error: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one (_FENCED) 3: %0|1673491068.649|FATAL|0105_transactions_mock#producer-85| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.330s] abort_transaction() failed: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.330s] Fatal error: _FENCED: Producer fenced by newer instance: Failed to acquire transactional PID from broker TxnCoordinator/2: Broker: There is a newer producer with the same transactionalId which fences the current one 3: [0105_transactions_mock / 6.330s] [ do_test_txn_fenced_reinit:511: With error PRODUCER_FENCED: PASS (1.01s) ] 3: [0105_transactions_mock / 6.330s] [ do_test_txn_req_cnt:1071 ] 3: [0105_transactions_mock / 6.330s] Test config file test.conf not found 3: [0105_transactions_mock / 6.330s] Setting test timeout to 60s * 2.7 3: %5|1673491068.650|MOCK|0105_transactions_mock#producer-89| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45985,127.0.0.1:39905,127.0.0.1:38813 3: [0105_transactions_mock / 6.331s] Created kafka instance 0105_transactions_mock#producer-89 3: [0105_transactions_mock / 6.334s] rd_kafka_init_transactions(rk, 5000): duration 3.146ms 3: [0105_transactions_mock / 6.334s] rd_kafka_begin_transaction(rk): duration 0.028ms 3: [0080_admin_ut / 10.532s] DeleteAcls.queue_poll: duration 200.035ms 3: [0080_admin_ut / 10.532s] DeleteAcls: got DeleteAclsResult in 200.035s 3: [0080_admin_ut / 10.532s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-79 DeleteAcls with temp queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.532s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-79 DeleteAcls with main queue, options, timeout 100ms ] 3: [0080_admin_ut / 10.532s] Using topic "rdkafkatest_rnd7bc7ce8254b74c54_do_test_DeleteAcls" 3: [0080_admin_ut / 10.532s] Using topic "rdkafkatest_rnd28e199006ab4f847_do_test_DeleteAcls" 3: [0080_admin_ut / 10.532s] Call DeleteAcls, timeout is 200ms 3: [0080_admin_ut / 10.532s] DeleteAcls: duration 0.007ms 3: [0105_transactions_mock / 6.536s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 201.976ms 3: [0105_transactions_mock / 6.536s] rd_kafka_abort_transaction(rk, 5000): duration 0.127ms 3: [0105_transactions_mock / 6.537s] [ do_test_txn_req_cnt:1071: PASS (0.21s) ] 3: [0105_transactions_mock / 6.537s] [ do_test_txn_requires_abort_errors:1132 ] 3: [0105_transactions_mock / 6.537s] Test config file test.conf not found 3: [0105_transactions_mock / 6.537s] Setting test timeout to 60s * 2.7 3: %5|1673491068.856|MOCK|0105_transactions_mock#producer-90| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39263,127.0.0.1:46187,127.0.0.1:38039 3: [0105_transactions_mock / 6.541s] Created kafka instance 0105_transactions_mock#producer-90 3: [0105_transactions_mock / 6.574s] rd_kafka_init_transactions(rk, 5000): duration 33.609ms 3: [0105_transactions_mock / 6.578s] rd_kafka_begin_transaction(rk): duration 3.948ms 3: [0105_transactions_mock / 6.578s] 1. Fail on produce 3: [0105_transactions_mock / 6.579s] 0105_transactions_mock#producer-90: Flushing 1 messages 3: [0080_admin_ut / 10.733s] DeleteAcls.queue_poll: duration 201.173ms 3: [0080_admin_ut / 10.733s] DeleteAcls: got DeleteAclsResult in 201.173s 3: [0080_admin_ut / 10.733s] [ do_test_DeleteAcls:1221: 0080_admin_ut#consumer-79 DeleteAcls with main queue, options, timeout 100ms: PASS (0.20s) ] 3: [0080_admin_ut / 10.733s] [ do_test_mix:1342 ] 3: [0080_admin_ut / 10.733s] Creating 2 topics 3: [0080_admin_ut / 10.733s] Deleting 1 topics 3: [0080_admin_ut / 10.733s] Creating 1 topics 3: [0080_admin_ut / 10.733s] Deleting 3 groups 3: [0080_admin_ut / 10.733s] Deleting offsets from 3 partitions 3: [0080_admin_ut / 10.733s] Creating (up to) 15 partitions for topic "topicD" 3: [0080_admin_ut / 10.733s] Deleting committed offsets for group mygroup and 3 partitions 3: [0080_admin_ut / 10.733s] Provoking invalid DeleteConsumerGroupOffsets call 3: [0080_admin_ut / 10.733s] Creating 2 topics 3: [0080_admin_ut / 10.733s] Got event DeleteConsumerGroupOffsetsResult: Exactly one DeleteConsumerGroupOffsets must be passed 3: [0034_offset_reset_mock / 14.771s] #3: injecting NO_ERROR, expecting _NO_OFFSET 3: [0034_offset_reset_mock / 14.772s] ASSIGN.PARTITIONS: duration 0.065ms 3: [0034_offset_reset_mock / 14.772s] ASSIGN: assigned 1 partition(s) 3: [0034_offset_reset_mock / 14.772s] #3: Ignoring Error event: Failed to query logical offset TAIL(10): Broker: Topic authorization failed 3: [0034_offset_reset_mock / 14.772s] #3: injected NO_ERROR, got error _AUTO_OFFSET_RESET: no previously committed offset available: Local: No offset stored 3: [0034_offset_reset_mock / 14.772s] [ offset_reset_errors:201: PASS (14.77s) ] 3: [0034_offset_reset_mock / 14.772s] 0034_offset_reset_mock: duration 14772.367ms 3: [0034_offset_reset_mock / 14.772s] ================= Test 0034_offset_reset_mock PASSED ================= 3: [0080_admin_ut / 10.833s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.833s] Got event DeleteTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.833s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.833s] Got event DeleteGroupsResult: Success 3: [0080_admin_ut / 10.833s] Got event DeleteRecordsResult: Failed to query partition leaders: Local: Timed out 3: [0080_admin_ut / 10.833s] Got event CreatePartitionsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.833s] Got event DeleteConsumerGroupOffsetsResult: Failed while waiting for response from broker: Local: Timed out 3: [0080_admin_ut / 10.833s] Got event CreateTopicsResult: Failed while waiting for controller: Local: Timed out 3: [0080_admin_ut / 10.833s] [ do_test_mix:1342: PASS (0.10s) ] 3: [0080_admin_ut / 10.833s] [ do_test_configs:1411 ] 3: [
/ 15.109s] Too many tests running (5 >= 5): postponing 0116_kafkaconsumer_close start... 3: [0113_cooperative_rebalance_local/ 0.000s] ================= Running test 0113_cooperative_rebalance_local ================= 3: [0113_cooperative_rebalance_local/ 0.000s] ==== Stats written to file stats_0113_cooperative_rebalance_local_8844451532794841101.json ==== 3: [0113_cooperative_rebalance_local/ 0.000s] [ a_assign_rapid:674 ] 3: %5|1673491069.118|CONFWARN|MOCK#producer-91| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 0.016s] Setting test timeout to 10s * 2.7 3: %3|1673491069.862|TXNERR|0105_transactions_mock#producer-90| [thrd:127.0.0.1:38039/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 1 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:120557000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491069.862|PARTCNT|0105_transactions_mock#producer-90| [thrd:127.0.0.1:38039/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 7.542s] FLUSH: duration 963.761ms 3: [0105_transactions_mock / 7.542s] Error TOPIC_AUTHORIZATION_FAILED: ProduceRequest for mytopic [2] with 1 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:120557000,Epoch:0}, base seq 0): current transaction must be aborted 3: [0105_transactions_mock / 7.543s] rd_kafka_abort_transaction(rk, -1): duration 0.841ms 3: [0105_transactions_mock / 7.543s] 2. Fail on AddPartitionsToTxn 3: [0105_transactions_mock / 7.543s] rd_kafka_begin_transaction(rk): duration 0.021ms 3: %3|1673491069.864|ADDPARTS|0105_transactions_mock#producer-90| [thrd:main]: TxnCoordinator/1: Failed to add partition "mytopic" [2] to transaction: Broker: Topic authorization failed 3: %3|1673491069.864|TXNERR|0105_transactions_mock#producer-90| [thrd:main]: Current transaction failed in state BeginCommit: Failed to add partition(s) to transaction on broker TxnCoordinator/1: Broker: Topic authorization failed (after 0 ms) (TOPIC_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 7.545s] commit_transaction() error TOPIC_AUTHORIZATION_FAILED: Failed to add partition(s) to transaction on broker TxnCoordinator/1: Broker: Topic authorization failed (after 0 ms) 3: [0105_transactions_mock / 7.545s] rd_kafka_abort_transaction(rk, -1): duration 0.135ms 3: [0105_transactions_mock / 7.545s] 3. Fail on AddOffsetsToTxn 3: [0105_transactions_mock / 7.545s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: %3|1673491069.864|ADDOFFSETS|0105_transactions_mock#producer-90| [thrd:main]: TxnCoordinator/1: Failed to add offsets to transaction on broker TxnCoordinator/1: Broker: Group authorization failed 3: %3|1673491069.864|TXNERR|0105_transactions_mock#producer-90| [thrd:main]: Current transaction failed in state InTransaction: Failed to add offsets to transaction on broker TxnCoordinator/1: Broker: Group authorization failed (after 0ms) (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 7.546s] rd_kafka_abort_transaction(rk, -1): duration 0.098ms 3: [0105_transactions_mock / 7.546s] [ do_test_txn_requires_abort_errors:1132: PASS (1.01s) ] 3: [0105_transactions_mock / 7.546s] [ do_test_txn_slow_reinit:390: without sleep ] 3: [0105_transactions_mock / 7.546s] Test config file test.conf not found 3: [0105_transactions_mock / 7.546s] Setting test timeout to 60s * 2.7 3: %5|1673491069.866|MOCK|0105_transactions_mock#producer-93| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37359,127.0.0.1:37169,127.0.0.1:33647 3: [0105_transactions_mock / 7.547s] Created kafka instance 0105_transactions_mock#producer-93 3: [0105_transactions_mock / 7.587s] rd_kafka_init_transactions(rk, -1): duration 36.207ms 3: [0105_transactions_mock / 7.587s] rd_kafka_begin_transaction(rk): duration 0.016ms 3: [0105_transactions_mock / 7.587s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock / 7.587s] 0105_transactions_mock#producer-93: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 1.028s] Setting test timeout to 20s * 2.7 3: [0113_cooperative_rebalance_local/ 1.028s] pre-commit: 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.028s] topic1[0] offset 11: Success 3: [0113_cooperative_rebalance_local/ 1.028s] topic2[0] offset 22: Success 3: [0113_cooperative_rebalance_local/ 1.076s] a_assign_rapid#consumer-94: incremental assign of 2 partition(s) 3: [0113_cooperative_rebalance_local/ 1.076s] incremental_assign(): 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.076s] topic1[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.076s] topic2[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.080s] a_assign_rapid#consumer-94: incremental unassign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 1.080s] incremental_unassign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.080s] topic1[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.080s] commit: 2 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.080s] topic1[0] offset 55: Success 3: [0113_cooperative_rebalance_local/ 1.080s] topic2[0] offset 33: Success 3: [0113_cooperative_rebalance_local/ 1.080s] a_assign_rapid#consumer-94: incremental assign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 1.080s] incremental_assign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.080s] topic3[0] offset -1001: Success 3: [0113_cooperative_rebalance_local/ 1.080s] Clearing rtt 3: [0113_cooperative_rebalance_local/ 1.080s] a_assign_rapid#consumer-94: incremental assign of 1 partition(s) 3: [0113_cooperative_rebalance_local/ 1.080s] incremental_assign(): 1 TopicPartition(s): 3: [0113_cooperative_rebalance_local/ 1.080s] topic1[0] offset -1001: Success 3: [0106_cgrp_sess_timeout / 3.108s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 3.108s] ASSIGN.PARTITIONS: duration 0.085ms 3: [0106_cgrp_sess_timeout / 3.108s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 3.108s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 3.223s] CONSUME: duration 115.138ms 3: [0106_cgrp_sess_timeout / 3.223s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 3.223s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0105_transactions_mock / 8.548s] FLUSH: duration 961.496ms 3: [0105_transactions_mock / 8.548s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: %3|1673491070.868|TXNERR|0105_transactions_mock#producer-93| [thrd:127.0.0.1:33647/bootstrap]: Current transaction failed in state BeginCommit: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 8.550s] commit_transaction(-1): duration 1.355ms 3: [0105_transactions_mock / 8.550s] commit_transaction() failed (expectedly): unknown producer id 3: [0105_transactions_mock / 8.650s] abort_transaction(100): duration 100.076ms 3: [0105_transactions_mock / 8.650s] First abort_transaction() failed: Transactional operation timed out 3: [0105_transactions_mock / 8.650s] Retrying abort 3: [0080_admin_ut / 12.834s] [ do_test_configs:1411: PASS (2.00s) ] 3: [0080_admin_ut / 12.834s] Test config file test.conf not found 3: %4|1673491071.039|CONFWARN|0080_admin_ut#consumer-95| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1673491071.039|CONFWARN|0080_admin_ut#consumer-95| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 12.834s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-95 DeleteRecords with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 12.834s] Using topic "rdkafkatest_rnd1ba5168249ef3e70_do_test_DeleteRecords" 3: [0080_admin_ut / 12.834s] Using topic "rdkafkatest_rnd6326103f60544c1a_do_test_DeleteRecords" 3: [0080_admin_ut / 12.834s] Using topic "rdkafkatest_rnd6a1285be5375dc71_do_test_DeleteRecords" 3: [0080_admin_ut / 12.834s] Using topic "rdkafkatest_rnd1bbaa93e6503aca1_do_test_DeleteRecords" 3: [0080_admin_ut / 12.834s] Call DeleteRecords, timeout is 200ms 3: [0080_admin_ut / 12.834s] DeleteRecords: duration 0.017ms 3: [0080_admin_ut / 12.834s] [ do_test_DeleteRecords:520: 0080_admin_ut#consumer-95 DeleteRecords with main queue, options, destroy, timeout 100ms: PASS (0.00s) ] 3: [0080_admin_ut / 12.842s] Test config file test.conf not found 3: %4|1673491071.047|CONFWARN|0080_admin_ut#consumer-96| [thrd:app]: Configuration property `fetch.wait.max.ms` (500) should be set lower than `socket.timeout.ms` (100) by at least 1000ms to avoid blocking and timing out sub-sequent requests 3: %5|1673491071.047|CONFWARN|0080_admin_ut#consumer-96| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0080_admin_ut / 12.843s] [ do_test_DeleteGroups:402: 0080_admin_ut#consumer-96 DeleteGroups with main queue, options, destroy, timeout 100ms ] 3: [0080_admin_ut / 12.843s] Using topic "rdkafkatest_rnd91b992247c0642b_do_test_DeleteGroups" 3: [0080_admin_ut / 12.843s] Using topic "rdkafkatest_rnd1523ec71074fa34b_do_test_DeleteGroups" 3: [0080_admin_ut / 12.843s] Using topic "rdkafkatest_rnd5986ab30625dfab8_do_test_DeleteGroups" 3: [0080_admin_ut / 12.843s] Using topic "rdkafkatest_rnd9e27bc87537c6cc_do_test_DeleteGroups" 3: [0080_admin_ut / 12.843s] Call DeleteGroups, timeout is 200ms 3: [0080_admin_ut / 12.843s] DeleteGroups: duration 0.014ms 3: [0080_admin_ut / 12.843s] [ do_test_DeleteGroups:497 ] 3: [0080_admin_ut / 12.850s] 0080_admin_ut: duration 12850.328ms 3: [0080_admin_ut / 12.850s] ================= Test 0080_admin_ut PASSED ================= 3: [
/ 17.048s] Too many tests running (5 >= 5): postponing 0117_mock_errors start... 3: [0116_kafkaconsumer_close / 0.000s] ================= Running test 0116_kafkaconsumer_close ================= 3: [0116_kafkaconsumer_close / 0.000s] ==== Stats written to file stats_0116_kafkaconsumer_close_545560336592724792.json ==== 3: [0116_kafkaconsumer_close / 0.000s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=0 ] 3: %5|1673491071.061|CONFWARN|MOCK#producer-97| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 0.005s] Setting test timeout to 10s * 2.7 3: [0104_fetch_from_follower_mock/ 9.498s] CONSUME: duration 3003.758ms 3: [0104_fetch_from_follower_mock/ 9.498s] test_consumer_poll_no_msgs:4075: no msgs: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 9.498s] test_consumer_poll_no_msgs:4075: no msgs: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 9.503s] remaining: consume 3 messages 3: [0104_fetch_from_follower_mock/ 9.507s] CONSUME: duration 3.958ms 3: [0104_fetch_from_follower_mock/ 9.507s] remaining: consumed 3/3 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 9.507s] Closing consumer 0104_fetch_from_follower_mock#consumer-88 3: [0104_fetch_from_follower_mock/ 9.507s] CONSUMER.CLOSE: duration 0.351ms 3: [0104_fetch_from_follower_mock/ 9.512s] [ Test lagging FFF offset reset PASSED ] 3: [0104_fetch_from_follower_mock/ 9.512s] [ Test unknown follower ] 3: %5|1673491071.102|CONFWARN|MOCK#producer-100| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 9.516s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.519s] Created kafka instance 0104_fetch_from_follower_mock#producer-101 3: [0104_fetch_from_follower_mock/ 9.519s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.519s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 9.519s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 9.519s] PRODUCE: duration 0.749ms 3: [0104_fetch_from_follower_mock/ 9.613s] PRODUCE.DELIVERY.WAIT: duration 93.721ms 3: [0104_fetch_from_follower_mock/ 9.618s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 9.622s] Created kafka instance 0104_fetch_from_follower_mock#consumer-102 3: [0104_fetch_from_follower_mock/ 9.629s] ASSIGN.PARTITIONS: duration 7.048ms 3: [0104_fetch_from_follower_mock/ 9.629s] unknown follower: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 9.629s] unknown follower: not expecting any messages for 5000ms 3: %5|1673491071.325|FETCH|0104_fetch_from_follower_mock#consumer-102| [thrd:127.0.0.1:34623/bootstrap]: 127.0.0.1:34623/1: test [0]: preferred replica (19) is unknown: refreshing metadata 3: %5|1673491071.827|FETCH|0104_fetch_from_follower_mock#consumer-102| [thrd:127.0.0.1:34623/bootstrap]: 127.0.0.1:34623/1: test [0]: preferred replica (19) lease changing too quickly (0s < 60s): possibly due to unavailable replica or stale cluster state: backing off next fetch 3: [0113_cooperative_rebalance_local/ 3.044s] [ a_assign_rapid:674: PASS (3.04s) ] 3: [0113_cooperative_rebalance_local/ 3.044s] [ p_lost_partitions_heartbeat_illegal_generation_test:2695 ] 3: %5|1673491072.161|CONFWARN|MOCK#producer-103| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 3.044s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 3.044s] Created kafka instance 0113_cooperative_rebalance_local#producer-104 3: [0113_cooperative_rebalance_local/ 3.044s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 3.044s] Produce to test [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 3.044s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 3.044s] PRODUCE: duration 0.065ms 3: [0113_cooperative_rebalance_local/ 3.112s] PRODUCE.DELIVERY.WAIT: duration 67.442ms 3: [0113_cooperative_rebalance_local/ 3.113s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 3.113s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 3.116s] Created kafka instance 0113_cooperative_rebalance_local#consumer-105 3: [0113_cooperative_rebalance_local/ 3.125s] p_lost_partitions_heartbeat_illegal_generation_test:2720: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0113_cooperative_rebalance_local/ 6.155s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 6.155s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 6.155s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 6.155s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 6.155s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 6.155s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.151ms 3: [0113_cooperative_rebalance_local/ 6.155s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 6.758s] p_lost_partitions_heartbeat_illegal_generation_test:2732: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0116_kafkaconsumer_close / 5.027s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=0: PASS (5.03s) ] 3: [0116_kafkaconsumer_close / 5.027s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=0 ] 3: %5|1673491076.084|CONFWARN|MOCK#producer-106| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 5.028s] Setting test timeout to 10s * 2.7 3: [0104_fetch_from_follower_mock/ 14.629s] CONSUME: duration 5000.074ms 3: [0104_fetch_from_follower_mock/ 14.629s] test_consumer_poll_no_msgs:4075: unknown follower: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 14.629s] test_consumer_poll_no_msgs:4075: unknown follower: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 14.629s] proper follower: consume 1000 messages 3: [0113_cooperative_rebalance_local/ 7.156s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 7.156s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 7.156s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 7.156s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 7.156s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 7.156s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 7.156s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.112ms 3: [0113_cooperative_rebalance_local/ 7.156s] unassign: incremental unassign of 4 partition(s) done 3: [0104_fetch_from_follower_mock/ 15.262s] CONSUME: duration 633.619ms 3: [0104_fetch_from_follower_mock/ 15.262s] proper follower: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 15.262s] do_test_unknown_follower:223: broker_id: Verifying 1000 received messages (flags 0x80000): expecting msgids 0..1000 (1000) 3: [0104_fetch_from_follower_mock/ 15.262s] do_test_unknown_follower:223: broker_id: Verification of 1000 received messages succeeded: expected msgids 0..1000 (1000) 3: [0104_fetch_from_follower_mock/ 15.262s] Closing consumer 0104_fetch_from_follower_mock#consumer-102 3: [0104_fetch_from_follower_mock/ 15.263s] CONSUMER.CLOSE: duration 0.240ms 3: [0104_fetch_from_follower_mock/ 15.263s] [ Test unknown follower PASSED ] 3: [0104_fetch_from_follower_mock/ 15.263s] [ Test REPLICA_NOT_AVAIALBLE ] 3: %5|1673491076.850|CONFWARN|MOCK#producer-109| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0104_fetch_from_follower_mock/ 15.264s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.268s] Created kafka instance 0104_fetch_from_follower_mock#producer-110 3: [0104_fetch_from_follower_mock/ 15.268s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.268s] Produce to test [0]: messages #0..1000 3: [0104_fetch_from_follower_mock/ 15.268s] SUM(POLL): duration 0.000ms 3: [0104_fetch_from_follower_mock/ 15.268s] PRODUCE: duration 0.751ms 3: [0113_cooperative_rebalance_local/ 7.762s] p_lost_partitions_heartbeat_illegal_generation_test:2737: Waiting for rejoin after lost (_ASSIGN_PARTITIONS) for 12s 3: [0104_fetch_from_follower_mock/ 15.325s] PRODUCE.DELIVERY.WAIT: duration 56.180ms 3: [0104_fetch_from_follower_mock/ 15.330s] Test config file test.conf not found 3: [0104_fetch_from_follower_mock/ 15.330s] Created kafka instance 0104_fetch_from_follower_mock#consumer-111 3: [0104_fetch_from_follower_mock/ 15.334s] ASSIGN.PARTITIONS: duration 3.393ms 3: [0104_fetch_from_follower_mock/ 15.334s] REPLICA_NOT_AVAIALBLE: assigned 1 partition(s) 3: [0104_fetch_from_follower_mock/ 15.334s] Wait initial metadata: not expecting any messages for 2000ms 3: %4|1673491077.377|SESSTMOUT|0106_cgrp_sess_timeout#consumer-84| [thrd:main]: Consumer group session timed out (in join-state steady) after 6002 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0106_cgrp_sess_timeout / 10.224s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 10.224s] Performing sync commit 3: [0106_cgrp_sess_timeout / 11.224s] UNASSIGN.PARTITIONS: duration 0.029ms 3: [0106_cgrp_sess_timeout / 11.224s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 11.224s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: [0104_fetch_from_follower_mock/ 17.334s] CONSUME: duration 2000.073ms 3: [0104_fetch_from_follower_mock/ 17.334s] test_consumer_poll_no_msgs:4075: Wait initial metadata: Verifying 0 received messages (flags 0xf): expecting msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 17.334s] test_consumer_poll_no_msgs:4075: Wait initial metadata: Verification of 0 received messages succeeded: expected msgids 0..0 (0) 3: [0104_fetch_from_follower_mock/ 17.334s] Consume: consume 1000 messages 3: [0104_fetch_from_follower_mock/ 18.485s] CONSUME: duration 1151.319ms 3: [0104_fetch_from_follower_mock/ 18.485s] Consume: consumed 1000/1000 messages (0/1 EOFs) 3: [0104_fetch_from_follower_mock/ 18.485s] Closing consumer 0104_fetch_from_follower_mock#consumer-111 3: [0104_fetch_from_follower_mock/ 18.486s] CONSUMER.CLOSE: duration 0.269ms 3: [0104_fetch_from_follower_mock/ 18.486s] [ Test REPLICA_NOT_AVAIALBLE PASSED ] 3: [0104_fetch_from_follower_mock/ 18.486s] 0104_fetch_from_follower_mock: duration 18486.371ms 3: [0104_fetch_from_follower_mock/ 18.486s] ================= Test 0104_fetch_from_follower_mock PASSED ================= 3: [0116_kafkaconsumer_close / 9.078s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=0: PASS (4.05s) ] 3: [0116_kafkaconsumer_close / 9.078s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=0 ] 3: %5|1673491080.134|CONFWARN|MOCK#producer-112| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 9.078s] Setting test timeout to 10s * 2.7 3: [
/ 26.166s] Too many tests running (5 >= 5): postponing 0120_asymmetric_subscription start... 3: [0117_mock_errors / 0.000s] ================= Running test 0117_mock_errors ================= 3: [0117_mock_errors / 0.000s] ==== Stats written to file stats_0117_mock_errors_51744500036205279.json ==== 3: [0117_mock_errors / 0.000s] Test config file test.conf not found 3: [0117_mock_errors / 0.000s] [ do_test_producer_storage_error:53: ] 3: [0117_mock_errors / 0.000s] Test config file test.conf not found 3: [0117_mock_errors / 0.000s] Setting test timeout to 10s * 2.7 3: %5|1673491080.177|MOCK|0117_mock_errors#producer-115| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46235,127.0.0.1:39721,127.0.0.1:46207 3: [0117_mock_errors / 0.001s] Created kafka instance 0117_mock_errors#producer-115 3: [0117_mock_errors / 0.001s] 0117_mock_errors#producer-115: Flushing 1 messages 3: [0105_transactions_mock / 18.555s] rd_kafka_abort_transaction(rk, -1): duration 9904.935ms 3: [0105_transactions_mock / 18.555s] abort_transaction(-1): duration 9904.953ms 3: [0105_transactions_mock / 18.555s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock / 18.555s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: [0105_transactions_mock / 18.557s] rd_kafka_commit_transaction(rk, -1): duration 2.408ms 3: [0105_transactions_mock / 18.559s] [ do_test_txn_slow_reinit:390: without sleep: PASS (11.01s) ] 3: [0105_transactions_mock / 18.559s] [ do_test_txn_slow_reinit:390: with sleep ] 3: [0105_transactions_mock / 18.559s] Test config file test.conf not found 3: [0105_transactions_mock / 18.559s] Setting test timeout to 60s * 2.7 3: %5|1673491080.879|MOCK|0105_transactions_mock#producer-116| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:32795,127.0.0.1:44905,127.0.0.1:38815 3: [0105_transactions_mock / 18.559s] Created kafka instance 0105_transactions_mock#producer-116 3: [0105_transactions_mock / 18.589s] rd_kafka_init_transactions(rk, -1): duration 25.890ms 3: [0105_transactions_mock / 18.589s] rd_kafka_begin_transaction(rk): duration 0.113ms 3: [0105_transactions_mock / 18.589s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.016ms 3: [0105_transactions_mock / 18.589s] 0105_transactions_mock#producer-116: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 12.160s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 12.160s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 12.160s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 12.160s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 12.160s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 12.160s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.042ms 3: [0113_cooperative_rebalance_local/ 12.160s] assign: incremental assign of 4 partition(s) done 3: [0117_mock_errors / 1.523s] FLUSH: duration 1522.122ms 3: [0117_mock_errors / 1.523s] [ do_test_producer_storage_error:53: : PASS (1.52s) ] 3: [0117_mock_errors / 1.523s] [ do_test_producer_storage_error:53: with too few retries ] 3: [0117_mock_errors / 1.524s] Test config file test.conf not found 3: [0117_mock_errors / 1.524s] Setting test timeout to 10s * 2.7 3: %5|1673491081.700|MOCK|0117_mock_errors#producer-117| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39291,127.0.0.1:45941,127.0.0.1:44171 3: [0117_mock_errors / 1.528s] Created kafka instance 0117_mock_errors#producer-117 3: [0117_mock_errors / 1.528s] 0117_mock_errors#producer-117: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 12.663s] Closing consumer 3: [0113_cooperative_rebalance_local/ 12.663s] Closing consumer 0113_cooperative_rebalance_local#consumer-105 3: [0113_cooperative_rebalance_local/ 12.665s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 12.665s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 12.665s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 12.665s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 12.665s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 12.666s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.814ms 3: [0113_cooperative_rebalance_local/ 12.666s] unassign: incremental unassign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 12.666s] CONSUMER.CLOSE: duration 2.993ms 3: [0113_cooperative_rebalance_local/ 12.666s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 12.666s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 12.667s] [ p_lost_partitions_heartbeat_illegal_generation_test:2695: PASS (9.62s) ] 3: [0113_cooperative_rebalance_local/ 12.667s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=0 ] 3: %5|1673491081.784|CONFWARN|MOCK#producer-118| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 12.671s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.672s] Created kafka instance 0113_cooperative_rebalance_local#producer-119 3: [0113_cooperative_rebalance_local/ 12.672s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.672s] Produce to test1 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 12.672s] SUM(POLL): duration 0.001ms 3: [0113_cooperative_rebalance_local/ 12.672s] PRODUCE: duration 0.066ms 3: [0113_cooperative_rebalance_local/ 12.745s] PRODUCE.DELIVERY.WAIT: duration 72.814ms 3: [0113_cooperative_rebalance_local/ 12.749s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.749s] Created kafka instance 0113_cooperative_rebalance_local#producer-120 3: [0113_cooperative_rebalance_local/ 12.749s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.749s] Produce to test2 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 12.749s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 12.749s] PRODUCE: duration 0.072ms 3: [0105_transactions_mock / 19.565s] FLUSH: duration 976.228ms 3: [0105_transactions_mock / 19.565s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.004ms 3: %3|1673491081.885|TXNERR|0105_transactions_mock#producer-116| [thrd:127.0.0.1:32795/bootstrap]: Current transaction failed in state InTransaction: unknown producer id (UNKNOWN_PRODUCER_ID, requires epoch bump) 3: [0105_transactions_mock / 19.566s] commit_transaction(-1): duration 1.233ms 3: [0105_transactions_mock / 19.566s] commit_transaction() failed (expectedly): unknown producer id 3: [0113_cooperative_rebalance_local/ 12.796s] PRODUCE.DELIVERY.WAIT: duration 46.372ms 3: [0113_cooperative_rebalance_local/ 12.800s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 12.800s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 12.800s] Created kafka instance 0113_cooperative_rebalance_local#consumer-121 3: [0113_cooperative_rebalance_local/ 12.800s] q_lost_partitions_illegal_generation_test:2801: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 19.667s] abort_transaction(100): duration 100.345ms 3: [0105_transactions_mock / 19.667s] First abort_transaction() failed: Transactional operation timed out 3: [0117_mock_errors / 2.039s] FLUSH: duration 511.130ms 3: [0117_mock_errors / 2.040s] [ do_test_producer_storage_error:53: with too few retries: PASS (0.52s) ] 3: [0117_mock_errors / 2.040s] [ do_test_offset_commit_error_during_rebalance:109 ] 3: [0117_mock_errors / 2.040s] Test config file test.conf not found 3: [0117_mock_errors / 2.040s] Setting test timeout to 60s * 2.7 3: %5|1673491082.216|CONFWARN|MOCK#producer-122| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 2.043s] Test config file test.conf not found 3: [0117_mock_errors / 2.043s] Created kafka instance 0117_mock_errors#producer-123 3: [0117_mock_errors / 2.043s] Test config file test.conf not found 3: [0117_mock_errors / 2.043s] Produce to test [-1]: messages #0..100 3: [0117_mock_errors / 2.043s] SUM(POLL): duration 0.000ms 3: [0117_mock_errors / 2.043s] PRODUCE: duration 0.074ms 3: [0117_mock_errors / 2.103s] PRODUCE.DELIVERY.WAIT: duration 59.350ms 3: [0117_mock_errors / 2.108s] Created kafka instance 0117_mock_errors#consumer-124 3: [0117_mock_errors / 2.108s] Created kafka instance 0117_mock_errors#consumer-125 3: [0117_mock_errors / 2.121s] C1.PRE: consume 1 messages 3: [0106_cgrp_sess_timeout / 15.131s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.131s] ASSIGN.PARTITIONS: duration 0.214ms 3: [0106_cgrp_sess_timeout / 15.131s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.131s] Closing consumer 0106_cgrp_sess_timeout#consumer-84 3: [0106_cgrp_sess_timeout / 15.131s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 15.131s] Performing sync commit 3: [0106_cgrp_sess_timeout / 16.131s] UNASSIGN.PARTITIONS: duration 0.028ms 3: [0106_cgrp_sess_timeout / 16.131s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 16.131s] CONSUMER.CLOSE: duration 1000.569ms 3: [0106_cgrp_sess_timeout / 16.132s] [ do_test_session_timeout:152: Test session timeout with sync commit: PASS (16.13s) ] 3: [0106_cgrp_sess_timeout / 16.132s] [ do_test_session_timeout:152: Test session timeout with async commit ] 3: %5|1673491083.399|CONFWARN|MOCK#producer-126| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 16.136s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.136s] Created kafka instance 0106_cgrp_sess_timeout#producer-127 3: [0106_cgrp_sess_timeout / 16.136s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.136s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 16.136s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 16.136s] PRODUCE: duration 0.073ms 3: [0106_cgrp_sess_timeout / 16.196s] PRODUCE.DELIVERY.WAIT: duration 59.464ms 3: [0106_cgrp_sess_timeout / 16.200s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 16.200s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 16.204s] Created kafka instance 0106_cgrp_sess_timeout#consumer-128 3: [0106_cgrp_sess_timeout / 16.208s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0113_cooperative_rebalance_local/ 15.819s] Rebalance #5: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 15.819s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 15.819s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 15.819s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 15.819s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 15.819s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.222ms 3: [0113_cooperative_rebalance_local/ 15.819s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 15.920s] q_lost_partitions_illegal_generation_test:2823: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0116_kafkaconsumer_close / 14.120s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=0: PASS (5.04s) ] 3: [0116_kafkaconsumer_close / 14.120s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=0 ] 3: %5|1673491085.176|CONFWARN|MOCK#producer-129| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 14.120s] Setting test timeout to 10s * 2.7 3: [0117_mock_errors / 5.127s] 0117_mock_errors#consumer-124: Rebalance: _ASSIGN_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.127s] ASSIGN.PARTITIONS: duration 0.058ms 3: [0117_mock_errors / 5.127s] assign: assigned 2 partition(s) 3: [0117_mock_errors / 5.233s] CONSUME: duration 3111.930ms 3: [0117_mock_errors / 5.233s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 5.233s] C2.PRE: consume 1 messages 3: [0117_mock_errors / 5.233s] 0117_mock_errors#consumer-125: Rebalance: _ASSIGN_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.233s] ASSIGN.PARTITIONS: duration 0.067ms 3: [0117_mock_errors / 5.233s] assign: assigned 2 partition(s) 3: [0117_mock_errors / 5.333s] CONSUME: duration 100.599ms 3: [0117_mock_errors / 5.333s] C2.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 5.333s] Closing consumer 0117_mock_errors#consumer-125 3: [0117_mock_errors / 5.333s] 0117_mock_errors#consumer-125: Rebalance: _REVOKE_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 5.333s] UNASSIGN.PARTITIONS: duration 0.024ms 3: [0117_mock_errors / 5.333s] unassign: unassigned current partitions 3: [0117_mock_errors / 5.334s] CONSUMER.CLOSE: duration 0.222ms 3: [0117_mock_errors / 5.334s] Committing (should fail) 3: [0117_mock_errors / 5.334s] Commit returned REBALANCE_IN_PROGRESS 3: [0117_mock_errors / 5.334s] C1.PRE: consume 100 messages 3: [0106_cgrp_sess_timeout / 19.232s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 19.232s] ASSIGN.PARTITIONS: duration 0.174ms 3: [0106_cgrp_sess_timeout / 19.232s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 19.232s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 19.835s] CONSUME: duration 602.656ms 3: [0106_cgrp_sess_timeout / 19.835s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 19.835s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0117_mock_errors / 8.127s] 0117_mock_errors#consumer-124: Rebalance: _REVOKE_PARTITIONS: 2 partition(s) 3: [0117_mock_errors / 8.127s] UNASSIGN.PARTITIONS: duration 0.025ms 3: [0117_mock_errors / 8.127s] unassign: unassigned current partitions 3: [0116_kafkaconsumer_close / 17.292s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=0: PASS (3.17s) ] 3: [0116_kafkaconsumer_close / 17.292s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=0 ] 3: %5|1673491088.348|CONFWARN|MOCK#producer-132| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 17.292s] Setting test timeout to 10s * 2.7 3: [
/ 36.188s] Too many tests running (5 >= 5): postponing 0120_asymmetric_subscription start... 3: [0117_mock_errors / 10.335s] 0117_mock_errors#consumer-124: Rebalance: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0117_mock_errors / 10.335s] ASSIGN.PARTITIONS: duration 0.048ms 3: [0117_mock_errors / 10.335s] assign: assigned 4 partition(s) 3: [0113_cooperative_rebalance_local/ 21.426s] Rebalance #6: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 21.426s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 21.426s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 21.426s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 21.426s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 21.426s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 21.426s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.036ms 3: [0113_cooperative_rebalance_local/ 21.426s] unassign: incremental unassign of 4 partition(s) done 3: [0117_mock_errors / 10.436s] CONSUME: duration 5101.640ms 3: [0117_mock_errors / 10.436s] C1.PRE: consumed 100/100 messages (0/-1 EOFs) 3: [0117_mock_errors / 10.436s] 0117_mock_errors#consumer-124: Rebalance: _REVOKE_PARTITIONS: 4 partition(s) 3: [0117_mock_errors / 10.436s] UNASSIGN.PARTITIONS: duration 0.033ms 3: [0117_mock_errors / 10.436s] unassign: unassigned current partitions 3: [0117_mock_errors / 10.437s] [ do_test_offset_commit_error_during_rebalance:109: PASS (8.40s) ] 3: [0117_mock_errors / 10.437s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=true ] 3: [0117_mock_errors / 10.437s] Test config file test.conf not found 3: [0117_mock_errors / 10.437s] Setting test timeout to 60s * 2.7 3: %5|1673491090.613|CONFWARN|MOCK#producer-135| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 10.438s] Test config file test.conf not found 3: [0117_mock_errors / 10.441s] Created kafka instance 0117_mock_errors#producer-136 3: [0117_mock_errors / 10.441s] Test config file test.conf not found 3: [0117_mock_errors / 10.441s] Produce to test [-1]: messages #0..1 3: [0117_mock_errors / 10.441s] SUM(POLL): duration 0.000ms 3: [0117_mock_errors / 10.441s] PRODUCE: duration 0.008ms 3: [0117_mock_errors / 10.449s] PRODUCE.DELIVERY.WAIT: duration 7.763ms 3: [0117_mock_errors / 10.456s] Created kafka instance 0117_mock_errors#consumer-137 3: [0117_mock_errors / 10.460s] C1.PRE: consume 1 messages 3: [0113_cooperative_rebalance_local/ 21.922s] q_lost_partitions_illegal_generation_test:2830: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 12s 3: [0116_kafkaconsumer_close / 22.311s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=0: PASS (5.02s) ] 3: [0116_kafkaconsumer_close / 22.311s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=0 ] 3: %5|1673491093.367|CONFWARN|MOCK#producer-138| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 22.311s] Setting test timeout to 10s * 2.7 3: %4|1673491093.499|SESSTMOUT|0106_cgrp_sess_timeout#consumer-128| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0117_mock_errors / 13.609s] CONSUME: duration 3148.945ms 3: [0117_mock_errors / 13.609s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 13.609s] Closing consumer 0117_mock_errors#consumer-137 3: [0105_transactions_mock / 31.667s] Retrying abort 3: [0105_transactions_mock / 31.667s] rd_kafka_abort_transaction(rk, -1): duration 0.138ms 3: [0105_transactions_mock / 31.667s] abort_transaction(-1): duration 0.148ms 3: [0105_transactions_mock / 31.667s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock / 31.667s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.010ms 3: [0117_mock_errors / 13.811s] CONSUMER.CLOSE: duration 201.616ms 3: [0117_mock_errors / 13.811s] Created kafka instance 0117_mock_errors#consumer-141 3: [0105_transactions_mock / 31.687s] rd_kafka_commit_transaction(rk, -1): duration 19.865ms 3: [0117_mock_errors / 13.832s] rd_kafka_committed(c2, partitions, 10 * 1000): duration 21.249ms 3: [0105_transactions_mock / 31.702s] [ do_test_txn_slow_reinit:390: with sleep: PASS (13.14s) ] 3: [0105_transactions_mock / 31.702s] [ do_test_txn_endtxn_errors:705 ] 3: [0105_transactions_mock / 31.702s] Testing scenario #0 commit with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 31.702s] Test config file test.conf not found 3: [0105_transactions_mock / 31.702s] Setting test timeout to 60s * 2.7 3: %5|1673491094.022|MOCK|0105_transactions_mock#producer-142| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37731,127.0.0.1:37017,127.0.0.1:40007 3: [0117_mock_errors / 13.848s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=true: PASS (3.41s) ] 3: [0117_mock_errors / 13.848s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=false ] 3: [0105_transactions_mock / 31.705s] Created kafka instance 0105_transactions_mock#producer-142 3: [0117_mock_errors / 13.848s] Test config file test.conf not found 3: [0117_mock_errors / 13.848s] Setting test timeout to 60s * 2.7 3: %5|1673491094.033|CONFWARN|MOCK#producer-143| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0117_mock_errors / 13.863s] Test config file test.conf not found 3: [0117_mock_errors / 13.863s] Created kafka instance 0117_mock_errors#producer-144 3: [0117_mock_errors / 13.863s] Test config file test.conf not found 3: [0117_mock_errors / 13.863s] Produce to test [-1]: messages #0..1 3: [0117_mock_errors / 13.863s] SUM(POLL): duration 0.001ms 3: [0117_mock_errors / 13.863s] PRODUCE: duration 0.007ms 3: [0105_transactions_mock / 31.728s] rd_kafka_init_transactions(rk, 5000): duration 22.690ms 3: [0105_transactions_mock / 31.728s] rd_kafka_begin_transaction(rk): duration 0.159ms 3: [0117_mock_errors / 13.871s] PRODUCE.DELIVERY.WAIT: duration 7.613ms 3: [0117_mock_errors / 13.880s] Created kafka instance 0117_mock_errors#consumer-145 3: [0117_mock_errors / 13.880s] C1.PRE: consume 1 messages 3: [0105_transactions_mock / 31.752s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 23.768ms 3: [0106_cgrp_sess_timeout / 26.836s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 26.836s] Performing async commit 3: [0106_cgrp_sess_timeout / 27.836s] UNASSIGN.PARTITIONS: duration 0.077ms 3: [0106_cgrp_sess_timeout / 27.836s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 27.836s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: %4|1673491095.103|COMMITFAIL|0106_cgrp_sess_timeout#consumer-128| [thrd:main]: Offset commit (manual) failed for 1/4 partition(s) in join-state wait-unassign-to-complete: Broker: Unknown member: test[0]@17(Broker: Unknown member) 3: [0105_transactions_mock / 32.966s] commit: duration 1214.185ms 3: [0105_transactions_mock / 32.966s] Scenario #0 commit succeeded 3: [0105_transactions_mock / 32.966s] Testing scenario #0 commit&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 32.966s] rd_kafka_begin_transaction(rk): duration 0.063ms 3: [0105_transactions_mock / 32.966s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 32.967s] FLUSH: duration 1.225ms 3: [0105_transactions_mock / 32.968s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 1.004ms 3: [0113_cooperative_rebalance_local/ 26.429s] Rebalance #7: _ASSIGN_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 26.429s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.429s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.129ms 3: [0113_cooperative_rebalance_local/ 26.429s] assign: incremental assign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 26.922s] Closing consumer 3: [0113_cooperative_rebalance_local/ 26.922s] Closing consumer 0113_cooperative_rebalance_local#consumer-121 3: [0113_cooperative_rebalance_local/ 26.922s] Rebalance #8: _REVOKE_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 26.922s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 26.922s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.035ms 3: [0113_cooperative_rebalance_local/ 26.922s] unassign: incremental unassign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 26.922s] CONSUMER.CLOSE: duration 0.364ms 3: [0113_cooperative_rebalance_local/ 26.922s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 26.923s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 26.923s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=0: PASS (14.26s) ] 3: [0113_cooperative_rebalance_local/ 26.923s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=1 ] 3: %5|1673491096.041|CONFWARN|MOCK#producer-146| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 26.926s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.926s] Created kafka instance 0113_cooperative_rebalance_local#producer-147 3: [0113_cooperative_rebalance_local/ 26.927s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.927s] Produce to test1 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 26.927s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 26.927s] PRODUCE: duration 0.076ms 3: [0113_cooperative_rebalance_local/ 26.993s] PRODUCE.DELIVERY.WAIT: duration 66.168ms 3: [0113_cooperative_rebalance_local/ 26.997s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.997s] Created kafka instance 0113_cooperative_rebalance_local#producer-148 3: [0113_cooperative_rebalance_local/ 26.997s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 26.997s] Produce to test2 [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 26.997s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 26.997s] PRODUCE: duration 0.065ms 3: [0113_cooperative_rebalance_local/ 27.057s] PRODUCE.DELIVERY.WAIT: duration 59.514ms 3: [0113_cooperative_rebalance_local/ 27.058s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 27.058s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 27.063s] Created kafka instance 0113_cooperative_rebalance_local#consumer-149 3: [0113_cooperative_rebalance_local/ 27.063s] q_lost_partitions_illegal_generation_test:2801: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 34.070s] commit&flush: duration 1101.656ms 3: [0105_transactions_mock / 34.070s] Scenario #0 commit&flush succeeded 3: [0105_transactions_mock / 34.070s] Testing scenario #0 abort with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 34.070s] rd_kafka_begin_transaction(rk): duration 0.223ms 3: [0105_transactions_mock / 34.071s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 1.152ms 3: [0116_kafkaconsumer_close / 25.948s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=0: PASS (3.64s) ] 3: [0116_kafkaconsumer_close / 25.948s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=0 ] 3: %5|1673491097.005|CONFWARN|MOCK#producer-150| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 25.949s] Setting test timeout to 10s * 2.7 3: [0117_mock_errors / 17.006s] CONSUME: duration 3126.515ms 3: [0117_mock_errors / 17.006s] C1.PRE: consumed 1/1 messages (0/-1 EOFs) 3: [0117_mock_errors / 17.208s] rd_kafka_commit(c1, ((void *)0), 0 ): duration 201.260ms 3: [0117_mock_errors / 17.208s] Closing consumer 0117_mock_errors#consumer-145 3: [0117_mock_errors / 17.208s] CONSUMER.CLOSE: duration 0.098ms 3: [0117_mock_errors / 17.208s] Created kafka instance 0117_mock_errors#consumer-153 3: [0117_mock_errors / 17.234s] rd_kafka_committed(c2, partitions, 10 * 1000): duration 25.283ms 3: [0117_mock_errors / 17.246s] [ do_test_offset_commit_request_timed_out:190: enable.auto.commit=false: PASS (3.40s) ] 3: [0117_mock_errors / 17.246s] 0117_mock_errors: duration 17246.371ms 3: [0117_mock_errors / 17.246s] ================= Test 0117_mock_errors PASSED ================= 3: [
/ 43.415s] Too many tests running (5 >= 5): postponing 0121_clusterid start... 3: [0120_asymmetric_subscription/ 0.000s] ================= Running test 0120_asymmetric_subscription ================= 3: [0120_asymmetric_subscription/ 0.000s] ==== Stats written to file stats_0120_asymmetric_subscription_8493651571420542336.json ==== 3: [0120_asymmetric_subscription/ 0.000s] Test config file test.conf not found 3: %5|1673491097.426|CONFWARN|MOCK#producer-154| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0120_asymmetric_subscription/ 0.017s] [ do_test_asymmetric:71: roundrobin assignor ] 3: [0120_asymmetric_subscription/ 0.017s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 0.017s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 0.018s] Created kafka instance c0#consumer-155 3: [0120_asymmetric_subscription/ 0.018s] rd_kafka_subscribe(c[i], tlist): duration 0.107ms 3: [0120_asymmetric_subscription/ 0.018s] Created kafka instance c1#consumer-156 3: [0120_asymmetric_subscription/ 0.018s] rd_kafka_subscribe(c[i], tlist): duration 0.129ms 3: [0120_asymmetric_subscription/ 0.019s] Created kafka instance c2#consumer-157 3: [0120_asymmetric_subscription/ 0.019s] rd_kafka_subscribe(c[i], tlist): duration 0.137ms 3: [0105_transactions_mock / 35.182s] abort: duration 1110.480ms 3: [0105_transactions_mock / 35.182s] Scenario #0 abort succeeded 3: [0105_transactions_mock / 35.182s] Testing scenario #0 abort&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 35.182s] rd_kafka_begin_transaction(rk): duration 0.202ms 3: [0105_transactions_mock / 35.182s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 35.184s] FLUSH: duration 2.157ms 3: [0105_transactions_mock / 35.184s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.172ms 3: [0106_cgrp_sess_timeout / 31.246s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.246s] ASSIGN.PARTITIONS: duration 0.209ms 3: [0106_cgrp_sess_timeout / 31.246s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.246s] Closing consumer 0106_cgrp_sess_timeout#consumer-128 3: [0106_cgrp_sess_timeout / 31.246s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 31.246s] Performing async commit 3: [0105_transactions_mock / 36.389s] abort&flush: duration 1204.413ms 3: [0105_transactions_mock / 36.389s] Scenario #0 abort&flush succeeded 3: [0105_transactions_mock / 36.389s] Testing scenario #1 commit with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 36.389s] rd_kafka_begin_transaction(rk): duration 0.195ms 3: [0105_transactions_mock / 36.391s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.172ms 3: [0113_cooperative_rebalance_local/ 30.076s] Rebalance #9: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 30.076s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 30.076s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 30.076s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 30.076s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 30.076s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.143ms 3: [0113_cooperative_rebalance_local/ 30.076s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 30.177s] q_lost_partitions_illegal_generation_test:2823: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0113_cooperative_rebalance_local/ 30.177s] Rebalance #10: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 30.177s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 30.177s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 30.177s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 30.177s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 30.177s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 30.177s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.148ms 3: [0113_cooperative_rebalance_local/ 30.177s] unassign: incremental unassign of 4 partition(s) done 3: [0106_cgrp_sess_timeout / 32.247s] UNASSIGN.PARTITIONS: duration 0.063ms 3: [0106_cgrp_sess_timeout / 32.247s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 32.247s] CONSUMER.CLOSE: duration 1000.527ms 3: [0106_cgrp_sess_timeout / 32.248s] [ do_test_session_timeout:152: Test session timeout with async commit: PASS (16.12s) ] 3: [0106_cgrp_sess_timeout / 32.248s] [ do_test_session_timeout:152: Test session timeout with auto commit ] 3: %5|1673491099.514|CONFWARN|MOCK#producer-158| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 32.251s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.263s] Created kafka instance 0106_cgrp_sess_timeout#producer-159 3: [0106_cgrp_sess_timeout / 32.263s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.263s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 32.263s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 32.263s] PRODUCE: duration 0.069ms 3: [0106_cgrp_sess_timeout / 32.328s] PRODUCE.DELIVERY.WAIT: duration 64.897ms 3: [0106_cgrp_sess_timeout / 32.329s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 32.329s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 32.329s] Created kafka instance 0106_cgrp_sess_timeout#consumer-160 3: [0106_cgrp_sess_timeout / 32.333s] Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 37.596s] commit: duration 1204.570ms 3: [0105_transactions_mock / 37.596s] Scenario #1 commit succeeded 3: [0105_transactions_mock / 37.596s] Testing scenario #1 commit&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 37.596s] rd_kafka_begin_transaction(rk): duration 0.043ms 3: [0105_transactions_mock / 37.596s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 37.598s] FLUSH: duration 2.161ms 3: [0105_transactions_mock / 37.598s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.131ms 3: [0113_cooperative_rebalance_local/ 31.177s] q_lost_partitions_illegal_generation_test:2830: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 12s 3: [0105_transactions_mock / 38.803s] commit&flush: duration 1205.387ms 3: [0105_transactions_mock / 38.803s] Scenario #1 commit&flush succeeded 3: [0105_transactions_mock / 38.804s] Testing scenario #1 abort with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 38.804s] rd_kafka_begin_transaction(rk): duration 0.041ms 3: [0105_transactions_mock / 38.806s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.165ms 3: [0120_asymmetric_subscription/ 4.021s] c0#consumer-155: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[1], t2[3] 3: [0120_asymmetric_subscription/ 4.021s] c1#consumer-156: Assignment (6 partition(s)): t2[0], t2[2], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 4.021s] c2#consumer-157: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 4.021s] rd_kafka_assignment(c[i], &assignment): duration 0.153ms 3: [0120_asymmetric_subscription/ 4.021s] rd_kafka_assignment(c[i], &assignment): duration 0.021ms 3: [0120_asymmetric_subscription/ 4.021s] rd_kafka_assignment(c[i], &assignment): duration 0.017ms 3: [0120_asymmetric_subscription/ 4.021s] Closing consumer c0#consumer-155 3: [0120_asymmetric_subscription/ 4.021s] CONSUMER.CLOSE: duration 0.361ms 3: [0120_asymmetric_subscription/ 4.023s] Closing consumer c2#consumer-157 3: [0120_asymmetric_subscription/ 4.023s] CONSUMER.CLOSE: duration 0.173ms 3: [0120_asymmetric_subscription/ 4.023s] [ do_test_asymmetric:71: roundrobin assignor: PASS (4.01s) ] 3: [0120_asymmetric_subscription/ 4.023s] [ do_test_asymmetric:71: range assignor ] 3: [0120_asymmetric_subscription/ 4.023s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 4.023s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 4.023s] Created kafka instance c0#consumer-161 3: [0120_asymmetric_subscription/ 4.023s] rd_kafka_subscribe(c[i], tlist): duration 0.069ms 3: [0120_asymmetric_subscription/ 4.024s] Created kafka instance c1#consumer-162 3: [0120_asymmetric_subscription/ 4.024s] rd_kafka_subscribe(c[i], tlist): duration 0.125ms 3: [0120_asymmetric_subscription/ 4.025s] Created kafka instance c2#consumer-163 3: [0120_asymmetric_subscription/ 4.026s] rd_kafka_subscribe(c[i], tlist): duration 0.382ms 3: [0116_kafkaconsumer_close / 30.985s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=0: PASS (5.04s) ] 3: [0116_kafkaconsumer_close / 30.985s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=0 ] 3: %5|1673491102.041|CONFWARN|MOCK#producer-164| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 30.985s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 40.010s] abort: duration 1203.949ms 3: [0105_transactions_mock / 40.010s] Scenario #1 abort succeeded 3: [0105_transactions_mock / 40.010s] Testing scenario #1 abort&flush with 2 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 40.010s] rd_kafka_begin_transaction(rk): duration 0.039ms 3: [0105_transactions_mock / 40.010s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 40.012s] FLUSH: duration 2.140ms 3: [0105_transactions_mock / 40.012s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.117ms 3: [0106_cgrp_sess_timeout / 35.370s] Rebalance #1: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 35.370s] ASSIGN.PARTITIONS: duration 0.084ms 3: [0106_cgrp_sess_timeout / 35.370s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 35.370s] consume: consume 10 messages 3: [0106_cgrp_sess_timeout / 35.471s] CONSUME: duration 100.855ms 3: [0106_cgrp_sess_timeout / 35.471s] consume: consumed 10/10 messages (0/-1 EOFs) 3: [0106_cgrp_sess_timeout / 35.471s] Waiting for session timeout revoke (_REVOKE_PARTITIONS) for 9s 3: [0105_transactions_mock / 41.217s] abort&flush: duration 1204.558ms 3: [0105_transactions_mock / 41.217s] Scenario #1 abort&flush succeeded 3: [0105_transactions_mock / 41.217s] Testing scenario #2 commit with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.217s] rd_kafka_begin_transaction(rk): duration 0.049ms 3: [0105_transactions_mock / 41.219s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.175ms 3: [0105_transactions_mock / 41.320s] commit: duration 100.785ms 3: [0105_transactions_mock / 41.320s] Scenario #2 commit succeeded 3: [0105_transactions_mock / 41.320s] Testing scenario #2 commit&flush with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.320s] rd_kafka_begin_transaction(rk): duration 0.049ms 3: [0105_transactions_mock / 41.320s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 41.322s] FLUSH: duration 2.150ms 3: [0105_transactions_mock / 41.322s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.121ms 3: [0105_transactions_mock / 41.423s] commit&flush: duration 100.612ms 3: [0105_transactions_mock / 41.423s] Scenario #2 commit&flush succeeded 3: [0105_transactions_mock / 41.423s] Testing scenario #2 abort with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.423s] rd_kafka_begin_transaction(rk): duration 0.048ms 3: [0105_transactions_mock / 41.425s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.170ms 3: [0105_transactions_mock / 41.526s] abort: duration 100.727ms 3: [0105_transactions_mock / 41.526s] Scenario #2 abort succeeded 3: [0105_transactions_mock / 41.526s] Testing scenario #2 abort&flush with 1 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.526s] rd_kafka_begin_transaction(rk): duration 0.050ms 3: [0105_transactions_mock / 41.526s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 41.528s] FLUSH: duration 2.145ms 3: [0105_transactions_mock / 41.528s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.118ms 3: [0105_transactions_mock / 41.629s] abort&flush: duration 100.614ms 3: [0105_transactions_mock / 41.629s] Scenario #2 abort&flush succeeded 3: [0105_transactions_mock / 41.629s] Testing scenario #3 commit with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.629s] rd_kafka_begin_transaction(rk): duration 0.049ms 3: [0105_transactions_mock / 41.631s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.092ms 3: [0105_transactions_mock / 41.933s] commit: duration 301.804ms 3: [0105_transactions_mock / 41.933s] Scenario #3 commit succeeded 3: [0105_transactions_mock / 41.933s] Testing scenario #3 commit&flush with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 41.933s] rd_kafka_begin_transaction(rk): duration 0.048ms 3: [0105_transactions_mock / 41.933s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 41.935s] FLUSH: duration 2.150ms 3: [0105_transactions_mock / 41.935s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.120ms 3: [0105_transactions_mock / 42.237s] commit&flush: duration 301.604ms 3: [0105_transactions_mock / 42.237s] Scenario #3 commit&flush succeeded 3: [0105_transactions_mock / 42.237s] Testing scenario #3 abort with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 42.237s] rd_kafka_begin_transaction(rk): duration 0.048ms 3: [0105_transactions_mock / 42.239s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.173ms 3: [0105_transactions_mock / 42.541s] abort: duration 301.728ms 3: [0105_transactions_mock / 42.541s] Scenario #3 abort succeeded 3: [0105_transactions_mock / 42.541s] Testing scenario #3 abort&flush with 3 injected erorrs, expecting NO_ERROR 3: [0105_transactions_mock / 42.541s] rd_kafka_begin_transaction(rk): duration 0.047ms 3: [0105_transactions_mock / 42.541s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 42.543s] FLUSH: duration 2.151ms 3: [0105_transactions_mock / 42.543s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.126ms 3: [0105_transactions_mock / 42.845s] abort&flush: duration 301.567ms 3: [0105_transactions_mock / 42.845s] Scenario #3 abort&flush succeeded 3: [0105_transactions_mock / 42.845s] Testing scenario #4 commit with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.845s] rd_kafka_begin_transaction(rk): duration 0.046ms 3: [0105_transactions_mock / 42.847s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.183ms 3: %3|1673491105.167|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.847s] commit: duration 0.167ms 3: [0105_transactions_mock / 42.847s] Scenario #4 commit failed: UNKNOWN_PRODUCER_ID: EndTxn commit failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.847s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.847s] rd_kafka_abort_transaction(rk, -1): duration 0.123ms 3: [0105_transactions_mock / 42.847s] Testing scenario #4 commit&flush with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.847s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [0105_transactions_mock / 42.847s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 42.849s] FLUSH: duration 2.131ms 3: [0105_transactions_mock / 42.850s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.117ms 3: %3|1673491105.169|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.850s] commit&flush: duration 0.109ms 3: [0105_transactions_mock / 42.850s] Scenario #4 commit&flush failed: UNKNOWN_PRODUCER_ID: EndTxn commit failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.850s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.850s] rd_kafka_abort_transaction(rk, -1): duration 0.118ms 3: [0105_transactions_mock / 42.850s] Testing scenario #4 abort with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.850s] rd_kafka_begin_transaction(rk): duration 0.029ms 3: [0105_transactions_mock / 42.852s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.155ms 3: %3|1673491105.172|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.852s] abort: duration 0.134ms 3: [0105_transactions_mock / 42.852s] Scenario #4 abort failed: UNKNOWN_PRODUCER_ID: EndTxn abort failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.852s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.853s] rd_kafka_abort_transaction(rk, -1): duration 1.130ms 3: [0105_transactions_mock / 42.853s] Testing scenario #4 abort&flush with 1 injected erorrs, expecting UNKNOWN_PRODUCER_ID 3: [0105_transactions_mock / 42.853s] rd_kafka_begin_transaction(rk): duration 0.031ms 3: [0105_transactions_mock / 42.853s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 42.856s] FLUSH: duration 2.143ms 3: [0105_transactions_mock / 42.856s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.132ms 3: %3|1673491105.177|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Unknown Producer Id (UNKNOWN_PRODUCER_ID) 3: [0105_transactions_mock / 42.858s] abort&flush: duration 2.031ms 3: [0105_transactions_mock / 42.858s] Scenario #4 abort&flush failed: UNKNOWN_PRODUCER_ID: EndTxn abort failed: Broker: Unknown Producer Id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.858s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.858s] rd_kafka_abort_transaction(rk, -1): duration 0.117ms 3: [0105_transactions_mock / 42.858s] Testing scenario #5 commit with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.858s] rd_kafka_begin_transaction(rk): duration 0.028ms 3: [0105_transactions_mock / 42.860s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.156ms 3: %3|1673491105.180|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.860s] commit: duration 0.152ms 3: [0105_transactions_mock / 42.860s] Scenario #5 commit failed: INVALID_PRODUCER_ID_MAPPING: EndTxn commit failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.860s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.861s] rd_kafka_abort_transaction(rk, -1): duration 0.444ms 3: [0105_transactions_mock / 42.861s] Testing scenario #5 commit&flush with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.861s] rd_kafka_begin_transaction(rk): duration 0.027ms 3: [0105_transactions_mock / 42.861s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 42.863s] FLUSH: duration 2.135ms 3: [0105_transactions_mock / 42.863s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.113ms 3: %3|1673491105.183|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.863s] commit&flush: duration 0.103ms 3: [0105_transactions_mock / 42.863s] Scenario #5 commit&flush failed: INVALID_PRODUCER_ID_MAPPING: EndTxn commit failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.863s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.863s] rd_kafka_abort_transaction(rk, -1): duration 0.116ms 3: [0105_transactions_mock / 42.863s] Testing scenario #5 abort with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.863s] rd_kafka_begin_transaction(rk): duration 0.030ms 3: [0105_transactions_mock / 42.865s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.158ms 3: %3|1673491105.185|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.866s] abort: duration 0.141ms 3: [0105_transactions_mock / 42.866s] Scenario #5 abort failed: INVALID_PRODUCER_ID_MAPPING: EndTxn abort failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.866s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.866s] rd_kafka_abort_transaction(rk, -1): duration 0.116ms 3: [0105_transactions_mock / 42.866s] Testing scenario #5 abort&flush with 1 injected erorrs, expecting INVALID_PRODUCER_ID_MAPPING 3: [0105_transactions_mock / 42.866s] rd_kafka_begin_transaction(rk): duration 0.027ms 3: [0105_transactions_mock / 42.866s] 0105_transactions_mock#producer-142: Flushing 1 messages 3: [0105_transactions_mock / 42.868s] FLUSH: duration 2.126ms 3: [0105_transactions_mock / 42.868s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.111ms 3: %3|1673491105.191|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (INVALID_PRODUCER_ID_MAPPING) 3: [0105_transactions_mock / 42.872s] abort&flush: duration 3.715ms 3: [0105_transactions_mock / 42.872s] Scenario #5 abort&flush failed: INVALID_PRODUCER_ID_MAPPING: EndTxn abort failed: Broker: Producer attempted to use a producer id which is not currently assigned to its transactional id (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 42.872s] Abortable error, aborting transaction 3: [0105_transactions_mock / 42.874s] rd_kafka_abort_transaction(rk, -1): duration 2.088ms 3: [0105_transactions_mock / 42.874s] Testing scenario #6 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 42.874s] rd_kafka_begin_transaction(rk): duration 0.025ms 3: [0105_transactions_mock / 42.876s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.170ms 3: %1|1673491105.201|TXNERR|0105_transactions_mock#producer-142| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491105.201|FATAL|0105_transactions_mock#producer-142| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 42.882s] commit: duration 5.742ms 3: [0105_transactions_mock / 42.882s] Scenario #6 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 42.882s] Fatal error, destroying producer 3: [0105_transactions_mock / 42.883s] Testing scenario #6 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 42.883s] Test config file test.conf not found 3: [0105_transactions_mock / 42.883s] Setting test timeout to 60s * 2.7 3: %5|1673491105.203|MOCK|0105_transactions_mock#producer-167| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38289,127.0.0.1:37641,127.0.0.1:38533 3: [0105_transactions_mock / 42.883s] Created kafka instance 0105_transactions_mock#producer-167 3: [0105_transactions_mock / 42.895s] rd_kafka_init_transactions(rk, 5000): duration 11.783ms 3: [0105_transactions_mock / 42.895s] rd_kafka_begin_transaction(rk): duration 0.081ms 3: [0105_transactions_mock / 42.895s] 0105_transactions_mock#producer-167: Flushing 1 messages 3: [0116_kafkaconsumer_close / 34.165s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=0: PASS (3.18s) ] 3: [0116_kafkaconsumer_close / 34.165s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=1 ] 3: %5|1673491105.226|CONFWARN|MOCK#producer-168| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 34.170s] Setting test timeout to 10s * 2.7 3: [0120_asymmetric_subscription/ 8.028s] c0#consumer-161: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[0], t2[1] 3: [0120_asymmetric_subscription/ 8.028s] c1#consumer-162: Assignment (6 partition(s)): t2[2], t2[3], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 8.028s] c2#consumer-163: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 8.028s] rd_kafka_assignment(c[i], &assignment): duration 0.120ms 3: [0120_asymmetric_subscription/ 8.028s] rd_kafka_assignment(c[i], &assignment): duration 0.027ms 3: [0120_asymmetric_subscription/ 8.028s] rd_kafka_assignment(c[i], &assignment): duration 0.023ms 3: [0120_asymmetric_subscription/ 8.030s] [ do_test_asymmetric:71: range assignor: PASS (4.01s) ] 3: [0120_asymmetric_subscription/ 8.030s] [ do_test_asymmetric:71: cooperative-sticky assignor ] 3: [0120_asymmetric_subscription/ 8.030s] Test config file test.conf not found 3: [0120_asymmetric_subscription/ 8.030s] Setting test timeout to 30s * 2.7 3: [0120_asymmetric_subscription/ 8.030s] Created kafka instance c0#consumer-171 3: [0120_asymmetric_subscription/ 8.031s] rd_kafka_subscribe(c[i], tlist): duration 0.029ms 3: [0120_asymmetric_subscription/ 8.033s] Created kafka instance c1#consumer-172 3: [0120_asymmetric_subscription/ 8.033s] rd_kafka_subscribe(c[i], tlist): duration 0.019ms 3: [0120_asymmetric_subscription/ 8.037s] Created kafka instance c2#consumer-173 3: [0120_asymmetric_subscription/ 8.037s] rd_kafka_subscribe(c[i], tlist): duration 0.196ms 3: [0105_transactions_mock / 43.886s] FLUSH: duration 990.454ms 3: [0105_transactions_mock / 43.886s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.432ms 3: %1|1673491106.206|TXNERR|0105_transactions_mock#producer-167| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491106.206|FATAL|0105_transactions_mock#producer-167| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 43.886s] commit&flush: duration 0.106ms 3: [0105_transactions_mock / 43.886s] Scenario #6 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 43.886s] Fatal error, destroying producer 3: [0105_transactions_mock / 43.887s] Testing scenario #6 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 43.887s] Test config file test.conf not found 3: [0105_transactions_mock / 43.887s] Setting test timeout to 60s * 2.7 3: %5|1673491106.207|MOCK|0105_transactions_mock#producer-174| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37653,127.0.0.1:44003,127.0.0.1:38715 3: [0105_transactions_mock / 43.891s] Created kafka instance 0105_transactions_mock#producer-174 3: [0105_transactions_mock / 43.915s] rd_kafka_init_transactions(rk, 5000): duration 23.789ms 3: [0105_transactions_mock / 43.915s] rd_kafka_begin_transaction(rk): duration 0.112ms 3: [0105_transactions_mock / 43.919s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 4.494ms 3: %1|1673491106.242|TXNERR|0105_transactions_mock#producer-174| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491106.242|FATAL|0105_transactions_mock#producer-174| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 43.923s] abort: duration 3.716ms 3: [0105_transactions_mock / 43.923s] Scenario #6 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 43.923s] Fatal error, destroying producer 3: [0105_transactions_mock / 43.935s] Testing scenario #6 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 43.935s] Test config file test.conf not found 3: [0105_transactions_mock / 43.935s] Setting test timeout to 60s * 2.7 3: %5|1673491106.255|MOCK|0105_transactions_mock#producer-175| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46627,127.0.0.1:37307,127.0.0.1:42083 3: [0105_transactions_mock / 43.936s] Created kafka instance 0105_transactions_mock#producer-175 3: [0105_transactions_mock / 43.950s] rd_kafka_init_transactions(rk, 5000): duration 14.452ms 3: [0105_transactions_mock / 43.950s] rd_kafka_begin_transaction(rk): duration 0.149ms 3: [0105_transactions_mock / 43.950s] 0105_transactions_mock#producer-175: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 37.579s] Rebalance #11: _ASSIGN_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 37.579s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.579s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.124ms 3: [0113_cooperative_rebalance_local/ 37.579s] assign: incremental assign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 37.580s] Closing consumer 3: [0113_cooperative_rebalance_local/ 37.580s] Closing consumer 0113_cooperative_rebalance_local#consumer-149 3: [0113_cooperative_rebalance_local/ 37.580s] Rebalance #12: _REVOKE_PARTITIONS: 8 partition(s) 3: [0113_cooperative_rebalance_local/ 37.580s] test1 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] test1 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] test1 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] test1 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] test2 [0] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] test2 [1] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] test2 [2] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] test2 [3] offset -1001 3: [0113_cooperative_rebalance_local/ 37.580s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.085ms 3: [0113_cooperative_rebalance_local/ 37.580s] unassign: incremental unassign of 8 partition(s) done 3: [0113_cooperative_rebalance_local/ 37.580s] CONSUMER.CLOSE: duration 0.816ms 3: [0113_cooperative_rebalance_local/ 37.580s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 37.581s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 37.581s] [ q_lost_partitions_illegal_generation_test:2770: test_joingroup_fail=1: PASS (10.66s) ] 3: [0113_cooperative_rebalance_local/ 37.581s] [ r_lost_partitions_commit_illegal_generation_test_local:2860 ] 3: %5|1673491106.699|CONFWARN|MOCK#producer-176| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0113_cooperative_rebalance_local/ 37.582s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 37.582s] Created kafka instance 0113_cooperative_rebalance_local#producer-177 3: [0113_cooperative_rebalance_local/ 37.582s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 37.582s] Produce to test [0]: messages #0..100 3: [0113_cooperative_rebalance_local/ 37.582s] SUM(POLL): duration 0.000ms 3: [0113_cooperative_rebalance_local/ 37.582s] PRODUCE: duration 0.064ms 3: [0113_cooperative_rebalance_local/ 37.636s] PRODUCE.DELIVERY.WAIT: duration 53.595ms 3: [0113_cooperative_rebalance_local/ 37.644s] Test config file test.conf not found 3: [0113_cooperative_rebalance_local/ 37.644s] Setting test timeout to 30s * 2.7 3: [0113_cooperative_rebalance_local/ 37.644s] Created kafka instance 0113_cooperative_rebalance_local#consumer-178 3: [0113_cooperative_rebalance_local/ 37.644s] r_lost_partitions_commit_illegal_generation_test_local:2883: Waiting for initial assignment (_ASSIGN_PARTITIONS) for 7s 3: [0105_transactions_mock / 44.938s] FLUSH: duration 987.909ms 3: [0105_transactions_mock / 44.939s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.294ms 3: %1|1673491107.258|TXNERR|0105_transactions_mock#producer-175| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491107.258|FATAL|0105_transactions_mock#producer-175| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 44.939s] abort&flush: duration 0.205ms 3: [0105_transactions_mock / 44.939s] Scenario #6 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 44.939s] Fatal error, destroying producer 3: [0105_transactions_mock / 44.939s] Testing scenario #7 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 44.940s] Test config file test.conf not found 3: [0105_transactions_mock / 44.940s] Setting test timeout to 60s * 2.7 3: %5|1673491107.259|MOCK|0105_transactions_mock#producer-179| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37929,127.0.0.1:34893,127.0.0.1:45145 3: [0105_transactions_mock / 44.941s] Created kafka instance 0105_transactions_mock#producer-179 3: [0105_transactions_mock / 44.951s] rd_kafka_init_transactions(rk, 5000): duration 10.092ms 3: [0105_transactions_mock / 44.951s] rd_kafka_begin_transaction(rk): duration 0.013ms 3: [0105_transactions_mock / 44.952s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 1.392ms 3: %1|1673491107.275|TXNERR|0105_transactions_mock#producer-179| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491107.275|FATAL|0105_transactions_mock#producer-179| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 44.955s] commit: duration 2.846ms 3: [0105_transactions_mock / 44.955s] Scenario #7 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 44.955s] Fatal error, destroying producer 3: [0105_transactions_mock / 44.963s] Testing scenario #7 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 44.963s] Test config file test.conf not found 3: [0105_transactions_mock / 44.963s] Setting test timeout to 60s * 2.7 3: %5|1673491107.283|MOCK|0105_transactions_mock#producer-180| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44579,127.0.0.1:34355,127.0.0.1:41925 3: [0105_transactions_mock / 44.964s] Created kafka instance 0105_transactions_mock#producer-180 3: [0105_transactions_mock / 44.978s] rd_kafka_init_transactions(rk, 5000): duration 14.872ms 3: [0105_transactions_mock / 44.978s] rd_kafka_begin_transaction(rk): duration 0.023ms 3: [0105_transactions_mock / 44.978s] 0105_transactions_mock#producer-180: Flushing 1 messages 3: [
/ 53.441s] Too many tests running (5 >= 5): postponing 0121_clusterid start... 3: [0105_transactions_mock / 45.966s] FLUSH: duration 987.652ms 3: [0105_transactions_mock / 45.967s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 1.006ms 3: %1|1673491108.287|TXNERR|0105_transactions_mock#producer-180| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491108.287|FATAL|0105_transactions_mock#producer-180| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 45.967s] commit&flush: duration 0.161ms 3: [0105_transactions_mock / 45.967s] Scenario #7 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 45.967s] Fatal error, destroying producer 3: [0105_transactions_mock / 45.968s] Testing scenario #7 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 45.968s] Test config file test.conf not found 3: [0105_transactions_mock / 45.968s] Setting test timeout to 60s * 2.7 3: %5|1673491108.288|MOCK|0105_transactions_mock#producer-181| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46289,127.0.0.1:43489,127.0.0.1:42227 3: [0105_transactions_mock / 45.972s] Created kafka instance 0105_transactions_mock#producer-181 3: [0105_transactions_mock / 45.999s] rd_kafka_init_transactions(rk, 5000): duration 26.847ms 3: [0105_transactions_mock / 45.999s] rd_kafka_begin_transaction(rk): duration 0.071ms 3: [0105_transactions_mock / 46.005s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 5.612ms 3: %1|1673491108.330|TXNERR|0105_transactions_mock#producer-181| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491108.330|FATAL|0105_transactions_mock#producer-181| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 46.011s] abort: duration 5.751ms 3: [0105_transactions_mock / 46.011s] Scenario #7 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 46.011s] Fatal error, destroying producer 3: [0105_transactions_mock / 46.022s] Testing scenario #7 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 46.022s] Test config file test.conf not found 3: [0105_transactions_mock / 46.022s] Setting test timeout to 60s * 2.7 3: %5|1673491108.342|MOCK|0105_transactions_mock#producer-182| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46275,127.0.0.1:37953,127.0.0.1:40321 3: [0105_transactions_mock / 46.022s] Created kafka instance 0105_transactions_mock#producer-182 3: [0105_transactions_mock / 46.041s] rd_kafka_init_transactions(rk, 5000): duration 18.062ms 3: [0105_transactions_mock / 46.041s] rd_kafka_begin_transaction(rk): duration 0.104ms 3: [0105_transactions_mock / 46.041s] 0105_transactions_mock#producer-182: Flushing 1 messages 3: [0105_transactions_mock / 47.025s] FLUSH: duration 984.421ms 3: [0105_transactions_mock / 47.025s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.177ms 3: %1|1673491109.345|TXNERR|0105_transactions_mock#producer-182| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491109.345|FATAL|0105_transactions_mock#producer-182| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 47.025s] abort&flush: duration 0.163ms 3: [0105_transactions_mock / 47.025s] Scenario #7 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 47.025s] Fatal error, destroying producer 3: [0105_transactions_mock / 47.026s] Testing scenario #8 commit with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 47.026s] Test config file test.conf not found 3: [0105_transactions_mock / 47.026s] Setting test timeout to 60s * 2.7 3: %5|1673491109.346|MOCK|0105_transactions_mock#producer-183| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46823,127.0.0.1:37601,127.0.0.1:37673 3: [0105_transactions_mock / 47.026s] Created kafka instance 0105_transactions_mock#producer-183 3: [0105_transactions_mock / 47.049s] rd_kafka_init_transactions(rk, 5000): duration 22.010ms 3: [0105_transactions_mock / 47.049s] rd_kafka_begin_transaction(rk): duration 0.013ms 3: [0105_transactions_mock / 47.057s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 8.941ms 3: [0120_asymmetric_subscription/ 12.039s] c0#consumer-171: Assignment (6 partition(s)): t1[0], t1[1], t1[2], t1[3], t2[0], t2[2] 3: [0120_asymmetric_subscription/ 12.039s] c1#consumer-172: Assignment (6 partition(s)): t2[1], t2[3], t3[0], t3[1], t3[2], t3[3] 3: [0120_asymmetric_subscription/ 12.039s] c2#consumer-173: Assignment (4 partition(s)): t4[0], t4[1], t4[2], t4[3] 3: [0120_asymmetric_subscription/ 12.039s] rd_kafka_assignment(c[i], &assignment): duration 0.193ms 3: [0120_asymmetric_subscription/ 12.039s] rd_kafka_assignment(c[i], &assignment): duration 0.027ms 3: [0120_asymmetric_subscription/ 12.039s] rd_kafka_assignment(c[i], &assignment): duration 0.017ms 3: [0120_asymmetric_subscription/ 12.039s] Closing consumer c0#consumer-171 3: [0120_asymmetric_subscription/ 12.040s] CONSUMER.CLOSE: duration 0.300ms 3: [0120_asymmetric_subscription/ 12.041s] Closing consumer c2#consumer-173 3: [0120_asymmetric_subscription/ 12.042s] CONSUMER.CLOSE: duration 0.366ms 3: [0120_asymmetric_subscription/ 12.042s] [ do_test_asymmetric:71: cooperative-sticky assignor: PASS (4.01s) ] 3: [0120_asymmetric_subscription/ 12.042s] 0120_asymmetric_subscription: duration 12042.360ms 3: [0120_asymmetric_subscription/ 12.042s] ================= Test 0120_asymmetric_subscription PASSED ================= 3: [
/ 55.558s] Too many tests running (5 >= 5): postponing 0124_openssl_invalid_engine start... 3: [0121_clusterid / 0.000s] ================= Running test 0121_clusterid ================= 3: [0121_clusterid / 0.000s] ==== Stats written to file stats_0121_clusterid_8715404520432365087.json ==== 3: [0121_clusterid / 0.000s] Test config file test.conf not found 3: %5|1673491109.566|CONFWARN|MOCK#producer-184| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %5|1673491109.566|CONFWARN|MOCK#producer-185| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0121_clusterid / 0.002s] Test config file test.conf not found 3: [0121_clusterid / 0.002s] Setting test timeout to 10s * 2.7 3: [0121_clusterid / 0.002s] Created kafka instance 0121_clusterid#producer-186 3: %4|1673491109.641|SESSTMOUT|0106_cgrp_sess_timeout#consumer-160| [thrd:main]: Consumer group session timed out (in join-state steady) after 6004 ms without a successful response from the group coordinator (broker 1, last error was Broker: Not coordinator): revoking assignment and rejoining group 3: [0106_cgrp_sess_timeout / 42.472s] Rebalance #2: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 42.472s] UNASSIGN.PARTITIONS: duration 0.080ms 3: [0106_cgrp_sess_timeout / 42.472s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 42.472s] Waiting for second assignment (_ASSIGN_PARTITIONS) for 7s 3: %4|1673491109.738|COMMITFAIL|0106_cgrp_sess_timeout#consumer-160| [thrd:main]: Offset commit (unassigned partitions) failed for 1/4 partition(s) in join-state wait-unassign-to-complete: Broker: Unknown member: test[0]@17(Broker: Unknown member) 3: [0113_cooperative_rebalance_local/ 40.664s] Rebalance #13: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 40.664s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 40.664s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 40.664s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 40.664s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 40.664s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.075ms 3: [0113_cooperative_rebalance_local/ 40.664s] assign: incremental assign of 4 partition(s) done 3: [0116_kafkaconsumer_close / 39.203s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=0, queue=1: PASS (5.04s) ] 3: [0116_kafkaconsumer_close / 39.204s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=1 ] 3: %5|1673491110.260|CONFWARN|MOCK#producer-187| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 39.204s] Setting test timeout to 10s * 2.7 3: %1|1673491110.350|TXNERR|0105_transactions_mock#producer-183| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1673491110.350|FATAL|0105_transactions_mock#producer-183| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 48.031s] commit: duration 973.299ms 3: [0105_transactions_mock / 48.031s] Scenario #8 commit failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 48.031s] Fatal error, destroying producer 3: [0105_transactions_mock / 48.031s] Testing scenario #8 commit&flush with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 48.032s] Test config file test.conf not found 3: [0105_transactions_mock / 48.032s] Setting test timeout to 60s * 2.7 3: %5|1673491110.351|MOCK|0105_transactions_mock#producer-189| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:41849,127.0.0.1:44927,127.0.0.1:39885 3: [0105_transactions_mock / 48.039s] Created kafka instance 0105_transactions_mock#producer-189 3: [0113_cooperative_rebalance_local/ 41.274s] consume: consume 50 messages 3: [0113_cooperative_rebalance_local/ 41.274s] CONSUME: duration 0.018ms 3: [0113_cooperative_rebalance_local/ 41.274s] consume: consumed 50/50 messages (0/-1 EOFs) 3: [0105_transactions_mock / 48.074s] rd_kafka_init_transactions(rk, 5000): duration 34.708ms 3: [0105_transactions_mock / 48.074s] rd_kafka_begin_transaction(rk): duration 0.032ms 3: [0105_transactions_mock / 48.074s] 0105_transactions_mock#producer-189: Flushing 1 messages 3: [0113_cooperative_rebalance_local/ 41.283s] r_lost_partitions_commit_illegal_generation_test_local:2901: Waiting for lost partitions (_REVOKE_PARTITIONS) for 12s 3: [0113_cooperative_rebalance_local/ 41.283s] Rebalance #14: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 41.283s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 41.283s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 41.283s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 41.283s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 41.283s] Partitions were lost 3: [0113_cooperative_rebalance_local/ 41.291s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 7.928ms 3: [0113_cooperative_rebalance_local/ 41.291s] unassign: incremental unassign of 4 partition(s) done 3: [0105_transactions_mock / 49.054s] FLUSH: duration 979.937ms 3: [0105_transactions_mock / 49.054s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.359ms 3: %1|1673491111.374|TXNERR|0105_transactions_mock#producer-189| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1673491111.374|FATAL|0105_transactions_mock#producer-189| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 49.054s] commit&flush: duration 0.131ms 3: [0105_transactions_mock / 49.054s] Scenario #8 commit&flush failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 49.054s] Fatal error, destroying producer 3: [0105_transactions_mock / 49.058s] Testing scenario #8 abort with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 49.058s] Test config file test.conf not found 3: [0105_transactions_mock / 49.058s] Setting test timeout to 60s * 2.7 3: %5|1673491111.378|MOCK|0105_transactions_mock#producer-191| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:46545,127.0.0.1:34023,127.0.0.1:43759 3: [0105_transactions_mock / 49.062s] Created kafka instance 0105_transactions_mock#producer-191 3: [0105_transactions_mock / 49.071s] rd_kafka_init_transactions(rk, 5000): duration 8.011ms 3: [0105_transactions_mock / 49.071s] rd_kafka_begin_transaction(rk): duration 0.027ms 3: [0113_cooperative_rebalance_local/ 42.288s] r_lost_partitions_commit_illegal_generation_test_local:2904: Waiting for rejoin group (_ASSIGN_PARTITIONS) for 22s 3: [0105_transactions_mock / 49.086s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 15.056ms 3: %1|1673491111.409|TXNERR|0105_transactions_mock#producer-191| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1673491111.409|FATAL|0105_transactions_mock#producer-191| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 49.090s] abort: duration 4.101ms 3: [0105_transactions_mock / 49.090s] Scenario #8 abort failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 49.090s] Fatal error, destroying producer 3: [0105_transactions_mock / 49.095s] Testing scenario #8 abort&flush with 1 injected erorrs, expecting TRANSACTIONAL_ID_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 49.095s] Test config file test.conf not found 3: [0105_transactions_mock / 49.095s] Setting test timeout to 60s * 2.7 3: %5|1673491111.415|MOCK|0105_transactions_mock#producer-192| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45741,127.0.0.1:43541,127.0.0.1:36973 3: [0105_transactions_mock / 49.095s] Created kafka instance 0105_transactions_mock#producer-192 3: [0105_transactions_mock / 49.116s] rd_kafka_init_transactions(rk, 5000): duration 20.631ms 3: [0105_transactions_mock / 49.116s] rd_kafka_begin_transaction(rk): duration 0.015ms 3: [0105_transactions_mock / 49.116s] 0105_transactions_mock#producer-192: Flushing 1 messages 3: [0105_transactions_mock / 50.097s] FLUSH: duration 981.099ms 3: [0105_transactions_mock / 50.098s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.964ms 3: %1|1673491112.418|TXNERR|0105_transactions_mock#producer-192| [thrd:main]: Fatal transaction error: Failed to end transaction: Broker: Transactional Id authorization failed (TRANSACTIONAL_ID_AUTHORIZATION_FAILED) 3: %0|1673491112.418|FATAL|0105_transactions_mock#producer-192| [thrd:main]: Fatal error: Broker: Transactional Id authorization failed: Failed to end transaction: Broker: Transactional Id authorization failed 3: [0105_transactions_mock / 50.098s] abort&flush: duration 0.176ms 3: [0105_transactions_mock / 50.098s] Scenario #8 abort&flush failed: TRANSACTIONAL_ID_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Transactional Id authorization failed (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 50.098s] Fatal error, destroying producer 3: [0105_transactions_mock / 50.099s] Testing scenario #9 commit with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 50.099s] Test config file test.conf not found 3: [0105_transactions_mock / 50.099s] Setting test timeout to 60s * 2.7 3: %5|1673491112.419|MOCK|0105_transactions_mock#producer-193| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45911,127.0.0.1:41569,127.0.0.1:35747 3: [0105_transactions_mock / 50.100s] Created kafka instance 0105_transactions_mock#producer-193 3: [0105_transactions_mock / 50.135s] rd_kafka_init_transactions(rk, 5000): duration 34.952ms 3: [0105_transactions_mock / 50.135s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock / 50.139s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 4.734ms 3: [
/ 58.562s] Log: 0121_clusterid#producer-186 level 6 fac FAIL: [thrd:127.0.0.1:38955/bootstrap]: 127.0.0.1:38955/1: Disconnected (after 3000ms in state UP) 3: [
/ 58.563s] Log: 0121_clusterid#producer-186 level 3 fac FAIL: [thrd:127.0.0.1:38955/bootstrap]: 127.0.0.1:38955/1: Connect to ipv4#127.0.0.1:38955 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 58.563s] Log: 0121_clusterid#producer-186 level 4 fac CLUSTERID: [thrd:main]: Broker 127.0.0.1:45253/bootstrap reports different ClusterId "mockCluster1ff5d2000c5c" than previously known "mockCluster1ff5d20012fc": a client must not be simultaneously connected to multiple clusters 3: %3|1673491113.421|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 51.102s] commit: duration 962.393ms 3: [0105_transactions_mock / 51.102s] Scenario #9 commit failed: GROUP_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.102s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.102s] rd_kafka_abort_transaction(rk, -1): duration 0.170ms 3: [0105_transactions_mock / 51.102s] Testing scenario #9 commit&flush with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 51.102s] rd_kafka_begin_transaction(rk): duration 0.030ms 3: [0105_transactions_mock / 51.102s] 0105_transactions_mock#producer-193: Flushing 1 messages 3: [0105_transactions_mock / 51.103s] FLUSH: duration 1.132ms 3: [0105_transactions_mock / 51.104s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 1.058ms 3: %3|1673491113.424|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 51.104s] commit&flush: duration 0.122ms 3: [0105_transactions_mock / 51.104s] Scenario #9 commit&flush failed: GROUP_AUTHORIZATION_FAILED: EndTxn commit failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.104s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.105s] rd_kafka_abort_transaction(rk, -1): duration 0.117ms 3: [0105_transactions_mock / 51.105s] Testing scenario #9 abort with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 51.105s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock / 51.107s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.116ms 3: %3|1673491113.426|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 51.107s] abort: duration 0.152ms 3: [0105_transactions_mock / 51.107s] Scenario #9 abort failed: GROUP_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.107s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.107s] rd_kafka_abort_transaction(rk, -1): duration 0.116ms 3: [0105_transactions_mock / 51.107s] Testing scenario #9 abort&flush with 1 injected erorrs, expecting GROUP_AUTHORIZATION_FAILED 3: [0105_transactions_mock / 51.107s] rd_kafka_begin_transaction(rk): duration 0.022ms 3: [0105_transactions_mock / 51.107s] 0105_transactions_mock#producer-193: Flushing 1 messages 3: [0105_transactions_mock / 51.109s] FLUSH: duration 2.072ms 3: [0105_transactions_mock / 51.109s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.147ms 3: %3|1673491113.429|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Group authorization failed (GROUP_AUTHORIZATION_FAILED) 3: [0105_transactions_mock / 51.109s] abort&flush: duration 0.132ms 3: [0105_transactions_mock / 51.109s] Scenario #9 abort&flush failed: GROUP_AUTHORIZATION_FAILED: EndTxn abort failed: Broker: Group authorization failed (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.109s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.110s] rd_kafka_abort_transaction(rk, -1): duration 0.128ms 3: [0105_transactions_mock / 51.110s] Testing scenario #10 commit with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 51.110s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock / 51.115s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 5.019ms 3: %3|1673491113.435|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 51.116s] commit: duration 0.983ms 3: [0105_transactions_mock / 51.116s] Scenario #10 commit failed: INVALID_MSG_SIZE: EndTxn commit failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.116s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.116s] rd_kafka_abort_transaction(rk, -1): duration 0.129ms 3: [0105_transactions_mock / 51.116s] Testing scenario #10 commit&flush with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 51.116s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: [0105_transactions_mock / 51.116s] 0105_transactions_mock#producer-193: Flushing 1 messages 3: [0105_transactions_mock / 51.122s] FLUSH: duration 5.665ms 3: [0105_transactions_mock / 51.122s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.163ms 3: %3|1673491113.441|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state CommittingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 51.122s] commit&flush: duration 0.109ms 3: [0105_transactions_mock / 51.122s] Scenario #10 commit&flush failed: INVALID_MSG_SIZE: EndTxn commit failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.122s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.122s] rd_kafka_abort_transaction(rk, -1): duration 0.123ms 3: [0105_transactions_mock / 51.122s] Testing scenario #10 abort with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 51.122s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: [0105_transactions_mock / 51.124s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.148ms 3: %3|1673491113.444|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 51.124s] abort: duration 0.127ms 3: [0105_transactions_mock / 51.124s] Scenario #10 abort failed: INVALID_MSG_SIZE: EndTxn abort failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.124s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.124s] rd_kafka_abort_transaction(rk, -1): duration 0.106ms 3: [0105_transactions_mock / 51.124s] Testing scenario #10 abort&flush with 1 injected erorrs, expecting INVALID_MSG_SIZE 3: [0105_transactions_mock / 51.124s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock / 51.124s] 0105_transactions_mock#producer-193: Flushing 1 messages 3: [0105_transactions_mock / 51.127s] FLUSH: duration 2.121ms 3: [0105_transactions_mock / 51.127s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.113ms 3: %3|1673491113.446|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Current transaction failed in state AbortingTransaction: Failed to end transaction: Broker: Invalid message size (INVALID_MSG_SIZE) 3: [0105_transactions_mock / 51.127s] abort&flush: duration 0.118ms 3: [0105_transactions_mock / 51.127s] Scenario #10 abort&flush failed: INVALID_MSG_SIZE: EndTxn abort failed: Broker: Invalid message size (retriable=false, req_abort=true, fatal=false) 3: [0105_transactions_mock / 51.127s] Abortable error, aborting transaction 3: [0105_transactions_mock / 51.127s] rd_kafka_abort_transaction(rk, -1): duration 0.107ms 3: [0105_transactions_mock / 51.127s] Testing scenario #11 commit with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 51.127s] rd_kafka_begin_transaction(rk): duration 0.024ms 3: [0105_transactions_mock / 51.129s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 2.147ms 3: %1|1673491113.449|TXNERR|0105_transactions_mock#producer-193| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491113.449|FATAL|0105_transactions_mock#producer-193| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 51.129s] commit: duration 0.146ms 3: [0105_transactions_mock / 51.129s] Scenario #11 commit failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 51.129s] Fatal error, destroying producer 3: [0105_transactions_mock / 51.134s] Testing scenario #11 commit&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 51.134s] Test config file test.conf not found 3: [0105_transactions_mock / 51.134s] Setting test timeout to 60s * 2.7 3: %5|1673491113.454|MOCK|0105_transactions_mock#producer-194| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37635,127.0.0.1:45139,127.0.0.1:35291 3: [0105_transactions_mock / 51.138s] Created kafka instance 0105_transactions_mock#producer-194 3: [0105_transactions_mock / 51.171s] rd_kafka_init_transactions(rk, 5000): duration 32.906ms 3: [0105_transactions_mock / 51.171s] rd_kafka_begin_transaction(rk): duration 0.014ms 3: [0105_transactions_mock / 51.171s] 0105_transactions_mock#producer-194: Flushing 1 messages 3: [0121_clusterid / 4.007s] 0121_clusterid: duration 4007.176ms 3: [0121_clusterid / 4.007s] ================= Test 0121_clusterid PASSED ================= 3: [
/ 59.565s] Too many tests running (5 >= 5): postponing 0128_sasl_callback_queue start... 3: [0124_openssl_invalid_engine / 0.000s] ================= Running test 0124_openssl_invalid_engine ================= 3: [0124_openssl_invalid_engine / 0.000s] ==== Stats written to file stats_0124_openssl_invalid_engine_3846607003274912959.json ==== 3: [0124_openssl_invalid_engine / 0.000s] Test config file test.conf not found 3: [0124_openssl_invalid_engine / 0.000s] Setting test timeout to 30s * 2.7 3: %3|1673491113.573|SSL|0124_openssl_invalid_engine#producer-195| [thrd:app]: error:25066067:DSO support routines:dlfcn_load:could not load the shared library: filename(libinvalid_path.so): libinvalid_path.so: cannot open shared object file: No such file or directory 3: %3|1673491113.573|SSL|0124_openssl_invalid_engine#producer-195| [thrd:app]: error:25070067:DSO support routines:DSO_load:could not load the shared library 3: [0124_openssl_invalid_engine / 0.000s] rd_kafka_new() failed (as expected): OpenSSL engine initialization failed in ENGINE_ctrl_cmd_string LOAD: error:260B6084:engine routines:dynamic_load:dso not found 3: [0124_openssl_invalid_engine / 0.000s] 0124_openssl_invalid_engine: duration 0.276ms 3: [0124_openssl_invalid_engine / 0.000s] ================= Test 0124_openssl_invalid_engine PASSED ================= 3: [
/ 59.565s] Too many tests running (5 >= 5): postponing 0131_connect_timeout start... 3: [0128_sasl_callback_queue / 0.000s] ================= Running test 0128_sasl_callback_queue ================= 3: [0128_sasl_callback_queue / 0.000s] ==== Stats written to file stats_0128_sasl_callback_queue_498697192715025841.json ==== 3: [0128_sasl_callback_queue / 0.000s] Feature "sasl_oauthbearer" is built-in 3: [0128_sasl_callback_queue / 0.000s] [ do_test:64: Use background queue = yes ] 3: %5|1673491113.576|CONFWARN|rdkafka#producer-196| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: %3|1673491113.576|ERROR|rdkafka#producer-196| [thrd:background]: Failed to acquire SASL OAUTHBEARER token: Not implemented by this test, but that's okay 3: [
/ 59.568s] Callback called! 3: [0105_transactions_mock / 52.139s] FLUSH: duration 968.274ms 3: [0105_transactions_mock / 52.140s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 0.589ms 3: %1|1673491114.459|TXNERR|0105_transactions_mock#producer-194| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491114.459|FATAL|0105_transactions_mock#producer-194| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 52.140s] commit&flush: duration 0.094ms 3: [0105_transactions_mock / 52.140s] Scenario #11 commit&flush failed: _FENCED: EndTxn commit failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 52.140s] Fatal error, destroying producer 3: [0105_transactions_mock / 52.140s] Testing scenario #11 abort with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 52.140s] Test config file test.conf not found 3: [0105_transactions_mock / 52.140s] Setting test timeout to 60s * 2.7 3: %5|1673491114.460|MOCK|0105_transactions_mock#producer-197| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:38431,127.0.0.1:42261,127.0.0.1:36271 3: [0105_transactions_mock / 52.144s] Created kafka instance 0105_transactions_mock#producer-197 3: [0105_transactions_mock / 52.164s] rd_kafka_init_transactions(rk, 5000): duration 19.836ms 3: [0105_transactions_mock / 52.164s] rd_kafka_begin_transaction(rk): duration 0.101ms 3: [0105_transactions_mock / 52.169s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 4.840ms 3: %1|1673491114.498|TXNERR|0105_transactions_mock#producer-197| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491114.498|FATAL|0105_transactions_mock#producer-197| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 52.179s] abort: duration 10.059ms 3: [0105_transactions_mock / 52.179s] Scenario #11 abort failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 52.179s] Fatal error, destroying producer 3: [0105_transactions_mock / 52.187s] Testing scenario #11 abort&flush with 1 injected erorrs, expecting _FENCED 3: [0105_transactions_mock / 52.187s] Test config file test.conf not found 3: [0105_transactions_mock / 52.187s] Setting test timeout to 60s * 2.7 3: %5|1673491114.506|MOCK|0105_transactions_mock#producer-198| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42609,127.0.0.1:45505,127.0.0.1:35529 3: [0105_transactions_mock / 52.188s] Created kafka instance 0105_transactions_mock#producer-198 3: [0105_transactions_mock / 52.208s] rd_kafka_init_transactions(rk, 5000): duration 20.016ms 3: [0105_transactions_mock / 52.208s] rd_kafka_begin_transaction(rk): duration 0.090ms 3: [0105_transactions_mock / 52.208s] 0105_transactions_mock#producer-198: Flushing 1 messages 3: [0106_cgrp_sess_timeout / 47.388s] Rebalance #3: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.388s] ASSIGN.PARTITIONS: duration 0.109ms 3: [0106_cgrp_sess_timeout / 47.388s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.388s] Closing consumer 0106_cgrp_sess_timeout#consumer-160 3: [0106_cgrp_sess_timeout / 47.388s] Rebalance #4: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 47.388s] UNASSIGN.PARTITIONS: duration 0.027ms 3: [0106_cgrp_sess_timeout / 47.388s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 47.388s] CONSUMER.CLOSE: duration 0.234ms 3: [0106_cgrp_sess_timeout / 47.389s] [ do_test_session_timeout:152: Test session timeout with auto commit: PASS (15.14s) ] 3: [0106_cgrp_sess_timeout / 47.389s] [ do_test_commit_on_lost:231 ] 3: %5|1673491114.656|CONFWARN|MOCK#producer-199| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0106_cgrp_sess_timeout / 47.393s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.393s] Created kafka instance 0106_cgrp_sess_timeout#producer-200 3: [0106_cgrp_sess_timeout / 47.393s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.393s] Produce to test [0]: messages #0..100 3: [0106_cgrp_sess_timeout / 47.393s] SUM(POLL): duration 0.000ms 3: [0106_cgrp_sess_timeout / 47.393s] PRODUCE: duration 0.065ms 3: [0106_cgrp_sess_timeout / 47.467s] PRODUCE.DELIVERY.WAIT: duration 73.594ms 3: [0106_cgrp_sess_timeout / 47.471s] Test config file test.conf not found 3: [0106_cgrp_sess_timeout / 47.471s] Setting test timeout to 30s * 2.7 3: [0106_cgrp_sess_timeout / 47.471s] Created kafka instance 0106_cgrp_sess_timeout#consumer-201 3: [0106_cgrp_sess_timeout / 47.471s] consume: consume 10 messages 3: [0116_kafkaconsumer_close / 44.213s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=0, queue=1: PASS (5.01s) ] 3: [0116_kafkaconsumer_close / 44.213s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=1 ] 3: %5|1673491115.269|CONFWARN|MOCK#producer-202| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 44.213s] Setting test timeout to 10s * 2.7 3: [0113_cooperative_rebalance_local/ 46.291s] Rebalance #15: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 46.291s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.034ms 3: [0113_cooperative_rebalance_local/ 46.291s] assign: incremental assign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 46.291s] Closing consumer 3: [0113_cooperative_rebalance_local/ 46.291s] Closing consumer 0113_cooperative_rebalance_local#consumer-178 3: [0113_cooperative_rebalance_local/ 46.291s] Rebalance #16: _REVOKE_PARTITIONS: 4 partition(s) 3: [0113_cooperative_rebalance_local/ 46.291s] test [0] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] test [1] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] test [2] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] test [3] offset -1001 3: [0113_cooperative_rebalance_local/ 46.291s] INCREMENTAL.UNASSIGN.PARTITIONS: duration 0.035ms 3: [0113_cooperative_rebalance_local/ 46.291s] unassign: incremental unassign of 4 partition(s) done 3: [0113_cooperative_rebalance_local/ 46.291s] CONSUMER.CLOSE: duration 0.181ms 3: [0113_cooperative_rebalance_local/ 46.291s] Destroying consumer 3: [0113_cooperative_rebalance_local/ 46.294s] Destroying mock cluster 3: [0113_cooperative_rebalance_local/ 46.294s] 0113_cooperative_rebalance_local: duration 46294.281ms 3: [0113_cooperative_rebalance_local/ 46.294s] ================= Test 0113_cooperative_rebalance_local PASSED ================= 3: [0105_transactions_mock / 53.189s] FLUSH: duration 980.981ms 3: [0105_transactions_mock / 53.190s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, -1): duration 1.382ms 3: %1|1673491115.510|TXNERR|0105_transactions_mock#producer-198| [thrd:main]: Fatal transaction error: Failed to end transaction: Local: This instance has been fenced by a newer instance (_FENCED) 3: %0|1673491115.510|FATAL|0105_transactions_mock#producer-198| [thrd:main]: Fatal error: Local: This instance has been fenced by a newer instance: Failed to end transaction: Local: This instance has been fenced by a newer instance 3: [0105_transactions_mock / 53.190s] abort&flush: duration 0.182ms 3: [0105_transactions_mock / 53.190s] Scenario #11 abort&flush failed: _FENCED: EndTxn abort failed: Local: This instance has been fenced by a newer instance (retriable=false, req_abort=false, fatal=true) 3: [0105_transactions_mock / 53.190s] Fatal error, destroying producer 3: [0105_transactions_mock / 53.192s] [ do_test_txn_endtxn_errors:705: PASS (21.49s) ] 3: [0105_transactions_mock / 53.192s] [ do_test_txn_endtxn_infinite:901 ] 3: [0105_transactions_mock / 53.192s] Test config file test.conf not found 3: [0105_transactions_mock / 53.192s] Setting test timeout to 60s * 2.7 3: %5|1673491115.511|MOCK|0105_transactions_mock#producer-205| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44295,127.0.0.1:46509,127.0.0.1:37369 3: [0105_transactions_mock / 53.193s] Created kafka instance 0105_transactions_mock#producer-205 3: [
/ 61.507s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0131_connect_timeout / 0.000s] ================= Running test 0131_connect_timeout ================= 3: [0131_connect_timeout / 0.000s] ==== Stats written to file stats_0131_connect_timeout_6877028542629644355.json ==== 3: [0131_connect_timeout / 0.000s] Test config file test.conf not found 3: [0131_connect_timeout / 0.000s] Setting test timeout to 20s * 2.7 3: [0131_connect_timeout / 0.001s] Created kafka instance 0131_connect_timeout#producer-206 3: [0105_transactions_mock / 53.201s] rd_kafka_init_transactions(rk, 5000): duration 8.002ms 3: [0105_transactions_mock / 53.201s] rd_kafka_begin_transaction(rk): duration 0.107ms 3: [0105_transactions_mock / 53.201s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [
/ 62.508s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [
/ 63.508s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0106_cgrp_sess_timeout / 50.488s] 0106_cgrp_sess_timeout#consumer-201: Rebalance: _ASSIGN_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 50.488s] ASSIGN.PARTITIONS: duration 0.058ms 3: [0106_cgrp_sess_timeout / 50.488s] assign: assigned 4 partition(s) 3: [0106_cgrp_sess_timeout / 51.091s] CONSUME: duration 3619.508ms 3: [0106_cgrp_sess_timeout / 51.091s] consume: consumed 10/10 messages (0/-1 EOFs) 3: %6|1673491118.357|FAIL|0106_cgrp_sess_timeout#consumer-201| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:43497: Disconnected (after 3608ms in state UP) 3: %3|1673491118.358|FAIL|0106_cgrp_sess_timeout#consumer-201| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:43497: Connect to ipv4#127.0.0.1:43497 failed: Connection refused (after 0ms in state CONNECT) 3: [0106_cgrp_sess_timeout / 51.091s] Waiting for assignment to be lost... 3: [
/ 64.508s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: %3|1673491118.519|FAIL|0106_cgrp_sess_timeout#consumer-201| [thrd:GroupCoordinator]: GroupCoordinator: 127.0.0.1:43497: Connect to ipv4#127.0.0.1:43497 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0128_sasl_callback_queue / 5.003s] [ do_test:64: Use background queue = yes: PASS (5.00s) ] 3: [0128_sasl_callback_queue / 5.003s] [ do_test:64: Use background queue = no ] 3: %5|1673491118.577|CONFWARN|rdkafka#producer-207| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [
/ 65.508s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0116_kafkaconsumer_close / 49.269s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=0, queue=1: PASS (5.06s) ] 3: [0116_kafkaconsumer_close / 49.269s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=1 ] 3: %5|1673491120.326|CONFWARN|MOCK#producer-208| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 49.270s] Setting test timeout to 10s * 2.7 3: [
/ 66.508s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0105_transactions_mock / 59.112s] commit_transaction(): duration 4911.660ms 3: [0105_transactions_mock / 59.112s] commit returned success 3: [0105_transactions_mock / 59.112s] rd_kafka_begin_transaction(rk): duration 0.056ms 3: [0105_transactions_mock / 59.112s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.009ms 3: [
/ 67.508s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [
/ 67.541s] Log: 0131_connect_timeout#producer-206 level 7 fac FAIL: [thrd:127.0.0.1:45173/bootstrap]: 127.0.0.1:45173/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) (_TRANSPORT) 3: [
/ 67.541s] Log: 0131_connect_timeout#producer-206 level 4 fac FAIL: [thrd:127.0.0.1:45173/bootstrap]: 127.0.0.1:45173/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) 3: [
/ 68.509s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [
/ 68.538s] Log: 0131_connect_timeout#producer-206 level 7 fac FAIL: [thrd:127.0.0.1:40897/bootstrap]: 127.0.0.1:40897/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) (_TRANSPORT) 3: [
/ 68.538s] Log: 0131_connect_timeout#producer-206 level 4 fac FAIL: [thrd:127.0.0.1:40897/bootstrap]: 127.0.0.1:40897/bootstrap: Connection setup timed out in state APIVERSION_QUERY (after 6029ms in state APIVERSION_QUERY) 3: [0116_kafkaconsumer_close / 52.419s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=0, queue=1: PASS (3.15s) ] 3: [0116_kafkaconsumer_close / 52.419s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=1 ] 3: %5|1673491123.475|CONFWARN|MOCK#producer-211| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 52.419s] Setting test timeout to 10s * 2.7 3: [
/ 69.509s] 5 test(s) running: 0105_transactions_mock 0106_cgrp_sess_timeout 0116_kafkaconsumer_close 0128_sasl_callback_queue 0131_connect_timeout 3: [0131_connect_timeout / 8.006s] 0131_connect_timeout: duration 8005.903ms 3: [0131_connect_timeout / 8.006s] ================= Test 0131_connect_timeout PASSED ================= 3: [0128_sasl_callback_queue / 10.011s] [ do_test:64: Use background queue = no: PASS (5.01s) ] 3: [0128_sasl_callback_queue / 10.011s] 0128_sasl_callback_queue: duration 10010.837ms 3: [0128_sasl_callback_queue / 10.011s] ================= Test 0128_sasl_callback_queue PASSED ================= 3: %4|1673491123.755|SESSTMOUT|0106_cgrp_sess_timeout#consumer-201| [thrd:main]: Consumer group session timed out (in join-state steady) after 6000 ms without a successful response from the group coordinator (broker 1, last error was Success): revoking assignment and rejoining group 3: [0106_cgrp_sess_timeout / 57.092s] Assignment is lost, committing 3: [0106_cgrp_sess_timeout / 57.092s] commit() returned: _ASSIGNMENT_LOST 3: [0106_cgrp_sess_timeout / 57.092s] Closing consumer 0106_cgrp_sess_timeout#consumer-201 3: [0106_cgrp_sess_timeout / 57.092s] 0106_cgrp_sess_timeout#consumer-201 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:43497: Disconnected (after 3608ms in state UP) 3: [0106_cgrp_sess_timeout / 57.092s] 0106_cgrp_sess_timeout#consumer-201 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:43497: Connect to ipv4#127.0.0.1:43497 failed: Connection refused (after 0ms in state CONNECT) 3: [0106_cgrp_sess_timeout / 57.092s] 0106_cgrp_sess_timeout#consumer-201 rdkafka error (non-testfatal): Local: Broker transport failure: GroupCoordinator: 127.0.0.1:43497: Connect to ipv4#127.0.0.1:43497 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0106_cgrp_sess_timeout / 57.092s] 0106_cgrp_sess_timeout#consumer-201: Rebalance: _REVOKE_PARTITIONS: 4 partition(s) 3: [0106_cgrp_sess_timeout / 57.092s] UNASSIGN.PARTITIONS: duration 0.051ms 3: [0106_cgrp_sess_timeout / 57.092s] unassign: unassigned current partitions 3: [0106_cgrp_sess_timeout / 57.092s] CONSUMER.CLOSE: duration 0.113ms 3: [0106_cgrp_sess_timeout / 57.093s] [ do_test_commit_on_lost:231: PASS (9.70s) ] 3: [0106_cgrp_sess_timeout / 57.093s] 0106_cgrp_sess_timeout: duration 57092.637ms 3: [0106_cgrp_sess_timeout / 57.093s] ================= Test 0106_cgrp_sess_timeout PASSED ================= 3: [
/ 70.509s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 71.509s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 72.509s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 64.719s] abort_transaction(): duration 4606.095ms 3: [0105_transactions_mock / 64.719s] abort returned success 3: [0105_transactions_mock / 64.719s] [ do_test_txn_endtxn_infinite:901: PASS (11.53s) ] 3: [0105_transactions_mock / 64.719s] [ do_test_txn_broker_down_in_txn:1280: Test coordinator down ] 3: [0105_transactions_mock / 64.719s] Test config file test.conf not found 3: [0105_transactions_mock / 64.719s] Setting test timeout to 60s * 2.7 3: %5|1673491127.039|MOCK|0105_transactions_mock#producer-214| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:34519,127.0.0.1:36025,127.0.0.1:36311 3: [0105_transactions_mock / 64.720s] Created kafka instance 0105_transactions_mock#producer-214 3: [0105_transactions_mock / 64.724s] Starting transaction 3: [0105_transactions_mock / 64.761s] rd_kafka_init_transactions(rk, 5000): duration 37.093ms 3: [0105_transactions_mock / 64.761s] rd_kafka_begin_transaction(rk): duration 0.198ms 3: [0105_transactions_mock / 64.761s] Test config file test.conf not found 3: [0105_transactions_mock / 64.761s] Produce to test [-1]: messages #0..500 3: [0105_transactions_mock / 64.761s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock / 64.761s] PRODUCE: duration 0.403ms 3: [0105_transactions_mock / 64.761s] Bringing down coordinator 1 3: %6|1673491127.081|FAIL|0105_transactions_mock#producer-214| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:34519: Disconnected (after 14ms in state UP) 3: [0105_transactions_mock / 64.762s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:34519: Disconnected (after 14ms in state UP) 3: [0105_transactions_mock / 64.762s] 0105_transactions_mock#producer-214 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:34519: Disconnected (after 14ms in state UP) 3: %3|1673491127.206|FAIL|0105_transactions_mock#producer-214| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:34519: Connect to ipv4#127.0.0.1:34519 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 64.887s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:34519: Connect to ipv4#127.0.0.1:34519 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 64.887s] 0105_transactions_mock#producer-214 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:34519: Connect to ipv4#127.0.0.1:34519 failed: Connection refused (after 0ms in state CONNECT) 3: [
/ 73.509s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: %3|1673491127.748|FAIL|0105_transactions_mock#producer-214| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:34519: Connect to ipv4#127.0.0.1:34519 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 65.428s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:34519: Connect to ipv4#127.0.0.1:34519 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 65.428s] 0105_transactions_mock#producer-214 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:34519: Connect to ipv4#127.0.0.1:34519 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0116_kafkaconsumer_close / 57.437s] Closing with queue 3: [0116_kafkaconsumer_close / 57.437s] Attempting second close 3: [0116_kafkaconsumer_close / 57.438s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=0, close=1, queue=1: PASS (5.02s) ] 3: [0116_kafkaconsumer_close / 57.438s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=1 ] 3: %5|1673491128.494|CONFWARN|MOCK#producer-215| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 57.438s] Setting test timeout to 10s * 2.7 3: [
/ 74.510s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 75.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 67.762s] Test config file test.conf not found 3: [0105_transactions_mock / 67.762s] Produce to test [-1]: messages #500..1000 3: [0105_transactions_mock / 67.763s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock / 67.763s] PRODUCE: duration 0.687ms 3: [
/ 76.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 77.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0116_kafkaconsumer_close / 60.582s] Closing with queue 3: [0116_kafkaconsumer_close / 60.583s] Attempting second close 3: [0116_kafkaconsumer_close / 60.583s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=0, close=1, queue=1: PASS (3.14s) ] 3: [0116_kafkaconsumer_close / 60.583s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=1 ] 3: %5|1673491131.639|CONFWARN|MOCK#producer-218| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 60.583s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 69.763s] Bringing up coordinator 1 3: [
/ 78.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 79.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 71.236s] rd_kafka_commit_transaction(rk, -1): duration 1473.157ms 3: [0105_transactions_mock / 71.237s] [ do_test_txn_broker_down_in_txn:1280: Test coordinator down: PASS (6.52s) ] 3: [0105_transactions_mock / 71.237s] [ do_test_txn_broker_down_in_txn:1280: Test leader down ] 3: [0105_transactions_mock / 71.237s] Test config file test.conf not found 3: [0105_transactions_mock / 71.237s] Setting test timeout to 60s * 2.7 3: %5|1673491133.557|MOCK|0105_transactions_mock#producer-221| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42535,127.0.0.1:35617,127.0.0.1:45861 3: [0105_transactions_mock / 71.238s] Created kafka instance 0105_transactions_mock#producer-221 3: [0105_transactions_mock / 71.241s] Starting transaction 3: [0105_transactions_mock / 71.262s] rd_kafka_init_transactions(rk, 5000): duration 20.981ms 3: [0105_transactions_mock / 71.262s] rd_kafka_begin_transaction(rk): duration 0.011ms 3: [0105_transactions_mock / 71.262s] Test config file test.conf not found 3: [0105_transactions_mock / 71.262s] Produce to test [-1]: messages #0..500 3: [0105_transactions_mock / 71.262s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock / 71.262s] PRODUCE: duration 0.294ms 3: [0105_transactions_mock / 71.262s] Bringing down leader 2 3: %6|1673491133.582|FAIL|0105_transactions_mock#producer-221| [thrd:127.0.0.1:35617/bootstrap]: 127.0.0.1:35617/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 17ms in state UP) 3: [0105_transactions_mock / 71.262s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:35617/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 17ms in state UP) 3: [0105_transactions_mock / 71.262s] 0105_transactions_mock#producer-221 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:35617/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 17ms in state UP) 3: %3|1673491133.698|FAIL|0105_transactions_mock#producer-221| [thrd:127.0.0.1:35617/bootstrap]: 127.0.0.1:35617/2: Connect to ipv4#127.0.0.1:35617 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 71.379s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:35617/2: Connect to ipv4#127.0.0.1:35617 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock / 71.379s] 0105_transactions_mock#producer-221 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:35617/2: Connect to ipv4#127.0.0.1:35617 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1673491134.257|FAIL|0105_transactions_mock#producer-221| [thrd:127.0.0.1:35617/bootstrap]: 127.0.0.1:35617/2: Connect to ipv4#127.0.0.1:35617 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 71.937s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:35617/2: Connect to ipv4#127.0.0.1:35617 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock / 71.937s] 0105_transactions_mock#producer-221 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:35617/2: Connect to ipv4#127.0.0.1:35617 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/ 80.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 81.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 82.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 74.263s] Test config file test.conf not found 3: [0105_transactions_mock / 74.263s] Produce to test [-1]: messages #500..1000 3: [0105_transactions_mock / 74.263s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock / 74.263s] PRODUCE: duration 0.566ms 3: [0116_kafkaconsumer_close / 65.635s] Closing with queue 3: [0116_kafkaconsumer_close / 65.635s] Attempting second close 3: [0116_kafkaconsumer_close / 65.636s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=0, unsubscribe=1, close=1, queue=1: PASS (5.05s) ] 3: [0116_kafkaconsumer_close / 65.636s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=1 ] 3: %5|1673491136.693|CONFWARN|MOCK#producer-222| [thrd:app]: No `bootstrap.servers` configured: client will not be able to connect to Kafka cluster 3: [0116_kafkaconsumer_close / 65.637s] Setting test timeout to 10s * 2.7 3: [
/ 83.511s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [
/ 84.512s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 76.264s] Bringing up leader 2 3: [
/ 85.512s] 2 test(s) running: 0105_transactions_mock 0116_kafkaconsumer_close 3: [0105_transactions_mock / 77.766s] rd_kafka_commit_transaction(rk, -1): duration 1502.537ms 3: [0105_transactions_mock / 77.767s] [ do_test_txn_broker_down_in_txn:1280: Test leader down: PASS (6.53s) ] 3: [0105_transactions_mock / 77.767s] [ do_test_txns_not_supported:1492 ] 3: [0105_transactions_mock / 77.767s] Test config file test.conf not found 3: [0105_transactions_mock / 77.767s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 77.767s] Created kafka instance 0105_transactions_mock#producer-225 3: %1|1673491140.094|TXNERR|0105_transactions_mock#producer-225| [thrd:main]: Fatal transaction error: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 (_UNSUPPORTED_FEATURE) 3: %0|1673491140.094|FATAL|0105_transactions_mock#producer-225| [thrd:main]: Fatal error: Local: Required feature not supported by broker: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 3: [0105_transactions_mock / 77.775s] init_transactions() returned _UNSUPPORTED_FEATURE: Transactions not supported by any of the 1 connected broker(s): requires Apache Kafka broker version >= 0.11.0 3: %6|1673491140.094|FAIL|0105_transactions_mock#producer-225| [thrd:127.0.0.1:35865/bootstrap]: 127.0.0.1:35865/2: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 2ms in state UP) 3: [0105_transactions_mock / 77.783s] [ do_test_txns_not_supported:1492: PASS (0.02s) ] 3: [0105_transactions_mock / 77.783s] [ do_test_txns_send_offsets_concurrent_is_retried:1551 ] 3: [0105_transactions_mock / 77.783s] Test config file test.conf not found 3: [0105_transactions_mock / 77.783s] Setting test timeout to 60s * 2.7 3: %5|1673491140.102|MOCK|0105_transactions_mock#producer-226| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35139,127.0.0.1:37975,127.0.0.1:41301 3: [0105_transactions_mock / 77.783s] Created kafka instance 0105_transactions_mock#producer-226 3: [0105_transactions_mock / 77.791s] rd_kafka_init_transactions(rk, 5000): duration 7.927ms 3: [0105_transactions_mock / 77.791s] rd_kafka_begin_transaction(rk): duration 0.167ms 3: [0105_transactions_mock / 77.791s] 0105_transactions_mock#producer-226: Flushing 1 messages 3: [0116_kafkaconsumer_close / 69.315s] Closing with queue 3: [0116_kafkaconsumer_close / 69.315s] Attempting second close 3: [0116_kafkaconsumer_close / 69.316s] [ do_test_consumer_close:89: Test C++ KafkaConsumer close subscribe=1, unsubscribe=1, close=1, queue=1: PASS (3.68s) ] 3: [0116_kafkaconsumer_close / 69.316s] 0116_kafkaconsumer_close: duration 69316.234ms 3: [0116_kafkaconsumer_close / 69.316s] ================= Test 0116_kafkaconsumer_close PASSED ================= 3: [
/ 86.512s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 78.786s] FLUSH: duration 994.529ms 3: [
/ 87.512s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 79.390s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 603.948ms 3: [0105_transactions_mock / 79.390s] rd_kafka_commit_transaction(rk, 5000): duration 0.252ms 3: [0105_transactions_mock / 79.391s] [ do_test_txns_send_offsets_concurrent_is_retried:1551: PASS (1.61s) ] 3: [0105_transactions_mock / 79.391s] [ do_test_txn_coord_req_destroy:1881 ] 3: [0105_transactions_mock / 79.391s] Test config file test.conf not found 3: [0105_transactions_mock / 79.391s] Setting test timeout to 60s * 2.7 3: %5|1673491141.711|MOCK|0105_transactions_mock#producer-227| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:34555,127.0.0.1:45517,127.0.0.1:40707 3: [0105_transactions_mock / 79.399s] Created kafka instance 0105_transactions_mock#producer-227 3: [0105_transactions_mock / 79.407s] rd_kafka_init_transactions(rk, 5000): duration 7.897ms 3: [0105_transactions_mock / 79.407s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 79.407s] rd_kafka_begin_transaction(rk): duration 0.214ms 3: [0105_transactions_mock / 79.508s] send_offsets_to_transaction() #0: 3: %3|1673491141.970|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:34555/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [3] with 1 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:72892000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491141.970|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:34555/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 88.512s] 1 test(s) running: 0105_transactions_mock 3: [
/ 89.512s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 81.509s] rd_kafka_abort_transaction(rk, 5000): duration 0.407ms 3: [0105_transactions_mock / 81.509s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 81.509s] rd_kafka_begin_transaction(rk): duration 0.235ms 3: [0105_transactions_mock / 81.611s] send_offsets_to_transaction() #1: 3: %3|1673491144.073|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491144.073|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 90.512s] 1 test(s) running: 0105_transactions_mock 3: [
/ 91.512s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 83.611s] rd_kafka_abort_transaction(rk, 5000): duration 0.355ms 3: [0105_transactions_mock / 83.611s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 83.611s] rd_kafka_begin_transaction(rk): duration 0.217ms 3: %3|1673491145.974|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 1): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491145.974|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 83.713s] send_offsets_to_transaction() #2: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 83.713s] send_offsets_to_transaction() #2 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/ 92.512s] 1 test(s) running: 0105_transactions_mock 3: [
/ 93.513s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 85.713s] rd_kafka_abort_transaction(rk, 5000): duration 0.355ms 3: [0105_transactions_mock / 85.713s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 85.713s] rd_kafka_begin_transaction(rk): duration 0.222ms 3: [0105_transactions_mock / 85.815s] send_offsets_to_transaction() #3: 3: %3|1673491148.277|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 2): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491148.277|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 94.513s] 1 test(s) running: 0105_transactions_mock 3: [
/ 95.513s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 87.815s] rd_kafka_abort_transaction(rk, 5000): duration 0.335ms 3: [0105_transactions_mock / 87.815s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 87.816s] rd_kafka_begin_transaction(rk): duration 0.223ms 3: %3|1673491150.177|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:40707/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:72892000,Epoch:0}, base seq 0): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491150.177|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:40707/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 87.917s] send_offsets_to_transaction() #4: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 87.917s] send_offsets_to_transaction() #4 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/ 96.513s] 1 test(s) running: 0105_transactions_mock 3: [
/ 97.513s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 89.917s] rd_kafka_abort_transaction(rk, 5000): duration 0.329ms 3: [0105_transactions_mock / 89.917s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 89.917s] rd_kafka_begin_transaction(rk): duration 0.210ms 3: [0105_transactions_mock / 90.019s] send_offsets_to_transaction() #5: 3: %3|1673491152.480|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:40707/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:72892000,Epoch:0}, base seq 2): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491152.480|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:40707/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/ 98.513s] 1 test(s) running: 0105_transactions_mock 3: [
/ 99.513s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 92.019s] rd_kafka_abort_transaction(rk, 5000): duration 0.396ms 3: [0105_transactions_mock / 92.019s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 92.020s] rd_kafka_begin_transaction(rk): duration 0.218ms 3: %3|1673491154.381|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 4): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491154.381|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 92.121s] send_offsets_to_transaction() #6: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 92.121s] send_offsets_to_transaction() #6 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/100.513s] 1 test(s) running: 0105_transactions_mock 3: [
/101.513s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 94.121s] rd_kafka_abort_transaction(rk, 5000): duration 0.480ms 3: [0105_transactions_mock / 94.121s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 94.122s] rd_kafka_begin_transaction(rk): duration 0.208ms 3: [
/102.514s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 94.224s] send_offsets_to_transaction() #7: 3: %3|1673491156.684|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 6): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491156.684|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/103.514s] 1 test(s) running: 0105_transactions_mock 3: [
/104.514s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 96.224s] rd_kafka_abort_transaction(rk, 5000): duration 0.462ms 3: [0105_transactions_mock / 96.224s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 96.225s] rd_kafka_begin_transaction(rk): duration 0.025ms 3: %3|1673491158.586|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 3): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491158.586|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock / 96.327s] send_offsets_to_transaction() #8: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock / 96.327s] send_offsets_to_transaction() #8 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/105.514s] 1 test(s) running: 0105_transactions_mock 3: [
/106.514s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock / 98.327s] rd_kafka_abort_transaction(rk, 5000): duration 0.433ms 3: [0105_transactions_mock / 98.327s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock / 98.327s] rd_kafka_begin_transaction(rk): duration 0.018ms 3: [0105_transactions_mock / 98.430s] send_offsets_to_transaction() #9: 3: %3|1673491160.890|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:34555/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [3] with 2 message(s) failed: Broker: Topic authorization failed (broker 1 PID{Id:72892000,Epoch:0}, base seq 1): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491160.890|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:34555/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/107.514s] 1 test(s) running: 0105_transactions_mock 3: [
/108.514s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /100.430s] rd_kafka_abort_transaction(rk, 5000): duration 0.427ms 3: [0105_transactions_mock /100.430s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /100.430s] rd_kafka_begin_transaction(rk): duration 0.020ms 3: %3|1673491162.792|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 5): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491162.792|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock /100.533s] send_offsets_to_transaction() #10: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock /100.533s] send_offsets_to_transaction() #10 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/109.514s] 1 test(s) running: 0105_transactions_mock 3: [
/110.514s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /102.533s] rd_kafka_abort_transaction(rk, 5000): duration 0.443ms 3: [0105_transactions_mock /102.533s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /102.533s] rd_kafka_begin_transaction(rk): duration 0.019ms 3: [0105_transactions_mock /102.636s] send_offsets_to_transaction() #11: 3: %3|1673491165.096|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [2] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 8): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491165.096|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/111.514s] 1 test(s) running: 0105_transactions_mock 3: [
/112.515s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /104.636s] rd_kafka_abort_transaction(rk, 5000): duration 0.435ms 3: [0105_transactions_mock /104.636s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /104.636s] rd_kafka_begin_transaction(rk): duration 0.034ms 3: %3|1673491166.998|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 7): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491166.998|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock /104.739s] send_offsets_to_transaction() #12: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock /104.739s] send_offsets_to_transaction() #12 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/113.515s] 1 test(s) running: 0105_transactions_mock 3: [
/114.515s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /106.739s] rd_kafka_abort_transaction(rk, 5000): duration 0.467ms 3: [0105_transactions_mock /106.739s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /106.739s] rd_kafka_begin_transaction(rk): duration 0.031ms 3: [0105_transactions_mock /106.842s] send_offsets_to_transaction() #13: 3: %3|1673491169.302|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [1] with 2 message(s) failed: Broker: Topic authorization failed (broker 2 PID{Id:72892000,Epoch:0}, base seq 9): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491169.302|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:45517/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [
/115.515s] 1 test(s) running: 0105_transactions_mock 3: [
/116.515s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /108.842s] rd_kafka_abort_transaction(rk, 5000): duration 0.433ms 3: [0105_transactions_mock /108.842s] Setting test timeout to 10s * 2.7 3: [0105_transactions_mock /108.842s] rd_kafka_begin_transaction(rk): duration 0.028ms 3: %3|1673491171.204|TXNERR|0105_transactions_mock#producer-227| [thrd:127.0.0.1:40707/bootstrap]: Current transaction failed in state InTransaction: ProduceRequest for mytopic [0] with 2 message(s) failed: Broker: Topic authorization failed (broker 3 PID{Id:72892000,Epoch:0}, base seq 4): current transaction must be aborted (TOPIC_AUTHORIZATION_FAILED) 3: %5|1673491171.204|PARTCNT|0105_transactions_mock#producer-227| [thrd:127.0.0.1:40707/bootstrap]: Topic mytopic partition count changed from 4 to 0 3: [0105_transactions_mock /108.945s] send_offsets_to_transaction() #14: Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [0105_transactions_mock /108.945s] send_offsets_to_transaction() #14 failed (expectedly): Failed to commit offsets to transaction on broker (none): Local: Erroneous state (after 0 ms) 3: [
/117.515s] 1 test(s) running: 0105_transactions_mock 3: [
/118.515s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /110.945s] rd_kafka_abort_transaction(rk, 5000): duration 0.432ms 3: [0105_transactions_mock /110.947s] [ do_test_txn_coord_req_multi_find:2064 ] 3: [0105_transactions_mock /110.947s] Test config file test.conf not found 3: [0105_transactions_mock /110.947s] Setting test timeout to 60s * 2.7 3: %5|1673491173.266|MOCK|0105_transactions_mock#producer-228| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40653,127.0.0.1:46709,127.0.0.1:39347 3: [0105_transactions_mock /110.948s] Created kafka instance 0105_transactions_mock#producer-228 3: [0105_transactions_mock /110.959s] rd_kafka_init_transactions(rk, 5000): duration 8.185ms 3: [0105_transactions_mock /110.959s] rd_kafka_begin_transaction(rk): duration 0.017ms 3: [0105_transactions_mock /110.959s] 0105_transactions_mock#producer-228: Flushing 3 messages 3: [
/119.515s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /111.950s] FLUSH: duration 990.713ms 3: [
/120.515s] 1 test(s) running: 0105_transactions_mock 3: [
/121.516s] 1 test(s) running: 0105_transactions_mock 3: [
/122.516s] 1 test(s) running: 0105_transactions_mock 3: [
/123.516s] 1 test(s) running: 0105_transactions_mock 3: [
/124.266s] on_response_received_cb: 0105_transactions_mock#producer-228: TxnCoordinator/1: brokerid 1, ApiKey 25, CorrId 0, rtt 4004.63ms, not done yet: NO_ERROR 3: %6|1673491178.274|FAIL|0105_transactions_mock#producer-228| [thrd:127.0.0.1:46709/bootstrap]: 127.0.0.1:46709/2: Disconnected (after 4995ms in state UP) 3: [
/124.267s] on_response_received_cb: 0105_transactions_mock#producer-228: TxnCoordinator/1: brokerid 1, ApiKey 25, CorrId 0, rtt 4004.63ms, not done yet: NO_ERROR 3: %6|1673491178.275|FAIL|0105_transactions_mock#producer-228| [thrd:127.0.0.1:39347/bootstrap]: 127.0.0.1:39347/3: Disconnected (after 5000ms in state UP) 3: %3|1673491178.275|FAIL|0105_transactions_mock#producer-228| [thrd:127.0.0.1:39347/bootstrap]: 127.0.0.1:39347/3: Connect to ipv4#127.0.0.1:39347 failed: Connection refused (after 0ms in state CONNECT) 3: [
/124.516s] 1 test(s) running: 0105_transactions_mock 3: [
/125.516s] 1 test(s) running: 0105_transactions_mock 3: [
/126.516s] 1 test(s) running: 0105_transactions_mock 3: [
/127.516s] 1 test(s) running: 0105_transactions_mock 3: [
/128.516s] 1 test(s) running: 0105_transactions_mock 3: [
/129.516s] 1 test(s) running: 0105_transactions_mock 3: [
/130.517s] 1 test(s) running: 0105_transactions_mock 3: [
/131.517s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /123.964s] send_offsets_to_transaction() 3: [
/132.517s] 1 test(s) running: 0105_transactions_mock 3: [
/133.517s] 1 test(s) running: 0105_transactions_mock 3: [
/134.517s] 1 test(s) running: 0105_transactions_mock 3: [
/135.517s] 1 test(s) running: 0105_transactions_mock 3: [
/136.517s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /128.964s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:46709/2: Disconnected (after 4995ms in state UP) 3: [0105_transactions_mock /128.964s] 0105_transactions_mock#producer-228 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:46709/2: Disconnected (after 4995ms in state UP) 3: [0105_transactions_mock /128.964s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:39347/3: Disconnected (after 5000ms in state UP) 3: [0105_transactions_mock /128.964s] 0105_transactions_mock#producer-228 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:39347/3: Disconnected (after 5000ms in state UP) 3: [0105_transactions_mock /128.964s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:39347/3: Connect to ipv4#127.0.0.1:39347 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /128.964s] 0105_transactions_mock#producer-228 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:39347/3: Connect to ipv4#127.0.0.1:39347 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /128.964s] rd_kafka_commit_transaction(rk, 5000): duration 0.384ms 3: [0105_transactions_mock /128.965s] [ do_test_txn_coord_req_multi_find:2064: PASS (18.02s) ] 3: [0105_transactions_mock /128.965s] [ do_test_txn_addparts_req_multi:2209 ] 3: [0105_transactions_mock /128.965s] Test config file test.conf not found 3: [0105_transactions_mock /128.965s] Setting test timeout to 60s * 2.7 3: %5|1673491191.285|MOCK|0105_transactions_mock#producer-229| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:36295,127.0.0.1:43261,127.0.0.1:45899 3: [0105_transactions_mock /128.966s] Created kafka instance 0105_transactions_mock#producer-229 3: [0105_transactions_mock /128.990s] rd_kafka_init_transactions(rk, 5000): duration 21.033ms 3: [0105_transactions_mock /128.990s] Running seed transaction 3: [0105_transactions_mock /128.990s] rd_kafka_begin_transaction(rk): duration 0.016ms 3: [0105_transactions_mock /128.990s] rd_kafka_producev(rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { void * __t __attribute__((unused)) = ("seed"); size_t __t2 __attribute__((unused)) = (4); } RD_KAFKA_VTYPE_VALUE; }), (void *)"seed", (size_t)4, RD_KAFKA_VTYPE_END): duration 0.016ms 3: [
/137.517s] 1 test(s) running: 0105_transactions_mock 3: [
/138.279s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 0.04ms, count 0: NO_ERROR 3: [
/138.279s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 0.04ms, count 1: NO_ERROR 3: [0105_transactions_mock /129.968s] rd_kafka_commit_transaction(rk, 5000): duration 978.466ms 3: [0105_transactions_mock /129.968s] Running test transaction 3: [0105_transactions_mock /129.968s] rd_kafka_begin_transaction(rk): duration 0.026ms 3: [0105_transactions_mock /129.968s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.006ms 3: [
/138.517s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /130.468s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (1); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)1, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.035ms 3: [0105_transactions_mock /130.468s] Waiting for two AddPartitionsToTxnResponse 3: [
/139.285s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.37ms, count 0: NO_ERROR 3: [
/139.285s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.37ms, count 1: NO_ERROR 3: [0105_transactions_mock /130.982s] 2 AddPartitionsToTxnResponses seen 3: [0105_transactions_mock /130.982s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)2, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.026ms 3: [
/139.518s] 1 test(s) running: 0105_transactions_mock 3: [
/140.291s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.89ms, count 2: NO_ERROR 3: [
/140.291s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.89ms, count 3: NO_ERROR 3: [
/140.518s] 1 test(s) running: 0105_transactions_mock 3: [
/141.297s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.83ms, count 4: NO_ERROR 3: [
/141.297s] on_response_received_cb: 0105_transactions_mock#producer-229: TxnCoordinator/2: brokerid 2, ApiKey 24, CorrId 0, rtt 1004.83ms, count 5: NO_ERROR 3: [
/141.518s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /133.992s] rd_kafka_commit_transaction(rk, 10 * 1000): duration 2009.675ms 3: [0105_transactions_mock /133.992s] [ do_test_txn_addparts_req_multi:2209: PASS (5.03s) ] 3: [0105_transactions_mock /133.992s] [ do_test_txns_no_timeout_crash:1615 ] 3: [0105_transactions_mock /133.993s] Test config file test.conf not found 3: [0105_transactions_mock /133.993s] Setting test timeout to 60s * 2.7 3: %5|1673491196.312|MOCK|0105_transactions_mock#producer-230| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43017,127.0.0.1:46701,127.0.0.1:38045 3: [0105_transactions_mock /133.993s] Created kafka instance 0105_transactions_mock#producer-230 3: [0105_transactions_mock /134.037s] rd_kafka_init_transactions(rk, 5000): duration 43.450ms 3: [0105_transactions_mock /134.037s] rd_kafka_begin_transaction(rk): duration 0.078ms 3: [0105_transactions_mock /134.037s] 0105_transactions_mock#producer-230: Flushing 1 messages 3: [
/142.518s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /134.996s] FLUSH: duration 959.410ms 3: [
/143.518s] 1 test(s) running: 0105_transactions_mock 3: %5|1673491198.320|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out AddOffsetsToTxnRequest in flight (after 1004ms, timeout #0) 3: %4|1673491198.320|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1673491198.320|FAIL|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43017: 1 request(s) timed out: disconnect (after 1967ms in state UP) 3: [
/144.518s] 1 test(s) running: 0105_transactions_mock 3: %5|1673491199.325|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %5|1673491199.325|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43017/bootstrap]: 127.0.0.1:43017/1: Timed out MetadataRequest in flight (after 1004ms, timeout #0) 3: %4|1673491199.325|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43017/bootstrap]: 127.0.0.1:43017/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %4|1673491199.325|FAIL|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43017: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1673491199.325|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1673491199.325|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43017/bootstrap]: 127.0.0.1:43017/1: 1 request(s) timed out: disconnect (after 2992ms in state UP) 3: [
/145.518s] 1 test(s) running: 0105_transactions_mock 3: %5|1673491200.330|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: Timed out FindCoordinatorRequest in flight (after 1959ms, timeout #0) 3: %5|1673491200.330|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43017/bootstrap]: 127.0.0.1:43017/1: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1673491200.330|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43017/bootstrap]: 127.0.0.1:43017/1: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1673491200.330|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:43017/bootstrap]: 127.0.0.1:43017/1: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %5|1673491200.330|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: Timed out MetadataRequest in flight (after 1004ms, timeout #1) 3: %5|1673491200.330|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: Timed out MetadataRequest in flight (after 1004ms, timeout #2) 3: %4|1673491200.330|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: Timed out 3 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %3|1673491200.330|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: 3 request(s) timed out: disconnect (after 3014ms in state UP) 3: [
/146.518s] 1 test(s) running: 0105_transactions_mock 3: %5|1673491201.335|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %5|1673491201.335|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:38045/bootstrap]: 127.0.0.1:38045/3: Timed out ApiVersionRequest in flight (after 1004ms, timeout #0) 3: %4|1673491201.335|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1673491201.335|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:46701/bootstrap]: 127.0.0.1:46701/2: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %4|1673491201.335|FAIL|0105_transactions_mock#producer-230| [thrd:127.0.0.1:38045/bootstrap]: 127.0.0.1:38045/3: ApiVersionRequest failed: Local: Timed out: probably due to broker version < 0.10 (see api.version.request configuration) (after 1004ms in state APIVERSION_QUERY) 3: %4|1673491201.335|REQTMOUT|0105_transactions_mock#producer-230| [thrd:127.0.0.1:38045/bootstrap]: 127.0.0.1:38045/3: Timed out 1 in-flight, 0 retry-queued, 0 out-queue, 0 partially-sent requests 3: %4|1673491201.448|REQTMOUT|0105_transactions_mock#producer-230| [thrd:TxnCoordinator]: TxnCoordinator: Timed out 0 in-flight, 0 retry-queued, 1 out-queue, 0 partially-sent requests 3: [
/147.518s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /139.996s] send_offsets..() failed with retriable error: Transactional operation timed out 3: [0105_transactions_mock /139.997s] Retrying send_offsets..() 3: %3|1673491202.331|ADDOFFSETS|0105_transactions_mock#producer-230| [thrd:main]: TxnCoordinator/1: Failed to add offsets to transaction on broker TxnCoordinator/1: Local: Outdated 3: [0105_transactions_mock /140.114s] [ do_test_txns_no_timeout_crash:1615: PASS (6.12s) ] 3: [0105_transactions_mock /140.114s] [ do_test_txn_auth_failure:1690: ApiKey=InitProducerId ErrorCode=CLUSTER_AUTHORIZATION_FAILED ] 3: [0105_transactions_mock /140.114s] Test config file test.conf not found 3: [0105_transactions_mock /140.114s] Setting test timeout to 60s * 2.7 3: %5|1673491202.434|MOCK|0105_transactions_mock#producer-231| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:44659,127.0.0.1:46813,127.0.0.1:39651 3: [0105_transactions_mock /140.118s] Created kafka instance 0105_transactions_mock#producer-231 3: %1|1673491202.458|TXNERR|0105_transactions_mock#producer-231| [thrd:main]: Fatal transaction error: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed (CLUSTER_AUTHORIZATION_FAILED) 3: %0|1673491202.458|FATAL|0105_transactions_mock#producer-231| [thrd:main]: Fatal error: Broker: Cluster authorization failed: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed 3: [0105_transactions_mock /140.139s] init_transactions() failed: CLUSTER_AUTHORIZATION_FAILED: Failed to acquire transactional PID from broker TxnCoordinator/1: Broker: Cluster authorization failed 3: [0105_transactions_mock /140.151s] [ do_test_txn_auth_failure:1690: ApiKey=InitProducerId ErrorCode=CLUSTER_AUTHORIZATION_FAILED: PASS (0.04s) ] 3: [0105_transactions_mock /140.151s] [ do_test_txn_auth_failure:1690: ApiKey=FindCoordinator ErrorCode=CLUSTER_AUTHORIZATION_FAILED ] 3: [0105_transactions_mock /140.151s] Test config file test.conf not found 3: [0105_transactions_mock /140.151s] Setting test timeout to 60s * 2.7 3: %5|1673491202.470|MOCK|0105_transactions_mock#producer-232| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:42423,127.0.0.1:37559,127.0.0.1:35093 3: [0105_transactions_mock /140.151s] Created kafka instance 0105_transactions_mock#producer-232 3: %1|1673491202.495|TXNERR|0105_transactions_mock#producer-232| [thrd:main]: Fatal transaction error: Failed to find transaction coordinator: 127.0.0.1:35093/3: Broker: Cluster authorization failed: Broker: Cluster authorization failed (CLUSTER_AUTHORIZATION_FAILED) 3: %0|1673491202.495|FATAL|0105_transactions_mock#producer-232| [thrd:main]: Fatal error: Broker: Cluster authorization failed: Failed to find transaction coordinator: 127.0.0.1:35093/3: Broker: Cluster authorization failed: Broker: Cluster authorization failed 3: [0105_transactions_mock /140.176s] init_transactions() failed: CLUSTER_AUTHORIZATION_FAILED: Failed to find transaction coordinator: 127.0.0.1:35093/3: Broker: Cluster authorization failed: Broker: Cluster authorization failed 3: [0105_transactions_mock /140.187s] [ do_test_txn_auth_failure:1690: ApiKey=FindCoordinator ErrorCode=CLUSTER_AUTHORIZATION_FAILED: PASS (0.04s) ] 3: [0105_transactions_mock /140.187s] [ do_test_txn_flush_timeout:1737 ] 3: [0105_transactions_mock /140.187s] Test config file test.conf not found 3: [0105_transactions_mock /140.187s] Setting test timeout to 60s * 2.7 3: %5|1673491202.507|MOCK|0105_transactions_mock#producer-233| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:40795,127.0.0.1:43471,127.0.0.1:43121 3: [0105_transactions_mock /140.188s] Created kafka instance 0105_transactions_mock#producer-233 3: [
/148.523s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /140.236s] rd_kafka_init_transactions(rk, 5000): duration 45.281ms 3: [0105_transactions_mock /140.241s] rd_kafka_begin_transaction(rk): duration 1.614ms 3: [0105_transactions_mock /140.241s] Test config file test.conf not found 3: [0105_transactions_mock /140.241s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /140.241s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /140.241s] PRODUCE: duration 0.071ms 3: [0105_transactions_mock /140.241s] Test config file test.conf not found 3: [0105_transactions_mock /140.241s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /140.241s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /140.241s] PRODUCE: duration 0.061ms 3: [0105_transactions_mock /140.241s] Test config file test.conf not found 3: [0105_transactions_mock /140.241s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /140.241s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /140.241s] PRODUCE: duration 0.077ms 3: [0105_transactions_mock /140.258s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 16.078ms 3: [
/149.523s] 1 test(s) running: 0105_transactions_mock 3: [
/150.523s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /142.258s] Disconnecting transaction coordinator 2 3: %6|1673491204.577|FAIL|0105_transactions_mock#producer-233| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43471: Disconnected (after 2022ms in state UP) 3: %3|1673491204.578|FAIL|0105_transactions_mock#producer-233| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43471: Connect to ipv4#127.0.0.1:43471 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /142.258s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:43471: Disconnected (after 2022ms in state UP) 3: [0105_transactions_mock /142.258s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:43471: Disconnected (after 2022ms in state UP) 3: [0105_transactions_mock /142.258s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:43471: Connect to ipv4#127.0.0.1:43471 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /142.258s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:43471: Connect to ipv4#127.0.0.1:43471 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1673491204.753|FAIL|0105_transactions_mock#producer-233| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43471: Connect to ipv4#127.0.0.1:43471 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /142.433s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:43471: Connect to ipv4#127.0.0.1:43471 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /142.433s] 0105_transactions_mock#producer-233 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:43471: Connect to ipv4#127.0.0.1:43471 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/151.523s] 1 test(s) running: 0105_transactions_mock 3: [
/152.523s] 1 test(s) running: 0105_transactions_mock 3: [
/153.524s] 1 test(s) running: 0105_transactions_mock 3: [
/154.524s] 1 test(s) running: 0105_transactions_mock 3: [
/155.524s] 1 test(s) running: 0105_transactions_mock 3: [
/156.524s] 1 test(s) running: 0105_transactions_mock 3: [
/157.524s] 1 test(s) running: 0105_transactions_mock 3: [
/158.524s] 1 test(s) running: 0105_transactions_mock 3: %3|1673491212.561|TXNERR|0105_transactions_mock#producer-233| [thrd:main]: Current transaction failed in state BeginCommit: 300 message(s) failed delivery (see individual delivery reports) (_INCONSISTENT) 3: [0105_transactions_mock /150.242s] commit_transaction() failed (expectedly): 300 message(s) failed delivery (see individual delivery reports) 3: [
/159.524s] 1 test(s) running: 0105_transactions_mock 3: [
/160.524s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /152.242s] Aborting and retrying 3: [0105_transactions_mock /152.242s] rd_kafka_abort_transaction(rk, 60000): duration 0.329ms 3: [0105_transactions_mock /152.242s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock /152.242s] Test config file test.conf not found 3: [0105_transactions_mock /152.242s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /152.242s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /152.242s] PRODUCE: duration 0.089ms 3: [0105_transactions_mock /152.242s] Test config file test.conf not found 3: [0105_transactions_mock /152.242s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /152.242s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /152.242s] PRODUCE: duration 0.074ms 3: [0105_transactions_mock /152.242s] Test config file test.conf not found 3: [0105_transactions_mock /152.242s] Produce to myTopic [0]: messages #0..100 3: [0105_transactions_mock /152.242s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /152.242s] PRODUCE: duration 0.085ms 3: [0105_transactions_mock /152.244s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 1.824ms 3: [
/161.524s] 1 test(s) running: 0105_transactions_mock 3: [
/162.525s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /154.245s] [ do_test_txn_flush_timeout:1737: PASS (14.06s) ] 3: [0105_transactions_mock /154.245s] [ do_test_unstable_offset_commit:2320 ] 3: [0105_transactions_mock /154.245s] Test config file test.conf not found 3: [0105_transactions_mock /154.245s] Setting test timeout to 60s * 2.7 3: %5|1673491216.565|MOCK|0105_transactions_mock#producer-234| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:37905,127.0.0.1:36379,127.0.0.1:40393 3: [0105_transactions_mock /154.246s] Created kafka instance 0105_transactions_mock#producer-234 3: [0105_transactions_mock /154.246s] Test config file test.conf not found 3: [0105_transactions_mock /154.246s] Created kafka instance 0105_transactions_mock#consumer-235 3: [0105_transactions_mock /154.263s] rd_kafka_init_transactions(rk, -1): duration 16.913ms 3: [0105_transactions_mock /154.263s] rd_kafka_begin_transaction(rk): duration 0.010ms 3: [0105_transactions_mock /154.263s] Test config file test.conf not found 3: [0105_transactions_mock /154.263s] Produce to mytopic [0]: messages #0..100 3: [0105_transactions_mock /154.263s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /154.263s] PRODUCE: duration 0.067ms 3: [0105_transactions_mock /154.274s] rd_kafka_commit_transaction(rk, -1): duration 10.096ms 3: [0105_transactions_mock /154.275s] rd_kafka_commit(c, offsets, 0 ): duration 1.109ms 3: [0105_transactions_mock /154.878s] #0: committed() returned NO_ERROR (expected NO_ERROR) 3: [0105_transactions_mock /155.079s] #1: committed() returned _TIMED_OUT (expected _TIMED_OUT) 3: [0105_transactions_mock /155.079s] Phase 2: OffsetFetch lookup through assignment 3: [0105_transactions_mock /155.079s] INCREMENTAL.ASSIGN.PARTITIONS: duration 0.058ms 3: [0105_transactions_mock /155.079s] assign: incremental assign of 1 partition(s) done 3: [0105_transactions_mock /155.079s] consume: consume exactly 50 messages 3: [
/163.525s] 1 test(s) running: 0105_transactions_mock 3: [
/164.525s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /156.487s] mytopic [0] reached EOF at offset 100 3: [0105_transactions_mock /156.487s] CONSUME: duration 1408.045ms 3: [0105_transactions_mock /156.487s] consume: consumed 50/50 messages (1/1 EOFs) 3: [0105_transactions_mock /156.488s] [ do_test_unstable_offset_commit:2320: PASS (2.24s) ] 3: [0105_transactions_mock /156.488s] [ do_test_commit_after_msg_timeout:2447 ] 3: [0105_transactions_mock /156.488s] Test config file test.conf not found 3: [0105_transactions_mock /156.488s] Setting test timeout to 60s * 2.7 3: %5|1673491218.808|MOCK|0105_transactions_mock#producer-236| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43165,127.0.0.1:43623,127.0.0.1:41201 3: [0105_transactions_mock /156.489s] Created kafka instance 0105_transactions_mock#producer-236 3: [0105_transactions_mock /156.492s] Starting transaction 3: [0105_transactions_mock /156.500s] rd_kafka_init_transactions(rk, -1): duration 8.161ms 3: [0105_transactions_mock /156.500s] rd_kafka_begin_transaction(rk): duration 0.023ms 3: [0105_transactions_mock /156.500s] Bringing down 2 3: %6|1673491218.820|FAIL|0105_transactions_mock#producer-236| [thrd:127.0.0.1:43165/bootstrap]: 127.0.0.1:43165/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 8ms in state UP) 3: [0105_transactions_mock /156.500s] Test config file test.conf not found 3: [0105_transactions_mock /156.500s] Produce to test [0]: messages #0..1 3: [0105_transactions_mock /156.500s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:43165/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 8ms in state UP) 3: [0105_transactions_mock /156.500s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:43165/1: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 8ms in state UP) 3: [0105_transactions_mock /156.500s] SUM(POLL): duration 0.009ms 3: [0105_transactions_mock /156.500s] PRODUCE: duration 0.016ms 3: %6|1673491218.820|FAIL|0105_transactions_mock#producer-236| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43165: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock /156.500s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:43165: Disconnected (after 0ms in state UP) 3: [0105_transactions_mock /156.500s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:43165: Disconnected (after 0ms in state UP) 3: %3|1673491218.941|FAIL|0105_transactions_mock#producer-236| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43165: Connect to ipv4#127.0.0.1:43165 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /156.621s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:43165: Connect to ipv4#127.0.0.1:43165 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /156.621s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:43165: Connect to ipv4#127.0.0.1:43165 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1673491219.407|FAIL|0105_transactions_mock#producer-236| [thrd:TxnCoordinator]: TxnCoordinator: 127.0.0.1:43165: Connect to ipv4#127.0.0.1:43165 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /157.088s] Ignoring allowed error: _TRANSPORT: TxnCoordinator: 127.0.0.1:43165: Connect to ipv4#127.0.0.1:43165 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /157.088s] 0105_transactions_mock#producer-236 rdkafka error (non-testfatal): Local: Broker transport failure: TxnCoordinator: 127.0.0.1:43165: Connect to ipv4#127.0.0.1:43165 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/165.525s] 1 test(s) running: 0105_transactions_mock 3: [
/166.525s] 1 test(s) running: 0105_transactions_mock 3: [
/167.525s] 1 test(s) running: 0105_transactions_mock 3: [
/168.525s] 1 test(s) running: 0105_transactions_mock 3: [
/169.525s] 1 test(s) running: 0105_transactions_mock 3: [
/170.525s] 1 test(s) running: 0105_transactions_mock 3: [
/171.526s] 1 test(s) running: 0105_transactions_mock 3: [
/172.526s] 1 test(s) running: 0105_transactions_mock 3: [
/173.526s] 1 test(s) running: 0105_transactions_mock 3: [
/174.526s] 1 test(s) running: 0105_transactions_mock 3: %3|1673491229.319|TXNERR|0105_transactions_mock#producer-236| [thrd:127.0.0.1:43623/bootstrap]: Current transaction failed in state BeginCommit: 1 message(s) timed out on test [0] (_TIMED_OUT, requires epoch bump) 3: [0105_transactions_mock /166.999s] commit_transaction() failed (as expected): 1 message(s) timed out on test [0] 3: [0105_transactions_mock /166.999s] Aborting transaction 3: [
/175.526s] 1 test(s) running: 0105_transactions_mock 3: [
/176.526s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /168.748s] rd_kafka_abort_transaction(rk, -1): duration 1748.694ms 3: [0105_transactions_mock /168.748s] Attempting second transaction, which should succeed 3: [0105_transactions_mock /168.748s] rd_kafka_begin_transaction(rk): duration 0.028ms 3: [0105_transactions_mock /168.748s] Test config file test.conf not found 3: [0105_transactions_mock /168.748s] Produce to test [0]: messages #0..1 3: [0105_transactions_mock /168.748s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /168.748s] PRODUCE: duration 0.012ms 3: [0105_transactions_mock /168.751s] rd_kafka_commit_transaction(rk, -1): duration 2.466ms 3: [0105_transactions_mock /168.751s] [ do_test_commit_after_msg_timeout:2447: PASS (12.26s) ] 3: [0105_transactions_mock /168.751s] Setting test timeout to 200s * 2.7 3: [0105_transactions_mock /168.751s] [ do_test_txn_switch_coordinator:1366: Test switching coordinators ] 3: [0105_transactions_mock /168.751s] Test config file test.conf not found 3: [0105_transactions_mock /168.751s] Setting test timeout to 60s * 2.7 3: %5|1673491231.071|MOCK|0105_transactions_mock#producer-237| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:39719,127.0.0.1:33897,127.0.0.1:44149,127.0.0.1:38707,127.0.0.1:34661 3: [0105_transactions_mock /168.755s] Created kafka instance 0105_transactions_mock#producer-237 3: [0105_transactions_mock /168.762s] Starting transaction 3: [0105_transactions_mock /168.785s] rd_kafka_init_transactions(rk, 5000): duration 22.958ms 3: [0105_transactions_mock /168.785s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /168.789s] rd_kafka_begin_transaction(rk): duration 2.427ms 3: [0105_transactions_mock /168.789s] Test config file test.conf not found 3: [0105_transactions_mock /168.789s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /168.789s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /168.789s] PRODUCE: duration 0.043ms 3: [
/177.526s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /169.294s] PRODUCE.DELIVERY.WAIT: duration 505.471ms 3: [0105_transactions_mock /169.294s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /169.294s] Test config file test.conf not found 3: [0105_transactions_mock /169.294s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /169.294s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /169.294s] PRODUCE: duration 0.083ms 3: [0105_transactions_mock /169.294s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /169.878s] rd_kafka_commit_transaction(rk, -1): duration 583.395ms 3: [0105_transactions_mock /169.878s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /169.878s] rd_kafka_begin_transaction(rk): duration 0.118ms 3: [0105_transactions_mock /169.878s] Test config file test.conf not found 3: [0105_transactions_mock /169.878s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /169.878s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /169.878s] PRODUCE: duration 0.093ms 3: [
/178.526s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /170.382s] PRODUCE.DELIVERY.WAIT: duration 503.969ms 3: [0105_transactions_mock /170.382s] Test config file test.conf not found 3: [0105_transactions_mock /170.382s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /170.382s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /170.382s] PRODUCE: duration 0.082ms 3: [0105_transactions_mock /170.382s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /170.879s] rd_kafka_abort_transaction(rk, -1): duration 496.274ms 3: [0105_transactions_mock /170.879s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /170.879s] rd_kafka_begin_transaction(rk): duration 0.105ms 3: [0105_transactions_mock /170.879s] Test config file test.conf not found 3: [0105_transactions_mock /170.879s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /170.879s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /170.879s] PRODUCE: duration 0.082ms 3: [
/179.526s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /171.382s] PRODUCE.DELIVERY.WAIT: duration 502.872ms 3: [0105_transactions_mock /171.382s] Test config file test.conf not found 3: [0105_transactions_mock /171.382s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /171.382s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /171.382s] PRODUCE: duration 0.075ms 3: [0105_transactions_mock /171.382s] rd_kafka_abort_transaction(rk, -1): duration 0.237ms 3: [0105_transactions_mock /171.382s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /171.382s] rd_kafka_begin_transaction(rk): duration 0.027ms 3: [0105_transactions_mock /171.382s] Test config file test.conf not found 3: [0105_transactions_mock /171.382s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /171.382s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /171.382s] PRODUCE: duration 0.070ms 3: [0105_transactions_mock /171.885s] PRODUCE.DELIVERY.WAIT: duration 502.431ms 3: [0105_transactions_mock /171.885s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /171.885s] Test config file test.conf not found 3: [0105_transactions_mock /171.885s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /171.885s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /171.885s] PRODUCE: duration 0.090ms 3: [0105_transactions_mock /171.885s] Changing transaction coordinator from 4 to 5 3: [
/180.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /172.379s] rd_kafka_abort_transaction(rk, -1): duration 494.432ms 3: [0105_transactions_mock /172.379s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /172.380s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock /172.380s] Test config file test.conf not found 3: [0105_transactions_mock /172.380s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /172.380s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /172.380s] PRODUCE: duration 0.091ms 3: [0105_transactions_mock /172.882s] PRODUCE.DELIVERY.WAIT: duration 502.274ms 3: [0105_transactions_mock /172.882s] Test config file test.conf not found 3: [0105_transactions_mock /172.882s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /172.882s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /172.882s] PRODUCE: duration 0.073ms 3: [0105_transactions_mock /172.882s] rd_kafka_abort_transaction(rk, -1): duration 0.233ms 3: [0105_transactions_mock /172.882s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /172.882s] rd_kafka_begin_transaction(rk): duration 0.033ms 3: [0105_transactions_mock /172.882s] Test config file test.conf not found 3: [0105_transactions_mock /172.882s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /172.883s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /172.883s] PRODUCE: duration 0.068ms 3: [
/181.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /173.385s] PRODUCE.DELIVERY.WAIT: duration 502.264ms 3: [0105_transactions_mock /173.385s] Test config file test.conf not found 3: [0105_transactions_mock /173.385s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /173.385s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /173.385s] PRODUCE: duration 0.071ms 3: [0105_transactions_mock /173.385s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /173.881s] rd_kafka_commit_transaction(rk, -1): duration 495.623ms 3: [0105_transactions_mock /173.881s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /173.881s] rd_kafka_begin_transaction(rk): duration 0.086ms 3: [0105_transactions_mock /173.881s] Test config file test.conf not found 3: [0105_transactions_mock /173.881s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /173.881s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /173.881s] PRODUCE: duration 0.097ms 3: [
/182.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /174.383s] PRODUCE.DELIVERY.WAIT: duration 502.391ms 3: [0105_transactions_mock /174.383s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /174.383s] Test config file test.conf not found 3: [0105_transactions_mock /174.383s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /174.383s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /174.383s] PRODUCE: duration 0.096ms 3: [0105_transactions_mock /174.882s] rd_kafka_abort_transaction(rk, -1): duration 498.125ms 3: [0105_transactions_mock /174.882s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /174.882s] rd_kafka_begin_transaction(rk): duration 0.161ms 3: [0105_transactions_mock /174.882s] Test config file test.conf not found 3: [0105_transactions_mock /174.882s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /174.882s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /174.882s] PRODUCE: duration 0.083ms 3: [
/183.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /175.384s] PRODUCE.DELIVERY.WAIT: duration 502.401ms 3: [0105_transactions_mock /175.384s] Test config file test.conf not found 3: [0105_transactions_mock /175.384s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /175.385s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /175.385s] PRODUCE: duration 0.085ms 3: [0105_transactions_mock /175.385s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /175.883s] rd_kafka_abort_transaction(rk, -1): duration 498.040ms 3: [0105_transactions_mock /175.883s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /175.883s] rd_kafka_begin_transaction(rk): duration 0.116ms 3: [0105_transactions_mock /175.883s] Test config file test.conf not found 3: [0105_transactions_mock /175.883s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /175.883s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /175.883s] PRODUCE: duration 0.097ms 3: [
/184.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /176.385s] PRODUCE.DELIVERY.WAIT: duration 502.348ms 3: [0105_transactions_mock /176.385s] Test config file test.conf not found 3: [0105_transactions_mock /176.385s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /176.385s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /176.385s] PRODUCE: duration 0.091ms 3: [0105_transactions_mock /176.385s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /176.885s] rd_kafka_abort_transaction(rk, -1): duration 499.020ms 3: [0105_transactions_mock /176.885s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /176.885s] rd_kafka_begin_transaction(rk): duration 0.036ms 3: [0105_transactions_mock /176.885s] Test config file test.conf not found 3: [0105_transactions_mock /176.885s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /176.885s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /176.885s] PRODUCE: duration 0.085ms 3: [
/185.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /177.389s] PRODUCE.DELIVERY.WAIT: duration 503.784ms 3: [0105_transactions_mock /177.389s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /177.389s] Test config file test.conf not found 3: [0105_transactions_mock /177.389s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /177.389s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /177.389s] PRODUCE: duration 0.084ms 3: [0105_transactions_mock /177.389s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /177.991s] rd_kafka_abort_transaction(rk, -1): duration 602.415ms 3: [0105_transactions_mock /177.991s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /177.991s] rd_kafka_begin_transaction(rk): duration 0.044ms 3: [0105_transactions_mock /177.992s] Test config file test.conf not found 3: [0105_transactions_mock /177.992s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /177.992s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /177.992s] PRODUCE: duration 0.091ms 3: [
/186.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /178.495s] PRODUCE.DELIVERY.WAIT: duration 503.761ms 3: [0105_transactions_mock /178.495s] Test config file test.conf not found 3: [0105_transactions_mock /178.495s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /178.495s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /178.495s] PRODUCE: duration 0.078ms 3: [0105_transactions_mock /178.496s] rd_kafka_commit_transaction(rk, -1): duration 0.358ms 3: [0105_transactions_mock /178.496s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /178.496s] rd_kafka_begin_transaction(rk): duration 0.038ms 3: [0105_transactions_mock /178.496s] Test config file test.conf not found 3: [0105_transactions_mock /178.496s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /178.496s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /178.496s] PRODUCE: duration 0.073ms 3: [0105_transactions_mock /179.000s] PRODUCE.DELIVERY.WAIT: duration 503.731ms 3: [0105_transactions_mock /179.000s] Test config file test.conf not found 3: [0105_transactions_mock /179.000s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /179.000s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /179.000s] PRODUCE: duration 0.078ms 3: [0105_transactions_mock /179.000s] Changing transaction coordinator from 4 to 5 3: [
/187.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /179.602s] rd_kafka_abort_transaction(rk, -1): duration 601.744ms 3: [0105_transactions_mock /179.602s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /179.602s] rd_kafka_begin_transaction(rk): duration 0.044ms 3: [0105_transactions_mock /179.602s] Test config file test.conf not found 3: [0105_transactions_mock /179.602s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /179.602s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /179.602s] PRODUCE: duration 0.073ms 3: [0105_transactions_mock /180.105s] PRODUCE.DELIVERY.WAIT: duration 502.536ms 3: [0105_transactions_mock /180.105s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /180.105s] Test config file test.conf not found 3: [0105_transactions_mock /180.105s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /180.105s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /180.105s] PRODUCE: duration 0.088ms 3: [
/188.527s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /180.603s] rd_kafka_abort_transaction(rk, -1): duration 498.060ms 3: [0105_transactions_mock /180.603s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /180.603s] rd_kafka_begin_transaction(rk): duration 0.056ms 3: [0105_transactions_mock /180.603s] Test config file test.conf not found 3: [0105_transactions_mock /180.603s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /180.603s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /180.603s] PRODUCE: duration 0.075ms 3: [0105_transactions_mock /181.105s] PRODUCE.DELIVERY.WAIT: duration 502.405ms 3: [0105_transactions_mock /181.106s] Test config file test.conf not found 3: [0105_transactions_mock /181.106s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /181.106s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /181.106s] PRODUCE: duration 0.076ms 3: [0105_transactions_mock /181.106s] Changing transaction coordinator from 3 to 4 3: [
/189.528s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /181.604s] rd_kafka_abort_transaction(rk, -1): duration 498.029ms 3: [0105_transactions_mock /181.604s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /181.604s] rd_kafka_begin_transaction(rk): duration 0.111ms 3: [0105_transactions_mock /181.604s] Test config file test.conf not found 3: [0105_transactions_mock /181.604s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /181.604s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /181.604s] PRODUCE: duration 0.084ms 3: [0105_transactions_mock /182.107s] PRODUCE.DELIVERY.WAIT: duration 502.541ms 3: [0105_transactions_mock /182.107s] Test config file test.conf not found 3: [0105_transactions_mock /182.107s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /182.107s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /182.107s] PRODUCE: duration 0.073ms 3: [0105_transactions_mock /182.107s] rd_kafka_abort_transaction(rk, -1): duration 0.309ms 3: [0105_transactions_mock /182.107s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /182.107s] rd_kafka_begin_transaction(rk): duration 0.032ms 3: [0105_transactions_mock /182.107s] Test config file test.conf not found 3: [0105_transactions_mock /182.107s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /182.107s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /182.107s] PRODUCE: duration 0.075ms 3: [
/190.528s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /182.610s] PRODUCE.DELIVERY.WAIT: duration 502.580ms 3: [0105_transactions_mock /182.610s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /182.610s] Test config file test.conf not found 3: [0105_transactions_mock /182.610s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /182.610s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /182.610s] PRODUCE: duration 0.088ms 3: [0105_transactions_mock /182.610s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /183.106s] rd_kafka_commit_transaction(rk, -1): duration 495.629ms 3: [0105_transactions_mock /183.106s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /183.106s] rd_kafka_begin_transaction(rk): duration 0.078ms 3: [0105_transactions_mock /183.106s] Test config file test.conf not found 3: [0105_transactions_mock /183.106s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /183.106s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /183.106s] PRODUCE: duration 0.085ms 3: [
/191.528s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /183.609s] PRODUCE.DELIVERY.WAIT: duration 502.559ms 3: [0105_transactions_mock /183.609s] Test config file test.conf not found 3: [0105_transactions_mock /183.609s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /183.609s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /183.609s] PRODUCE: duration 0.076ms 3: [0105_transactions_mock /183.609s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /184.109s] rd_kafka_abort_transaction(rk, -1): duration 500.362ms 3: [0105_transactions_mock /184.109s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /184.109s] rd_kafka_begin_transaction(rk): duration 0.052ms 3: [0105_transactions_mock /184.109s] Test config file test.conf not found 3: [0105_transactions_mock /184.109s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /184.109s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /184.109s] PRODUCE: duration 0.091ms 3: [
/192.528s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /184.612s] PRODUCE.DELIVERY.WAIT: duration 502.512ms 3: [0105_transactions_mock /184.612s] Test config file test.conf not found 3: [0105_transactions_mock /184.612s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /184.612s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /184.612s] PRODUCE: duration 0.075ms 3: [0105_transactions_mock /184.612s] Changing transaction coordinator from 1 to 2 3: [0105_transactions_mock /185.108s] rd_kafka_abort_transaction(rk, -1): duration 495.622ms 3: [0105_transactions_mock /185.108s] Changing transaction coordinator from 2 to 3 3: [0105_transactions_mock /185.108s] rd_kafka_begin_transaction(rk): duration 0.052ms 3: [0105_transactions_mock /185.108s] Test config file test.conf not found 3: [0105_transactions_mock /185.108s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /185.108s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /185.108s] PRODUCE: duration 0.078ms 3: [
/193.528s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /185.611s] PRODUCE.DELIVERY.WAIT: duration 502.545ms 3: [0105_transactions_mock /185.611s] Changing transaction coordinator from 3 to 4 3: [0105_transactions_mock /185.611s] Test config file test.conf not found 3: [0105_transactions_mock /185.611s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /185.611s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /185.611s] PRODUCE: duration 0.075ms 3: [0105_transactions_mock /186.109s] rd_kafka_abort_transaction(rk, -1): duration 498.067ms 3: [0105_transactions_mock /186.109s] Changing transaction coordinator from 4 to 5 3: [0105_transactions_mock /186.109s] rd_kafka_begin_transaction(rk): duration 0.084ms 3: [0105_transactions_mock /186.109s] Test config file test.conf not found 3: [0105_transactions_mock /186.109s] Produce to test [-1]: messages #0..50 3: [0105_transactions_mock /186.109s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /186.109s] PRODUCE: duration 0.079ms 3: [
/194.528s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /186.612s] PRODUCE.DELIVERY.WAIT: duration 502.546ms 3: [0105_transactions_mock /186.612s] Test config file test.conf not found 3: [0105_transactions_mock /186.612s] Produce to test [-1]: messages #50..100 3: [0105_transactions_mock /186.612s] SUM(POLL): duration 0.001ms 3: [0105_transactions_mock /186.612s] PRODUCE: duration 0.076ms 3: [0105_transactions_mock /186.612s] Changing transaction coordinator from 5 to 1 3: [0105_transactions_mock /187.110s] rd_kafka_abort_transaction(rk, -1): duration 497.905ms 3: [0105_transactions_mock /187.111s] [ do_test_txn_switch_coordinator:1366: Test switching coordinators: PASS (18.36s) ] 3: [0105_transactions_mock /187.111s] [ do_test_txn_switch_coordinator_refresh:1433: Test switching coordinators (refresh) ] 3: [0105_transactions_mock /187.111s] Test config file test.conf not found 3: [0105_transactions_mock /187.111s] Setting test timeout to 60s * 2.7 3: %5|1673491249.430|MOCK|0105_transactions_mock#producer-238| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:35507,127.0.0.1:34235,127.0.0.1:37027 3: [0105_transactions_mock /187.111s] Created kafka instance 0105_transactions_mock#producer-238 3: [0105_transactions_mock /187.115s] Starting transaction 3: [0105_transactions_mock /187.130s] rd_kafka_init_transactions(rk, 5000): duration 14.949ms 3: [0105_transactions_mock /187.130s] rd_kafka_begin_transaction(rk): duration 0.117ms 3: [0105_transactions_mock /187.130s] Switching to coordinator 2 3: [
/195.531s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /187.633s] rd_kafka_send_offsets_to_transaction( rk, offsets, cgmetadata, 20 * 1000): duration 503.711ms 3: [0105_transactions_mock /187.633s] Test config file test.conf not found 3: [0105_transactions_mock /187.633s] Produce to test [-1]: messages #0..10 3: [0105_transactions_mock /187.633s] SUM(POLL): duration 0.000ms 3: [0105_transactions_mock /187.633s] PRODUCE: duration 0.031ms 3: [0105_transactions_mock /187.639s] PRODUCE.DELIVERY.WAIT: duration 5.728ms 3: [0105_transactions_mock /187.639s] rd_kafka_commit_transaction(rk, -1): duration 0.210ms 3: [0105_transactions_mock /187.640s] [ do_test_txn_switch_coordinator_refresh:1433: Test switching coordinators (refresh): PASS (0.53s) ] 3: [0105_transactions_mock /187.640s] [ do_test_out_of_order_seq:2532 ] 3: [0105_transactions_mock /187.640s] Test config file test.conf not found 3: [0105_transactions_mock /187.640s] Setting test timeout to 60s * 2.7 3: %5|1673491249.960|MOCK|0105_transactions_mock#producer-239| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:45527,127.0.0.1:39811,127.0.0.1:38471 3: [0105_transactions_mock /187.645s] Created kafka instance 0105_transactions_mock#producer-239 3: [0105_transactions_mock /187.657s] rd_kafka_init_transactions(rk, -1): duration 11.224ms 3: [0105_transactions_mock /187.657s] rd_kafka_begin_transaction(rk): duration 0.111ms 3: [0105_transactions_mock /187.657s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.015ms 3: [0105_transactions_mock /187.657s] 0105_transactions_mock#producer-239: Flushing 1 messages 3: [
/196.531s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /188.643s] FLUSH: duration 986.192ms 3: [0105_transactions_mock /188.643s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.004ms 3: [0105_transactions_mock /188.643s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /188.643s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /188.643s] Sleeping.. 3: [
/197.531s] 1 test(s) running: 0105_transactions_mock 3: [
/198.531s] 1 test(s) running: 0105_transactions_mock 3: %3|1673491253.009|TXNERR|0105_transactions_mock#producer-239| [thrd:127.0.0.1:39811/bootstrap]: Current transaction failed in state InTransaction: skipped sequence numbers (OUT_OF_ORDER_SEQUENCE_NUMBER, requires epoch bump) 3: [
/199.532s] 1 test(s) running: 0105_transactions_mock 3: [
/200.532s] 1 test(s) running: 0105_transactions_mock 3: [
/201.532s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /193.643s] produce() failed as expected: Local: Erroneous state 3: [0105_transactions_mock /193.643s] commit_transaction(-1): duration 0.178ms 3: [0105_transactions_mock /193.643s] commit_transaction() failed (expectedly): skipped sequence numbers 3: [0105_transactions_mock /193.644s] rd_kafka_abort_transaction(rk, -1): duration 0.122ms 3: [0105_transactions_mock /193.644s] rd_kafka_begin_transaction(rk): duration 0.049ms 3: [0105_transactions_mock /193.644s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = ("mytopic"); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)"mytopic", ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.005ms 3: [0105_transactions_mock /193.687s] rd_kafka_commit_transaction(rk, -1): duration 43.058ms 3: [0105_transactions_mock /193.687s] [ do_test_out_of_order_seq:2532: PASS (6.05s) ] 3: [0105_transactions_mock /193.687s] [ do_test_topic_disappears_for_awhile:2666 ] 3: [0105_transactions_mock /193.687s] Test config file test.conf not found 3: [0105_transactions_mock /193.687s] Setting test timeout to 60s * 2.7 3: %5|1673491256.007|MOCK|0105_transactions_mock#producer-240| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43379 3: [0105_transactions_mock /193.696s] Created kafka instance 0105_transactions_mock#producer-240 3: [0105_transactions_mock /193.716s] rd_kafka_init_transactions(rk, -1): duration 18.890ms 3: [0105_transactions_mock /193.716s] rd_kafka_begin_transaction(rk): duration 0.108ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.015ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.002ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /193.716s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [
/202.532s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /194.731s] rd_kafka_commit_transaction(rk, -1): duration 1014.895ms 3: [0105_transactions_mock /194.731s] commit_transaction(-1): duration 1014.912ms 3: [0105_transactions_mock /194.731s] Marking topic as non-existent 3: %5|1673491257.050|PARTCNT|0105_transactions_mock#producer-240| [thrd:main]: Topic mytopic partition count changed from 10 to 0 3: [0105_transactions_mock /194.731s] rd_kafka_metadata(rk, 0, ((void *)0), &md, tmout_multip(5000)): duration 0.119ms 3: [
/203.532s] 1 test(s) running: 0105_transactions_mock 3: [
/204.532s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /196.731s] Bringing topic back to life 3: [0105_transactions_mock /196.731s] rd_kafka_begin_transaction(rk): duration 0.039ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.005ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.000ms 3: [0105_transactions_mock /196.731s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (cnt % partition_cnt); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)cnt % partition_cnt, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.001ms 3: [
/205.532s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /197.733s] rd_kafka_commit_transaction(rk, -1): duration 1001.353ms 3: [0105_transactions_mock /197.733s] commit_transaction(-1): duration 1001.372ms 3: [0105_transactions_mock /197.733s] Verifying messages by consumtion 3: [0105_transactions_mock /197.733s] Test config file test.conf not found 3: [0105_transactions_mock /197.733s] Created kafka instance 0105_transactions_mock#consumer-241 3: [0105_transactions_mock /197.733s] consume: consume exactly 122 messages 3: [
/206.532s] 1 test(s) running: 0105_transactions_mock 3: [
/207.532s] 1 test(s) running: 0105_transactions_mock 3: [
/208.533s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /200.880s] mytopic [5] reached EOF at offset 12 3: [0105_transactions_mock /200.880s] mytopic [6] reached EOF at offset 12 3: [0105_transactions_mock /200.880s] mytopic [7] reached EOF at offset 12 3: [0105_transactions_mock /200.880s] mytopic [8] reached EOF at offset 13 3: [0105_transactions_mock /200.880s] mytopic [9] reached EOF at offset 13 3: [0105_transactions_mock /200.880s] mytopic [2] reached EOF at offset 12 3: [0105_transactions_mock /200.880s] mytopic [3] reached EOF at offset 12 3: [0105_transactions_mock /200.880s] mytopic [4] reached EOF at offset 12 3: [
/209.533s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /201.383s] mytopic [0] reached EOF at offset 12 3: [0105_transactions_mock /201.383s] mytopic [1] reached EOF at offset 12 3: [0105_transactions_mock /201.383s] CONSUME: duration 3649.884ms 3: [0105_transactions_mock /201.383s] consume: consumed 122/122 messages (10/10 EOFs) 3: [0105_transactions_mock /201.384s] [ do_test_topic_disappears_for_awhile:2666: PASS (7.70s) ] 3: [0105_transactions_mock /201.384s] [ do_test_disconnected_group_coord:2802: switch_coord=false ] 3: [0105_transactions_mock /201.385s] Test config file test.conf not found 3: [0105_transactions_mock /201.385s] Setting test timeout to 60s * 2.7 3: %5|1673491263.704|MOCK|0105_transactions_mock#producer-242| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:43413,127.0.0.1:39227,127.0.0.1:43695 3: [0105_transactions_mock /201.385s] Created kafka instance 0105_transactions_mock#producer-242 3: [0105_transactions_mock /201.405s] rd_kafka_init_transactions(rk, -1): duration 19.782ms 3: [0105_transactions_mock /201.405s] rd_kafka_begin_transaction(rk): duration 0.116ms 3: [0105_transactions_mock /201.405s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.019ms 3: [0105_transactions_mock /201.405s] 0105_transactions_mock#producer-242: Flushing 1 messages 3: [
/210.533s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /202.388s] FLUSH: duration 981.435ms 3: [
/211.533s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /203.388s] Calling send_offsets_to_transaction() 3: %3|1673491265.708|FAIL|0105_transactions_mock#producer-242| [thrd:127.0.0.1:39227/bootstrap]: 127.0.0.1:39227/2: Connect to ipv4#127.0.0.1:39227 failed: Connection refused (after 0ms in state CONNECT) 3: %3|1673491265.864|FAIL|0105_transactions_mock#producer-242| [thrd:127.0.0.1:39227/bootstrap]: 127.0.0.1:39227/2: Connect to ipv4#127.0.0.1:39227 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/212.533s] 1 test(s) running: 0105_transactions_mock 3: [
/213.533s] 1 test(s) running: 0105_transactions_mock 3: [
/214.533s] 1 test(s) running: 0105_transactions_mock 3: [
/214.699s] Bringing up group coordinator 2.. 3: [0105_transactions_mock /206.726s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 3337.993ms 3: [0105_transactions_mock /206.726s] send_offsets_to_transaction(-1): duration 3338.017ms 3: [0105_transactions_mock /206.726s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:39227/2: Connect to ipv4#127.0.0.1:39227 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /206.726s] 0105_transactions_mock#producer-242 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:39227/2: Connect to ipv4#127.0.0.1:39227 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /206.726s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:39227/2: Connect to ipv4#127.0.0.1:39227 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /206.726s] 0105_transactions_mock#producer-242 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:39227/2: Connect to ipv4#127.0.0.1:39227 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /206.726s] rd_kafka_commit_transaction(rk, -1): duration 0.292ms 3: [0105_transactions_mock /206.726s] commit_transaction(-1): duration 0.300ms 3: [0105_transactions_mock /206.727s] [ do_test_disconnected_group_coord:2802: switch_coord=false: PASS (5.34s) ] 3: [0105_transactions_mock /206.727s] [ do_test_disconnected_group_coord:2802: switch_coord=true ] 3: [0105_transactions_mock /206.727s] Test config file test.conf not found 3: [0105_transactions_mock /206.727s] Setting test timeout to 60s * 2.7 3: %5|1673491269.046|MOCK|0105_transactions_mock#producer-243| [thrd:app]: Mock cluster enabled: original bootstrap.servers and security.protocol ignored and replaced with 127.0.0.1:36121,127.0.0.1:44067,127.0.0.1:38035 3: [0105_transactions_mock /206.731s] Created kafka instance 0105_transactions_mock#producer-243 3: %6|1673491269.051|FAIL|0105_transactions_mock#producer-243| [thrd:127.0.0.1:44067/bootstrap]: 127.0.0.1:44067/bootstrap: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 4ms in state UP) 3: %3|1673491269.171|FAIL|0105_transactions_mock#producer-243| [thrd:127.0.0.1:44067/bootstrap]: 127.0.0.1:44067/2: Connect to ipv4#127.0.0.1:44067 failed: Connection refused (after 0ms in state CONNECT) 3: [
/215.533s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /207.241s] rd_kafka_init_transactions(rk, -1): duration 509.479ms 3: [0105_transactions_mock /207.241s] rd_kafka_begin_transaction(rk): duration 0.119ms 3: [0105_transactions_mock /207.242s] rd_kafka_producev( rk, ({ if (0) { const char * __t __attribute__((unused)) = (topic); } RD_KAFKA_VTYPE_TOPIC; }), (const char *)topic, ({ if (0) { int32_t __t __attribute__((unused)) = (0); } RD_KAFKA_VTYPE_PARTITION; }), (int32_t)0, ({ if (0) { void * __t __attribute__((unused)) = ("hi"); size_t __t2 __attribute__((unused)) = (2); } RD_KAFKA_VTYPE_VALUE; }), (void *)"hi", (size_t)2, RD_KAFKA_VTYPE_END): duration 0.013ms 3: [0105_transactions_mock /207.242s] 0105_transactions_mock#producer-243: Flushing 3 messages 3: [0105_transactions_mock /207.242s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:44067/bootstrap: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 4ms in state UP) 3: [0105_transactions_mock /207.242s] 0105_transactions_mock#producer-243 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:44067/bootstrap: Disconnected: verify that security.protocol is correctly configured, broker might require SASL authentication (after 4ms in state UP) 3: [0105_transactions_mock /207.242s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:44067/2: Connect to ipv4#127.0.0.1:44067 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /207.242s] 0105_transactions_mock#producer-243 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:44067/2: Connect to ipv4#127.0.0.1:44067 failed: Connection refused (after 0ms in state CONNECT) 3: [0105_transactions_mock /207.732s] FLUSH: duration 490.413ms 3: [
/216.533s] 1 test(s) running: 0105_transactions_mock 3: [0105_transactions_mock /208.732s] Calling send_offsets_to_transaction() 3: %3|1673491271.052|FAIL|0105_transactions_mock#producer-243| [thrd:127.0.0.1:44067/bootstrap]: 127.0.0.1:44067/2: Connect to ipv4#127.0.0.1:44067 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [
/217.534s] 1 test(s) running: 0105_transactions_mock 3: [
/218.534s] 1 test(s) running: 0105_transactions_mock 3: [
/219.534s] 1 test(s) running: 0105_transactions_mock 3: [
/220.044s] Switching group coordinator to 3 3: [0105_transactions_mock /211.872s] rd_kafka_send_offsets_to_transaction(rk, offsets, cgmetadata, -1): duration 3139.596ms 3: [0105_transactions_mock /211.872s] send_offsets_to_transaction(-1): duration 3139.613ms 3: [0105_transactions_mock /211.872s] Ignoring allowed error: _TRANSPORT: 127.0.0.1:44067/2: Connect to ipv4#127.0.0.1:44067 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /211.872s] 0105_transactions_mock#producer-243 rdkafka error (non-testfatal): Local: Broker transport failure: 127.0.0.1:44067/2: Connect to ipv4#127.0.0.1:44067 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) 3: [0105_transactions_mock /211.872s] rd_kafka_commit_transaction(rk, -1): duration 0.244ms 3: [0105_transactions_mock /211.872s] commit_transaction(-1): duration 0.250ms 3: [0105_transactions_mock /211.873s] [ do_test_disconnected_group_coord:2802: switch_coord=true: PASS (5.15s) ] 3: [0105_transactions_mock /211.873s] 0105_transactions_mock: duration 211872.531ms 3: [0105_transactions_mock /211.873s] ================= Test 0105_transactions_mock PASSED ================= 3: [
/220.534s] ALL-TESTS: duration 220533.906ms 3: [
/220.534s] 10 thread(s) in use by librdkafka, waiting... 3: [
/221.534s] 10 thread(s) in use by librdkafka 3: [
/221.534s] TEST FAILURE 3: ### Test "
" failed at /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:1581:test_wait_exit() at Thu Jan 12 02:41:15 2023: ### 3: 10 thread(s) still active in librdkafka 3: test-runner: /usr/src/RPM/BUILD/librdkafka-1.9.2/tests/test.c:6629: test_fail0: Assertion `0' failed. 1/1 Test #3: RdKafkaTestBrokerLess ............Subprocess aborted***Exception: 221.54 sec 0% tests passed, 1 tests failed out of 1 Total Test time (real) = 221.55 sec The following tests FAILED: 3 - RdKafkaTestBrokerLess (Subprocess aborted) Errors while running CTest Output from these tests are in: /usr/src/RPM/BUILD/librdkafka-1.9.2/Testing/Temporary/LastTest.log Use "--rerun-failed --output-on-failure" to re-run the failed cases verbosely. error: Bad exit status from /usr/src/tmp/rpm-tmp.90595 (%check) RPM build errors: Bad exit status from /usr/src/tmp/rpm-tmp.90595 (%check) Command exited with non-zero status 1 142.54user 15.78system 4:50.65elapsed 54%CPU (0avgtext+0avgdata 786476maxresident)k 0inputs+0outputs (0major+4413117minor)pagefaults 0swaps hsh-rebuild: rebuild of `librdkafka-1.9.2-alt1.src.rpm' failed. Command exited with non-zero status 1 3.00user 1.83system 5:04.24elapsed 1%CPU (0avgtext+0avgdata 108512maxresident)k 88inputs+0outputs (32272major+163524minor)pagefaults 0swaps