<86>Nov 8 03:06:55 userdel[101345]: delete user 'rooter' <86>Nov 8 03:06:55 userdel[101345]: removed group 'rooter' owned by 'rooter' <86>Nov 8 03:06:55 userdel[101345]: removed shadow group 'rooter' owned by 'rooter' <86>Nov 8 03:06:55 groupadd[101386]: group added to /etc/group: name=rooter, GID=570 <86>Nov 8 03:06:55 groupadd[101386]: group added to /etc/gshadow: name=rooter <86>Nov 8 03:06:55 groupadd[101386]: new group: name=rooter, GID=570 <86>Nov 8 03:06:55 useradd[101420]: new user: name=rooter, UID=570, GID=570, home=/root, shell=/bin/bash <86>Nov 8 03:06:55 userdel[101503]: delete user 'builder' <86>Nov 8 03:06:55 userdel[101503]: removed group 'builder' owned by 'builder' <86>Nov 8 03:06:55 userdel[101503]: removed shadow group 'builder' owned by 'builder' <86>Nov 8 03:06:55 groupadd[101557]: group added to /etc/group: name=builder, GID=571 <86>Nov 8 03:06:55 groupadd[101557]: group added to /etc/gshadow: name=builder <86>Nov 8 03:06:55 groupadd[101557]: new group: name=builder, GID=571 <86>Nov 8 03:06:55 useradd[101590]: new user: name=builder, UID=571, GID=571, home=/usr/src, shell=/bin/bash <13>Nov 8 03:06:59 rpmi: libyaml2-0.2.1-alt1 sisyphus.214707.100 1539464409 installed <13>Nov 8 03:06:59 rpmi: libverto-0.3.0-alt1_5 1525957716 installed <13>Nov 8 03:06:59 rpmi: libruby-2.5.1-alt4 sisyphus.209945.120 1537061429 installed <13>Nov 8 03:06:59 rpmi: libkeyutils-1.5.10-alt1 1489994069 installed <13>Nov 8 03:06:59 rpmi: libgdbm-1.8.3-alt10 1454943313 installed <13>Nov 8 03:06:59 rpmi: libcom_err-1.44.3-alt1 1532134713 installed <13>Nov 8 03:06:59 rpmi: libtasn1-4.13-alt2 1521133848 installed <13>Nov 8 03:06:59 rpmi: libp11-kit-0.23.9-alt5 1525798241 installed <13>Nov 8 03:06:59 rpmi: rpm-macros-alternatives-0.4.5-alt1.1 1404382149 installed <13>Nov 8 03:06:59 rpmi: alternatives-0.4.5-alt1.1 1404382149 installed <13>Nov 8 03:06:59 rpmi: ca-certificates-2018.09.09-alt1 sisyphus.212781.100 1536518628 installed <13>Nov 8 03:06:59 rpmi: ca-trust-0.1.1-alt2 1515595785 installed <13>Nov 8 03:06:59 rpmi: p11-kit-trust-0.23.9-alt5 1525798241 installed <13>Nov 8 03:06:59 rpmi: libcrypto1.1-1.1.0i-alt1 1535471288 installed <13>Nov 8 03:06:59 rpmi: libcrypto10-1.0.2p-alt2 1535474143 installed <13>Nov 8 03:06:59 rpmi: libssl1.1-1.1.0i-alt1 1535471288 installed <86>Nov 8 03:06:59 groupadd[122150]: group added to /etc/group: name=_keytab, GID=499 <86>Nov 8 03:06:59 groupadd[122150]: group added to /etc/gshadow: name=_keytab <86>Nov 8 03:06:59 groupadd[122150]: new group: name=_keytab, GID=499 <13>Nov 8 03:07:00 rpmi: libkrb5-1.16.2-alt1 sisyphus.216047.100 1541159108 installed <13>Nov 8 03:07:00 rpmi: libssl10-1.0.2p-alt2 1535474143 installed <13>Nov 8 03:07:00 rpmi: ruby-stdlibs-2.5.1-alt4 sisyphus.209945.120 1537061429 installed <13>Nov 8 03:07:00 rpmi: ruby-2.5.1-alt4 sisyphus.209945.120 1537061429 installed Building target platforms: x86_64 Building for target x86_64 Wrote: /usr/src/in/nosrpm/auto-nng-1.7-alt2_3.1.nosrc.rpm Installing auto-nng-1.7-alt2_3.1.src.rpm Building target platforms: x86_64 Building for target x86_64 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.77552 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf auto-nng.v1.7 + echo 'Source #0 (auto-nng.v1.7.tar.gz):' Source #0 (auto-nng.v1.7.tar.gz): + /bin/gzip -dc /usr/src/RPM/SOURCES/auto-nng.v1.7.tar.gz + /bin/tar -xf - + cd auto-nng.v1.7 + /bin/chmod -c -Rf u+rwX,go-w . + echo 'Patch #0 (auto-nng-cflags.patch):' Patch #0 (auto-nng-cflags.patch): + /usr/bin/patch -p1 -b --suffix .cflags patching file Makefile + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.60861 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd auto-nng.v1.7 + make -j8 'CFLAGS=-pipe -frecord-gcc-switches -Wall -g -O2' make: Entering directory '/usr/src/RPM/BUILD/auto-nng.v1.7' cc -pipe -frecord-gcc-switches -Wall -g -O2 -o auto-nng auto-nng.c -lm auto-nng.c: In function 'main_generate': auto-nng.c:681:54: warning: 'continuous_input_stddevs' may be used uninitialized in this function [-Wmaybe-uninitialized] for (i = 0; i < continuous_input_count; i++) printf("%f, %f\n", continuous_input_averages[i], continuous_input_stddevs[i]); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ auto-nng.c:681:54: warning: 'continuous_input_averages' may be used uninitialized in this function [-Wmaybe-uninitialized] auto-nng.c:686:11: warning: 'continuous_output_stddevs' may be used uninitialized in this function [-Wmaybe-uninitialized] printf(DOUBLE_FORMAT ", " DOUBLE_FORMAT "\n", continuous_output_averages[i], continuous_output_stddevs[i]); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ auto-nng.c:686:11: warning: 'continuous_output_averages' may be used uninitialized in this function [-Wmaybe-uninitialized] auto-nng.c: In function 'main_run': auto-nng.c:771:11: warning: 'continuous_input_stats' may be used uninitialized in this function [-Wmaybe-uninitialized] double *continuous_input_stats; ^~~~~~~~~~~~~~~~~~~~~~ auto-nng.c:910:107: warning: 'continuous_output_stats' may be used uninitialized in this function [-Wmaybe-uninitialized] printf(DOUBLE_FORMAT, read_indicators(low_indicator, high_indicator) * continuous_output_stats[2*j+1] + continuous_output_stats[2*j]); ^ auto-nng.c:904:37: warning: 'continuous_output_cols' may be used uninitialized in this function [-Wmaybe-uninitialized] if (continuous_output_cols[j] == i) { ~~~~~~~~~~~~~~~~~~~~~~^~~ auto-nng.c:897:33: warning: 'binary_output_cols' may be used uninitialized in this function [-Wmaybe-uninitialized] if (binary_output_cols[j] == i) { ~~~~~~~~~~~~~~~~~~^~~ auto-nng.c:770:8: warning: 'continuous_input_cols' may be used uninitialized in this function [-Wmaybe-uninitialized] int *continuous_input_cols; ^~~~~~~~~~~~~~~~~~~~~ auto-nng.c:872:91: warning: 'binary_input_cols' may be used uninitialized in this function [-Wmaybe-uninitialized] for (i = 0; i < binary_input_count; i++) input_values[i] = values[binary_input_cols[i]]; ~~~~~~~~~~~~~~~~~^~~ make: Leaving directory '/usr/src/RPM/BUILD/auto-nng.v1.7' + exit 0 Executing(%install): /bin/sh -e /usr/src/tmp/rpm-tmp.71861 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + /bin/chmod -Rf u+rwX -- /usr/src/tmp/auto-nng-buildroot + : + /bin/rm -rf -- /usr/src/tmp/auto-nng-buildroot + cd auto-nng.v1.7 + mkdir -p /usr/src/tmp/auto-nng-buildroot//usr/bin/ + install auto-nng /usr/src/tmp/auto-nng-buildroot//usr/bin/ + /usr/lib/rpm/brp-alt Cleaning files in /usr/src/tmp/auto-nng-buildroot (auto) Verifying and fixing files in /usr/src/tmp/auto-nng-buildroot (binconfig,pkgconfig,libtool,desktop) Checking contents of files in /usr/src/tmp/auto-nng-buildroot/ (default) Compressing files in /usr/src/tmp/auto-nng-buildroot (auto) Verifying ELF objects in /usr/src/tmp/auto-nng-buildroot (arch=normal,fhs=normal,lfs=relaxed,lint=relaxed,rpath=normal,stack=normal,textrel=normal,unresolved=normal) Hardlinking identical .pyc and .pyo files Executing(%check): /bin/sh -e /usr/src/tmp/rpm-tmp.29024 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd auto-nng.v1.7 + make test make: Entering directory '/usr/src/RPM/BUILD/auto-nng.v1.7' ruby auto-nng-test.rb ================================================== auto-nng v1.7 Copyright (c) 2011 Public Software Group e. V. This software is EXPERIMENTAL and comes with ABSOLUTELY NO WARRANTY. web: http://www.public-software-group.org/ ================================================== Training error: 0.997376 / Test error: 0.995906 / Layer sizes: 10 9 4 1 Training error: 0.995848 / Test error: 0.993390 / Layer sizes: 10 10 10 1 Training error: 0.988063 / Test error: 0.981053 / Layer sizes: 10 10 10 1 Training error: 0.986098 / Test error: 0.977381 / Layer sizes: 10 3 1 Training error: 0.977457 / Test error: 0.961950 / Layer sizes: 10 3 1 Training error: 0.976865 / Test error: 0.960376 / Layer sizes: 10 3 1 Training error: 0.995821 / Test error: 0.942066 / Layer sizes: 10 1 Training error: 0.984575 / Test error: 0.941390 / Layer sizes: 10 1 Training error: 0.975069 / Test error: 0.923421 / Layer sizes: 10 1 Training error: 0.966163 / Test error: 0.893751 / Layer sizes: 10 1 Training error: 0.933531 / Test error: 0.853150 / Layer sizes: 10 1 Training error: 0.906831 / Test error: 0.831481 / Layer sizes: 10 1 Training error: 0.878420 / Test error: 0.800355 / Layer sizes: 10 1 Training error: 0.870938 / Test error: 0.777696 / Layer sizes: 10 1 Training error: 0.868628 / Test error: 0.765873 / Layer sizes: 10 1 Training error: 0.837267 / Test error: 0.717935 / Layer sizes: 10 1 Training error: 0.812836 / Test error: 0.701856 / Layer sizes: 10 1 Training error: 0.796331 / Test error: 0.678963 / Layer sizes: 10 1 Training error: 0.794859 / Test error: 0.676915 / Layer sizes: 10 1 Training error: 0.783285 / Test error: 0.671945 / Layer sizes: 10 1 Training error: 0.778018 / Test error: 0.656336 / Layer sizes: 10 1 Training error: 0.775469 / Test error: 0.648405 / Layer sizes: 10 1 Training error: 0.760373 / Test error: 0.644281 / Layer sizes: 10 1 Training error: 0.749452 / Test error: 0.623520 / Layer sizes: 10 1 Training error: 0.725254 / Test error: 0.612666 / Layer sizes: 10 1 Training error: 0.706121 / Test error: 0.586127 / Layer sizes: 10 1 Training error: 0.690000 / Test error: 0.585852 / Layer sizes: 10 1 Training error: 0.672578 / Test error: 0.553645 / Layer sizes: 10 1 Training error: 0.551807 / Test error: 0.550935 / Layer sizes: 10 1 Training error: 0.550429 / Test error: 0.528183 / Layer sizes: 10 1 Training error: 0.541710 / Test error: 0.512349 / Layer sizes: 10 1 Training error: 0.537878 / Test error: 0.491022 / Layer sizes: 10 1 Training error: 0.533795 / Test error: 0.470387 / Layer sizes: 10 1 Training error: 0.547923 / Test error: 0.455015 / Layer sizes: 10 9 4 1 Training error: 0.539699 / Test error: 0.451931 / Layer sizes: 10 9 4 1 Training error: 0.497645 / Test error: 0.444779 / Layer sizes: 10 9 4 1 Training error: 0.497509 / Test error: 0.438845 / Layer sizes: 10 9 4 1 Training error: 0.475107 / Test error: 0.423792 / Layer sizes: 10 9 4 1 Training error: 0.469702 / Test error: 0.418800 / Layer sizes: 10 9 4 1 Training error: 0.466840 / Test error: 0.404335 / Layer sizes: 10 9 4 1 Training error: 0.464629 / Test error: 0.398599 / Layer sizes: 10 9 4 1 Training error: 0.463302 / Test error: 0.368524 / Layer sizes: 10 9 4 1 Training error: 0.458891 / Test error: 0.357185 / Layer sizes: 10 1 Training error: 0.456776 / Test error: 0.354698 / Layer sizes: 10 1 Training error: 0.456149 / Test error: 0.346670 / Layer sizes: 10 1 Training error: 0.456022 / Test error: 0.339127 / Layer sizes: 10 1 Training error: 0.425314 / Test error: 0.331863 / Layer sizes: 10 10 1 Training error: 0.422908 / Test error: 0.328964 / Layer sizes: 10 10 1 Training error: 0.421545 / Test error: 0.326991 / Layer sizes: 10 10 1 Training error: 0.414492 / Test error: 0.326444 / Layer sizes: 10 10 1 Training error: 0.403189 / Test error: 0.323507 / Layer sizes: 10 10 1 Training error: 0.401171 / Test error: 0.316643 / Layer sizes: 10 10 1 Training error: 0.396827 / Test error: 0.300179 / Layer sizes: 10 10 1 Training error: 0.393403 / Test error: 0.286452 / Layer sizes: 10 10 1 Training error: 0.393016 / Test error: 0.279499 / Layer sizes: 10 10 1 Training error: 0.348292 / Test error: 0.274329 / Layer sizes: 10 10 10 1 Training error: 0.347494 / Test error: 0.270245 / Layer sizes: 10 10 10 1 Training error: 0.330355 / Test error: 0.269017 / Layer sizes: 10 10 10 1 Training error: 0.055094 / Test error: 0.261034 / Layer sizes: 10 7 1 Training error: 0.054846 / Test error: 0.248175 / Layer sizes: 10 7 1 Training error: 0.046698 / Test error: 0.240380 / Layer sizes: 10 7 1 Training error: 0.039233 / Test error: 0.237082 / Layer sizes: 10 7 1 Training error: 0.039216 / Test error: 0.234783 / Layer sizes: 10 7 1 Training error: 0.025913 / Test error: 0.229334 / Layer sizes: 10 7 1 Training error: 0.025894 / Test error: 0.220740 / Layer sizes: 10 7 1 Training error: 0.023350 / Test error: 0.211677 / Layer sizes: 10 7 1 Training error: 0.022902 / Test error: 0.190500 / Layer sizes: 10 7 1 Training error: 0.022868 / Test error: 0.185211 / Layer sizes: 10 7 1 Training error: 0.021305 / Test error: 0.184689 / Layer sizes: 10 7 1 Training error: 0.017560 / Test error: 0.172079 / Layer sizes: 10 7 1 Training error: 0.006318 / Test error: 0.165806 / Layer sizes: 10 7 1 Training error: 0.002862 / Test error: 0.165177 / Layer sizes: 10 7 1 Training error: 0.002819 / Test error: 0.162007 / Layer sizes: 10 7 1 Training error: 0.008041 / Test error: 0.159232 / Layer sizes: 10 3 3 1 Training error: 0.007244 / Test error: 0.143358 / Layer sizes: 10 3 3 1 Training error: 0.007174 / Test error: 0.129996 / Layer sizes: 10 3 3 1 Training error: 0.007048 / Test error: 0.118283 / Layer sizes: 10 3 3 1 Training error: 0.005957 / Test error: 0.117384 / Layer sizes: 10 3 3 1 Training error: 0.005883 / Test error: 0.115952 / Layer sizes: 10 3 3 1 Training error: 0.005845 / Test error: 0.107757 / Layer sizes: 10 3 3 1 Training error: 0.005732 / Test error: 0.097293 / Layer sizes: 10 3 3 1 Training error: 0.005655 / Test error: 0.089309 / Layer sizes: 10 3 3 1 Training error: 0.005573 / Test error: 0.086295 / Layer sizes: 10 3 3 1 Training error: 0.005429 / Test error: 0.084272 / Layer sizes: 10 3 3 1 Training error: 0.004952 / Test error: 0.078128 / Layer sizes: 10 3 3 1 Training error: 0.003472 / Test error: 0.065413 / Layer sizes: 10 3 3 1 Training error: 0.000584 / Test error: 0.041901 / Layer sizes: 10 3 3 1 Training error: 0.000580 / Test error: 0.041420 / Layer sizes: 10 3 3 1 Training error: 0.000577 / Test error: 0.041356 / Layer sizes: 10 3 3 1 Training error: 0.000577 / Test error: 0.041284 / Layer sizes: 10 3 3 1 Training error: 0.000447 / Test error: 0.041265 / Layer sizes: 10 3 3 1 Training error: 0.000267 / Test error: 0.041262 / Layer sizes: 10 3 3 1 Training error: 0.000240 / Test error: 0.041261 / Layer sizes: 10 3 3 1 Training error: 0.000237 / Test error: 0.041254 / Layer sizes: 10 3 3 1 Training error: 0.000236 / Test error: 0.041254 / Layer sizes: 10 3 3 1 Training error: 0.000236 / Test error: 0.041238 / Layer sizes: 10 3 3 1 Training error: 0.000172 / Test error: 0.041236 / Layer sizes: 10 3 3 1 Training error: 0.000170 / Test error: 0.041225 / Layer sizes: 10 3 3 1 Training error: 0.000160 / Test error: 0.041161 / Layer sizes: 10 3 3 1 Training error: 0.000011 / Test error: 0.040470 / Layer sizes: 10 3 3 1 Training error: 0.000011 / Test error: 0.000012 / Layer sizes: 10 3 3 1 Training error: 0.000011 / Test error: 0.000011 / Layer sizes: 10 3 3 1 Training error: 0.000010 / Test error: 0.000011 / Layer sizes: 10 3 3 1 Training error: 0.000010 / Test error: 0.000010 / Layer sizes: 10 3 3 1 Training error: 0.000010 / Test error: 0.000010 / Layer sizes: 10 3 3 1 Training error: 0.000009 / Test error: 0.000009 / Layer sizes: 10 3 3 1 Training error: 0.000009 / Test error: 0.000009 / Layer sizes: 10 3 3 1 Training error: 0.000009 / Test error: 0.000009 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 3 3 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Training error: 0.000000 / Test error: 0.000000 / Layer sizes: 10 10 10 1 Finishing... ================================================== auto-nng v1.7 Copyright (c) 2011 Public Software Group e. V. This software is EXPERIMENTAL and comes with ABSOLUTELY NO WARRANTY. web: http://www.public-software-group.org/ ================================================== Loading neuronal network from file "test.network.nn". Network loaded, processing data. 96.58 % correct. 3.42 % wrong. Limit: 17.60 %. Test passed. make: Leaving directory '/usr/src/RPM/BUILD/auto-nng.v1.7' + exit 0 Processing files: auto-nng-1.7-alt2_3.1 Executing(%doc): /bin/sh -e /usr/src/tmp/rpm-tmp.69479 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd auto-nng.v1.7 + DOCDIR=/usr/src/tmp/auto-nng-buildroot/usr/share/doc/auto-nng-1.7 + export DOCDIR + rm -rf /usr/src/tmp/auto-nng-buildroot/usr/share/doc/auto-nng-1.7 + /bin/mkdir -p /usr/src/tmp/auto-nng-buildroot/usr/share/doc/auto-nng-1.7 + cp -prL LICENSE README /usr/src/tmp/auto-nng-buildroot/usr/share/doc/auto-nng-1.7 + chmod -R go-w /usr/src/tmp/auto-nng-buildroot/usr/share/doc/auto-nng-1.7 + chmod -R a+rX /usr/src/tmp/auto-nng-buildroot/usr/share/doc/auto-nng-1.7 + exit 0 Finding Provides (using /usr/lib/rpm/find-provides) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.5Sxcll find-provides: running scripts (alternatives,debuginfo,lib,pam,perl,pkgconfig,python,shell) Finding Requires (using /usr/lib/rpm/find-requires) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.jeN0nB find-requires: running scripts (cpp,debuginfo,files,lib,pam,perl,pkgconfig,pkgconfiglib,python,rpmlib,shebang,shell,static,symlinks) Requires: /lib64/ld-linux-x86-64.so.2, libc.so.6(GLIBC_2.14)(64bit), libc.so.6(GLIBC_2.2.5)(64bit), libc.so.6(GLIBC_2.3.4)(64bit), libc.so.6(GLIBC_2.4)(64bit), libm.so.6(GLIBC_2.2.5)(64bit), rtld(GNU_HASH) Finding debuginfo files (using /usr/lib/rpm/find-debuginfo-files) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.dZKqfV Creating auto-nng-debuginfo package Processing files: auto-nng-debuginfo-1.7-alt2_3.1 Finding Provides (using /usr/lib/rpm/find-provides) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.FN7WIi find-provides: running scripts (debuginfo) Finding Requires (using /usr/lib/rpm/find-requires) Executing: /bin/sh -e /usr/src/tmp/rpm-tmp.lRCS0J find-requires: running scripts (debuginfo) Requires: auto-nng = 1.7-alt2_3.1, /usr/lib/debug/lib64/ld-linux-x86-64.so.2.debug, debug64(libc.so.6), debug64(libm.so.6) Wrote: /usr/src/RPM/RPMS/x86_64/auto-nng-1.7-alt2_3.1.x86_64.rpm Wrote: /usr/src/RPM/RPMS/x86_64/auto-nng-debuginfo-1.7-alt2_3.1.x86_64.rpm 59.61user 0.39system 1:41.12elapsed 59%CPU (0avgtext+0avgdata 43972maxresident)k 0inputs+0outputs (0major+177291minor)pagefaults 0swaps 68.98user 2.70system 1:58.29elapsed 60%CPU (0avgtext+0avgdata 122960maxresident)k 0inputs+0outputs (0major+600433minor)pagefaults 0swaps --- auto-nng-1.7-alt2_3.1.x86_64.rpm.repo 2013-04-03 05:20:55.000000000 +0000 +++ auto-nng-1.7-alt2_3.1.x86_64.rpm.hasher 2018-11-08 03:08:46.446859292 +0000 @@ -5,2 +5,3 @@ Requires: /lib64/ld-linux-x86-64.so.2 +Requires: libc.so.6(GLIBC_2.14)(64bit) Requires: libc.so.6(GLIBC_2.2.5)(64bit)