<86>Jun 17 04:57:34 userdel[3216858]: delete user 'rooter' <86>Jun 17 04:57:34 userdel[3216858]: removed group 'rooter' owned by 'rooter' <86>Jun 17 04:57:34 userdel[3216858]: removed shadow group 'rooter' owned by 'rooter' <86>Jun 17 04:57:34 groupadd[3216889]: group added to /etc/group: name=rooter, GID=1859 <86>Jun 17 04:57:34 groupadd[3216889]: group added to /etc/gshadow: name=rooter <86>Jun 17 04:57:34 groupadd[3216889]: new group: name=rooter, GID=1859 <86>Jun 17 04:57:34 useradd[3216906]: new user: name=rooter, UID=1859, GID=1859, home=/root, shell=/bin/bash, from=none <86>Jun 17 04:57:34 userdel[3216927]: delete user 'builder' <86>Jun 17 04:57:34 userdel[3216927]: removed group 'builder' owned by 'builder' <86>Jun 17 04:57:34 userdel[3216927]: removed shadow group 'builder' owned by 'builder' <86>Jun 17 04:57:34 groupadd[3216949]: group added to /etc/group: name=builder, GID=1860 <86>Jun 17 04:57:34 groupadd[3216949]: group added to /etc/gshadow: name=builder <86>Jun 17 04:57:34 groupadd[3216949]: new group: name=builder, GID=1860 <86>Jun 17 04:57:34 useradd[3216962]: new user: name=builder, UID=1860, GID=1860, home=/usr/src, shell=/bin/bash, from=none <13>Jun 17 04:57:36 rpmi: libmpdec3-2.5.1-alt3 sisyphus+314490.500.5.1 1675432033 installed <13>Jun 17 04:57:36 rpmi: libgdbm-1.8.3-alt10 sisyphus+278100.1600.1.1 1626059138 installed <13>Jun 17 04:57:36 rpmi: libexpat-2.5.0-alt1 sisyphus+309227.100.1.1 1667075766 installed <13>Jun 17 04:57:36 rpmi: libb2-0.98.1-alt1_1 sisyphus+291614.100.1.1 1638962878 installed <13>Jun 17 04:57:36 rpmi: libp11-kit-0.24.1-alt1 sisyphus+293720.100.1.1 1642535281 installed <13>Jun 17 04:57:36 rpmi: libtasn1-4.19.0-alt1 sisyphus+305700.100.1.1 1661359628 installed <13>Jun 17 04:57:36 rpmi: rpm-macros-alternatives-0.5.2-alt2 sisyphus+315270.200.2.1 1676457367 installed <13>Jun 17 04:57:36 rpmi: alternatives-0.5.2-alt2 sisyphus+315270.200.2.1 1676457367 installed <13>Jun 17 04:57:36 rpmi: ca-certificates-2022.12.14-alt1 sisyphus+311754.200.1.1 1671046143 installed <13>Jun 17 04:57:36 rpmi: ca-trust-0.1.4-alt1 sisyphus+308690.100.1.1 1666182992 installed <13>Jun 17 04:57:36 rpmi: p11-kit-trust-0.24.1-alt1 sisyphus+293720.100.1.1 1642535281 installed <13>Jun 17 04:57:36 rpmi: libcrypto1.1-1.1.1u-alt1 sisyphus+322200.100.1.1 1685474790 installed <13>Jun 17 04:57:36 rpmi: libssl1.1-1.1.1u-alt1 sisyphus+322200.100.1.1 1685474790 installed <13>Jun 17 04:57:36 rpmi: python3-3.11.0-alt1 sisyphus+311250.40.175.1 1685626775 installed <13>Jun 17 04:57:37 rpmi: python3-base-3.11.0-alt1 sisyphus+311250.40.175.1 1685626775 installed <13>Jun 17 04:57:37 rpmi: tests-for-installed-python3-pkgs-0.1.22-alt1 sisyphus+319076.100.3.1 1682536051 installed <13>Jun 17 04:57:37 rpmi: rpm-build-python3-0.1.22-alt1 sisyphus+319076.100.3.1 1682536051 installed WARNING: %python3_build is deprecated and will be removed in future, please use %pyproject_build instead WARNING: %python3_install is deprecated and will be removed in future, please use %pyproject_install instead <13>Jun 17 04:57:39 rpmi: libjpeg-2:2.1.2-alt1.2 sisyphus+300827.100.2.1 1653916654 installed <13>Jun 17 04:57:39 rpmi: libpng16-1.6.39-alt1 sisyphus+310490.100.1.1 1669195208 installed <13>Jun 17 04:57:39 rpmi: libwayland-client-1.22.0-alt1.1 sisyphus+318010.100.1.1 1680606300 installed <13>Jun 17 04:57:39 rpmi: python3-module-six-1.16.0-alt1 sisyphus+283489.100.2.1 1629527308 installed <13>Jun 17 04:57:39 rpmi: libfribidi-1.0.13-alt1 sisyphus+320646.100.1.1 1684307083 installed <13>Jun 17 04:57:39 rpmi: python3-module-attrs-23.1.0-alt1 sisyphus+321859.100.1.1 1685027490 installed <13>Jun 17 04:57:39 rpmi: python3-module-idna-3.4-alt1 sisyphus+307942.100.1.1 1665051373 installed <13>Jun 17 04:57:39 rpmi: libatk-2.48.3-alt1 sisyphus+322046.200.1.1 1685216221 installed <13>Jun 17 04:57:39 rpmi: python3-module-zope.interface-5.4.0-alt2 sisyphus+311250.23000.175.1 1685634802 installed <13>Jun 17 04:57:39 rpmi: python3-module-pkg_resources-1:67.8.0-alt1 sisyphus+321626.100.2.1 1684831109 installed <13>Jun 17 04:57:39 rpmi: liblcms2-2.15-alt1 sisyphus+316039.100.1.1 1677666336 installed <13>Jun 17 04:57:40 rpmi: libwebp7-1.3.0-alt1 sisyphus+313492.100.1.1 1673749692 installed <13>Jun 17 04:57:40 rpmi: libogg-1.3.5-alt1 sisyphus+278100.3400.1.1 1626059695 installed <13>Jun 17 04:57:40 rpmi: libusb-1.0.26-alt2 sisyphus+305525.100.1.1 1660924428 installed <13>Jun 17 04:57:40 rpmi: libgomp1-12.2.1-alt3 sisyphus+322898.100.1.1 1686662557 installed <13>Jun 17 04:57:40 rpmi: libgudev-1:237-alt1 sisyphus+282754.100.1.1 1629006690 installed <13>Jun 17 04:57:40 rpmi: libopenjpeg2.0-2.5.0-alt1 sisyphus+299926.300.3.1 1652478844 installed <13>Jun 17 04:57:40 rpmi: libwayland-egl-4:18.1.0-alt1.1 sisyphus+318010.100.1.1 1680606300 installed <13>Jun 17 04:57:40 rpmi: python3-module-greenlet-2.0.1-alt1 sisyphus+311250.54100.178.1 1685787730 installed <13>Jun 17 04:57:40 rpmi: libtcl-8.6.13-alt1 sisyphus+310696.100.1.1 1669548256 installed <13>Jun 17 04:57:40 rpmi: libserd-0.30.10-alt1_3 sisyphus+288138.100.1.1 1635175832 installed <13>Jun 17 04:57:40 rpmi: libsqlite3-3.42.0-alt1 sisyphus+321513.100.1.1 1684603061 installed <13>Jun 17 04:57:40 rpmi: perl-HTTP-Date-6.05-alt1 sisyphus+258981.100.1.1 1601542386 installed <13>Jun 17 04:57:40 rpmi: perl-XML-NamespaceSupport-1.12-alt1 1491296348 installed <13>Jun 17 04:57:40 rpmi: libopus-1.4-alt1 sisyphus+319474.100.1.1 1682669971 installed <13>Jun 17 04:57:40 rpmi: libgsm-1.0.17-alt1 sisyphus+275359.100.2.1 1624907612 installed <13>Jun 17 04:57:40 rpmi: libidn2-2.3.4-alt1 sisyphus+309023.100.1.1 1666791089 installed <13>Jun 17 04:57:40 rpmi: libnettle8-3.9.1-alt1 sisyphus+322548.100.1.2 1686176897 installed <13>Jun 17 04:57:40 rpmi: libsord-0.16.8-alt1_2 sisyphus+286960.100.1.1 1634067443 installed <13>Jun 17 04:57:40 rpmi: libvorbis-1.3.7-alt1 sisyphus+275738.100.1.1 1624751609 installed <13>Jun 17 04:57:40 rpmi: libwayland-cursor-1.22.0-alt1.1 sisyphus+318010.100.1.1 1680606300 installed <13>Jun 17 04:57:40 rpmi: libv4l-1.24.1-alt1 sisyphus+315985.200.2.1 1677588862 installed <13>Jun 17 04:57:40 rpmi: python3-module-packaging-23.1-alt1 sisyphus+318906.100.2.1 1683015285 installed <13>Jun 17 04:57:40 rpmi: python3-module-multidict-6.0.4-alt1 sisyphus+311250.32340.176.1 1685738354 installed <13>Jun 17 04:57:40 rpmi: python3-module-frozenlist-1.3.3-alt1 sisyphus+311250.30540.176.1 1685737850 installed <13>Jun 17 04:57:40 rpmi: python3-module-charset-normalizer-2.1.1-alt1 sisyphus+311047.100.1.1 1669992940 installed <13>Jun 17 04:57:40 rpmi: python3-module-Pygments-2.15.1-alt1 sisyphus+321159.100.2.1 1684486730 installed <13>Jun 17 04:57:40 rpmi: libnspr-1:4.35-alt1 sisyphus+308164.100.1.1 1665397042 installed <13>Jun 17 04:57:40 rpmi: libgraphene-1.10.8-alt1 sisyphus+296855.100.1.1 1647633387 installed <13>Jun 17 04:57:40 rpmi: libcares-1.19.0-alt2 sisyphus+318319.40.19.1 1681776144 installed <13>Jun 17 04:57:40 rpmi: libzeromq-4.3.4-alt2 sisyphus+305424.100.1.1 1660736892 installed <13>Jun 17 04:57:40 rpmi: libopenal1-1.22.2-alt1 sisyphus+303860.40.2.1 1658091313 installed <13>Jun 17 04:57:40 rpmi: libxslt-1.1.37-alt1 sisyphus+307481.100.1.1 1664360525 installed <13>Jun 17 04:57:40 rpmi: libxkbcommon-1.5.0-alt1 sisyphus+312911.100.1.1 1673035570 installed <13>Jun 17 04:57:40 rpmi: libepoxy-1.5.10-alt1 sisyphus+296853.200.2.1 1647631868 installed <13>Jun 17 04:57:40 rpmi: perl-LWP-MediaTypes-6.04-alt1 sisyphus+225468.100.1.1 1553186684 installed <13>Jun 17 04:57:40 rpmi: perl-Compress-Raw-Zlib-2.204-alt1 sisyphus+314931.100.1.1 1675930919 installed <13>Jun 17 04:57:40 rpmi: perl-libnet-1:3.15-alt1 sisyphus+317310.100.1.1 1679580208 installed <13>Jun 17 04:57:40 rpmi: perl-XML-SAX-Base-1.09-alt1 1494364363 installed <13>Jun 17 04:57:40 rpmi: libjack-1:1.9.22-alt1 sisyphus+316501.200.2.1 1678463952 installed <13>Jun 17 04:57:40 rpmi: liblame-3.100-alt1 sisyphus+276241.100.1.2 1624925655 installed <13>Jun 17 04:57:40 rpmi: libSDL2-2.26.5-alt1 sisyphus+318116.100.1.1 1680766841 installed <13>Jun 17 04:57:40 rpmi: libdvdread8-6.1.3-alt1 sisyphus+302642.100.1.1 1656163149 installed <13>Jun 17 04:57:40 rpmi: libbs2b-3.1.0-alt1.4 sisyphus+284589.100.1.1 1630750816 installed <13>Jun 17 04:57:40 rpmi: libaom3-3.5.0-alt1 sisyphus+312085.100.1.1 1671551781 installed <13>Jun 17 04:57:40 rpmi: libimath29-3.1.6-alt4 sisyphus+311250.106000.178.1 1685823482 installed <13>Jun 17 04:57:40 rpmi: libiex30-3.1.5-alt2.1 sisyphus+316817.100.1.1 1678959009 installed <13>Jun 17 04:57:40 rpmi: liborc-0.4.34-alt1 sisyphus+322149.200.2.1 1685451298 installed <13>Jun 17 04:57:40 rpmi: libglvnd-7:1.6.0-alt2 sisyphus+321612.100.1.1 1684749008 installed <13>Jun 17 04:57:40 rpmi: libwayland-server-1.22.0-alt1.1 sisyphus+318010.100.1.1 1680606300 installed <13>Jun 17 04:57:40 rpmi: libkmod-30-alt1 sisyphus+307195.200.2.1 1663842346 installed <13>Jun 17 04:57:40 rpmi: kmod-30-alt1 sisyphus+307195.200.2.1 1663842346 installed <13>Jun 17 04:57:40 rpmi: libilmthread30-3.1.5-alt2.1 sisyphus+316817.100.1.1 1678959009 installed <13>Jun 17 04:57:40 rpmi: libopenexr30-3.1.5-alt2.1 sisyphus+316817.100.1.1 1678959009 installed <13>Jun 17 04:57:40 rpmi: libdvdnav-6.1.1-alt1 sisyphus+279367.100.1.1 1626478897 installed <13>Jun 17 04:57:40 rpmi: libnss-3.90.0-alt2 sisyphus+323138.200.2.1 1686907446 installed <13>Jun 17 04:57:40 rpmi: python3-module-alabaster-0.7.6-alt4 sisyphus+281697.200.1.1 1627919931 installed <13>Jun 17 04:57:40 rpmi: python3-module-aiosignal-1.3.1-alt1 sisyphus+314057.100.1.1 1674561191 installed <13>Jun 17 04:57:40 rpmi: python3-module-yarl-1.9.2-alt1 sisyphus+311250.34340.176.1 1685738974 installed <13>Jun 17 04:57:40 rpmi: libsratom-0.6.6-alt1_1 sisyphus+278712.200.4.2 1626176350 installed <13>Jun 17 04:57:40 rpmi: liblilv-0.24.12-alt1_3 sisyphus+295914.100.1.1 1645726271 installed <13>Jun 17 04:57:40 rpmi: libhogweed6-3.9.1-alt1 sisyphus+322548.100.1.2 1686176897 installed <13>Jun 17 04:57:40 rpmi: libgnutls30-3.7.9-alt1 sisyphus+315353.100.1.1 1676639387 installed <13>Jun 17 04:57:40 rpmi: libngtcp2-0.15.0-alt1 sisyphus+321126.100.1.1 1684414016 installed <13>Jun 17 04:57:40 rpmi: perl-File-Listing-6.15-alt1 sisyphus+298606.100.1.1 1650203737 installed <13>Jun 17 04:57:41 rpmi: tcl-8.6.13-alt1 sisyphus+310696.100.1.1 1669548256 installed <13>Jun 17 04:57:41 rpmi: libsoundtouch-2.3.2-alt1 sisyphus+314961.100.1.1 1675954594 installed <13>Jun 17 04:57:41 rpmi: libvidstab-1.1.0-alt2.1 sisyphus+279558.100.1.2 1626596086 installed <13>Jun 17 04:57:41 rpmi: libhidapi-0.12.0-alt1_1 sisyphus+303213.100.1.1 1657034193 installed <13>Jun 17 04:57:41 rpmi: python3-module-cython-hidapi-0.13.1-alt1 sisyphus+311250.66240.178.1 1685802734 installed <13>Jun 17 04:57:41 rpmi: python3-module-serial-3.5-alt2 sisyphus+281995.100.1.1 1628172783 installed <13>Jun 17 04:57:41 rpmi: libflac8-1.3.3.0.79.37d1-alt2 sisyphus+278100.1400.1.1 1626058888 installed <13>Jun 17 04:57:41 rpmi: libtheora-2:1.1.1-alt6 sisyphus+277967.100.1.1 1625928124 installed <13>Jun 17 04:57:41 rpmi: python3-module-idna_ssl-1.1.0-alt2 sisyphus+272418.100.1.1 1621876529 installed <13>Jun 17 04:57:41 rpmi: python3-module-hyperlink-21.0.0-alt1.1 sisyphus+304836.100.1.1 1659710964 installed <13>Jun 17 04:57:41 rpmi: python3-module-outcome-1.2.0-alt2 sisyphus+318894.1400.3.1 1682420792 installed <13>Jun 17 04:57:41 rpmi: python3-module-genshi-0.7.7-alt1 sisyphus+311250.13200.175.1 1685632570 installed <13>Jun 17 04:57:41 rpmi: libslang2-2.3.3-alt1 sisyphus+314492.100.1.1 1675240397 installed <13>Jun 17 04:57:41 rpmi: python3-module-snowballstemmer-2.2.0-alt1 sisyphus+319215.100.1.1 1682346633 installed <13>Jun 17 04:57:41 rpmi: python3-module-pluggy-1.0.0-alt1 sisyphus+284853.100.1.1 1631109373 installed <13>Jun 17 04:57:41 rpmi: python3-module-markupsafe-1:2.1.2-alt1 sisyphus+311250.14740.175.1 1685632983 installed <13>Jun 17 04:57:41 rpmi: python3-module-jinja2-3.1.2-alt1 sisyphus+303664.100.1.1 1657809843 installed <13>Jun 17 04:57:41 rpmi: python3-module-iniconfig-2.0.0-alt1 sisyphus+314076.200.3.1 1674737275 installed <13>Jun 17 04:57:41 rpmi: python3-module-imagesize-1.4.1-alt1 sisyphus+318084.100.1.1 1680697673 installed <13>Jun 17 04:57:41 rpmi: python3-module-httptools-0.1.1-alt1 sisyphus+311250.31100.176.1 1685737908 installed <13>Jun 17 04:57:41 rpmi: python3-module-webencodings-0.5.1-alt2 sisyphus+276020.100.1.1 1624812421 installed <13>Jun 17 04:57:41 rpmi: python3-module-html5lib-1:1.1-alt1 sisyphus+278096.120.5.1 1626086978 installed <13>Jun 17 04:57:41 rpmi: python3-module-lxml-4.9.2-alt2 sisyphus+311250.25230.176.1 1685735409 installed <13>Jun 17 04:57:41 rpmi: python3-module-docutils-0.18.1-alt2 sisyphus+298475.100.1.1 1650019614 installed <13>Jun 17 04:57:41 rpmi: python3-module-babel-1:2.12.1-alt1 sisyphus+317409.100.1.1 1679678193 installed <13>Jun 17 04:57:41 rpmi: python3-module-click-8.1.3-alt2 sisyphus+318894.300.3.1 1682420471 installed <13>Jun 17 04:57:41 rpmi: python3-module-incremental-22.10.0-alt1 sisyphus+312706.100.1.1 1672404273 installed <13>Jun 17 04:57:41 rpmi: python3-module-constantly-15.1.0-alt6 sisyphus+284854.100.1.1 1631108193 installed <13>Jun 17 04:57:41 rpmi: python3-module-typing_extensions-4.6.3-alt1 sisyphus+322816.100.1.1 1686499213 installed <13>Jun 17 04:57:41 rpmi: python3-module-pygobject-2.28.6-alt13 sisyphus+311250.56700.178.1 1685789723 installed <13>Jun 17 04:57:41 rpmi: python3-module-appdirs-1.4.4-alt1 sisyphus+267613.300.2.1 1620039159 installed <13>Jun 17 04:57:41 rpmi: python3-module-certifi-2023.5.7-alt1 sisyphus+322622.100.1.1 1686217855 installed <13>Jun 17 04:57:41 rpmi: python3-module-z3c-3.0.0-alt4 sisyphus+284857.200.1.1 1631109149 installed <13>Jun 17 04:57:41 rpmi: python3-module-zc-1.0.0-alt7 sisyphus+284857.100.1.1 1631109117 installed <13>Jun 17 04:57:41 rpmi: python3-module-zope-3.3.0-alt9 sisyphus+281937.200.4.1 1628175910 installed <13>Jun 17 04:57:41 rpmi: python3-module-zope.event-4.6-alt1 sisyphus+321169.240.3.1 1684590441 installed <13>Jun 17 04:57:41 rpmi: python3-module-pycparser-2.21-alt1.1 sisyphus+309935.7300.4.1 1668527005 installed <13>Jun 17 04:57:41 rpmi: python3-module-cffi-1.15.1-alt2 sisyphus+311250.35200.176.1 1685739676 installed <13>Jun 17 04:57:42 rpmi: python3-module-cryptography-40.0.2-alt1 sisyphus+311250.41604.176.1 1685741792 installed <13>Jun 17 04:57:42 rpmi: python3-module-openssl-23.1.1-alt2 sisyphus+319053.1700.6.1 1682668601 installed <13>Jun 17 04:57:42 rpmi: python3-module-urllib3-2:1.26.14-alt2 sisyphus+318352.100.1.1 1681194106 installed <13>Jun 17 04:57:42 rpmi: python3-module-requests-2.31.0-alt1 sisyphus+321663.100.2.1 1684917021 installed <13>Jun 17 04:57:42 rpmi: python3-module-pycares-4.1.2-alt1 sisyphus+311250.45300.178.1 1685783642 installed <13>Jun 17 04:57:42 rpmi: python3-module-astor-0.8.1-alt1.1 sisyphus+315877.100.1.1 1677481862 installed <13>Jun 17 04:57:42 rpmi: python3-module-sortedcontainers-2.4.0-alt1 sisyphus+272042.100.1.1 1621262424 installed <13>Jun 17 04:57:42 rpmi: python3-module-sniffio-1.2.0-alt1 sisyphus+295017.1600.2.1 1644498020 installed <13>Jun 17 04:57:42 rpmi: python3-module-exceptiongroup-1.1.1-alt3 sisyphus+323085.100.1.1 1686832149 installed <13>Jun 17 04:57:42 rpmi: python3-module-async_generator-1.10-alt3 sisyphus+319053.1600.6.1 1682668582 installed <13>Jun 17 04:57:42 rpmi: python3-module-trio-0.22.0-alt3 sisyphus+319053.2100.6.1 1682668663 installed <13>Jun 17 04:57:42 rpmi: python3-module-dns-1:2.2.0-alt2 sisyphus+320065.60.1.1 1683366881 installed <13>Jun 17 04:57:42 rpmi: python3-module-async-timeout-4.0.2-alt1 sisyphus+295017.1100.2.1 1644497909 installed <13>Jun 17 04:57:42 rpmi: python3-module-openid-3.2.0-alt1 sisyphus+278049.100.2.1 1625998936 installed <13>Jun 17 04:57:42 rpmi: python3-module-Cheetah-3.3.1-alt3 sisyphus+323086.100.1.2 1686870555 installed <13>Jun 17 04:57:42 rpmi: python3-module-paste-3.5.0-alt1.1 sisyphus+309935.6500.4.1 1668526794 installed <13>Jun 17 04:57:42 rpmi: python3-module-PasteDeploy-1:3.0.1-alt1 sisyphus+308592.100.1.1 1666070463 installed <13>Jun 17 04:57:42 rpmi: python3-module-PasteScript-1:2.0.2-alt2 sisyphus+272468.100.1.1 1621939313 installed <13>Jun 17 04:57:42 rpmi: python-sphinx-objects.inv-1:2.3.13.20230612-alt1 sisyphus+322875.100.1.1 1686615131 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinxcontrib-applehelp-1.0.4-alt1 sisyphus+315438.100.1.1 1676719519 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinxcontrib-devhelp-1.0.2-alt1 sisyphus+276003.100.2.2 1624879024 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinxcontrib-jquery-4.1-alt2 sisyphus+317619.100.1.1 1680000409 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinxcontrib-jsmath-1.0.1-alt1 sisyphus+276004.100.1.1 1624811634 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinxcontrib-htmlhelp-2.0.0-alt2 sisyphus+298571.100.1.1 1650103344 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinxcontrib-qthelp-1.0.3-alt2 sisyphus+304787.100.1.1 1659628584 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinxcontrib-serializinghtml-1.1.5-alt2 sisyphus+298572.100.1.1 1650104574 installed <13>Jun 17 04:57:42 rpmi: python3-module-sphinx-1:6.1.3-alt2 sisyphus+311250.42010.176.1 1685741409 installed <13>Jun 17 04:57:42 rpmi: libuv-1.45.0-alt1 sisyphus+322120.40.2.1 1685405027 installed <13>Jun 17 04:57:42 rpmi: poppler-data-0.4.12-alt1 sisyphus+322151.100.1.1 1685438511 installed <13>Jun 17 04:57:42 rpmi: libmaxminddb-1.7.1-alt1 sisyphus+310839.100.1.1 1669722011 installed <13>Jun 17 04:57:43 rpmi: libicu73-1:7.3.2-alt1 sisyphus+323036.100.1.1 1686763401 installed <13>Jun 17 04:57:43 rpmi: libev4-4.33-alt2 sisyphus+286828.100.2.3 1634005210 installed <13>Jun 17 04:57:43 rpmi: python3-module-gevent-22.10.2-alt1 sisyphus+311250.61340.178.1 1685796502 installed <13>Jun 17 04:57:43 rpmi: libcdio-2.1.0-alt1 sisyphus+275238.100.2.1 1624562774 installed <13>Jun 17 04:57:43 rpmi: libcdio-paranoia-10.2.2.0.1-alt1 sisyphus+277999.100.1.3 1625972088 installed <13>Jun 17 04:57:43 rpmi: iso-codes-4.15.0-alt1 sisyphus+319467.100.1.1 1682666958 installed <13>Jun 17 04:57:43 rpmi: libwebrtc-0.3-alt2 sisyphus+277616.100.1.1 1625719136 installed <13>Jun 17 04:57:43 rpmi: libvo-amrwbenc-0.1.3-alt1 sisyphus+275410.100.1.2 1624504622 installed <13>Jun 17 04:57:43 rpmi: libvo-aacenc-0.1.3-alt2 sisyphus+285812.100.1.1 1632410892 installed <13>Jun 17 04:57:44 rpmi: libsrtp2-2.5.0-alt1 sisyphus+315000.100.1.1 1676038240 installed <13>Jun 17 04:57:44 rpmi: libsbc1-2.0-alt2 sisyphus+322708.100.1.1 1686327806 installed <13>Jun 17 04:57:44 rpmi: libopenh264-2.3.1-alt1.1 sisyphus+311295.100.1.1 1670335733 installed <13>Jun 17 04:57:44 rpmi: libtbb-2021.5.0-alt1 sisyphus+311250.24500.175.1 1685636419 installed <13>Jun 17 04:57:44 rpmi: libprotobuf32-3.21.12-alt2 sisyphus+311250.102500.178.1 1685820882 installed <13>Jun 17 04:57:44 rpmi: libglog-0.5.0-alt1 sisyphus+291409.100.1.1 1638655129 installed <13>Jun 17 04:57:44 rpmi: librabbitmq-c4-0.13.0-alt1 sisyphus+314793.100.2.1 1678110227 installed <13>Jun 17 04:57:44 rpmi: libgme-0.6.3-alt2 sisyphus+293048.100.1.1 1641452309 installed <13>Jun 17 04:57:44 rpmi: libudfread-1.1.2-alt2 sisyphus+286325.100.1.1 1633263314 installed <13>Jun 17 04:57:44 rpmi: vulkan-filesystem-1.3.250-alt1 sisyphus+321834.400.1.1 1685013995 installed <13>Jun 17 04:57:44 rpmi: libvulkan1-1.3.250-alt1 sisyphus+321834.400.1.1 1685013995 installed <13>Jun 17 04:57:44 rpmi: libsoxr-0.1.3-alt1.1 sisyphus+317908.100.1.1 1680415071 installed <13>Jun 17 04:57:44 rpmi: libxvid-1.3.7-alt1 sisyphus+292833.100.1.1 1640949611 installed <13>Jun 17 04:57:44 rpmi: libnuma-2.0.14-alt2 sisyphus+278485.100.1.1 1626104243 installed <13>Jun 17 04:57:44 rpmi: libx265-199-3.5-alt1.1 sisyphus+277560.100.1.1 1625696944 installed <13>Jun 17 04:57:44 rpmi: libx264-164-alt1.1 sisyphus+322210.100.1.1 1685494886 installed <13>Jun 17 04:57:44 rpmi: libvpx6-1.11.0-alt2 sisyphus+294379.500.5.1 1644490615 installed <13>Jun 17 04:57:44 rpmi: libtwolame-0.4.0-alt1 sisyphus+277783.100.1.1 1625798479 installed <13>Jun 17 04:57:44 rpmi: libspeex-1.2-alt2 sisyphus+287335.100.1.1 1634381376 installed <13>Jun 17 04:57:44 rpmi: libsnappy-1.1.7-alt1 sisyphus+276400.100.1.2 1625016400 installed <13>Jun 17 04:57:44 rpmi: libopencore-amrwb0-0.1.6-alt1 sisyphus+307923.100.1.1 1665032485 installed <13>Jun 17 04:57:44 rpmi: libopencore-amrnb0-0.1.6-alt1 sisyphus+307923.100.1.1 1665032485 installed <13>Jun 17 04:57:44 rpmi: libdav1d5-0.9.2-alt2 sisyphus+319138.140.3.1 1682251164 installed <13>Jun 17 04:57:44 rpmi: libcodec2-1.0.5-alt1 sisyphus+307919.100.1.1 1665032101 installed <13>Jun 17 04:57:44 rpmi: libmodplug-0.8.9.0-alt1 sisyphus+275375.100.2.2 1624495419 installed <13>Jun 17 04:57:44 rpmi: libdv-1.0.0-alt6 sisyphus+286805.100.2.1 1633968017 installed <13>Jun 17 04:57:44 rpmi: libyajl-2.1.0-alt3 sisyphus+322972.100.1.1 1686724493 installed <13>Jun 17 04:57:44 rpmi: libkate-0.4.1-alt1.6 sisyphus+282639.100.1.1 1628935992 installed <13>Jun 17 04:57:44 rpmi: libdatrie-0.2.13-alt1_2 sisyphus+285649.100.1.1 1632260805 installed <13>Jun 17 04:57:44 rpmi: libthai-0.1.29-alt1_1 sisyphus+292947.100.1.1 1641111918 installed <13>Jun 17 04:57:44 rpmi: usbids-20230519-alt1 sisyphus+321221.100.1.1 1684538206 installed <13>Jun 17 04:57:44 rpmi: pciids-20230526-alt1 sisyphus+321954.100.1.1 1685142491 installed <13>Jun 17 04:57:44 rpmi: hwdata-0.370-alt1 sisyphus+319911.1.10.1 1684006762 installed <13>Jun 17 04:57:44 rpmi: gtk+3-themes-incompatible-3.20-alt3 1461944560 installed <13>Jun 17 04:57:44 rpmi: perl-Try-Tiny-0.31-alt1 sisyphus+290597.100.1.1 1637915507 installed <13>Jun 17 04:57:44 rpmi: perl-IO-Socket-IP-0.41-alt1 sisyphus+259012.100.1.2 1601553446 installed <13>Jun 17 04:57:44 rpmi: perl-Compress-Raw-Bzip2-2.204-alt1 sisyphus+314930.100.1.1 1675930902 installed <13>Jun 17 04:57:44 rpmi: perl-IO-Compress-Brotli-2:0.004001-alt3 sisyphus+302124.100.1.1 1655283098 installed <13>Jun 17 04:57:44 rpmi: perl-Clone-0.46-alt1 sisyphus+308850.100.1.3 1666451819 installed <13>Jun 17 04:57:44 rpmi: perl-HTML-Tagset-3.20-alt2 1317725093 installed <13>Jun 17 04:57:44 rpmi: perl-Term-ANSIColor-5.01-alt1 sisyphus+244783.100.1.2 1579747505 installed <13>Jun 17 04:57:44 rpmi: perl-Data-Dump-1.25-alt1 sisyphus+276551.100.1.1 1625126880 installed <13>Jun 17 04:57:44 rpmi: perl-Filter-1.64-alt1 sisyphus+305464.100.1.1 1660815328 installed <13>Jun 17 04:57:44 rpmi: perl-Encode-3.19-alt1 sisyphus+304776.100.1.1 1659623414 installed <13>Jun 17 04:57:44 rpmi: perl-URI-5.19-alt1 sisyphus+321559.100.1.1 1684663596 installed <13>Jun 17 04:57:44 rpmi: perl-IO-Compress-2.204-alt1 sisyphus+315114.100.1.1 1676225908 installed <13>Jun 17 04:57:44 rpmi: perl-Net-HTTP-6.22-alt1 sisyphus+294185.100.1.1 1643275428 installed <13>Jun 17 04:57:44 rpmi: perl-HTML-Parser-3.81-alt1 sisyphus+314462.100.1.1 1675193797 installed <13>Jun 17 04:57:44 rpmi: perl-WWW-RobotRules-6.02-alt1 1329756211 installed <13>Jun 17 04:57:44 rpmi: perl-Encode-Locale-1.05-alt1 1444608613 installed <13>Jun 17 04:57:44 rpmi: perl-IO-HTML-1.004-alt1 sisyphus+258983.100.1.1 1601542619 installed <13>Jun 17 04:57:44 rpmi: perl-HTTP-Message-6.44-alt1 sisyphus+309106.100.1.1 1666913573 installed <13>Jun 17 04:57:44 rpmi: perl-HTTP-Negotiate-6.01-alt1 1329760563 installed <13>Jun 17 04:57:44 rpmi: perl-libwww-6.70-alt1 sisyphus+321571.100.1.1 1684677769 installed <13>Jun 17 04:57:44 rpmi: perl-XML-LibXML-2.0208-alt3 sisyphus+319374.100.1.1 1682520584 installed <13>Jun 17 04:57:44 rpmi: perl-XML-SAX-1.02-alt1 sisyphus+232322.100.1.1 1560758406 installed <13>Jun 17 04:57:44 rpmi: perl-XML-Simple-2.25-alt2 sisyphus+257498.100.1.1 1599324034 installed <13>Jun 17 04:57:44 rpmi: icon-naming-utils-0.8.90-alt1 sisyphus+276851.100.1.1 1625243947 installed <13>Jun 17 04:57:44 rpmi: icon-theme-adwaita-44.0-alt1 sisyphus+317076.600.1.1 1679344664 installed <13>Jun 17 04:57:44 rpmi: libdeflate-1.18-alt1 sisyphus+317484.100.1.1 1679768614 installed <13>Jun 17 04:57:44 rpmi: libtiff5-4.4.0-alt4 sisyphus+322581.100.1.2 1686180093 installed <13>Jun 17 04:57:44 rpmi: libgdk-pixbuf-locales-2.42.10-alt1 sisyphus+308991.100.1.1 1666721198 installed <13>Jun 17 04:57:44 rpmi: libfreeaptx-0.1.1-alt1 sisyphus+282022.100.1.1 1628182684 installed <13>Jun 17 04:57:44 rpmi: libasyncns-0.8-alt2.qa1 sisyphus+275091.100.1.1 1624402242 installed <13>Jun 17 04:57:44 rpmi: liblash-1:0.5.4-alt1_51 sisyphus+318973.100.1.1 1682020747 installed <13>Jun 17 04:57:44 rpmi: libmpg123-1.31.3-alt1 sisyphus+317036.100.1.1 1679313685 installed <13>Jun 17 04:57:44 rpmi: libsndfile-1.1.0-alt1 sisyphus+306371.40.3.1 1662942490 installed <13>Jun 17 04:57:44 rpmi: libinstpatch-1.1.6-alt1.1 sisyphus+279572.100.1.2 1626605112 installed <13>Jun 17 04:57:44 rpmi: libsamplerate-0.2.2-alt1 sisyphus+284642.100.1.1 1630906257 installed <13>Jun 17 04:57:44 rpmi: libfaad2-2.10.1-alt1 sisyphus+317706.100.1.1 1680093778 installed <13>Jun 17 04:57:44 rpmi: libfaac0-1.28-alt2 sisyphus+275719.100.1.1 1624749514 installed <13>Jun 17 04:57:45 rpmi: libdca0-0.0.5-alt4.qa1 sisyphus+275349.100.1.2 1624487409 installed <13>Jun 17 04:57:45 rpmi: libraw1394-11-2.1.2-alt1 sisyphus+278262.100.1.1 1626081867 installed <13>Jun 17 04:57:45 rpmi: libdc1394-22-2.2.5-alt1.1 sisyphus+311484.100.1.1 1670647148 installed <13>Jun 17 04:57:45 rpmi: libssh2-1.10.0-alt1 sisyphus+289470.100.1.1 1636752294 installed <13>Jun 17 04:57:45 rpmi: publicsuffix-list-dafsa-20230404-alt1 sisyphus+318117.100.1.1 1680769734 installed <13>Jun 17 04:57:45 rpmi: libpsl-0.21.2-alt1 sisyphus+312536.100.1.1 1672131180 installed <13>Jun 17 04:57:45 rpmi: libnghttp3-0.9.0-alt1 sisyphus+317166.100.1.1 1679409333 installed <13>Jun 17 04:57:45 rpmi: libnghttp2-1.53.0-alt1 sisyphus+320325.100.1.1 1683872096 installed <13>Jun 17 04:57:45 rpmi: openldap-common-2.6.4-alt1 sisyphus+321176.240.10.2 1684802269 installed <13>Jun 17 04:57:45 rpmi: libntlm-1.5-alt1 sisyphus+278100.3300.1.1 1626059663 installed <13>Jun 17 04:57:45 rpmi: libidn-1.37-alt2 sisyphus+300849.100.1.1 1653769693 installed <13>Jun 17 04:57:45 rpmi: libverto-0.3.2-alt1_1 sisyphus+321176.2200.10.2 1684806164 installed <13>Jun 17 04:57:45 rpmi: liblmdb-0.9.29-alt1.1 sisyphus+306630.100.1.1 1663072361 installed <13>Jun 17 04:57:45 rpmi: libkeyutils-1.6.3-alt1 sisyphus+266061.100.1.1 1612919567 installed <13>Jun 17 04:57:45 rpmi: libcom_err-1.46.4.0.5.4cda-alt1 sisyphus+283826.100.1.1 1629975361 installed <86>Jun 17 04:57:45 groupadd[3272707]: group added to /etc/group: name=_keytab, GID=999 <86>Jun 17 04:57:45 groupadd[3272707]: group added to /etc/gshadow: name=_keytab <86>Jun 17 04:57:45 groupadd[3272707]: new group: name=_keytab, GID=999 <13>Jun 17 04:57:45 rpmi: libkrb5-1.21-alt1 sisyphus+323024.100.1.1 1686749989 installed <13>Jun 17 04:57:45 rpmi: libgsasl-1.8.0-alt3 sisyphus+275307.100.1.2 1624478553 installed <86>Jun 17 04:57:45 groupadd[3273102]: group added to /etc/group: name=sasl, GID=998 <86>Jun 17 04:57:45 groupadd[3273102]: group added to /etc/gshadow: name=sasl <86>Jun 17 04:57:45 groupadd[3273102]: new group: name=sasl, GID=998 <13>Jun 17 04:57:45 rpmi: libsasl2-3-2.1.27-alt2.2 sisyphus+306372.1000.8.1 1663097332 installed <13>Jun 17 04:57:45 rpmi: libldap2-2.6.4-alt1 sisyphus+321176.240.10.2 1684803685 installed <13>Jun 17 04:57:45 rpmi: libpq5-15.3-alt1 sisyphus+311250.7602.175.1 1685631767 installed <13>Jun 17 04:57:45 rpmi: python3-module-psycopg2-2.9.5-alt1 sisyphus+311250.16300.175.1 1685633601 installed <13>Jun 17 04:57:45 rpmi: python3-module-eventlet-0.33.3-alt2 sisyphus+318403.100.1.1 1681290339 installed <13>Jun 17 04:57:45 rpmi: libneon-0.32.2-alt1 sisyphus+302406.100.1.1 1655812503 installed <13>Jun 17 04:57:45 rpmi: libssh-0.9.6-alt1 sisyphus+284392.100.1.1 1630573058 installed <13>Jun 17 04:57:45 rpmi: libfftw3-common-3.3.8-alt2 sisyphus+278100.1300.1.1 1626057222 installed <13>Jun 17 04:57:45 rpmi: libfftw3-3.3.8-alt2 sisyphus+278100.1300.1.1 1626057647 installed <13>Jun 17 04:57:45 rpmi: libchromaprint1-1.5.0-alt2 sisyphus+286097.100.1.1 1632900907 installed <13>Jun 17 04:57:45 rpmi: librubberband-3.1.2-alt1 sisyphus+311187.100.1.1 1670229535 installed <13>Jun 17 04:57:45 rpmi: libpixman-3:0.42.2-alt1 sisyphus+309549.100.1.1 1667649379 installed <13>Jun 17 04:57:45 rpmi: libbrotlicommon-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Jun 17 04:57:45 rpmi: libbrotlidec-1.0.9-alt2 sisyphus+278430.100.1.2 1626213212 installed <13>Jun 17 04:57:45 rpmi: libcurl-8.1.2-alt1 sisyphus+322142.100.1.1 1685433996 installed <13>Jun 17 04:57:45 rpmi: libraptor2-2.0.15-alt1 sisyphus+275177.100.1.2 1624430744 installed <13>Jun 17 04:57:45 rpmi: liblrdf-0.6.1-alt1 sisyphus+278205.100.1.1 1626077043 installed <13>Jun 17 04:57:45 rpmi: libgraphite2-1.3.14-alt2.1 sisyphus+279571.100.1.2 1626605157 installed <13>Jun 17 04:57:45 rpmi: libharfbuzz-7.0.1-alt1 sisyphus+318888.200.2.1 1682012447 installed <13>Jun 17 04:57:45 rpmi: libfreetype-2.13.0-alt1 sisyphus+315092.100.1.1 1676198645 installed <13>Jun 17 04:57:45 rpmi: libfontconfig1-2.14.2-alt7 sisyphus+319291.100.1.1 1682426301 installed <13>Jun 17 04:57:45 rpmi: libass9-0.17.1-alt1 sisyphus+315891.100.1.1 1677487409 installed <13>Jun 17 04:57:45 rpmi: libbluray-1.3.4-alt1 sisyphus+310888.100.1.1 1669800491 installed <13>Jun 17 04:57:45 rpmi: libpoppler126-23.01.0-alt1 sisyphus+313777.100.1.1 1674212828 installed <13>Jun 17 04:57:45 rpmi: libharfbuzz-gobject-7.0.1-alt1 sisyphus+318888.200.2.1 1682012447 installed <13>Jun 17 04:57:45 rpmi: libxshmfence-1.3.2-alt1 sisyphus+311428.400.1.1 1670577529 installed <13>Jun 17 04:57:45 rpmi: libpciaccess-1:0.17-alt1 sisyphus+308663.300.1.1 1666168262 installed <13>Jun 17 04:57:45 rpmi: libdrm-1:2.4.115-alt1 sisyphus+315873.100.1.1 1677481746 installed <13>Jun 17 04:57:45 rpmi: libgbm-4:23.0.4-alt1 sisyphus+322784.2300.12.3 1686809120 installed <13>Jun 17 04:57:45 rpmi: libproxy-0.4.18-alt1 sisyphus+307602.100.1.1 1664480605 installed <13>Jun 17 04:57:45 rpmi: libjson-c5-0.15-alt1.1 sisyphus+279547.100.1.2 1626594467 installed <13>Jun 17 04:57:45 rpmi: libCharLS2-2.0.0-alt1_3 sisyphus+276376.100.1.1 1624988381 installed <13>Jun 17 04:57:45 rpmi: gdcm-3.0.12-alt3 sisyphus+311250.25640.176.1 1685735892 installed <13>Jun 17 04:57:45 rpmi: libdevmapper-1.02.193-alt1 sisyphus+317421.100.1.1 1679684422 installed <13>Jun 17 04:57:45 rpmi: mount-2.38.1-alt1 sisyphus+308470.100.1.1 1665845352 installed <13>Jun 17 04:57:45 rpmi: losetup-2.38.1-alt1 sisyphus+308470.100.1.1 1665845352 installed <13>Jun 17 04:57:45 rpmi: lsblk-2.38.1-alt1 sisyphus+308470.100.1.1 1665845352 installed <86>Jun 17 04:57:45 groupadd[3276870]: group added to /etc/group: name=tape, GID=997 <86>Jun 17 04:57:45 groupadd[3276870]: group added to /etc/gshadow: name=tape <86>Jun 17 04:57:45 groupadd[3276870]: new group: name=tape, GID=997 <86>Jun 17 04:57:45 groupadd[3276888]: group added to /etc/group: name=dialout, GID=996 <86>Jun 17 04:57:45 groupadd[3276888]: group added to /etc/gshadow: name=dialout <86>Jun 17 04:57:45 groupadd[3276888]: new group: name=dialout, GID=996 <86>Jun 17 04:57:45 groupadd[3276901]: group added to /etc/group: name=input, GID=995 <86>Jun 17 04:57:45 groupadd[3276901]: group added to /etc/gshadow: name=input <86>Jun 17 04:57:45 groupadd[3276901]: new group: name=input, GID=995 <86>Jun 17 04:57:45 groupadd[3276931]: group added to /etc/group: name=video, GID=994 <86>Jun 17 04:57:45 groupadd[3276931]: group added to /etc/gshadow: name=video <86>Jun 17 04:57:45 groupadd[3276931]: new group: name=video, GID=994 <86>Jun 17 04:57:45 groupadd[3276949]: group added to /etc/group: name=render, GID=993 <86>Jun 17 04:57:45 groupadd[3276949]: group added to /etc/gshadow: name=render <86>Jun 17 04:57:45 groupadd[3276949]: new group: name=render, GID=993 <86>Jun 17 04:57:45 groupadd[3276976]: group added to /etc/group: name=sgx, GID=992 <86>Jun 17 04:57:45 groupadd[3276976]: group added to /etc/gshadow: name=sgx <86>Jun 17 04:57:45 groupadd[3276976]: new group: name=sgx, GID=992 <13>Jun 17 04:57:45 rpmi: udev-1:252.7-alt1 sisyphus+316321.100.1.1 1678130459 installed <13>Jun 17 04:57:46 rpmi: dmsetup-1.02.193-alt1 sisyphus+317421.100.1.1 1679684422 installed <13>Jun 17 04:57:46 rpmi: desktop-file-utils-0.26-alt3 sisyphus+297027.100.1.1 1648023316 installed <13>Jun 17 04:57:46 rpmi: shared-mime-info-2.2-alt1 sisyphus+297388.100.1.1 1648466617 installed <13>Jun 17 04:57:46 rpmi: gsettings-desktop-schemas-data-44.0-alt1 sisyphus+317076.300.1.1 1679344430 installed <13>Jun 17 04:57:46 rpmi: libgio-2.76.3-alt1 sisyphus+311250.25703.176.1 1685736268 installed <13>Jun 17 04:57:46 rpmi: gsettings-desktop-schemas-44.0-alt1 sisyphus+317076.300.1.1 1679344454 installed <13>Jun 17 04:57:46 rpmi: gobject-introspection-1.76.1-alt1 sisyphus+311250.42520.176.1 1685742148 installed <13>Jun 17 04:57:46 rpmi: libgdk-pixbuf-2.42.10-alt1 sisyphus+308991.100.1.1 1666721201 installed <13>Jun 17 04:57:46 rpmi: libgstreamer1.0-1.22.3-alt1 sisyphus+321216.100.1.1 1684520503 installed <13>Jun 17 04:57:46 rpmi: libgdk-pixbuf-gir-2.42.10-alt1 sisyphus+308991.100.1.1 1666721201 installed <13>Jun 17 04:57:46 rpmi: gobject-introspection-x11-1.76.1-alt1 sisyphus+311250.42520.176.1 1685742148 installed <13>Jun 17 04:57:46 rpmi: libgstreamer1.0-gir-1.22.3-alt1 sisyphus+321216.100.1.1 1684520503 installed <13>Jun 17 04:57:46 rpmi: gtk4-update-icon-cache-4.10.4-alt1 sisyphus+322490.40.2.3 1686184424 installed <13>Jun 17 04:57:46 rpmi: libatk-gir-2.48.3-alt1 sisyphus+322046.200.1.1 1685216221 installed <13>Jun 17 04:57:46 rpmi: libgraphene-gir-1.10.8-alt1 sisyphus+296855.100.1.1 1647633387 installed <13>Jun 17 04:57:46 rpmi: gstreamer1.0-1.22.3-alt1 sisyphus+321216.100.1.1 1684520503 installed <13>Jun 17 04:57:46 rpmi: libharfbuzz-gir-7.0.1-alt1 sisyphus+318888.200.2.1 1682012447 installed <13>Jun 17 04:57:46 rpmi: libgudev-gir-1:237-alt1 sisyphus+282754.100.1.1 1629006690 installed <13>Jun 17 04:57:46 rpmi: glib-networking-2.76.0-alt1 sisyphus+311250.1140.175.1 1685627498 installed <13>Jun 17 04:57:46 rpmi: libsoup-2.74.3-alt1.1 sisyphus+318260.100.1.1 1680994690 installed <13>Jun 17 04:57:46 rpmi: libgssdp1.2-1.4.0.1-alt1 sisyphus+285462.100.1.1 1632035584 installed <13>Jun 17 04:57:46 rpmi: libgupnp1.2-1.4.3-alt1 sisyphus+293449.100.1.1 1642101427 installed <13>Jun 17 04:57:46 rpmi: libgupnp-igd-1.2.0-alt1.2 sisyphus+285462.600.1.1 1632035951 installed <13>Jun 17 04:57:46 rpmi: libnice-0.1.21-alt1 sisyphus+313049.100.1.1 1673271413 installed <13>Jun 17 04:57:46 rpmi: libdconf-0.40.0-alt1 sisyphus+279299.100.1.2 1626495975 installed <13>Jun 17 04:57:46 rpmi: libjson-glib-1.6.6-alt1 sisyphus+299768.200.2.1 1652264095 installed <13>Jun 17 04:57:46 rpmi: libgusb-0.4.6-alt1 sisyphus+322791.100.1.1 1686422573 installed <13>Jun 17 04:57:46 rpmi: libcolord-1.4.6-alt1 sisyphus+296000.100.1.1 1646050423 installed <13>Jun 17 04:57:46 rpmi: libcloudproviders-0.3.1-alt1 sisyphus+278834.100.1.2 1626283455 installed <13>Jun 17 04:57:46 rpmi: libX11-locales-3:1.8.6-alt1 sisyphus+323114.100.1.1 1686850054 installed <13>Jun 17 04:57:46 rpmi: libXdmcp-1.1.4-alt1 sisyphus+311188.1000.1.1 1670233860 installed <13>Jun 17 04:57:46 rpmi: libXau-1.0.11-alt1 sisyphus+311428.100.1.1 1670577440 installed <13>Jun 17 04:57:46 rpmi: libxcb-1.15-alt1 sisyphus+299436.300.1.1 1651655490 installed <13>Jun 17 04:57:46 rpmi: libX11-3:1.8.6-alt1 sisyphus+323114.100.1.1 1686850057 installed <13>Jun 17 04:57:46 rpmi: libXext-1.3.5-alt1 sisyphus+309285.100.1.1 1667212413 installed <13>Jun 17 04:57:46 rpmi: libXfixes-6.0.0-alt1 sisyphus+284644.300.1.1 1630910333 installed <13>Jun 17 04:57:46 rpmi: libXrender-0.9.11-alt1 sisyphus+308841.100.1.1 1666436131 installed <13>Jun 17 04:57:46 rpmi: libcairo-1:1.16.0-alt2 sisyphus+312186.100.1.1 1671693945 installed <13>Jun 17 04:57:46 rpmi: libcairo-gobject-1:1.16.0-alt2 sisyphus+312186.100.1.1 1671693945 installed <13>Jun 17 04:57:46 rpmi: libXrandr-1.5.3-alt1 sisyphus+310375.100.1.1 1669010698 installed <13>Jun 17 04:57:46 rpmi: libXi-1.8-alt1 sisyphus+285490.200.1.1 1632124180 installed <13>Jun 17 04:57:46 rpmi: libXcursor-1.2.1-alt1 sisyphus+297765.200.1.1 1649053934 installed <13>Jun 17 04:57:46 rpmi: libXft-2.3.7-alt1 sisyphus+310164.100.1.1 1668680609 installed <13>Jun 17 04:57:46 rpmi: libpango-1.50.14-alt1 sisyphus+316117.100.1.1 1677780554 installed <13>Jun 17 04:57:46 rpmi: libpango-gir-1.50.14-alt1 sisyphus+316117.100.1.1 1677780554 installed <13>Jun 17 04:57:46 rpmi: librsvg-1:2.56.1-alt1 sisyphus+322219.100.1.1 1685523353 installed <13>Jun 17 04:57:46 rpmi: libtk-8.6.13-alt1 sisyphus+310696.200.1.1 1669548528 installed <13>Jun 17 04:57:46 rpmi: tk-8.6.13-alt1 sisyphus+310696.200.1.1 1669548528 installed <13>Jun 17 04:57:46 rpmi: libXinerama-1.1.5-alt1 sisyphus+309287.100.1.1 1667213209 installed <13>Jun 17 04:57:46 rpmi: libXv-1.0.12-alt1 sisyphus+311188.600.1.1 1670233801 installed <13>Jun 17 04:57:46 rpmi: libXdamage-1.1.6-alt1 sisyphus+311188.300.1.1 1670233713 installed <13>Jun 17 04:57:46 rpmi: libzvbi-0.2.35-alt2 sisyphus+275416.100.1.3 1624507130 installed <13>Jun 17 04:57:47 rpmi: tcl-tix-8.4.3-alt4 sisyphus+277292.300.2.1 1625442551 installed <13>Jun 17 04:57:47 rpmi: python3-modules-tkinter-3.11.0-alt1 sisyphus+311250.40.175.1 1685626775 installed <13>Jun 17 04:57:47 rpmi: libpoppler8-glib-23.01.0-alt1 sisyphus+313777.100.1.1 1674212828 installed <13>Jun 17 04:57:47 rpmi: libpoppler-gir-23.01.0-alt1 sisyphus+313777.100.1.1 1674212828 installed <13>Jun 17 04:57:47 rpmi: libXxf86vm-1.1.5-alt1 sisyphus+308663.1400.1.1 1666168534 installed <13>Jun 17 04:57:47 rpmi: libGLX-mesa-4:23.0.4-alt1 sisyphus+322784.2300.12.3 1686809120 installed <13>Jun 17 04:57:47 rpmi: libGLX-7:1.6.0-alt2 sisyphus+321612.100.1.1 1684749008 installed <13>Jun 17 04:57:47 rpmi: libGL-7:1.6.0-alt2 sisyphus+321612.100.1.1 1684749008 installed <13>Jun 17 04:57:47 rpmi: libva-2.18.0-alt1 sisyphus+321624.100.1.1 1684777615 installed <13>Jun 17 04:57:47 rpmi: libEGL-mesa-4:23.0.4-alt1 sisyphus+322784.2300.12.3 1686809120 installed <13>Jun 17 04:57:47 rpmi: libEGL-7:1.6.0-alt2 sisyphus+321612.100.1.1 1684749008 installed <13>Jun 17 04:57:47 rpmi: libgst-plugins1.0-1.22.3-alt1 sisyphus+321216.200.1.1 1684520622 installed <13>Jun 17 04:57:47 rpmi: libgst-plugins1.0-gir-1.22.3-alt1 sisyphus+321216.200.1.1 1684520622 installed <13>Jun 17 04:57:47 rpmi: libvdpau-1:1.5-alt1 sisyphus+298034.100.1.1 1649336827 installed <13>Jun 17 04:57:47 rpmi: libavutil56-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libswscale5-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libswresample3-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libavcodec58-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libavformat58-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libpostproc55-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libavresample4-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libavfilter7-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:47 rpmi: libXcomposite-0.4.6-alt1 sisyphus+311188.200.1.1 1670233684 installed <13>Jun 17 04:57:47 rpmi: libcaca-0.99-alt23 sisyphus+309378.400.4.1 1667524216 installed <13>Jun 17 04:57:47 rpmi: liblz4-1:1.9.4-alt1 sisyphus+309416.100.1.1 1667413000 installed <13>Jun 17 04:57:47 rpmi: libsystemd-1:252.7-alt1 sisyphus+316321.100.1.1 1678130459 installed <13>Jun 17 04:57:47 rpmi: libdbus-1.14.6-alt2 sisyphus+321145.100.1.1 1684402689 installed <13>Jun 17 04:57:47 rpmi: libpulseaudio-16.1-alt1 sisyphus+321919.1100.6.2 1686174619 installed <13>Jun 17 04:57:47 rpmi: libat-spi2-core-2.48.3-alt1 sisyphus+322046.200.1.1 1685216221 installed <13>Jun 17 04:57:47 rpmi: at-spi2-atk-2.48.3-alt1 sisyphus+322046.200.1.1 1685216221 installed <13>Jun 17 04:57:47 rpmi: dbus-tools-1.14.6-alt2 sisyphus+321145.100.1.1 1684402689 installed <86>Jun 17 04:57:47 groupadd[3287981]: group added to /etc/group: name=messagebus, GID=991 <86>Jun 17 04:57:47 groupadd[3287981]: group added to /etc/gshadow: name=messagebus <86>Jun 17 04:57:47 groupadd[3287981]: new group: name=messagebus, GID=991 <86>Jun 17 04:57:47 useradd[3288013]: new user: name=messagebus, UID=999, GID=991, home=/run/dbus, shell=/dev/null, from=none <13>Jun 17 04:57:47 rpmi: dbus-1.14.6-alt2 sisyphus+321145.100.1.1 1684402689 installed <13>Jun 17 04:57:47 rpmi: dconf-0.40.0-alt1 sisyphus+279299.100.1.2 1626495975 installed <13>Jun 17 04:57:47 rpmi: libgtk+3-schemas-3.24.38-alt1 sisyphus+321614.100.1.1 1684749819 installed <13>Jun 17 04:57:47 rpmi: libavahi-0.8-alt2 sisyphus+321176.300.10.2 1684803998 installed <13>Jun 17 04:57:47 rpmi: libcups-2.4.2-alt3 sisyphus+322076.100.1.1 1685347724 installed <13>Jun 17 04:57:47 rpmi: libzbar-0.23.92-alt4 sisyphus+311250.100600.178.1 1685818158 installed <13>Jun 17 04:57:47 rpmi: libpolkit-0.120-alt1.qa2 sisyphus+296007.100.1.1 1646053433 installed <86>Jun 17 04:57:47 groupadd[3288521]: group added to /etc/group: name=colord, GID=990 <86>Jun 17 04:57:47 groupadd[3288521]: group added to /etc/gshadow: name=colord <86>Jun 17 04:57:47 groupadd[3288521]: new group: name=colord, GID=990 <86>Jun 17 04:57:47 useradd[3288551]: new user: name=colord, UID=998, GID=990, home=/var/colord, shell=/dev/null, from=none <13>Jun 17 04:57:48 rpmi: colord-1.4.6-alt1 sisyphus+296000.100.1.1 1646050423 installed <13>Jun 17 04:57:48 rpmi: libgtk+3-3.24.38-alt1 sisyphus+321614.100.1.1 1684749848 installed <13>Jun 17 04:57:48 rpmi: libgtk+3-gir-3.24.38-alt1 sisyphus+321614.100.1.1 1684749848 installed <13>Jun 17 04:57:48 rpmi: python3-module-pygobject3-3.44.1-alt1 sisyphus+311250.46320.178.1 1685784221 installed <13>Jun 17 04:57:49 rpmi: libopencv4.5-1:4.5.5-alt3 sisyphus+311250.63040.178.1 1685800682 installed <13>Jun 17 04:57:49 rpmi: libgoocanvas3-3.0.0-alt1 sisyphus+264776.100.1.1 1610808975 installed <13>Jun 17 04:57:49 rpmi: libgoocanvas3-gir-3.0.0-alt1 sisyphus+264776.100.1.1 1610808975 installed <13>Jun 17 04:57:49 rpmi: alsa-ucm-conf-1.2.8-alt7 sisyphus+323155.100.1.1 1686925085 installed <13>Jun 17 04:57:49 rpmi: alsa-topology-conf-1.2.5.1-alt1 sisyphus+274777.100.1.1 1624089141 installed <13>Jun 17 04:57:49 rpmi: libalsa-1:1.2.8-alt1 sisyphus+310349.200.1.1 1668964322 installed <13>Jun 17 04:57:49 rpmi: libfluidsynth-2.3.3-alt1 sisyphus+323083.200.1.2 1686858226 installed <13>Jun 17 04:57:49 rpmi: libSDL-1.2.14-alt10 sisyphus+284852.100.1.1 1631107844 installed <13>Jun 17 04:57:49 rpmi: libmjpegtools2.1-2.2.1-alt1 sisyphus+286352.100.1.1 1633334595 installed <13>Jun 17 04:57:49 rpmi: gst-plugins-bad1.0-1.22.3-alt1 sisyphus+321216.400.1.1 1684520902 installed <13>Jun 17 04:57:49 rpmi: libavdevice58-2:4.4.3-alt2 sisyphus+314308.100.1.1 1674910686 installed <13>Jun 17 04:57:50 rpmi: libgtk4-4.10.4-alt1 sisyphus+322490.40.2.3 1686184424 installed <13>Jun 17 04:57:50 rpmi: libgtk4-gir-4.10.4-alt1 sisyphus+322490.40.2.3 1686184424 installed <13>Jun 17 04:57:50 rpmi: libvte3-0.72.2-alt1 sisyphus+322338.100.1.1 1685709125 installed <13>Jun 17 04:57:50 rpmi: libvte3-gir-0.72.2-alt1 sisyphus+322338.100.1.1 1685709125 installed <13>Jun 17 04:57:50 rpmi: python3-module-pygobject3-pygtkcompat-3.44.1-alt1 sisyphus+311250.46320.178.1 1685784221 installed <13>Jun 17 04:57:50 rpmi: python3-module-automat-22.10.0-alt1 sisyphus+322927.200.2.1 1686736914 installed <13>Jun 17 04:57:50 rpmi: python3-module-twisted-logger-22.10.0-alt1.1 sisyphus+322927.100.2.1 1686736857 installed <13>Jun 17 04:57:50 rpmi: python3-module-twisted-core-22.10.0-alt1.1 sisyphus+322927.100.2.1 1686736857 installed <13>Jun 17 04:57:50 rpmi: python3-module-twisted-names-22.10.0-alt1.1 sisyphus+322927.100.2.1 1686736857 installed <13>Jun 17 04:57:50 rpmi: python3-module-tornado-6.3.2-alt1 sisyphus+311250.21520.175.1 1685634507 installed <13>Jun 17 04:57:50 rpmi: python3-module-gunicorn-20.1.0-alt2 sisyphus+297766.100.1.1 1649054912 installed <13>Jun 17 04:57:50 rpmi: python3-module-aiohttp-3.8.4-alt1 sisyphus+311250.43420.178.1 1685782025 installed <13>Jun 17 04:57:50 rpmi: python3-module-MaxMindDB-2.3.0-alt1 sisyphus+311250.44340.178.1 1685783359 installed <13>Jun 17 04:57:50 rpmi: python3-module-mocket-3.11.1-alt1 sisyphus+320564.100.1.2 1684174216 installed <13>Jun 17 04:57:50 rpmi: python3-module-pytest-7.3.2-alt1 sisyphus+322897.100.1.1 1686653468 installed <13>Jun 17 04:57:50 rpmi: python3-module-http-parser-0.9.0-alt2 sisyphus+311250.40000.176.1 1685741115 installed <13>Jun 17 04:57:50 rpmi: python3-module-decorator-4.4.2-alt2 sisyphus+280713.100.1.1 1627266028 installed WARNING: %python3_build is deprecated and will be removed in future, please use %pyproject_build instead WARNING: %python3_install is deprecated and will be removed in future, please use %pyproject_install instead Building target platforms: i586 Building for target i586 Wrote: /usr/src/in/nosrpm/python3-module-GeoIP2-4.7.0-alt1.nosrc.rpm (w1.gzdio) <13>Jun 17 04:57:53 rpmi: libpython3-3.11.0-alt1 sisyphus+311250.40.175.1 1685626775 installed <13>Jun 17 04:57:53 rpmi: libncurses-6.3.20220618-alt1 sisyphus+302449.100.1.1 1655835262 installed <13>Jun 17 04:57:53 rpmi: libtinfo-devel-6.3.20220618-alt1 sisyphus+302449.100.1.1 1655835262 installed <13>Jun 17 04:57:53 rpmi: libncurses-devel-6.3.20220618-alt1 sisyphus+302449.100.1.1 1655835262 installed <13>Jun 17 04:57:54 rpmi: python3-dev-3.11.0-alt1 sisyphus+311250.40.175.1 1685626775 installed <13>Jun 17 04:57:54 rpmi: python3-module-setuptools-1:67.8.0-alt1 sisyphus+321626.100.2.1 1684831109 installed WARNING: %python3_build is deprecated and will be removed in future, please use %pyproject_build instead WARNING: %python3_install is deprecated and will be removed in future, please use %pyproject_install instead Installing python3-module-GeoIP2-4.7.0-alt1.src.rpm Building target platforms: i586 Building for target i586 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.15137 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf python3-module-GeoIP2-4.7.0 + echo 'Source #0 (python3-module-GeoIP2-4.7.0.tar):' Source #0 (python3-module-GeoIP2-4.7.0.tar): + /bin/tar -xf /usr/src/RPM/SOURCES/python3-module-GeoIP2-4.7.0.tar + cd python3-module-GeoIP2-4.7.0 + /bin/chmod -c -Rf u+rwX,go-w . + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.15137 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python3-module-GeoIP2-4.7.0 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export FFLAGS + /usr/bin/python3 setup.py build running build running build_py creating build creating build/lib creating build/lib/geoip2 copying geoip2/webservice.py -> build/lib/geoip2 copying geoip2/types.py -> build/lib/geoip2 copying geoip2/records.py -> build/lib/geoip2 copying geoip2/models.py -> build/lib/geoip2 copying geoip2/mixins.py -> build/lib/geoip2 copying geoip2/errors.py -> build/lib/geoip2 copying geoip2/database.py -> build/lib/geoip2 copying geoip2/__init__.py -> build/lib/geoip2 running egg_info creating geoip2.egg-info writing geoip2.egg-info/PKG-INFO writing dependency_links to geoip2.egg-info/dependency_links.txt writing requirements to geoip2.egg-info/requires.txt writing top-level names to geoip2.egg-info/top_level.txt writing manifest file 'geoip2.egg-info/SOURCES.txt' reading manifest file 'geoip2.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'tests/data/test-data/*.mmdb' warning: no directories found matching 'docs/html' adding license file 'LICENSE' writing manifest file 'geoip2.egg-info/SOURCES.txt' copying geoip2/py.typed -> build/lib/geoip2 + sphinx-build-3 -b html docs html Running Sphinx v6.1.3 making output directory... done WARNING: html_static_path entry '_static' does not exist loading intersphinx inventory from http://docs.python.org/objects.inv... WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'http://docs.python.org/objects.inv' not fetchable due to : HTTPConnectionPool(host='docs.python.org', port=80): Max retries exceeded with url: /objects.inv (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) building [mo]: targets for 0 po files that are out of date writing output... building [html]: targets for 1 source files that are out of date updating environment: [new config] 1 added, 0 changed, 0 removed reading sources... [100%] index /usr/src/RPM/BUILD/python3-module-GeoIP2-4.7.0/geoip2/records.py:docstring of geoip2.records.Traits.network:1: WARNING: duplicate object description of geoip2.records.Traits.network, other instance in index, use :noindex: for one of them /usr/src/RPM/BUILD/python3-module-GeoIP2-4.7.0/geoip2/errors.py:docstring of geoip2.errors.AddressNotFoundError.network:1: WARNING: duplicate object description of geoip2.errors.AddressNotFoundError.network, other instance in index, use :noindex: for one of them looking for now-outdated files... none found pickling environment... done checking consistency... done preparing documents... done writing output... [100%] index generating indices... genindex py-modindex done writing additional pages... search done copying static files... done copying extra files... done dumping search index in English (code: en)... done dumping object inventory... done build succeeded, 4 warnings. The HTML pages are in html. + rm -rf html/.buildinfo html/.doctrees + exit 0 Executing(%install): /bin/sh -e /usr/src/tmp/rpm-tmp.52279 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + /bin/chmod -Rf u+rwX -- /usr/src/tmp/python3-module-GeoIP2-buildroot + : + /bin/rm -rf -- /usr/src/tmp/python3-module-GeoIP2-buildroot + PATH=/usr/libexec/rpm-build:/usr/src/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/games + cd python3-module-GeoIP2-4.7.0 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export FFLAGS + /usr/bin/python3 setup.py install --skip-build --root=/usr/src/tmp/python3-module-GeoIP2-buildroot --force running install /usr/lib/python3/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer, pypa/build or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() running install_lib creating /usr/src/tmp/python3-module-GeoIP2-buildroot creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3 creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/py.typed -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/__init__.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/database.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/errors.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/mixins.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/models.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/records.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/types.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/webservice.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py to __init__.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py to database.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py to errors.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py to mixins.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py to models.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py to records.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py to types.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py to webservice.cpython-311.pyc running install_egg_info running egg_info writing geoip2.egg-info/PKG-INFO writing dependency_links to geoip2.egg-info/dependency_links.txt writing requirements to geoip2.egg-info/requires.txt writing top-level names to geoip2.egg-info/top_level.txt reading manifest file 'geoip2.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'tests/data/test-data/*.mmdb' warning: no directories found matching 'docs/html' adding license file 'LICENSE' writing manifest file 'geoip2.egg-info/SOURCES.txt' Copying geoip2.egg-info to /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2-4.7.0-py3.11.egg-info running install_scripts + /usr/lib/rpm/brp-alt Cleaning files in /usr/src/tmp/python3-module-GeoIP2-buildroot (auto) Verifying and fixing files in /usr/src/tmp/python3-module-GeoIP2-buildroot (binconfig,pkgconfig,libtool,desktop,gnuconfig) Checking contents of files in /usr/src/tmp/python3-module-GeoIP2-buildroot/ (default) Compressing files in /usr/src/tmp/python3-module-GeoIP2-buildroot (auto) Adjusting library links in /usr/src/tmp/python3-module-GeoIP2-buildroot ./usr/lib: (from :0) Verifying ELF objects in /usr/src/tmp/python3-module-GeoIP2-buildroot (arch=normal,fhs=normal,lfs=relaxed,lint=relaxed,rpath=normal,stack=normal,textrel=normal,unresolved=normal) Bytecompiling python3 modules in /usr/src/tmp/python3-module-GeoIP2-buildroot using /usr/bin/python3 unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/database.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/errors.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/mixins.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/models.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/records.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/types.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/webservice.cpython-311.pyc compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py Bytecompiling python3 modules with optimization in /usr/src/tmp/python3-module-GeoIP2-buildroot using /usr/bin/python3 -O compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py Bytecompiling python3 modules with optimization-2 in /usr/src/tmp/python3-module-GeoIP2-buildroot using /usr/bin/python3 -OO compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py Hardlinking identical .pyc and .opt-?.pyc files './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.opt-2.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.opt-1.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/database.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/database.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/errors.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/errors.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/mixins.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/mixins.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/models.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/models.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/records.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/records.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/types.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/types.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/webservice.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/webservice.cpython-311.pyc' Executing(%check): /bin/sh -e /usr/src/tmp/rpm-tmp.52279 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python3-module-GeoIP2-4.7.0 + export PYTHONPATH=/usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages + PYTHONPATH=/usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages + py.test-3 --ignore tests/database_test.py ============================= test session starts ============================== platform linux -- Python 3.11.0, pytest-7.3.2, pluggy-1.0.0 rootdir: /usr/src/RPM/BUILD/python3-module-GeoIP2-4.7.0 collected 61 items tests/models_test.py ......... [ 14%] tests/webservice_test.py ..........................FFFFFFF.FFFFFFFF..FFF [ 91%] FFFFF [100%] =================================== FAILURES =================================== ________________________ TestAsyncClient.test_200_error ________________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.1.1.1 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.1.1.1 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_200_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/1.1.1.1", body="", status=200, content_type=self._content_type("country"), ) with self.assertRaisesRegex( GeoIP2Error, "could not decode the response as JSON" ): > self.run_client(self.client.country("1.1.1.1")) tests/webservice_test.py:141: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ________________________ TestAsyncClient.test_300_error ________________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.11 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.11 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_300_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.11", status=300, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, r"Received a very surprising HTTP status \(300\) for" ): > self.run_client(self.client.country("1.2.3.11")) tests/webservice_test.py:212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ________________________ TestAsyncClient.test_500_error ________________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.10 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.10 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_500_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.10", status=500 ) with self.assertRaisesRegex(HTTPError, r"Received a server error \(500\) for"): > self.run_client(self.client.country("1.2.3.10")) tests/webservice_test.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_account_id_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_account_id_required(self): > self._test_error(401, "ACCOUNT_ID_REQUIRED", AuthenticationError) tests/webservice_test.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ____________________ TestAsyncClient.test_account_id_unkown ____________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_account_id_unkown(self): > self._test_error(401, "ACCOUNT_ID_UNKNOWN", AuthenticationError) tests/webservice_test.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ______________________ TestAsyncClient.test_auth_invalid _______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_auth_invalid(self): > self._test_error(400, "AUTHORIZATION_INVALID", AuthenticationError) tests/webservice_test.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _____________________ TestAsyncClient.test_bad_body_error ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.9 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.9 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_bad_body_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.9", body="bad body", status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, "it did not include the expected JSON body" ): > self.run_client(self.client.country("1.2.3.9")) tests/webservice_test.py:191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _________________________ TestAsyncClient.test_city_ok _________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/city/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/city/1.2.3.4 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/city/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP2-P...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_city_ok(self): httpretty.register_uri( httpretty.GET, self.base_uri + "city/" + "1.2.3.4", body=json.dumps(self.country), status=200, content_type=self._content_type("city"), ) > city = self.run_client(self.client.city("1.2.3.4")) tests/webservice_test.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:289: in city City, await self._response_for("city", geoip2.models.City, ip_address) geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/city/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _______________________ TestAsyncClient.test_country_ok ________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.4 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_country_ok(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/1.2.3.4", body=json.dumps(self.country), status=200, content_type=self._content_type("country"), ) > country = self.run_client(self.client.country("1.2.3.4")) tests/webservice_test.py:72: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _______________________ TestAsyncClient.test_insights_ok _______________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/insights/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/insights/1.2.3.4 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/insights/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_insights_ok(self): httpretty.register_uri( httpretty.GET, self.base_uri + "insights/1.2.3.4", body=json.dumps(self.insights), status=200, content_type=self._content_type("country"), ) > insights = self.run_client(self.client.insights("1.2.3.4")) tests/webservice_test.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:322: in insights await self._response_for("insights", geoip2.models.Insights, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/insights/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError __________________ TestAsyncClient.test_ip_address_not_found ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_ip_address_not_found(self): > self._test_error(404, "IP_ADDRESS_NOT_FOUND", AddressNotFoundError) tests/webservice_test.py:220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_ip_address_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_ip_address_required(self): > self._test_error(400, "IP_ADDRESS_REQUIRED", InvalidRequestError) tests/webservice_test.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_ip_address_reserved ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_ip_address_reserved(self): > self._test_error(400, "IP_ADDRESS_RESERVED", AddressNotFoundError) tests/webservice_test.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError __________________ TestAsyncClient.test_license_key_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_license_key_required(self): > self._test_error(401, "LICENSE_KEY_REQUIRED", AuthenticationError) tests/webservice_test.py:236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________________ TestAsyncClient.test_me ____________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/me' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/me HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/me HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP2-Pyt...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_me(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/me", body=json.dumps(self.country), status=200, content_type=self._content_type("country"), ) > implicit_me = self.run_client(self.client.country()) tests/webservice_test.py:118: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/me' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ______________________ TestAsyncClient.test_no_body_error ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.7 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.7 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_no_body_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.7", body="", status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, "Received a 400 error for .* with no body" ): > self.run_client(self.client.country("1.2.3.7")) tests/webservice_test.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError __________________ TestAsyncClient.test_out_of_queries_error ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_out_of_queries_error(self): > self._test_error(402, "OUT_OF_QUERIES", OutOfQueriesError) tests/webservice_test.py:256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_permission_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_permission_required(self): > self._test_error(403, "PERMISSION_REQUIRED", PermissionRequiredError) tests/webservice_test.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _________________________ TestAsyncClient.test_request _________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.4 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_request(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.4", body=json.dumps(self.country), status=200, content_type=self._content_type("country"), ) > self.run_client(self.client.country("1.2.3.4")) tests/webservice_test.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ______________________ TestAsyncClient.test_unknown_error ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.19 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.19 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_unknown_error(self): msg = "Unknown error type" ip = "1.2.3.19" body = {"error": msg, "code": "UNKNOWN_TYPE"} httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + ip, body=json.dumps(body), status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex(InvalidRequestError, msg): > self.run_client(self.client.country(ip)) tests/webservice_test.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ____________________ TestAsyncClient.test_user_id_required _____________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_user_id_required(self): > self._test_error(401, "USER_ID_REQUIRED", AuthenticationError) tests/webservice_test.py:244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _____________________ TestAsyncClient.test_user_id_unkown ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_user_id_unkown(self): > self._test_error(401, "USER_ID_UNKNOWN", AuthenticationError) tests/webservice_test.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ____________________ TestAsyncClient.test_weird_body_error _____________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = Connection async def send(self, conn: "Connection") -> "ClientResponse": # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = f"[{connect_host}]" path = f"{connect_host}:{self.url.port}" elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += "?" + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=functools.partial( self._on_chunk_request_sent, self.method, self.url ), on_headers_sent=functools.partial( self._on_headers_request_sent, self.method, self.url ), ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if ( self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers ): self.headers[hdrs.CONTENT_TYPE] = "application/octet-stream" # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = "keep-alive" else: if self.version == HttpVersion11: connection = "close" if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = "{0} {1} HTTP/{2[0]}.{2[1]}".format( self.method, path, self.version ) > await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = status_line = 'GET /geoip/v2.1/country/1.2.3.8 HTTP/1.1' headers = async def write_headers( self, status_line: str, headers: "CIMultiDict[str]" ) -> None: """Write request/response status and headers.""" if self._on_headers_sent is not None: await self._on_headers_sent(headers) # status + headers buf = _serialize_headers(status_line, headers) > self._write(buf) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.8 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.4\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_weird_body_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.8", body='{"wierd": 42}', status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, "Response contains JSON but it does not " "specify code or error keys", ): > self.run_client(self.client.country("1.2.3.8")) tests/webservice_test.py:177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:650: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError =============================== warnings summary =============================== ../../../../lib/python3/site-packages/mocket/mocket.py:46 /usr/lib/python3/site-packages/mocket/mocket.py:46: DeprecationWarning: 'urllib3.contrib.pyopenssl' module is deprecated and will be removed in a future release of urllib3 2.x. Read more in this issue: https://github.com/urllib3/urllib3/issues/2680 from urllib3.contrib.pyopenssl import extract_from_urllib3, inject_into_urllib3 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:121 /usr/lib/python3/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning) ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 /usr/lib/python3/site-packages/pkg_resources/__init__.py:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('paste')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 /usr/lib/python3/site-packages/pkg_resources/__init__.py:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('sphinxcontrib')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2870 /usr/lib/python3/site-packages/pkg_resources/__init__.py:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('zope')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED tests/webservice_test.py::TestAsyncClient::test_200_error - aiohttp.cl... FAILED tests/webservice_test.py::TestAsyncClient::test_300_error - aiohttp.cl... FAILED tests/webservice_test.py::TestAsyncClient::test_500_error - aiohttp.cl... FAILED tests/webservice_test.py::TestAsyncClient::test_account_id_required - ... FAILED tests/webservice_test.py::TestAsyncClient::test_account_id_unkown - ai... FAILED tests/webservice_test.py::TestAsyncClient::test_auth_invalid - aiohttp... FAILED tests/webservice_test.py::TestAsyncClient::test_bad_body_error - aioht... FAILED tests/webservice_test.py::TestAsyncClient::test_city_ok - aiohttp.clie... FAILED tests/webservice_test.py::TestAsyncClient::test_country_ok - aiohttp.c... FAILED tests/webservice_test.py::TestAsyncClient::test_insights_ok - aiohttp.... FAILED tests/webservice_test.py::TestAsyncClient::test_ip_address_not_found FAILED tests/webservice_test.py::TestAsyncClient::test_ip_address_required - ... FAILED tests/webservice_test.py::TestAsyncClient::test_ip_address_reserved - ... FAILED tests/webservice_test.py::TestAsyncClient::test_license_key_required FAILED tests/webservice_test.py::TestAsyncClient::test_me - aiohttp.client_ex... FAILED tests/webservice_test.py::TestAsyncClient::test_no_body_error - aiohtt... FAILED tests/webservice_test.py::TestAsyncClient::test_out_of_queries_error FAILED tests/webservice_test.py::TestAsyncClient::test_permission_required - ... FAILED tests/webservice_test.py::TestAsyncClient::test_request - aiohttp.clie... FAILED tests/webservice_test.py::TestAsyncClient::test_unknown_error - aiohtt... FAILED tests/webservice_test.py::TestAsyncClient::test_user_id_required - aio... FAILED tests/webservice_test.py::TestAsyncClient::test_user_id_unkown - aioht... FAILED tests/webservice_test.py::TestAsyncClient::test_weird_body_error - aio... ================== 23 failed, 38 passed, 10 warnings in 2.07s ================== error: Bad exit status from /usr/src/tmp/rpm-tmp.52279 (%check) RPM build errors: Bad exit status from /usr/src/tmp/rpm-tmp.52279 (%check) Command exited with non-zero status 1 5.13user 0.40system 0:05.62elapsed 98%CPU (0avgtext+0avgdata 59224maxresident)k 0inputs+0outputs (0major+99436minor)pagefaults 0swaps hsh-rebuild: rebuild of `python3-module-GeoIP2-4.7.0-alt1.src.rpm' failed. Command exited with non-zero status 1 3.19user 2.32system 0:27.35elapsed 20%CPU (0avgtext+0avgdata 109676maxresident)k 448inputs+0outputs (0major+429207minor)pagefaults 0swaps