<86>Nov 4 07:00:50 userdel[1454789]: delete user 'rooter' <86>Nov 4 07:00:50 userdel[1454789]: removed group 'rooter' owned by 'rooter' <86>Nov 4 07:00:50 userdel[1454789]: removed shadow group 'rooter' owned by 'rooter' <86>Nov 4 07:00:50 groupadd[1454828]: group added to /etc/group: name=rooter, GID=1835 <86>Nov 4 07:00:50 groupadd[1454828]: group added to /etc/gshadow: name=rooter <86>Nov 4 07:00:50 groupadd[1454828]: new group: name=rooter, GID=1835 <86>Nov 4 07:00:50 useradd[1454872]: new user: name=rooter, UID=1835, GID=1835, home=/root, shell=/bin/bash, from=none <86>Nov 4 07:00:50 userdel[1454945]: delete user 'builder' <86>Nov 4 07:00:50 userdel[1454945]: removed group 'builder' owned by 'builder' <86>Nov 4 07:00:50 userdel[1454945]: removed shadow group 'builder' owned by 'builder' <86>Nov 4 07:00:50 groupadd[1454997]: group added to /etc/group: name=builder, GID=1836 <86>Nov 4 07:00:50 groupadd[1454997]: group added to /etc/gshadow: name=builder <86>Nov 4 07:00:50 groupadd[1454997]: new group: name=builder, GID=1836 <86>Nov 4 07:00:50 useradd[1455050]: new user: name=builder, UID=1836, GID=1836, home=/usr/src, shell=/bin/bash, from=none <13>Nov 4 07:00:53 rpmi: libmpdec3-2.5.1-alt3 sisyphus+314490.500.5.1 1675432033 installed <13>Nov 4 07:00:53 rpmi: libgdbm-1.8.3-alt10 sisyphus+278100.1600.1.1 1626059138 installed <13>Nov 4 07:00:53 rpmi: libexpat-2.5.0-alt1 sisyphus+309227.100.1.1 1667075766 installed <13>Nov 4 07:00:53 rpmi: libb2-0.98.1-alt1_1 sisyphus+291614.100.1.1 1638962878 installed <13>Nov 4 07:00:53 rpmi: libp11-kit-1:0.25.2-alt1 sisyphus+333434.100.1.1 1698941048 installed <13>Nov 4 07:00:53 rpmi: libtasn1-4.19.0-alt3 sisyphus+327816.100.1.1 1692802618 installed <13>Nov 4 07:00:53 rpmi: rpm-macros-alternatives-0.5.2-alt2 sisyphus+315270.200.2.1 1676457367 installed <13>Nov 4 07:00:53 rpmi: alternatives-0.5.2-alt2 sisyphus+315270.200.2.1 1676457367 installed <13>Nov 4 07:00:53 rpmi: ca-certificates-2023.07.31-alt1 sisyphus+326137.200.1.1 1690809798 installed <13>Nov 4 07:00:53 rpmi: ca-trust-0.1.4-alt1 sisyphus+308690.100.1.1 1666182992 installed <13>Nov 4 07:00:53 rpmi: p11-kit-trust-1:0.25.2-alt1 sisyphus+333434.100.1.1 1698941048 installed <13>Nov 4 07:00:53 rpmi: libcrypto3-3.1.4-alt1 sisyphus+332632.100.2.1 1698241305 installed <13>Nov 4 07:00:53 rpmi: libssl3-3.1.4-alt1 sisyphus+332632.100.2.1 1698241305 installed <13>Nov 4 07:00:53 rpmi: python3-3.11.6-alt1 sisyphus+331220.100.1.1 1696668078 installed <13>Nov 4 07:00:54 rpmi: python3-base-3.11.6-alt1 sisyphus+331220.100.1.1 1696668078 installed <13>Nov 4 07:00:54 rpmi: python3-module-py3dephell-0.1.0-alt2 sisyphus+328191.600.5.1 1693609196 installed <13>Nov 4 07:00:54 rpmi: tests-for-installed-python3-pkgs-0.1.24-alt2 sisyphus+331059.100.4.1 1696870033 installed <13>Nov 4 07:00:54 rpmi: rpm-build-python3-0.1.24-alt2 sisyphus+331059.100.4.1 1696870033 installed WARNING: %python3_build is deprecated and will be removed in future, please use %pyproject_build instead WARNING: %python3_install is deprecated and will be removed in future, please use %pyproject_install instead <13>Nov 4 07:00:56 rpmi: python3-module-docutils-0.20.1-alt1 sisyphus+323438.100.1.1 1687370718 installed <13>Nov 4 07:00:56 rpmi: python3-module-attrs-23.1.0-alt1 sisyphus+321859.100.1.1 1685027490 installed <13>Nov 4 07:00:56 rpmi: python3-module-idna-3.4-alt1 sisyphus+307942.100.1.1 1665051373 installed <13>Nov 4 07:00:56 rpmi: python3-module-zope.interface-6.0-alt1 sisyphus+326348.600.4.1 1691597456 installed <13>Nov 4 07:00:56 rpmi: python3-module-pkg_resources-1:68.2.2-alt1 sisyphus+329927.100.2.1 1695208272 installed <13>Nov 4 07:00:56 rpmi: python3-module-six-1.16.0-alt2 sisyphus+324249.100.1.1 1688484676 installed <13>Nov 4 07:00:56 rpmi: python3-module-packaging-23.2-alt1 sisyphus+330805.100.2.1 1696324248 installed <13>Nov 4 07:00:56 rpmi: python3-module-greenlet-2.0.2-alt1 sisyphus+326609.100.1.1 1691406111 installed <13>Nov 4 07:00:56 rpmi: libtcl-8.6.13-alt1 sisyphus+310696.100.1.1 1669548256 installed <13>Nov 4 07:00:56 rpmi: python3-module-multidict-6.0.4-alt1 sisyphus+311250.32340.176.1 1685738354 installed <13>Nov 4 07:00:56 rpmi: python3-module-frozenlist-1.3.3-alt1 sisyphus+311250.30540.176.1 1685737850 installed <13>Nov 4 07:00:56 rpmi: python3-module-charset-normalizer-2.1.1-alt1 sisyphus+311047.100.1.1 1669992940 installed <13>Nov 4 07:00:56 rpmi: python3-module-Pygments-2.16.1-alt1 sisyphus+326610.100.1.1 1691406169 installed <13>Nov 4 07:00:56 rpmi: libcares-1.19.1-alt1 sisyphus+324326.100.1.1 1688585817 installed <13>Nov 4 07:00:56 rpmi: python3-module-alabaster-0.7.6-alt4 sisyphus+281697.200.1.1 1627919931 installed <13>Nov 4 07:00:56 rpmi: python3-module-aiosignal-1.3.1-alt1 sisyphus+314057.100.1.1 1674561191 installed <13>Nov 4 07:00:56 rpmi: python3-module-yarl-1.9.2-alt2 sisyphus+326605.100.1.1 1691404440 installed <13>Nov 4 07:00:56 rpmi: tcl-8.6.13-alt1 sisyphus+310696.100.1.1 1669548256 installed <13>Nov 4 07:00:56 rpmi: python3-module-idna_ssl-1.1.0-alt2 sisyphus+272418.100.1.1 1621876529 installed <13>Nov 4 07:00:56 rpmi: python3-module-hyperlink-21.0.0-alt1.1 sisyphus+304836.100.1.1 1659710964 installed <13>Nov 4 07:00:56 rpmi: python3-module-outcome-1.3.0-alt1 sisyphus+332382.100.1.1 1697882349 installed <13>Nov 4 07:00:56 rpmi: python3-module-snowballstemmer-2.2.0-alt1 sisyphus+319215.100.1.1 1682346633 installed <13>Nov 4 07:00:56 rpmi: python3-module-pluggy-1.3.0-alt1 sisyphus+330478.100.1.1 1695823310 installed <13>Nov 4 07:00:56 rpmi: python3-module-markupsafe-1:2.1.3-alt1 sisyphus+323659.100.1.1 1687595160 installed <13>Nov 4 07:00:56 rpmi: python3-module-jinja2-3.1.2-alt1 sisyphus+303664.100.1.1 1657809843 installed <13>Nov 4 07:00:56 rpmi: python3-module-iniconfig-2.0.0-alt1 sisyphus+314076.200.3.1 1674737275 installed <13>Nov 4 07:00:56 rpmi: python3-module-imagesize-1.4.1-alt1 sisyphus+318084.100.1.1 1680697673 installed <13>Nov 4 07:00:56 rpmi: python3-module-httptools-0.1.1-alt1 sisyphus+311250.31100.176.1 1685737908 installed <13>Nov 4 07:00:57 rpmi: python3-module-babel-1:2.12.1-alt1 sisyphus+317409.100.1.1 1679678193 installed <13>Nov 4 07:00:57 rpmi: python3-module-click-8.1.7-alt1 sisyphus+327424.100.2.1 1695395098 installed <13>Nov 4 07:00:57 rpmi: python3-module-incremental-22.10.0-alt1 sisyphus+312706.100.1.1 1672404273 installed <13>Nov 4 07:00:57 rpmi: python3-module-constantly-15.1.0-alt6 sisyphus+284854.100.1.1 1631108193 installed <13>Nov 4 07:00:57 rpmi: python3-module-typing_extensions-4.8.0-alt1 sisyphus+332392.100.1.1 1697893352 installed <13>Nov 4 07:00:57 rpmi: python3-module-pygobject-2.28.6-alt13 sisyphus+311250.56700.178.1 1685789723 installed <13>Nov 4 07:00:57 rpmi: python3-module-appdirs-1.4.4-alt1 sisyphus+267613.300.2.1 1620039159 installed <13>Nov 4 07:00:57 rpmi: python3-module-certifi-2023.5.7-alt1 sisyphus+322622.100.1.1 1686217855 installed <13>Nov 4 07:00:57 rpmi: python3-module-z3c-3.0.0-alt4 sisyphus+284857.200.1.1 1631109149 installed <13>Nov 4 07:00:57 rpmi: python3-module-zc-1.0.0-alt7 sisyphus+284857.100.1.1 1631109117 installed <13>Nov 4 07:00:57 rpmi: python3-module-zope-3.3.0-alt9 sisyphus+281937.200.4.1 1628175910 installed <13>Nov 4 07:00:57 rpmi: python3-module-zope.event-5.0-alt1.1 sisyphus+325755.140.2.1 1690991538 installed <13>Nov 4 07:00:57 rpmi: python3-module-pycparser-2.21-alt1.1 sisyphus+309935.7300.4.1 1668527005 installed <13>Nov 4 07:00:57 rpmi: python3-module-cffi-1.16.0-alt1 sisyphus+330935.100.2.1 1696495706 installed <13>Nov 4 07:00:57 rpmi: python3-module-cryptography-41.0.5-alt1 sisyphus+333239.200.1.1 1698754471 installed <13>Nov 4 07:00:57 rpmi: python3-module-openssl-23.2.0-alt1 sisyphus+326014.100.1.1 1690659362 installed <13>Nov 4 07:00:57 rpmi: python3-module-urllib3-2:2.0.7-alt1 sisyphus+332146.100.1.1 1697707902 installed <13>Nov 4 07:00:57 rpmi: python3-module-requests-2.31.0-alt1 sisyphus+321663.100.2.1 1684917021 installed <13>Nov 4 07:00:57 rpmi: python3-module-pycares-4.1.2-alt1 sisyphus+311250.45300.178.1 1685783642 installed <13>Nov 4 07:00:57 rpmi: python3-module-astor-0.8.1-alt1.1 sisyphus+315877.100.1.1 1677481862 installed <13>Nov 4 07:00:57 rpmi: python3-module-sortedcontainers-2.4.0-alt1 sisyphus+272042.100.1.1 1621262424 installed <13>Nov 4 07:00:57 rpmi: python3-module-sniffio-1.2.0-alt1 sisyphus+295017.1600.2.1 1644498020 installed <13>Nov 4 07:00:57 rpmi: python3-module-exceptiongroup-1.1.3-alt1 sisyphus+327210.100.2.1 1692096915 installed <13>Nov 4 07:00:57 rpmi: python3-module-async_generator-1.10-alt3 sisyphus+319053.1600.6.1 1682668582 installed <13>Nov 4 07:00:57 rpmi: python3-module-trio-0.22.0-alt3 sisyphus+319053.2100.6.1 1682668663 installed <13>Nov 4 07:00:57 rpmi: python3-module-dns-1:2.2.0-alt2 sisyphus+320065.60.1.1 1683366881 installed <13>Nov 4 07:00:57 rpmi: python3-module-async-timeout-4.0.3-alt1 sisyphus+329482.100.1.1 1694612611 installed <13>Nov 4 07:00:57 rpmi: python3-module-openid-3.2.0-alt1 sisyphus+278049.100.2.1 1625998936 installed <13>Nov 4 07:00:57 rpmi: python3-module-Cheetah-3.3.3-alt1 sisyphus+332604.100.1.1 1698150527 installed <13>Nov 4 07:00:57 rpmi: python3-module-paste-3.7.1-alt1 sisyphus+332187.100.1.1 1697730831 installed <13>Nov 4 07:00:57 rpmi: python3-module-PasteDeploy-1:3.0.1-alt1 sisyphus+308592.100.1.1 1666070463 installed <13>Nov 4 07:00:57 rpmi: python3-module-PasteScript-1:2.0.2-alt2 sisyphus+272468.100.1.1 1621939313 installed <13>Nov 4 07:00:57 rpmi: python-sphinx-objects.inv-1:2.3.13.20231027-alt1 sisyphus+333179.100.1.1 1698708965 installed <13>Nov 4 07:00:57 rpmi: python3-module-sphinxcontrib-applehelp-1.0.7-alt1 sisyphus+329314.100.1.1 1694447248 installed <13>Nov 4 07:00:57 rpmi: python3-module-sphinxcontrib-devhelp-1.0.5-alt1 sisyphus+329315.100.1.3 1694450414 installed <13>Nov 4 07:00:57 rpmi: python3-module-sphinxcontrib-jquery-4.1-alt2 sisyphus+317619.100.1.1 1680000409 installed <13>Nov 4 07:00:57 rpmi: python3-module-sphinxcontrib-jsmath-1.0.1-alt1 sisyphus+276004.100.1.1 1624811634 installed <13>Nov 4 07:00:57 rpmi: python3-module-sphinxcontrib-htmlhelp-2.0.0-alt2 sisyphus+298571.100.1.1 1650103344 installed <13>Nov 4 07:00:57 rpmi: python3-module-sphinxcontrib-serializinghtml-1.1.5-alt2 sisyphus+298572.100.1.1 1650104574 installed <13>Nov 4 07:00:57 rpmi: python3-module-sphinxcontrib-qthelp-1.0.6-alt1 sisyphus+329316.100.1.3 1694450732 installed <13>Nov 4 07:00:58 rpmi: python3-module-sphinx-1:7.0.1-alt1 sisyphus+323436.100.1.1 1687367037 installed <13>Nov 4 07:00:58 rpmi: libuv-1.46.0-alt1 sisyphus+326001.100.1.1 1690655255 installed <13>Nov 4 07:00:58 rpmi: libmaxminddb-1.7.1-alt1 sisyphus+310839.100.1.1 1669722011 installed <13>Nov 4 07:00:58 rpmi: openldap-common-2.6.6-alt1 sisyphus+330946.100.2.1 1696432854 installed <13>Nov 4 07:00:58 rpmi: libverto-0.3.2-alt1_1 sisyphus+321176.2200.10.2 1684806164 installed <13>Nov 4 07:00:58 rpmi: liblmdb-0.9.31-alt1 sisyphus+330946.40.2.1 1696431544 installed <13>Nov 4 07:00:58 rpmi: libkeyutils-1.6.3-alt1 sisyphus+266061.100.1.1 1612919567 installed <13>Nov 4 07:00:58 rpmi: libusb-1.0.26-alt2 sisyphus+305525.100.1.1 1660924428 installed <13>Nov 4 07:00:58 rpmi: libhidapi-0.12.0-alt1_1 sisyphus+303213.100.1.1 1657034193 installed <13>Nov 4 07:00:58 rpmi: python3-module-cython-hidapi-0.14.0-alt1 sisyphus+326006.100.1.1 1690656874 installed <13>Nov 4 07:00:58 rpmi: python3-module-serial-3.5-alt2 sisyphus+281995.100.1.1 1628172783 installed <13>Nov 4 07:00:58 rpmi: libev4-4.33-alt2 sisyphus+286828.100.2.3 1634005210 installed <13>Nov 4 07:00:58 rpmi: python3-module-gevent-22.10.2-alt1 sisyphus+311250.61340.178.1 1685796502 installed <13>Nov 4 07:00:58 rpmi: libcom_err-1.46.4.0.5.4cda-alt1 sisyphus+283826.100.1.1 1629975361 installed <86>Nov 4 07:00:58 groupadd[1505149]: group added to /etc/group: name=_keytab, GID=999 <86>Nov 4 07:00:58 groupadd[1505149]: group added to /etc/gshadow: name=_keytab <86>Nov 4 07:00:58 groupadd[1505149]: new group: name=_keytab, GID=999 <13>Nov 4 07:00:58 rpmi: libkrb5-1.21.2-alt1 sisyphus+327265.100.1.1 1692185512 installed <86>Nov 4 07:00:58 groupadd[1505252]: group added to /etc/group: name=sasl, GID=998 <86>Nov 4 07:00:58 groupadd[1505252]: group added to /etc/gshadow: name=sasl <86>Nov 4 07:00:58 groupadd[1505252]: new group: name=sasl, GID=998 <13>Nov 4 07:00:58 rpmi: libsasl2-3-2.1.27-alt2.2 sisyphus+324359.6000.12.1 1689392231 installed <13>Nov 4 07:00:58 rpmi: libldap2-2.6.6-alt1 sisyphus+330946.100.2.1 1696432858 installed <13>Nov 4 07:00:58 rpmi: libpq5-16.0-alt2 sisyphus+331266.100.1.1 1696838519 installed <13>Nov 4 07:00:58 rpmi: python3-module-psycopg2-2.9.5-alt1 sisyphus+311250.16300.175.1 1685633601 installed <13>Nov 4 07:00:58 rpmi: python3-module-eventlet-0.33.3-alt3 sisyphus+331661.200.2.1 1697210495 installed <13>Nov 4 07:00:58 rpmi: libpng16-1.6.40-alt1 sisyphus+323732.100.1.1 1687771859 installed <13>Nov 4 07:00:58 rpmi: libbrotlicommon-1.1.0-alt1 sisyphus+328501.100.1.1 1693598420 installed <13>Nov 4 07:00:58 rpmi: libbrotlidec-1.1.0-alt1 sisyphus+328501.100.1.1 1693598420 installed <13>Nov 4 07:00:58 rpmi: libgraphite2-1.3.14-alt2.1 sisyphus+279571.100.1.2 1626605157 installed <13>Nov 4 07:00:58 rpmi: libharfbuzz-8.2.2-alt1 sisyphus+332039.100.1.1 1697618710 installed <13>Nov 4 07:00:58 rpmi: libfreetype-2.13.2-alt1 sisyphus+328677.100.1.1 1693834346 installed <13>Nov 4 07:00:58 rpmi: libfontconfig1-2.14.2-alt8 sisyphus+328444.100.1.1 1693553407 installed <13>Nov 4 07:00:58 rpmi: libXdmcp-1.1.4-alt1 sisyphus+311188.1000.1.1 1670233860 installed <13>Nov 4 07:00:58 rpmi: libXau-1.0.11-alt1 sisyphus+311428.100.1.1 1670577440 installed <13>Nov 4 07:00:58 rpmi: libxcb-1.16-alt1 sisyphus+327325.200.1.1 1692276267 installed <13>Nov 4 07:00:58 rpmi: libX11-locales-3:1.8.7-alt1 sisyphus+330921.200.1.1 1696400315 installed <13>Nov 4 07:00:58 rpmi: libX11-3:1.8.7-alt1 sisyphus+330921.200.1.1 1696400319 installed <13>Nov 4 07:00:58 rpmi: libXrender-0.9.11-alt1 sisyphus+308841.100.1.1 1666436131 installed <13>Nov 4 07:00:58 rpmi: libXft-2.3.8-alt1 sisyphus+331490.400.1.1 1697023273 installed <13>Nov 4 07:00:58 rpmi: libtk-8.6.13-alt1 sisyphus+310696.200.1.1 1669548528 installed <13>Nov 4 07:00:58 rpmi: tk-8.6.13-alt1 sisyphus+310696.200.1.1 1669548528 installed <13>Nov 4 07:00:58 rpmi: tcl-tix-8.4.3-alt4 sisyphus+277292.300.2.1 1625442551 installed <13>Nov 4 07:00:58 rpmi: python3-modules-tkinter-3.11.6-alt1 sisyphus+331220.100.1.1 1696668078 installed <13>Nov 4 07:00:58 rpmi: python3-module-automat-22.10.0-alt1 sisyphus+322927.200.2.1 1686736914 installed <13>Nov 4 07:00:58 rpmi: python3-module-twisted-logger-22.10.0-alt2 sisyphus+325754.100.1.1 1690535804 installed <13>Nov 4 07:00:58 rpmi: python3-module-twisted-core-22.10.0-alt2 sisyphus+325754.100.1.1 1690535804 installed <13>Nov 4 07:00:59 rpmi: python3-module-twisted-names-22.10.0-alt2 sisyphus+325754.100.1.1 1690535804 installed <13>Nov 4 07:00:59 rpmi: python3-module-tornado-6.3.3-alt1.1 sisyphus+329433.100.1.1 1694583816 installed <13>Nov 4 07:00:59 rpmi: python3-module-gunicorn-20.1.0-alt2 sisyphus+297766.100.1.1 1649054912 installed <13>Nov 4 07:00:59 rpmi: python3-module-aiohttp-3.8.5-alt1 sisyphus+329480.100.1.1 1694611987 installed <13>Nov 4 07:00:59 rpmi: python3-module-MaxMindDB-2.4.0-alt1 sisyphus+325045.100.1.1 1689589265 installed <13>Nov 4 07:00:59 rpmi: python3-module-mocket-3.12.0-alt1 sisyphus+333156.100.1.1 1698682615 installed <13>Nov 4 07:00:59 rpmi: python3-module-pytest-7.4.3-alt1 sisyphus+332685.100.2.1 1698314627 installed <13>Nov 4 07:00:59 rpmi: python3-module-http-parser-0.9.0-alt2 sisyphus+311250.40000.176.1 1685741115 installed <13>Nov 4 07:00:59 rpmi: python3-module-decorator-4.4.2-alt2 sisyphus+280713.100.1.1 1627266028 installed WARNING: %python3_build is deprecated and will be removed in future, please use %pyproject_build instead WARNING: %python3_install is deprecated and will be removed in future, please use %pyproject_install instead Building target platforms: i586 Building for target i586 Wrote: /usr/src/in/nosrpm/python3-module-GeoIP2-4.7.0-alt1.nosrc.rpm (w1.gzdio) <13>Nov 4 07:01:00 rpmi: libpython3-3.11.6-alt1 sisyphus+331220.100.1.1 1696668078 installed <13>Nov 4 07:01:00 rpmi: libncurses6-6.3.20220618-alt4 sisyphus+328055.40.2.1 1693213017 installed <13>Nov 4 07:01:00 rpmi: libtinfo-devel-6.3.20220618-alt4 sisyphus+328055.40.2.1 1693213017 installed <13>Nov 4 07:01:00 rpmi: libncurses-devel-6.3.20220618-alt4 sisyphus+328055.40.2.1 1693213017 installed <13>Nov 4 07:01:01 rpmi: python3-dev-3.11.6-alt1 sisyphus+331220.100.1.1 1696668078 installed <13>Nov 4 07:01:01 rpmi: python3-module-setuptools-1:68.2.2-alt1 sisyphus+329927.100.2.1 1695208272 installed WARNING: %python3_build is deprecated and will be removed in future, please use %pyproject_build instead WARNING: %python3_install is deprecated and will be removed in future, please use %pyproject_install instead Installing python3-module-GeoIP2-4.7.0-alt1.src.rpm Building target platforms: i586 Building for target i586 Executing(%prep): /bin/sh -e /usr/src/tmp/rpm-tmp.98554 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + rm -rf python3-module-GeoIP2-4.7.0 + echo 'Source #0 (python3-module-GeoIP2-4.7.0.tar):' Source #0 (python3-module-GeoIP2-4.7.0.tar): + /bin/tar -xf /usr/src/RPM/SOURCES/python3-module-GeoIP2-4.7.0.tar + cd python3-module-GeoIP2-4.7.0 + /bin/chmod -c -Rf u+rwX,go-w . + exit 0 Executing(%build): /bin/sh -e /usr/src/tmp/rpm-tmp.98554 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python3-module-GeoIP2-4.7.0 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export FFLAGS + /usr/bin/python3 setup.py build running build running build_py creating build creating build/lib creating build/lib/geoip2 copying geoip2/webservice.py -> build/lib/geoip2 copying geoip2/types.py -> build/lib/geoip2 copying geoip2/records.py -> build/lib/geoip2 copying geoip2/models.py -> build/lib/geoip2 copying geoip2/mixins.py -> build/lib/geoip2 copying geoip2/errors.py -> build/lib/geoip2 copying geoip2/database.py -> build/lib/geoip2 copying geoip2/__init__.py -> build/lib/geoip2 running egg_info creating geoip2.egg-info writing geoip2.egg-info/PKG-INFO writing dependency_links to geoip2.egg-info/dependency_links.txt writing requirements to geoip2.egg-info/requires.txt writing top-level names to geoip2.egg-info/top_level.txt writing manifest file 'geoip2.egg-info/SOURCES.txt' reading manifest file 'geoip2.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'tests/data/test-data/*.mmdb' warning: no directories found matching 'docs/html' adding license file 'LICENSE' writing manifest file 'geoip2.egg-info/SOURCES.txt' copying geoip2/py.typed -> build/lib/geoip2 + sphinx-build-3 -b html docs html Running Sphinx v7.0.1 /usr/lib/python3/site-packages/sphinxcontrib/htmlhelp/__init__.py:26: RemovedInSphinx80Warning: The alias 'sphinx.util.progress_message' is deprecated, use 'sphinx.http_date.epoch_to_rfc1123' instead. Check CHANGES for Sphinx API modifications. from sphinx.util import progress_message making output directory... done WARNING: html_static_path entry '_static' does not exist WARNING: The pre-Sphinx 1.0 'intersphinx_mapping' format is deprecated and will be removed in Sphinx 8. Update to the current format as described in the documentation. Hint: "intersphinx_mapping = {'': ('http://docs.python.org/', None)}".https://www.sphinx-doc.org/en/master/usage/extensions/intersphinx.html#confval-intersphinx_mapping loading intersphinx inventory from http://docs.python.org/objects.inv... WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'http://docs.python.org/objects.inv' not fetchable due to : HTTPConnectionPool(host='docs.python.org', port=80): Max retries exceeded with url: /objects.inv (Caused by NameResolutionError(": Failed to resolve 'docs.python.org' ([Errno -3] Temporary failure in name resolution)")) building [mo]: targets for 0 po files that are out of date writing output... building [html]: targets for 1 source files that are out of date updating environment: [new config] 1 added, 0 changed, 0 removed reading sources... [100%] index /usr/src/RPM/BUILD/python3-module-GeoIP2-4.7.0/geoip2/records.py:docstring of geoip2.records.Traits.network:1: WARNING: duplicate object description of geoip2.records.Traits.network, other instance in index, use :noindex: for one of them /usr/src/RPM/BUILD/python3-module-GeoIP2-4.7.0/geoip2/errors.py:docstring of geoip2.errors.AddressNotFoundError.network:1: WARNING: duplicate object description of geoip2.errors.AddressNotFoundError.network, other instance in index, use :noindex: for one of them looking for now-outdated files... none found pickling environment... done checking consistency... done preparing documents... done writing output... [100%] index generating indices... genindex py-modindex done writing additional pages... search done copying static files... done copying extra files... done dumping search index in English (code: en)... done dumping object inventory... done build succeeded, 5 warnings. The HTML pages are in html. + rm -rf html/.buildinfo html/.doctrees + exit 0 Executing(%install): /bin/sh -e /usr/src/tmp/rpm-tmp.20103 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + /bin/chmod -Rf u+rwX -- /usr/src/tmp/python3-module-GeoIP2-buildroot + : + /bin/rm -rf -- /usr/src/tmp/python3-module-GeoIP2-buildroot + PATH=/usr/libexec/rpm-build:/usr/src/bin:/bin:/usr/bin:/usr/X11R6/bin:/usr/games + cd python3-module-GeoIP2-4.7.0 + CFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CFLAGS + CXXFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export CXXFLAGS + FFLAGS='-pipe -frecord-gcc-switches -Wall -g -O2 -flto=auto -march=i586 -mtune=generic' + export FFLAGS + /usr/bin/python3 setup.py install --skip-build --root=/usr/src/tmp/python3-module-GeoIP2-buildroot --force running install /usr/lib/python3/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() running install_lib creating /usr/src/tmp/python3-module-GeoIP2-buildroot creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3 creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages creating /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/py.typed -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/__init__.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/database.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/errors.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/mixins.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/models.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/records.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/types.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 copying build/lib/geoip2/webservice.py -> /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2 byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py to __init__.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py to database.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py to errors.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py to mixins.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py to models.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py to records.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py to types.cpython-311.pyc byte-compiling /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py to webservice.cpython-311.pyc running install_egg_info running egg_info writing geoip2.egg-info/PKG-INFO writing dependency_links to geoip2.egg-info/dependency_links.txt writing requirements to geoip2.egg-info/requires.txt writing top-level names to geoip2.egg-info/top_level.txt reading manifest file 'geoip2.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'tests/data/test-data/*.mmdb' warning: no directories found matching 'docs/html' adding license file 'LICENSE' writing manifest file 'geoip2.egg-info/SOURCES.txt' Copying geoip2.egg-info to /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2-4.7.0-py3.11.egg-info running install_scripts + /usr/lib/rpm/brp-alt Cleaning files in /usr/src/tmp/python3-module-GeoIP2-buildroot (auto) Verifying and fixing files in /usr/src/tmp/python3-module-GeoIP2-buildroot (binconfig,pkgconfig,libtool,desktop,gnuconfig) Checking contents of files in /usr/src/tmp/python3-module-GeoIP2-buildroot/ (default) Compressing files in /usr/src/tmp/python3-module-GeoIP2-buildroot (auto) Adjusting library links in /usr/src/tmp/python3-module-GeoIP2-buildroot ./usr/lib: (from :0) Verifying ELF objects in /usr/src/tmp/python3-module-GeoIP2-buildroot (arch=normal,fhs=normal,lfs=relaxed,lint=relaxed,rpath=normal,stack=normal,textrel=normal,unresolved=normal) Bytecompiling python3 modules in /usr/src/tmp/python3-module-GeoIP2-buildroot using /usr/bin/python3 unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/database.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/errors.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/mixins.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/models.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/records.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/types.cpython-311.pyc unlink /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__pycache__/webservice.cpython-311.pyc compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py Bytecompiling python3 modules with optimization in /usr/src/tmp/python3-module-GeoIP2-buildroot using /usr/bin/python3 -O compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py Bytecompiling python3 modules with optimization-2 in /usr/src/tmp/python3-module-GeoIP2-buildroot using /usr/bin/python3 -OO compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/__init__.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/database.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/errors.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/mixins.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/models.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/records.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/types.py compile /usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages/geoip2/webservice.py Hardlinking identical .pyc and .opt-?.pyc files './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.opt-2.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/__init__.cpython-311.opt-1.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/database.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/database.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/errors.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/errors.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/mixins.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/mixins.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/models.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/models.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/records.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/records.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/types.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/types.cpython-311.pyc' './usr/lib/python3/site-packages/geoip2/__pycache__/webservice.cpython-311.opt-1.pyc' => './usr/lib/python3/site-packages/geoip2/__pycache__/webservice.cpython-311.pyc' Executing(%check): /bin/sh -e /usr/src/tmp/rpm-tmp.43495 + umask 022 + /bin/mkdir -p /usr/src/RPM/BUILD + cd /usr/src/RPM/BUILD + cd python3-module-GeoIP2-4.7.0 + export PYTHONPATH=/usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages + PYTHONPATH=/usr/src/tmp/python3-module-GeoIP2-buildroot/usr/lib/python3/site-packages + py.test-3 --ignore tests/database_test.py ============================= test session starts ============================== platform linux -- Python 3.11.6, pytest-7.4.3, pluggy-1.3.0 rootdir: /usr/src/RPM/BUILD/python3-module-GeoIP2-4.7.0 collected 61 items tests/models_test.py ......... [ 14%] tests/webservice_test.py ..........................FFFFFFF.FFFFFFFF..FFF [ 91%] FFFFF [100%] =================================== FAILURES =================================== ________________________ TestAsyncClient.test_200_error ________________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.1.1.1 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_200_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/1.1.1.1", body="", status=200, content_type=self._content_type("country"), ) with self.assertRaisesRegex( GeoIP2Error, "could not decode the response as JSON" ): > self.run_client(self.client.country("1.1.1.1")) tests/webservice_test.py:141: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ________________________ TestAsyncClient.test_300_error ________________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.11 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_300_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.11", status=300, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, r"Received a very surprising HTTP status \(300\) for" ): > self.run_client(self.client.country("1.2.3.11")) tests/webservice_test.py:212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ________________________ TestAsyncClient.test_500_error ________________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.10 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_500_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.10", status=500 ) with self.assertRaisesRegex(HTTPError, r"Received a server error \(500\) for"): > self.run_client(self.client.country("1.2.3.10")) tests/webservice_test.py:199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_account_id_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_account_id_required(self): > self._test_error(401, "ACCOUNT_ID_REQUIRED", AuthenticationError) tests/webservice_test.py:240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ____________________ TestAsyncClient.test_account_id_unkown ____________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_account_id_unkown(self): > self._test_error(401, "ACCOUNT_ID_UNKNOWN", AuthenticationError) tests/webservice_test.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ______________________ TestAsyncClient.test_auth_invalid _______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_auth_invalid(self): > self._test_error(400, "AUTHORIZATION_INVALID", AuthenticationError) tests/webservice_test.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _____________________ TestAsyncClient.test_bad_body_error ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.9 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_bad_body_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.9", body="bad body", status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, "it did not include the expected JSON body" ): > self.run_client(self.client.country("1.2.3.9")) tests/webservice_test.py:191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _________________________ TestAsyncClient.test_city_ok _________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/city/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/city/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP2-P...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_city_ok(self): httpretty.register_uri( httpretty.GET, self.base_uri + "city/" + "1.2.3.4", body=json.dumps(self.country), status=200, content_type=self._content_type("city"), ) > city = self.run_client(self.client.city("1.2.3.4")) tests/webservice_test.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:289: in city City, await self._response_for("city", geoip2.models.City, ip_address) geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/city/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _______________________ TestAsyncClient.test_country_ok ________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_country_ok(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/1.2.3.4", body=json.dumps(self.country), status=200, content_type=self._content_type("country"), ) > country = self.run_client(self.client.country("1.2.3.4")) tests/webservice_test.py:72: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _______________________ TestAsyncClient.test_insights_ok _______________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/insights/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/insights/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_insights_ok(self): httpretty.register_uri( httpretty.GET, self.base_uri + "insights/1.2.3.4", body=json.dumps(self.insights), status=200, content_type=self._content_type("country"), ) > insights = self.run_client(self.client.insights("1.2.3.4")) tests/webservice_test.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:322: in insights await self._response_for("insights", geoip2.models.Insights, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/insights/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError __________________ TestAsyncClient.test_ip_address_not_found ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_ip_address_not_found(self): > self._test_error(404, "IP_ADDRESS_NOT_FOUND", AddressNotFoundError) tests/webservice_test.py:220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_ip_address_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_ip_address_required(self): > self._test_error(400, "IP_ADDRESS_REQUIRED", InvalidRequestError) tests/webservice_test.py:216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_ip_address_reserved ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_ip_address_reserved(self): > self._test_error(400, "IP_ADDRESS_RESERVED", AddressNotFoundError) tests/webservice_test.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError __________________ TestAsyncClient.test_license_key_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_license_key_required(self): > self._test_error(401, "LICENSE_KEY_REQUIRED", AuthenticationError) tests/webservice_test.py:236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________________ TestAsyncClient.test_me ____________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/me' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/me HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP2-Pyt...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_me(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/me", body=json.dumps(self.country), status=200, content_type=self._content_type("country"), ) > implicit_me = self.run_client(self.client.country()) tests/webservice_test.py:118: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/me' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ______________________ TestAsyncClient.test_no_body_error ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.7 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_no_body_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.7", body="", status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, "Received a 400 error for .* with no body" ): > self.run_client(self.client.country("1.2.3.7")) tests/webservice_test.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError __________________ TestAsyncClient.test_out_of_queries_error ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_out_of_queries_error(self): > self._test_error(402, "OUT_OF_QUERIES", OutOfQueriesError) tests/webservice_test.py:256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ___________________ TestAsyncClient.test_permission_required ___________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_permission_required(self): > self._test_error(403, "PERMISSION_REQUIRED", PermissionRequiredError) tests/webservice_test.py:228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _________________________ TestAsyncClient.test_request _________________________ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.4 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_request(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.4", body=json.dumps(self.country), status=200, content_type=self._content_type("country"), ) > self.run_client(self.client.country("1.2.3.4")) tests/webservice_test.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' str_or_url = 'https://geoip.maxmind.com/geoip/v2.1/country/1.2.3.4' async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ______________________ TestAsyncClient.test_unknown_error ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.19 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_unknown_error(self): msg = "Unknown error type" ip = "1.2.3.19" body = {"error": msg, "code": "UNKNOWN_TYPE"} httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + ip, body=json.dumps(body), status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex(InvalidRequestError, msg): > self.run_client(self.client.country(ip)) tests/webservice_test.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ____________________ TestAsyncClient.test_user_id_required _____________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_user_id_required(self): > self._test_error(401, "USER_ID_REQUIRED", AuthenticationError) tests/webservice_test.py:244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError _____________________ TestAsyncClient.test_user_id_unkown ______________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.18 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoI...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_user_id_unkown(self): > self._test_error(401, "USER_ID_UNKNOWN", AuthenticationError) tests/webservice_test.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:269: in _test_error self.run_client(self.client.country("1.2.3.18")) tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError ____________________ TestAsyncClient.test_weird_body_error _____________________ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: > resp = await req.send(conn) /usr/lib/python3/site-packages/aiohttp/client.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3/site-packages/aiohttp/client_reqrep.py:670: in send await writer.write_headers(status_line, self.headers) /usr/lib/python3/site-packages/aiohttp/http_writer.py:130: in write_headers self._write(buf) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = chunk = b'GET /geoip/v2.1/country/1.2.3.8 HTTP/1.1\r\nHost: geoip.maxmind.com\r\nAccept: application/json\r\nUser-Agent: GeoIP...4.7.0 Python/3.11 aiohttp/3.8.5\r\nAccept-Encoding: gzip, deflate\r\nAuthorization: Basic NDI6YWJjZGVmMTIzNDU2\r\n\r\n' def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size transport = self.transport if not self._protocol.connected or transport is None or transport.is_closing(): > raise ConnectionResetError("Cannot write to closing transport") E ConnectionResetError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/http_writer.py:75: ConnectionResetError The above exception was the direct cause of the following exception: self = @httprettified def test_weird_body_error(self): httpretty.register_uri( httpretty.GET, self.base_uri + "country/" + "1.2.3.8", body='{"wierd": 42}', status=400, content_type=self._content_type("country"), ) with self.assertRaisesRegex( HTTPError, "Response contains JSON but it does not " "specify code or error keys", ): > self.run_client(self.client.country("1.2.3.8")) tests/webservice_test.py:177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/webservice_test.py:386: in run_client return self._loop.run_until_complete(v) /usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete return future.result() geoip2/webservice.py:304: in country await self._response_for("country", geoip2.models.Country, ip_address), geoip2/webservice.py:343: in _response_for async with await session.get(uri, proxy=self._proxy) as response: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, cookies: Optional[LooseCookies] = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, max_redirects: int = 10, compress: Optional[str] = None, chunked: Optional[bool] = None, expect100: bool = False, raise_for_status: Optional[bool] = None, read_until_eof: bool = True, proxy: Optional[StrOrURL] = None, proxy_auth: Optional[BasicAuth] = None, timeout: Union[ClientTimeout, object] = sentinel, verify_ssl: Optional[bool] = None, fingerprint: Optional[bytes] = None, ssl_context: Optional[SSLContext] = None, ssl: Optional[Union[SSLContext, bool, Fingerprint]] = None, proxy_headers: Optional[LooseHeaders] = None, trace_request_ctx: Optional[SimpleNamespace] = None, read_bufsize: Optional[int] = None, ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError("Session is closed") ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( "data and json parameters can not be used at the same time" ) elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn("Chunk size is deprecated #1615", DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = self._build_url(str_or_url) except ValueError as e: raise InvalidURL(str_or_url) from e skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError as e: raise InvalidURL(proxy) from e if timeout is sentinel: real_timeout: ClientTimeout = self._timeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore[arg-type] else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() if read_bufsize is None: read_bufsize = self._read_bufsize traces = [ Trace( self, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx), ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start(method, url.update_query(params), headers) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError( "Cannot combine AUTH argument with " "credentials encoded in URL" ) if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if ( headers is not None and auth is not None and hdrs.AUTHORIZATION in headers ): raise ValueError( "Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL" ) all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: with suppress(LookupError): proxy, proxy_auth = get_env_proxy_for_url(url) req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces, ) # connection timeout try: async with ceil_timeout(real_timeout.connect): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( "Connection timeout " "to host {}".format(url) ) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == "HEAD", read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read, read_bufsize=read_bufsize, ) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: if exc.errno is None and isinstance(exc, asyncio.TimeoutError): raise > raise ClientOSError(*exc.args) from exc E aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport /usr/lib/python3/site-packages/aiohttp/client.py:572: ClientOSError =============================== warnings summary =============================== ../../../../lib/python3/site-packages/gunicorn/util.py:25 /usr/lib/python3/site-packages/gunicorn/util.py:25: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html import pkg_resources ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2871 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2871 /usr/lib/python3/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('paste')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2871 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2871 ../../../../lib/python3/site-packages/pkg_resources/__init__.py:2871 /usr/lib/python3/site-packages/pkg_resources/__init__.py:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('sphinxcontrib')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED tests/webservice_test.py::TestAsyncClient::test_200_error - aiohttp.cl... FAILED tests/webservice_test.py::TestAsyncClient::test_300_error - aiohttp.cl... FAILED tests/webservice_test.py::TestAsyncClient::test_500_error - aiohttp.cl... FAILED tests/webservice_test.py::TestAsyncClient::test_account_id_required - ... FAILED tests/webservice_test.py::TestAsyncClient::test_account_id_unkown - ai... FAILED tests/webservice_test.py::TestAsyncClient::test_auth_invalid - aiohttp... FAILED tests/webservice_test.py::TestAsyncClient::test_bad_body_error - aioht... FAILED tests/webservice_test.py::TestAsyncClient::test_city_ok - aiohttp.clie... FAILED tests/webservice_test.py::TestAsyncClient::test_country_ok - aiohttp.c... FAILED tests/webservice_test.py::TestAsyncClient::test_insights_ok - aiohttp.... FAILED tests/webservice_test.py::TestAsyncClient::test_ip_address_not_found FAILED tests/webservice_test.py::TestAsyncClient::test_ip_address_required - ... FAILED tests/webservice_test.py::TestAsyncClient::test_ip_address_reserved - ... FAILED tests/webservice_test.py::TestAsyncClient::test_license_key_required FAILED tests/webservice_test.py::TestAsyncClient::test_me - aiohttp.client_ex... FAILED tests/webservice_test.py::TestAsyncClient::test_no_body_error - aiohtt... FAILED tests/webservice_test.py::TestAsyncClient::test_out_of_queries_error FAILED tests/webservice_test.py::TestAsyncClient::test_permission_required - ... FAILED tests/webservice_test.py::TestAsyncClient::test_request - aiohttp.clie... FAILED tests/webservice_test.py::TestAsyncClient::test_unknown_error - aiohtt... FAILED tests/webservice_test.py::TestAsyncClient::test_user_id_required - aio... FAILED tests/webservice_test.py::TestAsyncClient::test_user_id_unkown - aioht... FAILED tests/webservice_test.py::TestAsyncClient::test_weird_body_error - aio... ================== 23 failed, 38 passed, 6 warnings in 1.91s =================== error: Bad exit status from /usr/src/tmp/rpm-tmp.43495 (%check) RPM build errors: Bad exit status from /usr/src/tmp/rpm-tmp.43495 (%check) Command exited with non-zero status 1 4.32user 0.35system 0:04.75elapsed 98%CPU (0avgtext+0avgdata 59076maxresident)k 0inputs+0outputs (0major+101189minor)pagefaults 0swaps hsh-rebuild: rebuild of `python3-module-GeoIP2-4.7.0-alt1.src.rpm' failed. Command exited with non-zero status 1 4.08user 1.60system 0:17.75elapsed 32%CPU (0avgtext+0avgdata 124904maxresident)k 464inputs+0outputs (0major+339816minor)pagefaults 0swaps