Skip to content

security(pip_requirements): 🛡️ minor dependency to v2.7.0#93

Open
renovate[bot] wants to merge 2 commits into
mainfrom
renovate/vulnerability-high
Open

security(pip_requirements): 🛡️ minor dependency to v2.7.0#93
renovate[bot] wants to merge 2 commits into
mainfrom
renovate/vulnerability-high

Conversation

@renovate
Copy link
Copy Markdown
Contributor

@renovate renovate Bot commented May 11, 2026

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
urllib3 (changelog) ==2.6.3==2.7.0 age adoption passing confidence

urllib3: Decompression-bomb safeguards bypassed in parts of the streaming API

CVE-2026-44432 / GHSA-mf9v-mfxr-j63j

More information

Details

Impact

urllib3's streaming API is designed for the efficient handling of large HTTP responses by reading the content in chunks, rather than loading the entire response body into memory at once.

urllib3 can perform decompression based on the HTTP Content-Encoding header (e.g., gzip, deflate, br, or zstd). When using the streaming API since version 2.6.0, the library decompresses only the necessary bytes, enabling partial content consumption.

However, urllib3 before version 2.7.0 could still decompress the whole response instead of the requested portion in two cases:

  1. During the second HTTPResponse.read(amt=N) call when the response was decompressed using the official Brotli library.
  2. When HTTPResponse.drain_conn() was called after the response had been read and decompressed partially (compression algorithm did not matter here).

These issues could cause urllib3 to fully decode a small amount of highly compressed data in a single operation. This could result in excessive resource consumption (high CPU usage and massive memory allocation for the decompressed data; CWE-409) on the client side.

Affected usages

Applications and libraries using urllib3 versions earlier than 2.7.0 may be affected when streaming compressed responses from untrusted sources in either of these cases, unless decompression is explicitly disabled:

  1. A response encoded with br is read incrementally with at least two HTTPResponse.read(amt=N) or HTTPResponse.stream(amt=N) calls while using the official Brotli library.
  2. HTTPResponse.drain_conn() is called after response decompression has already started.
Remediation

Upgrade to at least urllib3 version 2.7.0 in which the library:

  1. Is more efficient for reads with Brotli.
  2. Always skips decompression for HTTPResponse.drain_conn().

If upgrading is not immediately possible, the following workarounds may reduce exposure in specific cases:

  1. For the Brotli-specific issue only, switch from brotli to brotlicffi until you can upgrade urllib3; the official Brotli package is affected because of https://github.com/google/brotli/issues/1396.
  2. If your code explicitly calls HTTPResponse.drain_conn(), call HTTPResponse.close() instead when connection reuse is not important.
Credits

The Brotli-specific issue was reported by @​kimkou2024.
HTTPResponse.drain_conn() inefficiency was reported by @​Cycloctane.

Severity

  • CVSS Score: 8.9 / 10 (High)
  • Vector String: CVSS:4.0/AV:N/AC:L/AT:P/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:H

References

This data is provided by the GitHub Advisory Database (CC-BY 4.0).


urllib3: Decompression-bomb safeguards bypassed in parts of the streaming API

CVE-2026-44432 / GHSA-mf9v-mfxr-j63j

More information

Details

Impact

urllib3's streaming API is designed for the efficient handling of large HTTP responses by reading the content in chunks, rather than loading the entire response body into memory at once.

urllib3 can perform decompression based on the HTTP Content-Encoding header (e.g., gzip, deflate, br, or zstd). When using the streaming API since version 2.6.0, the library decompresses only the necessary bytes, enabling partial content consumption.

However, urllib3 before version 2.7.0 could still decompress the whole response instead of the requested portion in two cases:

  1. During the second HTTPResponse.read(amt=N) call when the response was decompressed using the official Brotli library.
  2. When HTTPResponse.drain_conn() was called after the response had been read and decompressed partially (compression algorithm did not matter here).

These issues could cause urllib3 to fully decode a small amount of highly compressed data in a single operation. This could result in excessive resource consumption (high CPU usage and massive memory allocation for the decompressed data; CWE-409) on the client side.

Affected usages

Applications and libraries using urllib3 versions earlier than 2.7.0 may be affected when streaming compressed responses from untrusted sources in either of these cases, unless decompression is explicitly disabled:

  1. A response encoded with br is read incrementally with at least two HTTPResponse.read(amt=N) or HTTPResponse.stream(amt=N) calls while using the official Brotli library.
  2. HTTPResponse.drain_conn() is called after response decompression has already started.
Remediation

Upgrade to at least urllib3 version 2.7.0 in which the library:

  1. Is more efficient for reads with Brotli.
  2. Always skips decompression for HTTPResponse.drain_conn().

If upgrading is not immediately possible, the following workarounds may reduce exposure in specific cases:

  1. For the Brotli-specific issue only, switch from brotli to brotlicffi until you can upgrade urllib3; the official Brotli package is affected because of https://github.com/google/brotli/issues/1396.
  2. If your code explicitly calls HTTPResponse.drain_conn(), call HTTPResponse.close() instead when connection reuse is not important.
Credits

The Brotli-specific issue was reported by @​kimkou2024.
HTTPResponse.drain_conn() inefficiency was reported by @​Cycloctane.

Severity

  • CVSS Score: 8.9 / 10 (High)
  • Vector String: CVSS:4.0/AV:N/AC:L/AT:P/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:H

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


urllib3: Sensitive headers forwarded across origins in proxied low-level redirects

CVE-2026-44431 / GHSA-qccp-gfcp-xxvc

More information

Details

Impact

When following cross-origin redirects for requests made using urllib3’s high-level APIs, such as urllib3.request(), PoolManager.request(), and ProxyManager.request(), sensitive headers — Authorization, Cookie, and Proxy-Authorization (defined in Retry.DEFAULT_REMOVE_HEADERS_ON_REDIRECT) — are stripped by default, as expected.

However, cross-origin redirects followed from the low-level API via ProxyManager.connection_from_url().urlopen(..., assert_same_host=False) still forward these sensitive headers.

Affected usage

Applications and libraries using urllib3 versions earlier than 2.7.0 may be affected if they allow cross-origin redirects while making requests through HTTPConnection.urlopen() instances created via ProxyManager.connection_from_url().

Remediation

Upgrade to urllib3 version 2.7.0 or later, in which sensitive headers are stripped from redirects followed by HTTPConnection.

If upgrading is not immediately possible, avoid using this low-level redirect flow for cross-origin redirects. If appropriate for your use case, switch to ProxyManager.request().

Severity

  • CVSS Score: 8.2 / 10 (High)
  • Vector String: CVSS:4.0/AV:N/AC:H/AT:P/PR:N/UI:N/VC:H/VI:N/VA:N/SC:N/SI:N/SA:N

References

This data is provided by OSV and the GitHub Advisory Database (CC-BY 4.0).


Release Notes

urllib3/urllib3 (urllib3)

v2.7.0

Compare Source

=======================

Security

Addressed high-severity security issues.
Impact was limited to specific use cases detailed in the accompanying
advisories; overall user exposure was estimated to be marginal.

  • Decompression-bomb safeguards of the streaming API were bypassed:

    1. When HTTPResponse.drain_conn() was called after the response had been
      read and decompressed partially.
    2. During the second HTTPResponse.read(amt=N) or
      HTTPResponse.stream(amt=N) call when the response was decompressed
      using the official Brotli <https://pypi.org/project/brotli/>__ library.

    See GHSA-mf9v-mfxr-j63j <https://github.com/urllib3/urllib3/security/advisories/GHSA-mf9v-mfxr-j63j>__
    for details.

  • HTTP pools created using ProxyManager.connection_from_url did not strip
    sensitive headers specified in Retry.remove_headers_on_redirect when
    redirecting to a different host.
    (GHSA-qccp-gfcp-xxvc <https://github.com/urllib3/urllib3/security/advisories/GHSA-qccp-gfcp-xxvc>__)

Deprecations and Removals

  • Used FutureWarning instead of DeprecationWarning for better
    visibility of existing deprecation notices. Rescheduled the removal of
    deprecated features to version 3.0.
    (#&#8203;3763 <https://github.com/urllib3/urllib3/issues/3763>__)
  • Removed support for end-of-life Python 3.9.
    (#&#8203;3720 <https://github.com/urllib3/urllib3/issues/3720>__)
  • Removed support for end-of-life PyPy3.10.
    (#&#8203;4979 <https://github.com/urllib3/urllib3/issues/4979>__)
  • Bumped the minimum supported pyOpenSSL version to 19.0.0.
    (#&#8203;3777 <https://github.com/urllib3/urllib3/issues/3777>__)

Bugfixes

  • Fixed a bug where HTTPResponse.read(amt=None) was ignoring decompressed
    data buffered from previous partial reads.
    (#&#8203;3636 <https://github.com/urllib3/urllib3/issues/3636>__)
  • Fixed a bug where HTTPResponse.read() could cache only part of the
    response after a partial read when cache_content=True.
    (#&#8203;4967 <https://github.com/urllib3/urllib3/issues/4967>__)
  • Fixed HTTPResponse.stream() and HTTPResponse.read_chunked() to handle
    amt=0.
    (#&#8203;3793 <https://github.com/urllib3/urllib3/issues/3793>__)
  • Updated _TYPE_BODY type alias to include missing Iterable[str],
    matching the documented and runtime behavior of chunked request bodies.
    (#&#8203;3798 <https://github.com/urllib3/urllib3/issues/3798>__)
  • Fixed LocationParseError when paths resembling schemeless URIs were
    passed to HTTPConnectionPool.urlopen().
    (#&#8203;3352 <https://github.com/urllib3/urllib3/issues/3352>__)
  • Fixed BaseHTTPResponse.readinto() type annotation to accept
    memoryview in addition to bytearray, matching the
    io.RawIOBase.readinto contract and enabling use with
    io.BufferedReader without type errors.
    (#&#8203;3764 <https://github.com/urllib3/urllib3/issues/3764>__)

Configuration

📅 Schedule: (in timezone America/Chicago)

  • Branch creation
    • At any time (no schedule defined)
  • Automerge
    • At any time (no schedule defined)

🚦 Automerge: Enabled.

Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

👻 Immortal: This PR will be recreated if closed unmerged. Get config help if that's undesired.


This PR was generated by Mend Renovate. View the repository job log.

@snyk-io
Copy link
Copy Markdown

snyk-io Bot commented May 11, 2026

Snyk checks have failed. 4 issues have been found so far.

Status Scan Engine Critical High Medium Low Total (4)
Open Source Security 0 4 0 0 4 issues
Licenses 0 0 0 0 0 issues
Code Security 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

@renovate
Copy link
Copy Markdown
Contributor Author

renovate Bot commented May 11, 2026

Edited/Blocked Notification

Renovate will not automatically rebase this PR, because it does not recognize the last commit author and assumes somebody else may have edited the PR.

You can manually request rebase by checking the rebase/retry box above.

⚠️ Warning: custom changes will be lost.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant