Skip to content

Conversation

@renovate
Copy link
Contributor

@renovate renovate bot commented Nov 7, 2025

This PR contains the following updates:

Package Change Age Confidence
aiohttp ==3.12.15 -> ==3.13.2 age confidence

Release Notes

aio-libs/aiohttp (aiohttp)

v3.13.2: 3.13.2

Compare Source

Bug fixes

  • Fixed cookie parser to continue parsing subsequent cookies when encountering a malformed cookie that fails regex validation, such as Google's g_state cookie with unescaped quotes -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    #​11632.

  • Fixed loading netrc credentials from the default :file:~/.netrc (:file:~/_netrc on Windows) location when the :envvar:NETRC environment variable is not set -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    #​11713, #​11714.

  • Fixed WebSocket compressed sends to be cancellation safe. Tasks are now shielded during compression to prevent compressor state corruption. This ensures that the stateful compressor remains consistent even when send operations are cancelled -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    #​11725.


v3.13.1

Compare Source

===================

Features

  • Make configuration options in AppRunner also available in run_app()
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:11633.

Bug fixes

  • Switched to backports.zstd for Python <3.14 and fixed zstd decompression for chunked zstd streams -- by :user:ZhaoMJ.

    Note: Users who installed zstandard for support on Python <3.14 will now need to install
    backports.zstd instead (installing aiohttp[speedups] will do this automatically).

    Related issues and pull requests on GitHub:
    :issue:11623.

  • Updated Content-Type header parsing to return application/octet-stream when header contains invalid syntax.
    See :rfc:9110#section-8.3-5.

    -- by :user:sgaist.

    Related issues and pull requests on GitHub:
    :issue:10889.

  • Fixed Python 3.14 support when built without zstd support -- by :user:JacobHenner.

    Related issues and pull requests on GitHub:
    :issue:11603.

  • Fixed blocking I/O in the event loop when using netrc authentication by moving netrc file lookup to an executor -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:11634.

  • Fixed routing to a sub-application added via .add_domain() not working
    if the same path exists on the parent app. -- by :user:Dreamsorcerer.

    Related issues and pull requests on GitHub:
    :issue:11673.

Packaging updates and notes for downstreams

  • Moved core packaging metadata from :file:setup.cfg to :file:pyproject.toml per :pep:621
    -- by :user:cdce8p.

    Related issues and pull requests on GitHub:
    :issue:9951.


v3.13.0

Compare Source

===================

Features

  • Added support for Python 3.14.

    Related issues and pull requests on GitHub:
    :issue:10851, :issue:10872.

  • Added support for free-threading in Python 3.14+ -- by :user:kumaraditya303.

    Related issues and pull requests on GitHub:
    :issue:11466, :issue:11464.

  • Added support for Zstandard (aka Zstd) compression
    -- by :user:KGuillaume-chaps.

    Related issues and pull requests on GitHub:
    :issue:11161.

  • Added StreamReader.total_raw_bytes to check the number of bytes downloaded
    -- by :user:robpats.

    Related issues and pull requests on GitHub:
    :issue:11483.

Bug fixes

  • Fixed pytest plugin to not use deprecated :py:mod:asyncio policy APIs.

    Related issues and pull requests on GitHub:
    :issue:10851.

  • Updated Content-Disposition header parsing to handle trailing semicolons and empty parts
    -- by :user:PLPeeters.

    Related issues and pull requests on GitHub:
    :issue:11243.

  • Fixed saved CookieJar failing to be loaded if cookies have partitioned flag when
    http.cookie does not have partitioned cookies supports. -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:11523.

Improved documentation

  • Added Wireup to third-party libraries -- by :user:maldoinc.

    Related issues and pull requests on GitHub:
    :issue:11233.

Packaging updates and notes for downstreams

  • The blockbuster test dependency is now optional; the corresponding test fixture is disabled when it is unavailable
    -- by :user:musicinybrain.

    Related issues and pull requests on GitHub:
    :issue:11363.

  • Added riscv64 build to releases -- by :user:eshattow.

    Related issues and pull requests on GitHub:
    :issue:11425.

Contributor-facing changes

  • Fixed test_send_compress_text failing when alternative zlib implementation
    is used. (zlib-ng in python 3.14 windows build) -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:11546.



Configuration

📅 Schedule: Branch creation - Tuesday through Thursday ( * * * * 2-4 ) (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot force-pushed the renovate/aiohttp-3-x branch from 79750ad to a52b0d2 Compare November 7, 2025 23:57
@github-actions
Copy link

github-actions bot commented Nov 7, 2025

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/dependabot.yml .github/dependabot.yml
index 5d4bbe08db7..39cde1e004a 100644
--- .github/dependabot.yml
+++ .github/dependabot.yml
@@ -25,7 +25,7 @@ updates:
     directory: "/"
     labels:
       - dependencies
-    target-branch: "3.12"
+    target-branch: "3.13"
     schedule:
       interval: "daily"
     open-pull-requests-limit: 10
@@ -37,7 +37,7 @@ updates:
       - dependency-type: "all"
     labels:
       - dependencies
-    target-branch: "3.12"
+    target-branch: "3.13"
     schedule:
       interval: "daily"
     open-pull-requests-limit: 10
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 2f0957306cd..e8f51219d03 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -32,7 +32,7 @@ jobs:
     timeout-minutes: 5
     steps:
     - name: Checkout
-      uses: actions/checkout@v4
+      uses: actions/checkout@v5
       with:
         submodules: true
     - name: >-
@@ -43,11 +43,11 @@ jobs:
         make sync-direct-runtime-deps
         git diff --exit-code -- requirements/runtime-deps.in
     - name: Setup Python
-      uses: actions/setup-python@v5
+      uses: actions/setup-python@v6
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -92,18 +92,18 @@ jobs:
     timeout-minutes: 5
     steps:
     - name: Checkout
-      uses: actions/checkout@v4
+      uses: actions/checkout@v5
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
         path:  vendor/llhttp/build
     - name: Setup NodeJS
       if: steps.cache.outputs.cache-hit != 'true'
-      uses: actions/setup-node@v4
+      uses: actions/setup-node@v5
       with:
         node-version: 18
     - name: Generate llhttp sources
@@ -122,11 +122,7 @@ jobs:
     needs: gen_llhttp
     strategy:
       matrix:
-        # Note that 3.13.4 is broken on Windows which
-        # is why 3.13.5 was rushed out. When 3.13.5 is fully
-        # available, we can remove 3.13.4 from the matrix
-        # and switch it back to 3.13
-        pyver: [3.9, '3.10', '3.11', '3.12', '3.13.3']
+        pyver: [3.9, '3.10', '3.11', '3.12', '3.13', '3.14']
         no-extensions: ['', 'Y']
         os: [ubuntu, macos, windows]
         experimental: [false]
@@ -140,21 +136,21 @@ jobs:
             no-extensions: 'Y'
             os: ubuntu
             experimental: false
-          # - os: ubuntu
-          #   pyver: "3.14"
-          #   experimental: true
-          #   no-extensions: 'Y'
+          - os: ubuntu
+            pyver: "3.14t"
+            no-extensions: ''
+            experimental: false
       fail-fast: true
     runs-on: ${{ matrix.os }}-latest
     continue-on-error: ${{ matrix.experimental }}
     steps:
     - name: Checkout
-      uses: actions/checkout@v4
+      uses: actions/checkout@v5
       with:
         submodules: true
     - name: Setup Python ${{ matrix.pyver }}
       id: python-install
-      uses: actions/setup-python@v5
+      uses: actions/setup-python@v6
       with:
         allow-prereleases: true
         python-version: ${{ matrix.pyver }}
@@ -164,7 +160,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -174,11 +170,16 @@ jobs:
       run: |
         python -m pip install -U pip wheel setuptools build twine
     - name: Install dependencies
+      env:
+        DEPENDENCY_GROUP: test${{ endsWith(matrix.pyver, 't') && '-ft' || '' }}
       run: |
-        python -m pip install -r requirements/test.in -c requirements/test.txt
+        python -Im pip install -r requirements/${{ env.DEPENDENCY_GROUP }}.in -c requirements/${{ env.DEPENDENCY_GROUP }}.txt
+    - name: Set PYTHON_GIL=0 for free-threading builds
+      if: ${{ endsWith(matrix.pyver, 't') }}
+      run: echo "PYTHON_GIL=0" >> $GITHUB_ENV
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v4
+      uses: actions/download-artifact@v5
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -245,15 +246,15 @@ jobs:
     needs: gen_llhttp
 
     runs-on: ubuntu-latest
-    timeout-minutes: 9
+    timeout-minutes: 12
     steps:
     - name: Checkout project
-      uses: actions/checkout@v4
+      uses: actions/checkout@v5
       with:
         submodules: true
     - name: Setup Python 3.13.2
       id: python-install
-      uses: actions/setup-python@v5
+      uses: actions/setup-python@v6
       with:
         python-version: 3.13.2
         cache: pip
@@ -265,7 +266,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v4
+      uses: actions/download-artifact@v5
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -275,8 +276,9 @@ jobs:
     - name: Install self
       run: python -m pip install -e .
     - name: Run benchmarks
-      uses: CodSpeedHQ/action@v3
+      uses: CodSpeedHQ/action@v4
       with:
+        mode: instrumentation
         token: ${{ secrets.CODSPEED_TOKEN }}
         run: python -Im pytest --no-cov --numprocesses=0 -vvvvv --codspeed
 
@@ -313,11 +315,11 @@ jobs:
     needs: pre-deploy
     steps:
     - name: Checkout
-      uses: actions/checkout@v4
+      uses: actions/checkout@v5
       with:
         submodules: true
     - name: Setup Python
-      uses: actions/setup-python@v5
+      uses: actions/setup-python@v6
     - name: Update pip, wheel, setuptools, build, twine
       run: |
         python -m pip install -U pip wheel setuptools build twine
@@ -326,7 +328,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v4
+      uses: actions/download-artifact@v5
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -359,6 +361,12 @@ jobs:
         - os: ubuntu-latest
           qemu: ppc64le
           musl: musllinux
+        - os: ubuntu-latest
+          qemu: riscv64
+          musl: ""
+        - os: ubuntu-latest
+          qemu: riscv64
+          musl: musllinux
         - os: ubuntu-latest
           qemu: s390x
           musl: ""
@@ -377,7 +385,7 @@ jobs:
           musl: musllinux
     steps:
     - name: Checkout
-      uses: actions/checkout@v4
+      uses: actions/checkout@v5
       with:
         submodules: true
     - name: Set up QEMU
@@ -399,7 +407,7 @@ jobs:
         fi
       shell: bash
     - name: Setup Python
-      uses: actions/setup-python@v5
+      uses: actions/setup-python@v6
       with:
         python-version: 3.x
     - name: Update pip, wheel, setuptools, build, twine
@@ -410,7 +418,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v4
+      uses: actions/download-artifact@v5
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -418,7 +426,7 @@ jobs:
       run: |
         make cythonize
     - name: Build wheels
-      uses: pypa/[email protected]
+      uses: pypa/[email protected]
       env:
         CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
@@ -448,14 +456,14 @@ jobs:
 
     steps:
     - name: Checkout
-      uses: actions/checkout@v4
+      uses: actions/checkout@v5
       with:
         submodules: true
     - name: Login
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v4
+      uses: actions/download-artifact@v5
       with:
         path: dist
         pattern: dist-*
@@ -481,7 +489,7 @@ jobs:
       uses: pypa/gh-action-pypi-publish@release/v1
 
     - name: Sign the dists with Sigstore
-      uses: sigstore/[email protected]
+      uses: sigstore/[email protected]
       with:
         inputs: >-
           ./dist/*.tar.gz
diff --git .github/workflows/codeql.yml .github/workflows/codeql.yml
index 601d45a35ad..be954079132 100644
--- .github/workflows/codeql.yml
+++ .github/workflows/codeql.yml
@@ -26,7 +26,7 @@ jobs:
 
     steps:
       - name: Checkout
-        uses: actions/checkout@v4
+        uses: actions/checkout@v5
 
       - name: Initialize CodeQL
         uses: github/codeql-action/init@v3
diff --git .github/workflows/labels.yml .github/workflows/labels.yml
index 8d9c0f6f4a2..e3f10214082 100644
--- .github/workflows/labels.yml
+++ .github/workflows/labels.yml
@@ -11,7 +11,7 @@ jobs:
     name: Backport label added
     if: ${{ github.event.pull_request.user.type != 'Bot' }}
     steps:
-      - uses: actions/github-script@v7
+      - uses: actions/github-script@v8
         with:
           github-token: ${{ secrets.GITHUB_TOKEN }}
           script: |
diff --git .github/workflows/stale.yml .github/workflows/stale.yml
index ef1b86cfa69..8a56a2b7b80 100644
--- .github/workflows/stale.yml
+++ .github/workflows/stale.yml
@@ -10,7 +10,7 @@ jobs:
   stale:
     runs-on: ubuntu-latest
     steps:
-      - uses: actions/stale@v9
+      - uses: actions/stale@v10
         with:
           days-before-stale: 30
           any-of-labels: needs-info
diff --git .pre-commit-config.yaml .pre-commit-config.yaml
index 0edf03d8db7..b5a67394b80 100644
--- .pre-commit-config.yaml
+++ .pre-commit-config.yaml
@@ -55,6 +55,10 @@ repos:
   rev: v1.5.0
   hooks:
   - id: yesqa
+    additional_dependencies:
+      - flake8-docstrings==1.6.0
+      - flake8-no-implicit-concat==0.3.4
+      - flake8-requirements==1.7.8
 - repo: https://github.com/PyCQA/isort
   rev: '5.13.2'
   hooks:
diff --git CHANGES.rst CHANGES.rst
index 88826347584..fd193db6959 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,243 @@
 
 .. towncrier release notes start
 
+3.13.2 (2025-10-28)
+===================
+
+Bug fixes
+---------
+
+- Fixed cookie parser to continue parsing subsequent cookies when encountering a malformed cookie that fails regex validation, such as Google's ``g_state`` cookie with unescaped quotes -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11632`.
+
+
+
+- Fixed loading netrc credentials from the default :file:`~/.netrc` (:file:`~/_netrc` on Windows) location when the :envvar:`NETRC` environment variable is not set -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11713`, :issue:`11714`.
+
+
+
+- Fixed WebSocket compressed sends to be cancellation safe. Tasks are now shielded during compression to prevent compressor state corruption. This ensures that the stateful compressor remains consistent even when send operations are cancelled -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11725`.
+
+
+
+
+----
+
+
+3.13.1 (2025-10-17)
+===================
+
+Features
+--------
+
+- Make configuration options in ``AppRunner`` also available in ``run_app()``
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11633`.
+
+
+
+Bug fixes
+---------
+
+- Switched to `backports.zstd` for Python <3.14 and fixed zstd decompression for chunked zstd streams -- by :user:`ZhaoMJ`.
+
+  Note: Users who installed ``zstandard`` for support on Python <3.14 will now need to install
+  ``backports.zstd`` instead (installing ``aiohttp[speedups]`` will do this automatically).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11623`.
+
+
+
+- Updated ``Content-Type`` header parsing to return ``application/octet-stream`` when header contains invalid syntax.
+  See :rfc:`9110#section-8.3-5`.
+
+  -- by :user:`sgaist`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10889`.
+
+
+
+- Fixed Python 3.14 support when built without ``zstd`` support -- by :user:`JacobHenner`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11603`.
+
+
+
+- Fixed blocking I/O in the event loop when using netrc authentication by moving netrc file lookup to an executor -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11634`.
+
+
+
+- Fixed routing to a sub-application added via ``.add_domain()`` not working
+  if the same path exists on the parent app. -- by :user:`Dreamsorcerer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11673`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Moved core packaging metadata from :file:`setup.cfg` to :file:`pyproject.toml` per :pep:`621`
+  -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`9951`.
+
+
+
+
+----
+
+
+3.13.0 (2025-10-06)
+===================
+
+Features
+--------
+
+- Added support for Python 3.14.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10851`, :issue:`10872`.
+
+
+
+- Added support for free-threading in Python 3.14+ -- by :user:`kumaraditya303`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11466`, :issue:`11464`.
+
+
+
+- Added support for Zstandard (aka Zstd) compression
+  -- by :user:`KGuillaume-chaps`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11161`.
+
+
+
+- Added ``StreamReader.total_raw_bytes`` to check the number of bytes downloaded
+  -- by :user:`robpats`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11483`.
+
+
+
+Bug fixes
+---------
+
+- Fixed pytest plugin to not use deprecated :py:mod:`asyncio` policy APIs.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10851`.
+
+
+
+- Updated `Content-Disposition` header parsing to handle trailing semicolons and empty parts
+  -- by :user:`PLPeeters`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11243`.
+
+
+
+- Fixed saved ``CookieJar`` failing to be loaded if cookies have ``partitioned`` flag when
+  ``http.cookie`` does not have partitioned cookies supports. -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11523`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``Wireup`` to third-party libraries -- by :user:`maldoinc`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11233`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- The `blockbuster` test dependency is now optional; the corresponding test fixture is disabled when it is unavailable
+  -- by :user:`musicinybrain`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11363`.
+
+
+
+- Added ``riscv64`` build to releases -- by :user:`eshattow`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11425`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+
+
+- Fixed ``test_send_compress_text`` failing when alternative zlib implementation
+  is used. (``zlib-ng`` in python 3.14 windows build) -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`11546`.
+
+
+
+
+----
+
+
 3.12.15 (2025-07-28)
 ====================
 
@@ -4425,7 +4662,7 @@ Bugfixes
   `#5853 <https://github.com/aio-libs/aiohttp/issues/5853>`_
 - Added ``params`` keyword argument to ``ClientSession.ws_connect``. --  :user:`hoh`.
   `#5868 <https://github.com/aio-libs/aiohttp/issues/5868>`_
-- Uses :py:class:`~asyncio.ThreadedChildWatcher` under POSIX to allow setting up test loop in non-main thread.
+- Uses ``asyncio.ThreadedChildWatcher`` under POSIX to allow setting up test loop in non-main thread.
   `#5877 <https://github.com/aio-libs/aiohttp/issues/5877>`_
 - Fix the error in handling the return value of `getaddrinfo`.
   `getaddrinfo` will return an `(int, bytes)` tuple, if CPython could not handle the address family.
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 941beea7b87..290be0205f1 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -170,6 +170,7 @@ Ivan Lakovic
 Ivan Larin
 J. Nick Koston
 Jacob Champion
+Jacob Henner
 Jaesung Lee
 Jake Davis
 Jakob Ackermann
@@ -212,9 +213,11 @@ Justin Foo
 Justin Turner Arthur
 Kay Zheng
 Kevin Samuel
+Kilian Guillaume
 Kimmo Parviainen-Jalanko
 Kirill Klenov
 Kirill Malovitsa
+Kirill Potapenko
 Konstantin Shutkin
 Konstantin Valetov
 Krzysztof Blazewicz
@@ -260,6 +263,7 @@ Mikhail Burshteyn
 Mikhail Kashkin
 Mikhail Lukyanchenko
 Mikhail Nacharov
+Mingjie Zhao
 Misha Behersky
 Mitchell Ferree
 Morgan Delahaye-Prat
@@ -277,6 +281,7 @@ Pahaz Blinov
 Panagiotis Kolokotronis
 Pankaj Pandey
 Parag Jain
+Patrick Lee
 Pau Freixes
 Paul Colomiets
 Paul J. Dorn
@@ -306,6 +311,7 @@ Roman Postnov
 Rong Zhang
 Samir Akarioh
 Samuel Colvin
+Samuel Gaist
 Sean Hunt
 Sebastian Acuna
 Sebastian Hanula
diff --git MANIFEST.in MANIFEST.in
index 64cee139a1f..ea5d39d4722 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -9,8 +9,7 @@ graft examples
 graft tests
 graft tools
 graft requirements
-recursive-include vendor *
-global-include aiohttp *.pyi
+graft vendor
 global-exclude *.pyc
 global-exclude *.pyd
 global-exclude *.so
diff --git Makefile Makefile
index c6193fea9e4..cf621705e2e 100644
--- Makefile
+++ Makefile
@@ -59,14 +59,14 @@ aiohttp/_find_header.c: $(call to-hash,aiohttp/hdrs.py ./tools/gen.py)
 # Special case for reader since we want to be able to disable
 # the extension with AIOHTTP_NO_EXTENSIONS
 aiohttp/_websocket/reader_c.c: aiohttp/_websocket/reader_c.py
-	cython -3 -o $@ $< -I aiohttp -Werror
+	cython -3 -X freethreading_compatible=True -o $@ $< -I aiohttp -Werror
 
 # _find_headers generator creates _headers.pyi as well
 aiohttp/%.c: aiohttp/%.pyx $(call to-hash,$(CYS)) aiohttp/_find_header.c
-	cython -3 -o $@ $< -I aiohttp -Werror
+	cython -3 -X freethreading_compatible=True -o $@ $< -I aiohttp -Werror
 
 aiohttp/_websocket/%.c: aiohttp/_websocket/%.pyx $(call to-hash,$(CYS))
-	cython -3 -o $@ $< -I aiohttp -Werror
+	cython -3 -X freethreading_compatible=True -o $@ $< -I aiohttp -Werror
 
 vendor/llhttp/node_modules: vendor/llhttp/package.json
 	cd vendor/llhttp; npm ci
diff --git README.rst README.rst
index 554627a42e7..e6f428640da 100644
--- README.rst
+++ README.rst
@@ -17,25 +17,21 @@ Async http client/server framework
    :target: https://codecov.io/gh/aio-libs/aiohttp
    :alt: codecov.io status for master branch
 
-.. image:: https://img.shields.io/endpoint?url=https://codspeed.io/badge.json
-   :target: https://codspeed.io/aio-libs/aiohttp
-   :alt: Codspeed.io status for aiohttp
-
 .. image:: https://badge.fury.io/py/aiohttp.svg
    :target: https://pypi.org/project/aiohttp
    :alt: Latest PyPI package version
 
+.. image:: https://img.shields.io/pypi/dm/aiohttp
+   :target: https://pypistats.org/packages/aiohttp
+   :alt: Downloads count
+
 .. image:: https://readthedocs.org/projects/aiohttp/badge/?version=latest
    :target: https://docs.aiohttp.org/
    :alt: Latest Read The Docs
 
-.. image:: https://img.shields.io/matrix/aio-libs:matrix.org?label=Discuss%20on%20Matrix%20at%20%23aio-libs%3Amatrix.org&logo=matrix&server_fqdn=matrix.org&style=flat
-   :target: https://matrix.to/#/%23aio-libs:matrix.org
-   :alt: Matrix Room — #aio-libs:matrix.org
-
-.. image:: https://img.shields.io/matrix/aio-libs-space:matrix.org?label=Discuss%20on%20Matrix%20at%20%23aio-libs-space%3Amatrix.org&logo=matrix&server_fqdn=matrix.org&style=flat
-   :target: https://matrix.to/#/%23aio-libs-space:matrix.org
-   :alt: Matrix Space — #aio-libs-space:matrix.org
+.. image:: https://img.shields.io/endpoint?url=https://codspeed.io/badge.json
+   :target: https://codspeed.io/aio-libs/aiohttp
+   :alt: Codspeed.io status for aiohttp
 
 
 Key Features
@@ -201,3 +197,17 @@ Benchmarks
 If you are interested in efficiency, the AsyncIO community maintains a
 list of benchmarks on the official wiki:
 https://github.com/python/asyncio/wiki/Benchmarks
+
+--------
+
+.. image:: https://img.shields.io/matrix/aio-libs:matrix.org?label=Discuss%20on%20Matrix%20at%20%23aio-libs%3Amatrix.org&logo=matrix&server_fqdn=matrix.org&style=flat
+   :target: https://matrix.to/#/%23aio-libs:matrix.org
+   :alt: Matrix Room — #aio-libs:matrix.org
+
+.. image:: https://img.shields.io/matrix/aio-libs-space:matrix.org?label=Discuss%20on%20Matrix%20at%20%23aio-libs-space%3Amatrix.org&logo=matrix&server_fqdn=matrix.org&style=flat
+   :target: https://matrix.to/#/%23aio-libs-space:matrix.org
+   :alt: Matrix Space — #aio-libs-space:matrix.org
+
+.. image:: https://insights.linuxfoundation.org/api/badge/health-score?project=aiohttp
+   :target: https://insights.linuxfoundation.org/project/aiohttp
+   :alt: LFX Health Score
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 0ca22003937..396d5024dd6 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.12.15"
+__version__ = "3.13.2"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/_cookie_helpers.py aiohttp/_cookie_helpers.py
index 4e9fc968814..837893e5626 100644
--- aiohttp/_cookie_helpers.py
+++ aiohttp/_cookie_helpers.py
@@ -6,7 +6,6 @@
 """
 
 import re
-import sys
 from http.cookies import Morsel
 from typing import List, Optional, Sequence, Tuple, cast
 
@@ -166,7 +165,10 @@ def parse_cookie_header(header: str) -> List[Tuple[str, Morsel[str]]]:
     attribute names (like 'path' or 'secure') should be treated as cookies.
 
     This parser uses the same regex-based approach as parse_set_cookie_headers
-    to properly handle quoted values that may contain semicolons.
+    to properly handle quoted values that may contain semicolons. When the
+    regex fails to match a malformed cookie, it falls back to simple parsing
+    to ensure subsequent cookies are not lost
+    https://github.com/aio-libs/aiohttp/issues/11632
 
     Args:
         header: The Cookie header value to parse
@@ -178,6 +180,7 @@ def parse_cookie_header(header: str) -> List[Tuple[str, Morsel[str]]]:
         return []
 
     cookies: List[Tuple[str, Morsel[str]]] = []
+    morsel: Morsel[str]
     i = 0
     n = len(header)
 
@@ -185,7 +188,32 @@ def parse_cookie_header(header: str) -> List[Tuple[str, Morsel[str]]]:
         # Use the same pattern as parse_set_cookie_headers to find cookies
         match = _COOKIE_PATTERN.match(header, i)
         if not match:
-            break
+            # Fallback for malformed cookies https://github.com/aio-libs/aiohttp/issues/11632
+            # Find next semicolon to skip or attempt simple key=value parsing
+            next_semi = header.find(";", i)
+            eq_pos = header.find("=", i)
+
+            # Try to extract key=value if '=' comes before ';'
+            if eq_pos != -1 and (next_semi == -1 or eq_pos < next_semi):
+                end_pos = next_semi if next_semi != -1 else n
+                key = header[i:eq_pos].strip()
+                value = header[eq_pos + 1 : end_pos].strip()
+
+                # Validate the name (same as regex path)
+                if not _COOKIE_NAME_RE.match(key):
+                    internal_logger.warning(
+                        "Can not load cookie: Illegal cookie name %r", key
+                    )
+                else:
+                    morsel = Morsel()
+                    morsel.__setstate__(  # type: ignore[attr-defined]
+                        {"key": key, "value": _unquote(value), "coded_value": value}
+                    )
+                    cookies.append((key, morsel))
+
+            # Move to next cookie or end
+            i = next_semi + 1 if next_semi != -1 else n
+            continue
 
         key = match.group("key")
         value = match.group("val") or ""
@@ -197,7 +225,7 @@ def parse_cookie_header(header: str) -> List[Tuple[str, Morsel[str]]]:
             continue
 
         # Create new morsel
-        morsel: Morsel[str] = Morsel()
+        morsel = Morsel()
         # Preserve the original value as coded_value (with quotes if present)
         # We use __setstate__ instead of the public set() API because it allows us to
         # bypass validation and set already validated state. This is more stable than
@@ -270,11 +298,8 @@ def parse_set_cookie_headers(headers: Sequence[str]) -> List[Tuple[str, Morsel[s
                     break
                 if lower_key in _COOKIE_BOOL_ATTRS:
                     # Boolean attribute with any value should be True
-                    if current_morsel is not None:
-                        if lower_key == "partitioned" and sys.version_info < (3, 14):
-                            dict.__setitem__(current_morsel, lower_key, True)
-                        else:
-                            current_morsel[lower_key] = True
+                    if current_morsel is not None and current_morsel.isReservedKey(key):
+                        current_morsel[lower_key] = True
                 elif value is None:
                     # Invalid cookie string - non-boolean attribute without value
                     break
diff --git aiohttp/_http_parser.pyx aiohttp/_http_parser.pyx
index 16893f00e74..b2ddeb30d83 100644
--- aiohttp/_http_parser.pyx
+++ aiohttp/_http_parser.pyx
@@ -1,5 +1,3 @@
-#cython: language_level=3
-#
 # Based on https://github.com/MagicStack/httptools
 #
 
@@ -437,7 +435,7 @@ cdef class HttpParser:
         if enc is not None:
             self._content_encoding = None
             enc = enc.lower()
-            if enc in ('gzip', 'deflate', 'br'):
+            if enc in ('gzip', 'deflate', 'br', 'zstd'):
                 encoding = enc
 
         if self._cparser.type == cparser.HTTP_REQUEST:
diff --git aiohttp/_http_writer.pyx aiohttp/_http_writer.pyx
index 4a3ae1f9e68..7989c186c89 100644
--- aiohttp/_http_writer.pyx
+++ aiohttp/_http_writer.pyx
@@ -8,7 +8,6 @@ from libc.string cimport memcpy
 from multidict import istr
 
 DEF BUF_SIZE = 16 * 1024  # 16KiB
-cdef char BUFFER[BUF_SIZE]
 
 cdef object _istr = istr
 
@@ -19,16 +18,17 @@ cdef struct Writer:
     char *buf
     Py_ssize_t size
     Py_ssize_t pos
+    bint heap_allocated
 
-
-cdef inline void _init_writer(Writer* writer):
-    writer.buf = &BUFFER[0]
+cdef inline void _init_writer(Writer* writer, char *buf):
+    writer.buf = buf
     writer.size = BUF_SIZE
     writer.pos = 0
+    writer.heap_allocated = 0
 
 
 cdef inline void _release_writer(Writer* writer):
-    if writer.buf != BUFFER:
+    if writer.heap_allocated:
         PyMem_Free(writer.buf)
 
 
@@ -39,7 +39,7 @@ cdef inline int _write_byte(Writer* writer, uint8_t ch):
     if writer.pos == writer.size:
         # reallocate
         size = writer.size + BUF_SIZE
-        if writer.buf == BUFFER:
+        if not writer.heap_allocated:
             buf = <char*>PyMem_Malloc(size)
             if buf == NULL:
                 PyErr_NoMemory()
@@ -52,6 +52,7 @@ cdef inline int _write_byte(Writer* writer, uint8_t ch):
                 return -1
         writer.buf = buf
         writer.size = size
+        writer.heap_allocated = 1
     writer.buf[writer.pos] = <char>ch
     writer.pos += 1
     return 0
@@ -125,8 +126,9 @@ def _serialize_headers(str status_line, headers):
     cdef Writer writer
     cdef object key
     cdef object val
+    cdef char buf[BUF_SIZE]
 
-    _init_writer(&writer)
+    _init_writer(&writer, buf)
 
     try:
         if _write_str(&writer, status_line) < 0:
diff --git aiohttp/_websocket/writer.py aiohttp/_websocket/writer.py
index 19163f9afdf..9604202357c 100644
--- aiohttp/_websocket/writer.py
+++ aiohttp/_websocket/writer.py
@@ -2,8 +2,9 @@
 
 import asyncio
 import random
+import sys
 from functools import partial
-from typing import Any, Final, Optional, Union
+from typing import Final, Optional, Set, Union
 
 from ..base_protocol import BaseProtocol
 from ..client_exceptions import ClientConnectionResetError
@@ -22,14 +23,18 @@
 
 DEFAULT_LIMIT: Final[int] = 2**16
 
+# WebSocket opcode boundary: opcodes 0-7 are data frames, 8-15 are control frames
+# Control frames (ping, pong, close) are never compressed
+WS_CONTROL_FRAME_OPCODE: Final[int] = 8
+
 # For websockets, keeping latency low is extremely important as implementations
-# generally expect to be able to send and receive messages quickly.  We use a
-# larger chunk size than the default to reduce the number of executor calls
-# since the executor is a significant source of latency and overhead when
-# the chunks are small. A size of 5KiB was chosen because it is also the
-# same value python-zlib-ng choose to use as the threshold to release the GIL.
+# generally expect to be able to send and receive messages quickly. We use a
+# larger chunk size to reduce the number of executor calls and avoid task
+# creation overhead, since both are significant sources of latency when chunks
+# are small. A size of 16KiB was chosen as a balance between avoiding task
+# overhead and not blocking the event loop too long with synchronous compression.
 
-WEBSOCKET_MAX_SYNC_CHUNK_SIZE = 5 * 1024
+WEBSOCKET_MAX_SYNC_CHUNK_SIZE = 16 * 1024
 
 
 class WebSocketWriter:
@@ -62,7 +67,9 @@ def __init__(
         self._closing = False
         self._limit = limit
         self._output_size = 0
-        self._compressobj: Any = None  # actually compressobj
+        self._compressobj: Optional[ZLibCompressor] = None
+        self._send_lock = asyncio.Lock()
+        self._background_tasks: Set[asyncio.Task[None]] = set()
 
     async def send_frame(
         self, message: bytes, opcode: int, compress: Optional[int] = None
@@ -71,39 +78,57 @@ async def send_frame(
         if self._closing and not (opcode & WSMsgType.CLOSE):
             raise ClientConnectionResetError("Cannot write to closing transport")
 
-        # RSV are the reserved bits in the frame header. They are used to
-        # indicate that the frame is using an extension.
-        # https://datatracker.ietf.org/doc/html/rfc6455#section-5.2
-        rsv = 0
-        # Only compress larger packets (disabled)
-        # Does small packet needs to be compressed?
-        # if self.compress and opcode < 8 and len(message) > 124:
-        if (compress or self.compress) and opcode < 8:
-            # RSV1 (rsv = 0x40) is set for compressed frames
-            # https://datatracker.ietf.org/doc/html/rfc7692#section-7.2.3.1
-            rsv = 0x40
-
-            if compress:
-                # Do not set self._compress if compressing is for this frame
-                compressobj = self._make_compress_obj(compress)
-            else:  # self.compress
-                if not self._compressobj:
-                    self._compressobj = self._make_compress_obj(self.compress)
-                compressobj = self._compressobj
-
-            message = (
-                await compressobj.compress(message)
-                + compressobj.flush(
-                    ZLibBackend.Z_FULL_FLUSH
-                    if self.notakeover
-                    else ZLibBackend.Z_SYNC_FLUSH
-                )
-            ).removesuffix(WS_DEFLATE_TRAILING)
-            # Its critical that we do not return control to the event
-            # loop until we have finished sending all the compressed
-            # data. Otherwise we could end up mixing compressed frames
-            # if there are multiple coroutines compressing data.
+        if not (compress or self.compress) or opcode >= WS_CONTROL_FRAME_OPCODE:
+            # Non-compressed frames don't need lock or shield
+            self._write_websocket_frame(message, opcode, 0)
+        elif len(message) <= WEBSOCKET_MAX_SYNC_CHUNK_SIZE:
+            # Small compressed payloads - compress synchronously in event loop
+            # We need the lock even though sync compression has no await points.
+            # This prevents small frames from interleaving with large frames that
+            # compress in the executor, avoiding compressor state corruption.
+            async with self._send_lock:
+                self._send_compressed_frame_sync(message, opcode, compress)
+        else:
+            # Large compressed frames need shield to prevent corruption
+            # For large compressed frames, the entire compress+send
+            # operation must be atomic. If cancelled after compression but
+            # before send, the compressor state would be advanced but data
+            # not sent, corrupting subsequent frames.
+            # Create a task to shield from cancellation
+            # The lock is acquired inside the shielded task so the entire
+            # operation (lock + compress + send) completes atomically.
+            # Use eager_start on Python 3.12+ to avoid scheduling overhead
+            loop = asyncio.get_running_loop()
+            coro = self._send_compressed_frame_async_locked(message, opcode, compress)
+            if sys.version_info >= (3, 12):
+                send_task = asyncio.Task(coro, loop=loop, eager_start=True)
+            else:
+                send_task = loop.create_task(coro)
+            # Keep a strong reference to prevent garbage collection
+            self._background_tasks.add(send_task)
+            send_task.add_done_callback(self._background_tasks.discard)
+            await asyncio.shield(send_task)
+
+        # It is safe to return control to the event loop when using compression
+        # after this point as we have already sent or buffered all the data.
+        # Once we have written output_size up to the limit, we call the
+        # drain helper which waits for the transport to be ready to accept
+        # more data. This is a flow control mechanism to prevent the buffer
+        # from growing too large. The drain helper will return right away
+        # if the writer is not paused.
+        if self._output_size > self._limit:
+            self._output_size = 0
+            if self.protocol._paused:
+                await self.protocol._drain_helper()
 
+    def _write_websocket_frame(self, message: bytes, opcode: int, rsv: int) -> None:
+        """
+        Write a websocket frame to the transport.
+
+        This method handles frame header construction, masking, and writing to transport.
+        It does not handle compression or flow control - those are the responsibility
+        of the caller.
+        """
         msg_length = len(message)
 
         use_mask = self.use_mask
@@ -146,26 +171,85 @@ async def send_frame(
 
         self._output_size += header_len + msg_length
 
-        # It is safe to return control to the event loop when using compression
-        # after this point as we have already sent or buffered all the data.
+    def _get_compressor(self, compress: Optional[int]) -> ZLibCompressor:
+        """Get or create a compressor object for the given compression level."""
+        if compress:
+            # Do not set self._compress if compressing is for this frame
+            return ZLibCompressor(
+                level=ZLibBackend.Z_BEST_SPEED,
+                wbits=-compress,
+                max_sync_chunk_size=WEBSOCKET_MAX_SYNC_CHUNK_SIZE,
+            )
+        if not self._compressobj:
+            self._compressobj = ZLibCompressor(
+                level=ZLibBackend.Z_BEST_SPEED,
+                wbits=-self.compress,
+                max_sync_chunk_size=WEBSOCKET_MAX_SYNC_CHUNK_SIZE,
+            )
+        return self._compressobj
 
-        # Once we have written output_size up to the limit, we call the
-        # drain helper which waits for the transport to be ready to accept
-        # more data. This is a flow control mechanism to prevent the buffer
-        # from growing too large. The drain helper will return right away
-        # if the writer is not paused.
-        if self._output_size > self._limit:
-            self._output_size = 0
-            if self.protocol._paused:
-                await self.protocol._drain_helper()
+    def _send_compressed_frame_sync(
+        self, message: bytes, opcode: int, compress: Optional[int]
+    ) -> None:
+        """
+        Synchronous send for small compressed frames.
 
-    def _make_compress_obj(self, compress: int) -> ZLibCompressor:
-        return ZLibCompressor(
-            level=ZLibBackend.Z_BEST_SPEED,
-            wbits=-compress,
-            max_sync_chunk_size=WEBSOCKET_MAX_SYNC_CHUNK_SIZE,
+        This is used for small compressed payloads that compress synchronously in the event loop.
+        Since there are no await points, this is inherently cancellation-safe.
+        """
+        # RSV are the reserved bits in the frame header. They are used to
+        # indicate that the frame is using an extension.
+        # https://datatracker.ietf.org/doc/html/rfc6455#section-5.2
+        compressobj = self._get_compressor(compress)
+        # (0x40) RSV1 is set for compressed frames
+        # https://datatracker.ietf.org/doc/html/rfc7692#section-7.2.3.1
+        self._write_websocket_frame(
+            (
+                compressobj.compress_sync(message)
+                + compressobj.flush(
+                    ZLibBackend.Z_FULL_FLUSH
+                    if self.notakeover
+                    else ZLibBackend.Z_SYNC_FLUSH
+                )
+            ).removesuffix(WS_DEFLATE_TRAILING),
+            opcode,
+            0x40,
         )
 
+    async def _send_compressed_frame_async_locked(
+        self, message: bytes, opcode: int, compress: Optional[int]
+    ) -> None:
+        """
+        Async send for large compressed frames with lock.
+
+        Acquires the lock and compresses large payloads asynchronously in
+        the executor. The lock is held for the entire operation to ensure
+        the compressor state is not corrupted by concurrent sends.
+
+        MUST be run shielded from cancellation. If cancelled after
+        compression but before sending, the compressor state would be
+        advanced but data not sent, corrupting subsequent frames.
+        """
+        async with self._send_lock:
+            # RSV are the reserved bits in the frame header. They are used to
+            # indicate that the frame is using an extension.
+            # https://datatracker.ietf.org/doc/html/rfc6455#section-5.2
+            compressobj = self._get_compressor(compress)
+            # (0x40) RSV1 is set for compressed frames
+            # https://datatracker.ietf.org/doc/html/rfc7692#section-7.2.3.1
+            self._write_websocket_frame(
+                (
+                    await compressobj.compress(message)
+                    + compressobj.flush(
+                        ZLibBackend.Z_FULL_FLUSH
+                        if self.notakeover
+                        else ZLibBackend.Z_SYNC_FLUSH
+                    )
+                ).removesuffix(WS_DEFLATE_TRAILING),
+                opcode,
+                0x40,
+            )
+
     async def close(self, code: int = 1000, message: Union[bytes, str] = b"") -> None:
         """Close the websocket, sending the specified code and message."""
         if isinstance(message, str):
diff --git aiohttp/abc.py aiohttp/abc.py
index 2574ff93621..faf09575afb 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -122,7 +122,7 @@ def request(self) -> Request:
         return self._request
 
     @abstractmethod
-    def __await__(self) -> Generator[Any, None, StreamResponse]:
+    def __await__(self) -> Generator[None, None, StreamResponse]:
         """Execute the view handler."""
 
 
diff --git aiohttp/client.py aiohttp/client.py
index 0c72d5948ce..bc4ee17caf0 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -98,7 +98,9 @@
     EMPTY_BODY_METHODS,
     BasicAuth,
     TimeoutHandle,
+    basicauth_from_netrc,
     get_env_proxy_for_url,
+    netrc_from_env,
     sentinel,
     strip_auth_from_url,
 )
@@ -657,6 +659,13 @@ async def _request(
                         )
                     ):
                         auth = self._default_auth
+
+                    # Try netrc if auth is still None and trust_env is enabled.
+                    if auth is None and self._trust_env and url.host is not None:
+                        auth = await self._loop.run_in_executor(
+                            None, self._get_netrc_auth, url.host
+                        )
+
                     # It would be confusing if we support explicit
                     # Authorization header with auth argument
                     if (
@@ -1211,6 +1220,19 @@ def _prepare_headers(self, headers: Optional[LooseHeaders]) -> "CIMultiDict[str]
                     added_names.add(key)
         return result
 
+    def _get_netrc_auth(self, host: str) -> Optional[BasicAuth]:
+        """
+        Get auth from netrc for the given host.
+
+        This method is designed to be called in an executor to avoid
+        blocking I/O in the event loop.
+        """
+        netrc_obj = netrc_from_env()
+        try:
+            return basicauth_from_netrc(netrc_obj, host)
+        except LookupError:
+            return None
+
     if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
         def get(
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index 2586119b288..a9e0795893d 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -45,7 +45,7 @@
     InvalidURL,
     ServerFingerprintMismatch,
 )
-from .compression_utils import HAS_BROTLI
+from .compression_utils import HAS_BROTLI, HAS_ZSTD
 from .formdata import FormData
 from .helpers import (
     _SENTINEL,
@@ -53,10 +53,9 @@
     BasicAuth,
     HeadersMixin,
     TimerNoop,
-    basicauth_from_netrc,
-    netrc_from_env,
     noop,
     reify,
+    sentinel,
     set_exception,
     set_result,
 )
@@ -104,7 +103,15 @@
 
 
 def _gen_default_accept_encoding() -> str:
-    return "gzip, deflate, br" if HAS_BROTLI else "gzip, deflate"
+    encodings = [
+        "gzip",
+        "deflate",
+    ]
+    if HAS_BROTLI:
+        encodings.append("br")
+    if HAS_ZSTD:
+        encodings.append("zstd")
+    return ", ".join(encodings)
 
 
 @attr.s(auto_attribs=True, frozen=True, slots=True)
@@ -128,14 +135,14 @@ def __new__(
         url: URL,
         method: str,
         headers: "CIMultiDictProxy[str]",
-        real_url: URL = _SENTINEL,  # type: ignore[assignment]
+        real_url: Union[URL, _SENTINEL] = sentinel,
     ) -> "RequestInfo":
         """Create a new RequestInfo instance.
 
         For backwards compatibility, the real_url parameter is optional.
         """
         return tuple.__new__(
-            cls, (url, method, headers, url if real_url is _SENTINEL else real_url)
+            cls, (url, method, headers, url if real_url is sentinel else real_url)
         )
 
 
@@ -1155,10 +1162,6 @@ def update_auth(self, auth: Optional[BasicAuth], trust_env: bool = False) -> Non
         """Set basic auth."""
         if auth is None:
             auth = self.auth
-        if auth is None and trust_env and self.url.host is not None:
-            netrc_obj = netrc_from_env()
-            with contextlib.suppress(LookupError):
-                auth = basicauth_from_netrc(netrc_obj, self.url.host)
         if auth is None:
             return
 
diff --git aiohttp/compression_utils.py aiohttp/compression_utils.py
index f08c3d9cdff..c51fc524f98 100644
--- aiohttp/compression_utils.py
+++ aiohttp/compression_utils.py
@@ -21,6 +21,17 @@
 except ImportError:  # pragma: no cover
     HAS_BROTLI = False
 
+try:
+    if sys.version_info >= (3, 14):
+        from compression.zstd import ZstdDecompressor  # noqa: I900
+    else:  # TODO(PY314): Remove mentions of backports.zstd across codebase
+        from backports.zstd import ZstdDecompressor
+
+    HAS_ZSTD = True
+except ImportError:
+    HAS_ZSTD = False
+
+
 MAX_SYNC_CHUNK_SIZE = 1024
 
 
@@ -174,7 +185,6 @@ def __init__(
         if level is not None:
             kwargs["level"] = level
         self._compressor = self._zlib_backend.compressobj(**kwargs)
-        self._compress_lock = asyncio.Lock()
 
     def compress_sync(self, data: bytes) -> bytes:
         return self._compressor.compress(data)
@@ -187,22 +197,37 @@ async def compress(self, data: bytes) -> bytes:
         If the data size is large than the max_sync_chunk_size, the compression
         will be done in the executor. Otherwise, the compression will be done
         in the event loop.
+
+        **WARNING: This method is NOT cancellation-safe when used with flush().**
+        If this operation is cancelled, the compressor state may be corrupted.
+        The connection MUST be closed after cancellation to avoid data corruption
+        in subsequent compress operations.
+
+        For cancellation-safe compression (e.g., WebSocket), the caller MUST wrap
+        compress() + flush() + send operations in a shield and lock to ensure atomicity.
         """
-        async with self._compress_lock:
-            # To ensure the stream is consistent in the event
-            # there are multiple writers, we need to lock
-            # the compressor so that only one writer can
-            # compress at a time.
-            if (
-                self._max_sync_chunk_size is not None
-                and len(data) > self._max_sync_chunk_size
-            ):
-                return await asyncio.get_running_loop().run_in_executor(
-                    self._executor, self._compressor.compress, data
-                )
-            return self.compress_sync(data)
+        # For large payloads, offload compression to executor to avoid blocking event loop
+        should_use_executor = (
+            self._max_sync_chunk_size is not None
+            and len(data) > self._max_sync_chunk_size
+        )
+        if should_use_executor:
+            return await asyncio.get_running_loop().run_in_executor(
+                self._executor, self._compressor.compress, data
+            )
+        return self.compress_sync(data)
 
     def flush(self, mode: Optional[int] = None) -> bytes:
+        """Flush the compressor synchronously.
+
+        **WARNING: This method is NOT cancellation-safe when called after compress().**
+        The flush() operation accesses shared compressor state. If compress() was
+        cancelled, calling flush() may result in corrupted data. The connection MUST
+        be closed after compress() cancellation.
+
+        For cancellation-safe compression (e.g., WebSocket), the caller MUST wrap
+        compress() + flush() + send operations in a shield and lock to ensure atomicity.
+        """
         return self._compressor.flush(
             mode if mode is not None else self._zlib_backend.Z_FINISH
         )
@@ -276,3 +301,19 @@ def flush(self) -> bytes:
         if hasattr(self._obj, "flush"):
             return cast(bytes, self._obj.flush())
         return b""
+
+
+class ZSTDDecompressor:
+    def __init__(self) -> None:
+        if not HAS_ZSTD:
+            raise RuntimeError(
+                "The zstd decompression is not available. "
+                "Please install `backports.zstd` module"
+            )
+        self._obj = ZstdDecompressor()
+
+    def decompress_sync(self, data: bytes) -> bytes:
+        return self._obj.decompress(data)
+
+    def flush(self) -> bytes:
+        return b""
diff --git aiohttp/formdata.py aiohttp/formdata.py
index bdf591fae7a..a5a4f603e19 100644
--- aiohttp/formdata.py
+++ aiohttp/formdata.py
@@ -110,7 +110,7 @@ def add_fields(self, *fields: Any) -> None:
 
             elif isinstance(rec, (list, tuple)) and len(rec) == 2:
                 k, fp = rec
-                self.add_field(k, fp)  # type: ignore[arg-type]
+                self.add_field(k, fp)
 
             else:
                 raise TypeError(
diff --git aiohttp/helpers.py aiohttp/helpers.py
index ace4f0e9b53..dfab9877d39 100644
--- aiohttp/helpers.py
+++ aiohttp/helpers.py
@@ -17,7 +17,9 @@
 import weakref
 from collections import namedtuple
 from contextlib import suppress
+from email.message import EmailMessage
 from email.parser import HeaderParser
+from email.policy import HTTP
 from email.utils import parsedate
 from math import ceil
 from pathlib import Path
@@ -357,14 +359,40 @@ def parse_mimetype(mimetype: str) -> MimeType:
     )
 
 
+class EnsureOctetStream(EmailMessage):
+    def __init__(self) -> None:
+        super().__init__()
+        # https://www.rfc-editor.org/rfc/rfc9110#section-8.3-5
+        self.set_default_type("application/octet-stream")
+
+    def get_content_type(self) -> str:
+        """Re-implementation from Message
+
+        Returns application/octet-stream in place of plain/text when
+        value is wrong.
+
+        The way this class is used guarantees that content-type will
+        be present so simplify the checks wrt to the base implementation.
+        """
+        value = self.get("content-type", "").lower()
+
+        # Based on the implementation of _splitparam in the standard library
+        ctype, _, _ = value.partition(";")
+        ctype = ctype.strip()
+        if ctype.count("/") != 1:
+            return self.get_default_type()
+        return ctype
+
+
 @functools.lru_cache(maxsize=56)
 def parse_content_type(raw: str) -> Tuple[str, MappingProxyType[str, str]]:
     """Parse Content-Type header.
 
     Returns a tuple of the parsed content type and a
-    MappingProxyType of parameters.
+    MappingProxyType of parameters. The default returned value
+    is `application/octet-stream`
     """
-    msg = HeaderParser().parsestr(f"Content-Type: {raw}")
+    msg = HeaderParser(EnsureOctetStream, policy=HTTP).parsestr(f"Content-Type: {raw}")
     content_type = msg.get_content_type()
     params = msg.get_params(())
     content_dict = dict(params[1:])  # First element is content type again
diff --git aiohttp/http_parser.py aiohttp/http_parser.py
index 9f864b27876..9a2c00e6542 100644
--- aiohttp/http_parser.py
+++ aiohttp/http_parser.py
@@ -26,7 +26,13 @@
 
 from . import hdrs
 from .base_protocol import BaseProtocol
-from .compression_utils import HAS_BROTLI, BrotliDecompressor, ZLibDecompressor
+from .compression_utils import (
+    HAS_BROTLI,
+    HAS_ZSTD,
+    BrotliDecompressor,
+    ZLibDecompressor,
+    ZSTDDecompressor,
+)
 from .helpers import (
     _EXC_SENTINEL,
     DEBUG,
@@ -546,7 +552,7 @@ def parse_headers(
         enc = headers.get(hdrs.CONTENT_ENCODING)
         if enc:
             enc = enc.lower()
-            if enc in ("gzip", "deflate", "br"):
+            if enc in ("gzip", "deflate", "br", "zstd"):
                 encoding = enc
 
         # chunking
@@ -958,10 +964,11 @@ class DeflateBuffer:
     def __init__(self, out: StreamReader, encoding: Optional[str]) -> None:
         self.out = out
         self.size = 0
+        out.total_compressed_bytes = self.size
         self.encoding = encoding
         self._started_decoding = False
 
-        self.decompressor: Union[BrotliDecompressor, ZLibDecompressor]
+        self.decompressor: Union[BrotliDecompressor, ZLibDecompressor, ZSTDDecompressor]
         if encoding == "br":
             if not HAS_BROTLI:  # pragma: no cover
                 raise ContentEncodingError(
@@ -969,6 +976,13 @@ def __init__(self, out: StreamReader, encoding: Optional[str]) -> None:
                     "Please install `Brotli`"
                 )
             self.decompressor = BrotliDecompressor()
+        elif encoding == "zstd":
+            if not HAS_ZSTD:
+                raise ContentEncodingError(
+                    "Can not decode content-encoding: zstandard (zstd). "
+                    "Please install `backports.zstd`"
+                )
+            self.decompressor = ZSTDDecompressor()
         else:
             self.decompressor = ZLibDecompressor(encoding=encoding)
 
@@ -984,6 +998,7 @@ def feed_data(self, chunk: bytes, size: int) -> None:
             return
 
         self.size += size
+        self.out.total_compressed_bytes = self.size
 
         # RFC1950
         # bits 0..3 = CM = 0b1000 = 8 = "deflate"
diff --git aiohttp/multipart.py aiohttp/multipart.py
index 54dfd4843b0..3464b1c2307 100644
--- aiohttp/multipart.py
+++ aiohttp/multipart.py
@@ -114,6 +114,10 @@ def unescape(text: str, *, chars: str = "".join(map(re.escape, CHAR))) -> str:
     while parts:
         item = parts.pop(0)
 
+        if not item:  # To handle trailing semicolons
+            warnings.warn(BadContentDispositionHeader(header))
+            continue
+
         if "=" not in item:
             warnings.warn(BadContentDispositionHeader(header))
             return None, {}
diff --git aiohttp/streams.py aiohttp/streams.py
index 7a3f64d1289..e2bc04dd99c 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -130,6 +130,7 @@ class StreamReader(AsyncStreamReaderMixin):
         "_eof_callbacks",
         "_eof_counter",
         "total_bytes",
+        "total_compressed_bytes",
     )
 
     def __init__(
@@ -159,6 +160,7 @@ def __init__(
         self._eof_callbacks: List[Callable[[], None]] = []
         self._eof_counter = 0
         self.total_bytes = 0
+        self.total_compressed_bytes: Optional[int] = None
 
     def __repr__(self) -> str:
         info = [self.__class__.__name__]
@@ -250,6 +252,12 @@ async def wait_eof(self) -> None:
         finally:
             self._eof_waiter = None
 
+    @property
+    def total_raw_bytes(self) -> int:
+        if self.total_compressed_bytes is None:
+            return self.total_bytes
+        return self.total_compressed_bytes
+
     def unread_data(self, data: bytes) -> None:
         """rollback reading some data from stream, inserting it to buffer head."""
         warnings.warn(
diff --git aiohttp/web.py aiohttp/web.py
index 8307ff405ca..5a1fc964172 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -309,18 +309,12 @@ async def _run_app(
     port: Optional[int] = None,
     path: Union[PathLike, TypingIterable[PathLike], None] = None,
     sock: Optional[Union[socket.socket, TypingIterable[socket.socket]]] = None,
-    shutdown_timeout: float = 60.0,
-    keepalive_timeout: float = 75.0,
     ssl_context: Optional[SSLContext] = None,
     print: Optional[Callable[..., None]] = print,
     backlog: int = 128,
-    access_log_class: Type[AbstractAccessLogger] = AccessLogger,
-    access_log_format: str = AccessLogger.LOG_FORMAT,
-    access_log: Optional[logging.Logger] = access_logger,
-    handle_signals: bool = True,
     reuse_address: Optional[bool] = None,
     reuse_port: Optional[bool] = None,
-    handler_cancellation: bool = False,
+    **kwargs: Any,  # TODO(PY311): Use Unpack
 ) -> None:
     # An internal function to actually do all dirty job for application running
     if asyncio.iscoroutine(app):
@@ -328,16 +322,7 @@ async def _run_app(
 
     app = cast(Application, app)
 
-    runner = AppRunner(
-        app,
-        handle_signals=handle_signals,
-        access_log_class=access_log_class,
-        access_log_format=access_log_format,
-        access_log=access_log,
-        keepalive_timeout=keepalive_timeout,
-        shutdown_timeout=shutdown_timeout,
-        handler_cancellation=handler_cancellation,
-    )
+    runner = AppRunner(app, **kwargs)
 
     await runner.setup()
 
@@ -484,6 +469,7 @@ def run_app(
     reuse_port: Optional[bool] = None,
     handler_cancellation: bool = False,
     loop: Optional[asyncio.AbstractEventLoop] = None,
+    **kwargs: Any,
 ) -> None:
     """Run an app locally"""
     if loop is None:
@@ -515,6 +501,7 @@ def run_app(
             reuse_address=reuse_address,
             reuse_port=reuse_port,
             handler_cancellation=handler_cancellation,
+            **kwargs,
         )
     )
 
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 344611cc495..26484b9483a 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -164,8 +164,8 @@ async def _not_modified(
     ) -> Optional[AbstractStreamWriter]:
         self.set_status(HTTPNotModified.status_code)
         self._length_check = False
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = last_modified  # type: ignore[assignment]
+        self.etag = etag_value
+        self.last_modified = last_modified
         # Delete any Content-Length headers provided by user. HTTP 304
         # should always have empty response body
         return await super().prepare(request)
@@ -395,8 +395,8 @@ async def _prepare_open_file(
             # compress.
             self._compression = False
 
-        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
-        self.last_modified = file_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+        self.last_modified = file_mtime
         self.content_length = count
 
         self._headers[hdrs.ACCEPT_RANGES] = "bytes"
diff --git aiohttp/web_urldispatcher.py aiohttp/web_urldispatcher.py
index 28ae2518fec..8213456c5f5 100644
--- aiohttp/web_urldispatcher.py
+++ aiohttp/web_urldispatcher.py
@@ -194,6 +194,8 @@ def __init__(
         ):
             pass
         elif inspect.isgeneratorfunction(handler):
+            if TYPE_CHECKING:
+                assert False
             warnings.warn(
                 "Bare generators are deprecated, use @coroutine wrapper",
                 DeprecationWarning,
@@ -978,7 +980,7 @@ async def _iter(self) -> StreamResponse:
         assert isinstance(ret, StreamResponse)
         return ret
 
-    def __await__(self) -> Generator[Any, None, StreamResponse]:
+    def __await__(self) -> Generator[None, None, StreamResponse]:
         return self._iter().__await__()
 
     def _raise_allowed_methods(self) -> NoReturn:
@@ -1032,6 +1034,21 @@ async def resolve(self, request: Request) -> UrlMappingMatchInfo:
         resource_index = self._resource_index
         allowed_methods: Set[str] = set()
 
+        # MatchedSubAppResource is primarily used to match on domain names
+        # (though custom rules could match on other things). This means that
+        # the traversal algorithm below can't be applied, and that we likely
+        # need to check these first so a sub app that defines the same path
+        # as a parent app will get priority if there's a domain match.
+        #
+        # For most cases we do not expect there to be many of these since
+        # currently they are only added by `.add_domain()`.
+        for resource in self._matched_sub_app_resources:
+            match_dict, allowed = await resource.resolve(request)
+            if match_dict is not None:
+                return match_dict
+            else:
+                allowed_methods |= allowed
+
         # Walk the url parts looking for candidates. We walk the url backwards
         # to ensure the most explicit match is found first. If there are multiple
         # candidates for a given url part because there are multiple resources
@@ -1049,21 +1066,6 @@ async def resolve(self, request: Request) -> UrlMappingMatchInfo:
                 break
             url_part = url_part.rpartition("/")[0] or "/"
 
-        #
-        # We didn't find any candidates, so we'll try the matched sub-app
-        # resources which we have to walk in a linear fashion because they
-        # have regex/wildcard match rules and we cannot index them.
-        #
-        # For most cases we do not expect there to be many of these since
-        # currently they are only added by `add_domain`
-        #
-        for resource in self._matched_sub_app_resources:
-            match_dict, allowed = await resource.resolve(request)
-            if match_dict is not None:
-                return match_dict
-            else:
-                allowed_methods |= allowed
-
         if allowed_methods:
             return MatchInfoError(HTTPMethodNotAllowed(request.method, allowed_methods))
 
diff --git docs/client_middleware_cookbook.rst docs/client_middleware_cookbook.rst
index 33994160fba..e890b02a4bf 100644
--- docs/client_middleware_cookbook.rst
+++ docs/client_middleware_cookbook.rst
@@ -98,6 +98,29 @@ Using both of these together in a session should provide full SSRF protection.
 Best Practices
 --------------
 
+.. important::
+
+   **Request-level middlewares replace session middlewares**: When you pass ``middlewares``
+   to ``request()`` or its convenience methods (``get()``, ``post()``, etc.), it completely
+   replaces the session-level middlewares, rather than extending them. This differs from
+   other parameters like ``headers``, which are merged.
+
+   .. code-block:: python
+
+      session = ClientSession(middlewares=[middleware_session])
+
+      # Session middleware is used
+      await session.get("http://example.com")
+
+      # Session middleware is NOT used, only request middleware
+      await session.get("http://example.com", middlewares=[middleware_request])
+
+      # To use both, explicitly pass both
+      await session.get(
+          "http://example.com",
+          middlewares=[middleware_session, middleware_request]
+      )
+
 1. **Keep middleware focused**: Each middleware should have a single responsibility.
 
 2. **Order matters**: Middlewares execute in the order they're listed. Place logging first,
diff --git docs/client_quickstart.rst docs/client_quickstart.rst
index 0e03f104e90..b74c2500065 100644
--- docs/client_quickstart.rst
+++ docs/client_quickstart.rst
@@ -187,6 +187,10 @@ You can enable ``brotli`` transfer-encodings support,
 just install `Brotli <https://pypi.org/project/Brotli/>`_
 or `brotlicffi <https://pypi.org/project/brotlicffi/>`_.
 
+You can enable ``zstd`` transfer-encodings support,
+install `backports.zstd <https://pypi.org/project/backports.zstd/>`_.
+If you are using Python >= 3.14, no dependency should be required.
+
 JSON Request
 ============
 
diff --git docs/client_reference.rst docs/client_reference.rst
index ab16e35aed5..d8b36b95c91 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -1566,16 +1566,14 @@ Response object
 
       .. note::
 
-         Returns value is ``'application/octet-stream'`` if no
-         Content-Type header present in HTTP headers according to
-         :rfc:`9110`. If the *Content-Type* header is invalid (e.g., ``jpg``
-         instead of ``image/jpeg``), the value is ``text/plain`` by default
-         according to :rfc:`2045`. To see the original header check
-         ``resp.headers['CONTENT-TYPE']``.
+         Returns ``'application/octet-stream'`` if no Content-Type header
+         is present or the value contains invalid syntax according to
+         :rfc:`9110`. To see the original header check
+         ``resp.headers["Content-Type"]``.
 
          To make sure Content-Type header is not present in
          the server reply, use :attr:`headers` or :attr:`raw_headers`, e.g.
-         ``'CONTENT-TYPE' not in resp.headers``.
+         ``'Content-Type' not in resp.headers``.
 
    .. attribute:: charset
 
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index 3fd6cdd00fc..0912c312f6d 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -287,6 +287,7 @@ resolvehost
 resolvers
 reusage
 reuseconn
+riscv64
 Runit
 runtime
 runtimes
@@ -303,6 +304,7 @@ SocketSocketTransport
 ssl
 SSLContext
 startup
+stateful
 subapplication
 subclassed
 subclasses
@@ -342,6 +344,7 @@ un
 unawaited
 unclosed
 undercounting
+unescaped
 unhandled
 unicode
 unittest
@@ -384,3 +387,5 @@ www
 xxx
 yarl
 zlib
+zstandard
+zstd
diff --git docs/streams.rst docs/streams.rst
index 8e4be9d5343..415ded37e64 100644
--- docs/streams.rst
+++ docs/streams.rst
@@ -20,8 +20,8 @@ Streaming API
    :attr:`aiohttp.ClientResponse.content` properties for accessing raw
    BODY data.
 
-Reading Methods
----------------
+Reading Attributes and Methods
+------------------------------
 
 .. method:: StreamReader.read(n=-1)
       :async:
@@ -109,6 +109,13 @@ Reading Methods
                                to the end of a HTTP chunk.
 
 
+.. attribute:: StreamReader.total_raw_bytes
+
+   The number of bytes of raw data downloaded (before decompression).
+
+   Readonly :class:`int` property.
+
+
 Asynchronous Iteration Support
 ------------------------------
 
diff --git docs/third_party.rst docs/third_party.rst
index 145a505a5de..c01023c1f1b 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -308,3 +308,6 @@ ask to raise the status.
 
 - `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
   An aiohttp middleware for exposing Prometheus metrics.
+
+- `wireup <https://github.com/maldoinc/wireup>`_
+  Performant, concise, and easy-to-use dependency injection container.
diff --git docs/web_reference.rst docs/web_reference.rst
index 2d1882da17c..5ae15478b4f 100644
--- docs/web_reference.rst
+++ docs/web_reference.rst
@@ -1625,6 +1625,14 @@ Application and Router
       matches the pattern *domain* then
       further resolving is passed to *subapp*.
 
+      .. warning::
+
+         Registering many domains using this method may cause p,erformance
+         issues with handler routing. If you have a substantial number of
+         applications for different domains, you may want to consider
+         using a reverse proxy (such as Nginx) to handle routing to
+         different apps, rather that registering them as sub-applications.
+
       :param str domain: domain or mask of domain for the resource.
 
       :param Application subapp: nested application.
@@ -3060,7 +3068,8 @@ Utilities
                       handle_signals=True, \
                       reuse_address=None, \
                       reuse_port=None, \
-                      handler_cancellation=False)
+                      handler_cancellation=False, \
+					  **kwargs)
 
    A high-level function for running an application, serving it until
    keyboard interrupt and performing a
@@ -3170,6 +3179,9 @@ Utilities
                                      scalability is a concern.
                                      :ref:`aiohttp-web-peer-disconnection`
 
+   :param kwargs: additional named parameters to pass into
+                  :class:`AppRunner` constructor.
+
    .. versionadded:: 3.0
 
       Support *access_log_class* parameter.
diff --git pyproject.toml pyproject.toml
index df8b8465348..1cbfe81138e 100644
--- pyproject.toml
+++ pyproject.toml
@@ -1,10 +1,78 @@
 [build-system]
 requires = [
     "pkgconfig",
-    "setuptools >= 46.4.0",
+    # setuptools >= 67.0 required for Python 3.12+ support
+    # Next step should be >= 77.0 for PEP 639 support
+    # Don't bump too early to give distributors time to update
+    # their setuptools version.
+    "setuptools >= 67.0",
 ]
 build-backend = "setuptools.build_meta"
 
+[project]
+name        = "aiohttp"
+# TODO: Update to just 'license = "..."' once setuptools is bumped to >=77
+license     = {text = "Apache-2.0 AND MIT"}
+description = "Async http client/server framework (asyncio)"
+readme      = "README.rst"
+classifiers = [
+  "Development Status :: 5 - Production/Stable",
+  "Framework :: AsyncIO",
+  "Intended Audience :: Developers",
+  "Operating System :: POSIX",
+  "Operating System :: MacOS :: MacOS X",
+  "Operating System :: Microsoft :: Windows",
+  "Programming Language :: Python",
+  "Programming Language :: Python :: 3",
+  "Programming Language :: Python :: 3.9",
+  "Programming Language :: Python :: 3.10",
+  "Programming Language :: Python :: 3.11",
+  "Programming Language :: Python :: 3.12",
+  "Programming Language :: Python :: 3.13",
+  "Programming Language :: Python :: 3.14",
+  "Topic :: Internet :: WWW/HTTP",
+]
+requires-python = ">= 3.9"
+dynamic = [
+  "dependencies",
+  "optional-dependencies",
+  "version",
+]
+
+[[project.maintainers]]
+name = "aiohttp team"
+email = "[email protected]"
+
+[project.urls]
+"Homepage"           = "https://github.com/aio-libs/aiohttp"
+"Chat: Matrix"       = "https://matrix.to/#/#aio-libs:matrix.org"
+"Chat: Matrix Space" = "https://matrix.to/#/#aio-libs-space:matrix.org"
+"CI: GitHub Actions" = "https://github.com/aio-libs/aiohttp/actions?query=workflow%3ACI"
+"Coverage: codecov"  = "https://codecov.io/github/aio-libs/aiohttp"
+"Docs: Changelog"    = "https://docs.aiohttp.org/en/stable/changes.html"
+"Docs: RTD"          = "https://docs.aiohttp.org"
+"GitHub: issues"     = "https://github.com/aio-libs/aiohttp/issues"
+"GitHub: repo"       = "https://github.com/aio-libs/aiohttp"
+
+[tool.setuptools]
+license-files = [
+  # TODO: Use 'project.license-files' instead once setuptools is bumped to >=77
+  "LICENSE.txt",
+  "vendor/llhttp/LICENSE",
+]
+
+[tool.setuptools.dynamic]
+version = {attr = "aiohttp.__version__"}
+
+[tool.setuptools.packages.find]
+include = [
+  "aiohttp",
+  "aiohttp.*",
+]
+
+[tool.setuptools.exclude-package-data]
+"*" = ["*.c", "*.h"]
+
 [tool.towncrier]
   package = "aiohttp"
   filename = "CHANGES.rst"
@@ -88,8 +156,3 @@ ignore-words-list = 'te,ue'
 # TODO(3.13): Remove aiohttp.helpers once https://github.com/python/cpython/pull/106771
 # is available in all supported cpython versions
 exclude-modules = "(^aiohttp\\.helpers)"
-
-[tool.black]
-# TODO: Remove when project metadata is moved here.
-# Black can read the value from [project.requires-python].
-target-version = ["py39", "py310", "py311", "py312"]
diff --git a/requirements/base-ft.in b/requirements/base-ft.in
new file mode 100644
index 00000000000..2b0cbf7d0c2
--- /dev/null
+++ requirements/base-ft.in
@@ -0,0 +1,3 @@
+-r runtime-deps.in
+
+gunicorn
diff --git a/requirements/base-ft.txt b/requirements/base-ft.txt
new file mode 100644
index 00000000000..8a8d2a15499
--- /dev/null
+++ requirements/base-ft.txt
@@ -0,0 +1,50 @@
+#
+# This file is autogenerated by pip-compile with Python 3.10
+# by the following command:
+#
+#    pip-compile --allow-unsafe --output-file=requirements/base-ft.txt --strip-extras requirements/base-ft.in
+#
+aiodns==3.5.0
+    # via -r requirements/runtime-deps.in
+aiohappyeyeballs==2.6.1
+    # via -r requirements/runtime-deps.in
+aiosignal==1.4.0
+    # via -r requirements/runtime-deps.in
+async-timeout==5.0.1 ; python_version < "3.11"
+    # via -r requirements/runtime-deps.in
+attrs==25.3.0
+    # via -r requirements/runtime-deps.in
+brotli==1.1.0 ; platform_python_implementation == "CPython"
+    # via -r requirements/runtime-deps.in
+cffi==2.0.0
+    # via pycares
+frozenlist==1.8.0
+    # via
+    #   -r requirements/runtime-deps.in
+    #   aiosignal
+gunicorn==23.0.0
+    # via -r requirements/base-ft.in
+idna==3.10
+    # via yarl
+multidict==6.6.4
+    # via
+    #   -r requirements/runtime-deps.in
+    #   yarl
+packaging==25.0
+    # via gunicorn
+propcache==0.4.0
+    # via
+    #   -r requirements/runtime-deps.in
+    #   yarl
+pycares==4.11.0
+    # via aiodns
+pycparser==2.23
+    # via cffi
+typing-extensions==4.15.0
+    # via
+    #   aiosignal
+    #   multidict
+yarl==1.21.0
+    # via -r requirements/runtime-deps.in
+backports.zstd==0.5.0 ; platform_python_implementation == "CPython" and python_version < "3.14"
+    # via -r requirements/runtime-deps.in
diff --git requirements/base.txt requirements/base.txt
index 74f528d67bc..fa734658aba 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -4,7 +4,7 @@
 #
 #    pip-compile --allow-unsafe --output-file=requirements/base.txt --strip-extras requirements/base.in
 #
-aiodns==3.4.0
+aiodns==3.5.0
     # via -r requirements/runtime-deps.in
 aiohappyeyeballs==2.6.1
     # via -r requirements/runtime-deps.in
@@ -16,9 +16,9 @@ attrs==25.3.0
     # via -r requirements/runtime-deps.in
 brotli==1.1.0 ; platform_python_implementation == "CPython"
     # via -r requirements/runtime-deps.in
-cffi==1.17.1
+cffi==2.0.0
     # via pycares
-frozenlist==1.6.0
+frozenlist==1.8.0
     # via
     #   -r requirements/runtime-deps.in
     #   aiosignal
@@ -26,26 +26,27 @@ gunicorn==23.0.0
     # via -r requirements/base.in
 idna==3.4
     # via yarl
-multidict==6.4.4
+multidict==6.6.4
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
 packaging==25.0
     # via gunicorn
-propcache==0.3.1
+propcache==0.4.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-pycares==4.8.0
+pycares==4.11.0
     # via aiodns
-pycparser==2.22
+pycparser==2.23
     # via cffi
-typing-extensions==4.14.0
+typing-extensions==4.15.0
     # via
     #   aiosignal
     #   multidict
 uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython"
-winloop==0.1.8; platform_system == "Windows" and implementation_name == "cpython"
     # via -r requirements/base.in
-yarl==1.20.0
+yarl==1.21.0
+    # via -r requirements/runtime-deps.in
+backports.zstd==0.5.0 ; platform_python_implementation == "CPython" and python_version < "3.14"
     # via -r requirements/runtime-deps.in
diff --git requirements/constraints.txt requirements/constraints.txt
index 4457788efc0..ec3806c6977 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -1,10 +1,10 @@
 #
-# This file is autogenerated by pip-compile with python 3.10
-# To update, run:
+# This file is autogenerated by pip-compile with Python 3.10
+# by the following command:
 #
-#    pip-compile --allow-unsafe --output-file=requirements/constraints.txt --resolver=backtracking --strip-extras requirements/constraints.in
+#    pip-compile --allow-unsafe --output-file=requirements/constraints.txt --strip-extras requirements/constraints.in
 #
-aiodns==3.4.0
+aiodns==3.5.0
     # via
     #   -r requirements/lint.in
     #   -r requirements/runtime-deps.in
@@ -26,26 +26,26 @@ attrs==25.3.0
     # via -r requirements/runtime-deps.in
 babel==2.17.0
     # via sphinx
-blockbuster==1.5.24
+blockbuster==1.5.25
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
 brotli==1.1.0 ; platform_python_implementation == "CPython"
     # via -r requirements/runtime-deps.in
-build==1.2.2.post1
+build==1.3.0
     # via pip-tools
-certifi==2025.4.26
+certifi==2025.10.5
     # via requests
-cffi==1.17.1
+cffi==2.0.0
     # via
     #   cryptography
     #   pycares
     #   pytest-codspeed
 cfgv==3.4.0
     # via pre-commit
-charset-normalizer==3.4.2
+charset-normalizer==3.4.3
     # via requests
-cherry-picker==2.5.0
+cherry-picker==2.6.0
     # via -r requirements/dev.in
 click==8.1.8
     # via
@@ -54,17 +54,17 @@ click==8.1.8
     #   slotscheck
     #   towncrier
     #   wait-for-it
-coverage==7.8.1
+coverage==7.10.7
     # via
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
     #   pytest-cov
-cryptography==45.0.2
+cryptography==46.0.2
     # via
     #   pyjwt
     #   trustme
-cython==3.1.1
+cython==3.1.4
     # via -r requirements/cython.in
-distlib==0.3.9
+distlib==0.4.0
     # via virtualenv
 docutils==0.21.2
     # via sphinx
@@ -72,23 +72,23 @@ exceptiongroup==1.3.0
     # via pytest
 execnet==2.1.1
     # via pytest-xdist
-filelock==3.18.0
+filelock==3.19.1
     # via virtualenv
 forbiddenfruit==0.1.4
     # via blockbuster
-freezegun==1.5.1
+freezegun==1.5.5
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-frozenlist==1.6.0
+    #   -r requirements/test-common.in
+frozenlist==1.8.0
     # via
     #   -r requirements/runtime-deps.in
     #   aiosignal
-gidgethub==5.3.0
+gidgethub==5.4.0
     # via cherry-picker
 gunicorn==23.0.0
     # via -r requirements/base.in
-identify==2.6.10
+identify==2.6.15
     # via pre-commit
 idna==3.3
     # via
@@ -97,33 +97,31 @@ idna==3.3
     #   yarl
 imagesize==1.4.1
     # via sphinx
-incremental==24.7.2
-    # via towncrier
 iniconfig==2.1.0
     # via pytest
-isal==1.7.2
+isal==1.7.2 ; python_version < "3.14"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
 jinja2==3.1.6
     # via
     #   sphinx
     #   towncrier
 markdown-it-py==3.0.0
     # via rich
-markupsafe==3.0.2
+markupsafe==3.0.3
     # via jinja2
 mdurl==0.1.2
     # via markdown-it-py
-multidict==6.4.4
+multidict==6.6.4
     # via
     #   -r requirements/multidict.in
     #   -r requirements/runtime-deps.in
     #   yarl
-mypy==1.15.0 ; implementation_name == "cpython"
+mypy==1.18.2 ; implementation_name == "cpython"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
 mypy-extensions==1.1.0
     # via mypy
 nodeenv==1.9.1
@@ -134,34 +132,39 @@ packaging==25.0
     #   gunicorn
     #   pytest
     #   sphinx
-pip-tools==7.4.1
+pathspec==0.12.1
+    # via mypy
+pip-tools==7.5.1
     # via -r requirements/dev.in
 pkgconfig==1.5.5
-    # via -r requirements/test.in
-platformdirs==4.3.8
+    # via -r requirements/test-common.in
+platformdirs==4.4.0
     # via virtualenv
 pluggy==1.6.0
-    # via pytest
-pre-commit==4.2.0
+    # via
+    #   pytest
+    #   pytest-cov
+pre-commit==4.3.0
     # via -r requirements/lint.in
-propcache==0.3.1
+propcache==0.4.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
 proxy-py==2.4.10
-    # via -r requirements/test.in
-pycares==4.8.0
+    # via -r requirements/test-common.in
+pycares==4.11.0
     # via aiodns
-pycparser==2.22
+pycparser==2.23
     # via cffi
-pydantic==2.11.5
+pydantic==2.11.9
     # via python-on-whales
 pydantic-core==2.33.2
     # via pydantic
-pyenchant==3.2.2
+pyenchant==3.3.0
     # via sphinxcontrib-spelling
-pygments==2.19.1
+pygments==2.19.2
     # via
+    #   pytest
     #   rich
     #   sphinx
 pyjwt==2.9.0
@@ -172,47 +175,47 @@ pyproject-hooks==1.2.0
     # via
     #   build
     #   pip-tools
-pytest==8.3.5
+pytest==8.4.2
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
     #   pytest-codspeed
     #   pytest-cov
     #   pytest-mock
     #   pytest-xdist
-pytest-codspeed==3.2.0
+pytest-codspeed==4.0.0
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-pytest-cov==6.1.1
-    # via -r requirements/test.in
-pytest-mock==3.14.0
+    #   -r requirements/test-common.in
+pytest-cov==7.0.0
+    # via -r requirements/test-common.in
+pytest-mock==3.15.1
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-pytest-xdist==3.6.1
-    # via -r requirements/test.in
+    #   -r requirements/test-common.in
+pytest-xdist==3.8.0
+    # via -r requirements/test-common.in
 python-dateutil==2.9.0.post0
     # via freezegun
-python-on-whales==0.76.1
+python-on-whales==0.78.0
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-pyyaml==6.0.2
+    #   -r requirements/test-common.in
+pyyaml==6.0.3
     # via pre-commit
 re-assert==1.1.0
-    # via -r requirements/test.in
-regex==2024.11.6
+    # via -r requirements/test-common.in
+regex==2025.9.18
     # via re-assert
-requests==2.32.3
+requests==2.32.5
     # via
     #   cherry-picker
     #   sphinx
     #   sphinxcontrib-spelling
-rich==14.0.0
+rich==14.1.0
     # via pytest-codspeed
 setuptools-git==1.2
-    # via -r requirements/test.in
+    # via -r requirements/test-common.in
 six==1.17.0
     # via python-dateutil
 slotscheck==0.19.1
@@ -242,68 +245,70 @@ sphinxcontrib-towncrier==0.5.0a0
     # via -r requirements/doc.in
 stamina==25.1.0
     # via cherry-picker
-tenacity==9.0.0
+tenacity==9.1.2
     # via stamina
 tomli==2.2.1
     # via
     #   build
     #   cherry-picker
     #   coverage
-    #   incremental
     #   mypy
     #   pip-tools
     #   pytest
     #   slotscheck
     #   sphinx
     #   towncrier
-towncrier==23.11.0
+towncrier==25.8.0
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
 trustme==1.2.1 ; platform_machine != "i686"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-typing-extensions==4.13.2
+    #   -r requirements/test-common.in
+typing-extensions==4.15.0
     # via
     #   aiosignal
+    #   cryptography
     #   exceptiongroup
     #   multidict
     #   mypy
     #   pydantic
     #   pydantic-core
     #   python-on-whales
-    #   rich
     #   typing-inspection
-typing-inspection==0.4.1
+    #   virtualenv
+typing-inspection==0.4.2
     # via pydantic
-uritemplate==4.1.1
+uritemplate==4.2.0
     # via gidgethub
-urllib3==2.4.0
+urllib3==2.5.0
     # via requests
 uvloop==0.21.0 ; platform_system != "Windows"
     # via
     #   -r requirements/base.in
     #   -r requirements/lint.in
-valkey==6.1.0
+valkey==6.1.1
     # via -r requirements/lint.in
-virtualenv==20.31.2
+virtualenv==20.34.0
     # via pre-commit
 wait-for-it==2.3.0
-    # via -r requirements/test.in
+    # via -r requirements/test-common.in
 wheel==0.45.1
     # via pip-tools
-yarl==1.20.0
+yarl==1.21.0
     # via -r requirements/runtime-deps.in
-zlib-ng==0.5.1
+zlib-ng==1.0.0
+    # via
+    #   -r requirements/lint.in
+    #   -r requirements/test-common.in
+backports.zstd==0.5.0 ; implementation_name == "cpython"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/runtime-deps.in
 
 # The following packages are considered to be unsafe in a requirements file:
-pip==25.1.1
+pip==25.2
+    # via pip-tools
+setuptools==80.9.0
     # via pip-tools
-setuptools==80.8.0
-    # via
-    #   incremental
-    #   pip-tools
diff --git requirements/cython.txt requirements/cython.txt
index 8d7e2dc256c..824e216600b 100644
--- requirements/cython.txt
+++ requirements/cython.txt
@@ -4,9 +4,9 @@
 #
 #    pip-compile --allow-unsafe --output-file=requirements/cython.txt --resolver=backtracking --strip-extras requirements/cython.in
 #
-cython==3.1.1
+cython==3.1.4
     # via -r requirements/cython.in
-multidict==6.4.4
+multidict==6.6.4
     # via -r requirements/multidict.in
-typing-extensions==4.13.2
+typing-extensions==4.15.0
     # via multidict
diff --git requirements/dev.txt requirements/dev.txt
index c9ab0cb822b..63cc8589262 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -1,10 +1,10 @@
 #
-# This file is autogenerated by pip-compile with python 3.10
-# To update, run:
+# This file is autogenerated by pip-compile with Python 3.10
+# by the following command:
 #
-#    pip-compile --allow-unsafe --output-file=requirements/dev.txt --resolver=backtracking --strip-extras requirements/dev.in
+#    pip-compile --allow-unsafe --output-file=requirements/dev.txt --strip-extras requirements/dev.in
 #
-aiodns==3.4.0
+aiodns==3.5.0
     # via
     #   -r requirements/lint.in
     #   -r requirements/runtime-deps.in
@@ -26,26 +26,26 @@ attrs==25.3.0
     # via -r requirements/runtime-deps.in
 babel==2.17.0
     # via sphinx
-blockbuster==1.5.24
+blockbuster==1.5.25
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
 brotli==1.1.0 ; platform_python_implementation == "CPython"
     # via -r requirements/runtime-deps.in
-build==1.2.2.post1
+build==1.3.0
     # via pip-tools
-certifi==2025.4.26
+certifi==2025.10.5
     # via requests
-cffi==1.17.1
+cffi==2.0.0
     # via
     #   cryptography
     #   pycares
     #   pytest-codspeed
 cfgv==3.4.0
     # via pre-commit
-charset-normalizer==3.4.2
+charset-normalizer==3.4.3
     # via requests
-cherry-picker==2.5.0
+cherry-picker==2.6.0
     # via -r requirements/dev.in
 click==8.1.8
     # via
@@ -54,15 +54,15 @@ click==8.1.8
     #   slotscheck
     #   towncrier
     #   wait-for-it
-coverage==7.8.1
+coverage==7.10.7
     # via
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
     #   pytest-cov
-cryptography==45.0.2
+cryptography==46.0.2
     # via
     #   pyjwt
     #   trustme
-distlib==0.3.9
+distlib==0.4.0
     # via virtualenv
 docutils==0.21.2
     # via sphinx
@@ -70,23 +70,23 @@ exceptiongroup==1.3.0
     # via pytest
 execnet==2.1.1
     # via pytest-xdist
-filelock==3.18.0
+filelock==3.19.1
     # via virtualenv
 forbiddenfruit==0.1.4
     # via blockbuster
-freezegun==1.5.1
+freezegun==1.5.5
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-frozenlist==1.6.0
+    #   -r requirements/test-common.in
+frozenlist==1.8.0
     # via
     #   -r requirements/runtime-deps.in
     #   aiosignal
-gidgethub==5.3.0
+gidgethub==5.4.0
     # via cherry-picker
 gunicorn==23.0.0
     # via -r requirements/base.in
-identify==2.6.10
+identify==2.6.15
     # via pre-commit
 idna==3.4
     # via
@@ -95,32 +95,30 @@ idna==3.4
     #   yarl
 imagesize==1.4.1
     # via sphinx
-incremental==24.7.2
-    # via towncrier
 iniconfig==2.1.0
     # via pytest
-isal==1.7.2
+isal==1.7.2 ; python_version < "3.14"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
 jinja2==3.1.6
     # via
     #   sphinx
     #   towncrier
 markdown-it-py==3.0.0
     # via rich
-markupsafe==3.0.2
+markupsafe==3.0.3
     # via jinja2
 mdurl==0.1.2
     # via markdown-it-py
-multidict==6.4.4
+multidict==6.6.4
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-mypy==1.15.0 ; implementation_name == "cpython"
+mypy==1.18.2 ; implementation_name == "cpython"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
 mypy-extensions==1.1.0
     # via mypy
 nodeenv==1.9.1
@@ -131,32 +129,37 @@ packaging==25.0
     #   gunicorn
     #   pytest
     #   sphinx
-pip-tools==7.4.1
+pathspec==0.12.1
+    # via mypy
+pip-tools==7.5.1
     # via -r requirements/dev.in
 pkgconfig==1.5.5
-    # via -r requirements/test.in
-platformdirs==4.3.8
+    # via -r requirements/test-common.in
+platformdirs==4.4.0
     # via virtualenv
 pluggy==1.6.0
-    # via pytest
-pre-commit==4.2.0
+    # via
+    #   pytest
+    #   pytest-cov
+pre-commit==4.3.0
     # via -r requirements/lint.in
-propcache==0.3.1
+propcache==0.4.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
 proxy-py==2.4.10
-    # via -r requirements/test.in
-pycares==4.8.0
+    # via -r requirements/test-common.in
+pycares==4.11.0
     # via aiodns
-pycparser==2.22
+pycparser==2.23
     # via cffi
-pydantic==2.11.5
+pydantic==2.11.9
     # via python-on-whales
 pydantic-core==2.33.2
     # via pydantic
-pygments==2.19.1
+pygments==2.19.2
     # via
+    #   pytest
     #   rich
     #   sphinx
 pyjwt==2.8.0
@@ -167,46 +170,46 @@ pyproject-hooks==1.2.0
     # via
     #   build
     #   pip-tools
-pytest==8.3.5
+pytest==8.4.2
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
     #   pytest-codspeed
     #   pytest-cov
     #   pytest-mock
     #   pytest-xdist
-pytest-codspeed==3.2.0
+pytest-codspeed==4.0.0
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-pytest-cov==6.1.1
-    # via -r requirements/test.in
-pytest-mock==3.14.0
+    #   -r requirements/test-common.in
+pytest-cov==7.0.0
+    # via -r requirements/test-common.in
+pytest-mock==3.15.1
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-pytest-xdist==3.6.1
-    # via -r requirements/test.in
+    #   -r requirements/test-common.in
+pytest-xdist==3.8.0
+    # via -r requirements/test-common.in
 python-dateutil==2.9.0.post0
     # via freezegun
-python-on-whales==0.76.1
+python-on-whales==0.78.0
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-pyyaml==6.0.2
+    #   -r requirements/test-common.in
+pyyaml==6.0.3
     # via pre-commit
 re-assert==1.1.0
-    # via -r requirements/test.in
-regex==2024.11.6
+    # via -r requirements/test-common.in
+regex==2025.9.18
     # via re-assert
-requests==2.32.3
+requests==2.32.5
     # via
     #   cherry-picker
     #   sphinx
-rich==14.0.0
+rich==14.1.0
     # via pytest-codspeed
 setuptools-git==1.2
-    # via -r requirements/test.in
+    # via -r requirements/test-common.in
 six==1.17.0
     # via python-dateutil
 slotscheck==0.19.1
@@ -233,68 +236,70 @@ sphinxcontrib-towncrier==0.5.0a0
     # via -r requirements/doc.in
 stamina==25.1.0
     # via cherry-picker
-tenacity==9.0.0
+tenacity==9.1.2
     # via stamina
 tomli==2.2.1
     # via
     #   build
     #   cherry-picker
     #   coverage
-    #   incremental
     #   mypy
     #   pip-tools
     #   pytest
     #   slotscheck
     #   sphinx
     #   towncrier
-towncrier==23.11.0
+towncrier==25.8.0
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
 trustme==1.2.1 ; platform_machine != "i686"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
-typing-extensions==4.13.2
+    #   -r requirements/test-common.in
+typing-extensions==4.15.0
     # via
     #   aiosignal
+    #   cryptography
     #   exceptiongroup
     #   multidict
     #   mypy
     #   pydantic
     #   pydantic-core
     #   python-on-whales
-    #   rich
     #   typing-inspection
-typing-inspection==0.4.1
+    #   virtualenv
+typing-inspection==0.4.2
     # via pydantic
-uritemplate==4.1.1
+uritemplate==4.2.0
     # via gidgethub
-urllib3==2.4.0
+urllib3==2.5.0
     # via requests
 uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython"
     # via
     #   -r requirements/base.in
     #   -r requirements/lint.in
-valkey==6.1.0
+valkey==6.1.1
     # via -r requirements/lint.in
-virtualenv==20.31.2
+virtualenv==20.34.0
     # via pre-commit
 wait-for-it==2.3.0
-    # via -r requirements/test.in
+    # via -r requirements/test-common.in
 wheel==0.45.1
     # via pip-tools
-yarl==1.20.0
+yarl==1.21.0
     # via -r requirements/runtime-deps.in
-zlib-ng==0.5.1
+zlib-ng==1.0.0
+    # via
+    #   -r requirements/lint.in
+    #   -r requirements/test-common.in
+backports.zstd==0.5.0 ; platform_python_implementation == "CPython" and python_version < "3.14"
     # via
     #   -r requirements/lint.in
-    #   -r requirements/test.in
+    #   -r requirements/runtime-deps.in
 
 # The following packages are considered to be unsafe in a requirements file:
-pip==25.1.1
+pip==25.2
+    # via pip-tools
+setuptools==80.9.0
     # via pip-tools
-setuptools==80.8.0
-    # via
-    #   incremental
-    #   pip-tools
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index 142aa6d7edb..55c618df961 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -10,9 +10,9 @@ alabaster==1.0.0
     # via sphinx
 babel==2.17.0
     # via sphinx
-certifi==2025.4.26
+certifi==2025.10.5
     # via requests
-charset-normalizer==3.4.2
+charset-normalizer==3.4.3
     # via requests
 click==8.1.8
     # via towncrier
@@ -22,21 +22,19 @@ idna==3.4
     # via requests
 imagesize==1.4.1
     # via sphinx
-incremental==24.7.2
-    # via towncrier
 jinja2==3.1.6
     # via
     #   sphinx
     #   towncrier
-markupsafe==3.0.2
+markupsafe==3.0.3
     # via jinja2
 packaging==25.0
     # via sphinx
-pyenchant==3.2.2
+pyenchant==3.3.0
     # via sphinxcontrib-spelling
-pygments==2.19.1
+pygments==2.19.2
     # via sphinx
-requests==2.32.3
+requests==2.32.5
     # via
     #   sphinx
     #   sphinxcontrib-spelling
@@ -65,16 +63,11 @@ sphinxcontrib-towncrier==0.5.0a0
     # via -r requirements/doc.in
 tomli==2.2.1
     # via
-    #   incremental
     #   sphinx
     #   towncrier
-towncrier==23.11.0
+towncrier==25.8.0
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-urllib3==2.4.0
+urllib3==2.5.0
     # via requests
-
-# The following packages are considered to be unsafe in a requirements file:
-setuptools==80.8.0
-    # via incremental
diff --git requirements/doc.txt requirements/doc.txt
index 08f24f4175a..7aa95aee161 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -10,31 +10,29 @@ alabaster==1.0.0
     # via sphinx
 babel==2.17.0
     # via sphinx
-certifi==2025.4.26
+certifi==2025.10.5
     # via requests
-charset-normalizer==3.4.2
+charset-normalizer==3.4.3
     # via requests
 click==8.1.8
     # via towncrier
 docutils==0.21.2
     # via sphinx
-idna==3.4
+idna==3.10
     # via requests
 imagesize==1.4.1
     # via sphinx
-incremental==24.7.2
-    # via towncrier
 jinja2==3.1.6
     # via
     #   sphinx
     #   towncrier
-markupsafe==3.0.2
+markupsafe==3.0.3
     # via jinja2
 packaging==25.0
     # via sphinx
-pygments==2.19.1
+pygments==2.19.2
     # via sphinx
-requests==2.32.3
+requests==2.32.5
     # via sphinx
 snowballstemmer==3.0.1
     # via sphinx
@@ -58,16 +56,11 @@ sphinxcontrib-towncrier==0.5.0a0
     # via -r requirements/doc.in
 tomli==2.2.1
     # via
-    #   incremental
     #   sphinx
     #   towncrier
-towncrier==23.11.0
+towncrier==25.8.0
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-urllib3==2.4.0
+urllib3==2.5.0
     # via requests
-
-# The following packages are considered to be unsafe in a requirements file:
-setuptools==80.8.0
-    # via incremental
diff --git requirements/lint.in requirements/lint.in
index fe996d00176..5bfd3c31c65 100644
--- requirements/lint.in
+++ requirements/lint.in
@@ -1,4 +1,5 @@
 aiodns
+backports.zstd; implementation_name == "cpython"
 blockbuster
 freezegun
 isal
diff --git requirements/lint.txt requirements/lint.txt
index 57729254937..b3b12bdd62a 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -1,18 +1,18 @@
 #
-# This file is autogenerated by pip-compile with python 3.10
+# This file is autogenerated by pip-compile with Python 3.12
 # by the following command:
 #
 #    pip-compile --allow-unsafe --output-file=requirements/lint.txt --resolver=backtracking --strip-extras requirements/lint.in
 #
-aiodns==3.4.0
+aiodns==3.5.0
     # via -r requirements/lint.in
 annotated-types==0.7.0
     # via pydantic
 async-timeout==5.0.1
     # via valkey
-blockbuster==1.5.24
+blockbuster==1.5.25
     # via -r requirements/lint.in
-cffi==1.17.1
+cffi==2.0.0
     # via
     #   cryptography
     #   pycares
@@ -21,19 +21,19 @@ cfgv==3.4.0
     # via pre-commit
 click==8.1.8
     # via slotscheck
-cryptography==45.0.2
+cryptography==46.0.2
     # via trustme
-distlib==0.3.9
+distlib==0.4.0
     # via virtualenv
 exceptiongroup==1.3.0
     # via pytest
-filelock==3.18.0
+filelock==3.19.1
     # via virtualenv
 forbiddenfruit==0.1.4
     # via blockbuster
-freezegun==1.5.1
+freezegun==1.5.5
     # via -r requirements/lint.in
-identify==2.6.10
+identify==2.6.15
     # via pre-commit
 idna==3.7
     # via trustme
@@ -45,7 +45,7 @@ markdown-it-py==3.0.0
     # via rich
 mdurl==0.1.2
     # via markdown-it-py
-mypy==1.15.0 ; implementation_name == "cpython"
+mypy==1.18.2 ; implementation_name == "cpython"
     # via -r requirements/lint.in
 mypy-extensions==1.1.0
     # via mypy
@@ -53,38 +53,42 @@ nodeenv==1.9.1
     # via pre-commit
 packaging==25.0
     # via pytest
-platformdirs==4.3.8
+pathspec==0.12.1
+    # via mypy
+platformdirs==4.4.0
     # via virtualenv
 pluggy==1.6.0
     # via pytest
-pre-commit==4.2.0
+pre-commit==4.3.0
     # via -r requirements/lint.in
-pycares==4.8.0
+pycares==4.11.0
     # via aiodns
-pycparser==2.22
+pycparser==2.23
     # via cffi
-pydantic==2.11.5
+pydantic==2.11.9
     # via python-on-whales
 pydantic-core==2.33.2
     # via pydantic
-pygments==2.19.1
-    # via rich
-pytest==8.3.5
+pygments==2.19.2
+    # via
+    #   pytest
+    #   rich
+pytest==8.4.2
     # via
     #   -r requirements/lint.in
     #   pytest-codspeed
     #   pytest-mock
-pytest-codspeed==3.2.0
+pytest-codspeed==4.0.0
     # via -r requirements/lint.in
-pytest-mock==3.14.0
+pytest-mock==3.15.1
     # via -r requirements/lint.in
 python-dateutil==2.9.0.post0
     # via freezegun
-python-on-whales==0.76.1
+python-on-whales==0.78.0
     # via -r requirements/lint.in
-pyyaml==6.0.2
+pyyaml==6.0.3
     # via pre-commit
-rich==14.0.0
+rich==14.1.0
     # via pytest-codspeed
 six==1.17.0
     # via python-dateutil
@@ -97,22 +101,25 @@ tomli==2.2.1
     #   slotscheck
 trustme==1.2.1
     # via -r requirements/lint.in
-typing-extensions==4.13.2
+typing-extensions==4.15.0
     # via
+    #   cryptography
     #   exceptiongroup
     #   mypy
     #   pydantic
     #   pydantic-core
     #   python-on-whales
-    #   rich
     #   typing-inspection
-typing-inspection==0.4.1
+    #   virtualenv
+typing-inspection==0.4.2
     # via pydantic
 uvloop==0.21.0 ; platform_system != "Windows"
     # via -r requirements/lint.in
-valkey==6.1.0
+valkey==6.1.1
     # via -r requirements/lint.in
-virtualenv==20.31.2
+virtualenv==20.34.0
     # via pre-commit
-zlib-ng==0.5.1
+zlib-ng==1.0.0
+    # via -r requirements/lint.in
+backports.zstd==0.5.0 ; implementation_name == "cpython"
     # via -r requirements/lint.in
diff --git requirements/multidict.txt requirements/multidict.txt
index abd2e2cc9eb..04a7f1fc117 100644
--- requirements/multidict.txt
+++ requirements/multidict.txt
@@ -4,7 +4,7 @@
 #
 #    pip-compile --allow-unsafe --output-file=requirements/multidict.txt --resolver=backtracking --strip-extras requirements/multidict.in
 #
-multidict==6.4.4
+multidict==6.6.4
     # via -r requirements/multidict.in
-typing-extensions==4.13.2
+typing-extensions==4.15.0
     # via multidict
diff --git requirements/runtime-deps.in requirements/runtime-deps.in
index d748eab9fac..ad8f28e750d 100644
--- requirements/runtime-deps.in
+++ requirements/runtime-deps.in
@@ -5,6 +5,7 @@ aiohappyeyeballs >= 2.5.0
 aiosignal >= 1.4.0
 async-timeout >= 4.0, < 6.0 ; python_version < "3.11"
 attrs >= 17.3.0
+backports.zstd; platform_python_implementation == 'CPython' and python_version < "3.14"
 Brotli; platform_python_implementation == 'CPython'
 brotlicffi; platform_python_implementation != 'CPython'
 frozenlist >= 1.1.1
diff --git requirements/runtime-deps.txt requirements/runtime-deps.txt
index 4dca87c1362..232ae352db1 100644
--- requirements/runtime-deps.txt
+++ requirements/runtime-deps.txt
@@ -1,10 +1,10 @@
 #
-# This file is autogenerated by pip-compile with Python 3.10
+# This file is autogenerated by pip-compile with Python 3.12
 # by the following command:
 #
 #    pip-compile --allow-unsafe --output-file=requirements/runtime-deps.txt --strip-extras requirements/runtime-deps.in
 #
-aiodns==3.4.0
+aiodns==3.5.0
     # via -r requirements/runtime-deps.in
 aiohappyeyeballs==2.6.1
     # via -r requirements/runtime-deps.in
@@ -16,28 +16,31 @@ attrs==25.3.0
     # via -r requirements/runtime-deps.in
 brotli==1.1.0 ; platform_python_implementation == "CPython"
     # via -r requirements/runtime-deps.in
-cffi==1.17.1
+cffi==2.0.0
     # via pycares
-frozenlist==1.6.0
+frozenlist==1.8.0
     # via
     #   -r requirements/runtime-deps.in
     #   aiosignal
-idna==3.4
+idna==3.10
     # via yarl
-multidict==6.4.4
+multidict==6.6.4
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-propcache==0.3.1
+propcache==0.4.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-pycares==4.8.0
+pycares==4.11.0
     # via aiodns
-pycparser==2.22
+pycparser==2.23
     # via cffi
-typing-extensions==4.14.0
+typing-extensions==4.15.0
     # via
     #   aiosignal
     #   multidict
+yarl==1.21.0
+    # via -r requirements/runtime-deps.in
+backports.zstd==0.5.0 ; platform_python_implementation == "CPython" and python_version < "3.14"
     # via -r requirements/runtime-deps.in
diff --git a/requirements/test-common.in b/requirements/test-common.in
new file mode 100644
index 00000000000..84193d7bc11
--- /dev/null
+++ requirements/test-common.in
@@ -0,0 +1,18 @@
+blockbuster
+coverage
+freezegun
+isal; python_version < "3.14" # no wheel for 3.14
+mypy; implementation_name == "cpython"
+pkgconfig
+proxy.py >= 2.4.4rc5
+pytest
+pytest-cov
+pytest-mock
+pytest-xdist
+pytest_codspeed
+python-on-whales
+re-assert
+setuptools-git
+trustme; platform_machine != "i686"  # no 32-bit wheels
+wait-for-it
+zlib_ng
diff --git a/requirements/test-common.txt b/requirements/test-common.txt
new file mode 100644
index 00000000000..8e073988a37
--- /dev/null
+++ requirements/test-common.txt
@@ -0,0 +1,117 @@
+#
+# This file is autogenerated by pip-compile with Python 3.10
+# by the following command:
+#
+#    pip-compile --allow-unsafe --output-file=requirements/test-common.txt --strip-extras requirements/test-common.in
+#
+annotated-types==0.7.0
+    # via pydantic
+blockbuster==1.5.25
+    # via -r requirements/test-common.in
+cffi==2.0.0
+    # via
+    #   cryptography
+    #   pytest-codspeed
+click==8.2.1
+    # via wait-for-it
+coverage==7.10.7
+    # via
+    #   -r requirements/test-common.in
+    #   pytest-cov
+cryptography==46.0.2
+    # via trustme
+exceptiongroup==1.3.0
+    # via pytest
+execnet==2.1.1
+    # via pytest-xdist
+forbiddenfruit==0.1.4
+    # via blockbuster
+freezegun==1.5.5
+    # via -r requirements/test-common.in
+idna==3.10
+    # via trustme
+iniconfig==2.1.0
+    # via pytest
+isal==1.8.0 ; python_version < "3.14"
+    # via -r requirements/test-common.in
+markdown-it-py==4.0.0
+    # via rich
+mdurl==0.1.2
+    # via markdown-it-py
+mypy==1.18.2 ; implementation_name == "cpython"
+    # via -r requirements/test-common.in
+mypy-extensions==1.1.0
+    # via mypy
+packaging==25.0
+    # via pytest
+pathspec==0.12.1
+    # via mypy
+pkgconfig==1.5.5
+    # via -r requirements/test-common.in
+pluggy==1.6.0
+    # via
+    #   pytest
+    #   pytest-cov
+proxy-py==2.4.10
+    # via -r requirements/test-common.in
+pycparser==2.23
+    # via cffi
+pydantic==2.12.0a1
+    # via python-on-whales
+pydantic-core==2.37.2
+    # via pydantic
+pygments==2.19.2
+    # via
+    #   pytest
+    #   rich
+pytest==8.4.2
+    # via
+    #   -r requirements/test-common.in
+    #   pytest-codspeed
+    #   pytest-cov
+    #   pytest-mock
+    #   pytest-xdist
+pytest-codspeed==4.0.0
+    # via -r requirements/test-common.in
+pytest-cov==7.0.0
+    # via -r requirements/test-common.in
+pytest-mock==3.15.1
+    # via -r requirements/test-common.in
+pytest-xdist==3.8.0
+    # via -r requirements/test-common.in
+python-dateutil==2.9.0.post0
+    # via freezegun
+python-on-whales==0.78.0
+    # via -r requirements/test-common.in
+re-assert==1.1.0
+    # via -r requirements/test-common.in
+regex==2025.9.18
+    # via re-assert
+rich==14.1.0
+    # via pytest-codspeed
+setuptools-git==1.2
+    # via -r requirements/test-common.in
+six==1.17.0
+    # via python-dateutil
+tomli==2.2.1
+    # via
+    #   coverage
+    #   mypy
+    #   pytest
+trustme==1.2.1 ; platform_machine != "i686"
+    # via -r requirements/test-common.in
+typing-extensions==4.15.0
+    # via
+    #   cryptography
+    #   exceptiongroup
+    #   mypy
+    #   pydantic
+    #   pydantic-core
+    #   python-on-whales
+    #   typing-inspection
+typing-inspection==0.4.2
+    # via pydantic
+wait-for-it==2.3.0
+    # via -r requirements/test-common.in
+zlib-ng==1.0.0
+    # via -r requirements/test-common.in
diff --git a/requirements/test-ft.in b/requirements/test-ft.in
new file mode 100644
index 00000000000..b85406e5d7b
--- /dev/null
+++ requirements/test-ft.in
@@ -0,0 +1,2 @@
+-r base-ft.in
+-r test-common.in
diff --git a/requirements/test-ft.txt b/requirements/test-ft.txt
new file mode 100644
index 00000000000..04b1dcb86e4
--- /dev/null
+++ requirements/test-ft.txt
@@ -0,0 +1,156 @@
+#
+# This file is autogenerated by pip-compile with Python 3.10
+# by the following command:
+#
+#    pip-compile --allow-unsafe --output-file=requirements/test-ft.txt --strip-extras requirements/test-ft.in
+#
+aiodns==3.5.0
+    # via -r requirements/runtime-deps.in
+aiohappyeyeballs==2.6.1
+    # via -r requirements/runtime-deps.in
+aiosignal==1.4.0
+    # via -r requirements/runtime-deps.in
+annotated-types==0.7.0
+    # via pydantic
+async-timeout==5.0.1 ; python_version < "3.11"
+    # via -r requirements/runtime-deps.in
+attrs==25.3.0
+    # via -r requirements/runtime-deps.in
+blockbuster==1.5.25
+    # via -r requirements/test-common.in
+brotli==1.1.0 ; platform_python_implementation == "CPython"
+    # via -r requirements/runtime-deps.in
+cffi==2.0.0
+    # via
+    #   cryptography
+    #   pycares
+    #   pytest-codspeed
+click==8.2.1
+    # via wait-for-it
+coverage==7.10.7
+    # via
+    #   -r requirements/test-common.in
+    #   pytest-cov
+cryptography==46.0.2
+    # via trustme
+exceptiongroup==1.3.0
+    # via pytest
+execnet==2.1.1
+    # via pytest-xdist
+forbiddenfruit==0.1.4
+    # via blockbuster
+freezegun==1.5.5
+    # via -r requirements/test-common.in
+frozenlist==1.8.0
+    # via
+    #   -r requirements/runtime-deps.in
+    #   aiosignal
+gunicorn==23.0.0
+    # via -r requirements/base-ft.in
+idna==3.10
+    # via
+    #   trustme
+    #   yarl
+iniconfig==2.1.0
+    # via pytest
+isal==1.8.0 ; python_version < "3.14"
+    # via -r requirements/test-common.in
+markdown-it-py==4.0.0
+    # via rich
+mdurl==0.1.2
+    # via markdown-it-py
+multidict==6.6.4
+    # via
+    #   -r requirements/runtime-deps.in
+    #   yarl
+mypy==1.18.2 ; implementation_name == "cpython"
+    # via -r requirements/test-common.in
+mypy-extensions==1.1.0
+    # via mypy
+packaging==25.0
+    # via
+    #   gunicorn
+    #   pytest
+pathspec==0.12.1
+    # via mypy
+pkgconfig==1.5.5
+    # via -r requirements/test-common.in
+pluggy==1.6.0
+    # via
+    #   pytest
+    #   pytest-cov
+propcache==0.4.0
+    # via
+    #   -r requirements/runtime-deps.in
+    #   yarl
+proxy-py==2.4.10
+    # via -r requirements/test-common.in
+pycares==4.11.0
+    # via aiodns
+pycparser==2.23
+    # via cffi
+pydantic==2.12.0a1
+    # via python-on-whales
+pydantic-core==2.37.2
+    # via pydantic
+pygments==2.19.2
+    # via
+    #   pytest
+    #   rich
+pytest==8.4.2
+    # via
+    #   -r requirements/test-common.in
+    #   pytest-codspeed
+    #   pytest-cov
+    #   pytest-mock
+    #   pytest-xdist
+pytest-codspeed==4.0.0
+    # via -r requirements/test-common.in
+pytest-cov==7.0.0
+    # via -r requirements/test-common.in
+pytest-mock==3.15.1
+    # via -r requirements/test-common.in
+pytest-xdist==3.8.0
+    # via -r requirements/test-common.in
+python-dateutil==2.9.0.post0
+    # via freezegun
+python-on-whales==0.78.0
+    # via -r requirements/test-common.in
+re-assert==1.1.0
+    # via -r requirements/test-common.in
+regex==2025.9.18
+    # via re-assert
+rich==14.1.0
+    # via pytest-codspeed
+setuptools-git==1.2
+    # via -r requirements/test-common.in
+six==1.17.0
+    # via python-dateutil
+tomli==2.2.1
+    # via
+    #   coverage
+    #   mypy
+    #   pytest
+trustme==1.2.1 ; platform_machine != "i686"
+    # via -r requirements/test-common.in
+typing-extensions==4.15.0
+    # via
+    #   aiosignal
+    #   cryptography
+    #   exceptiongroup
+    #   multidict
+    #   mypy
+    #   pydantic
+    #   pydantic-core
+    #   python-on-whales
+    #   typing-inspection
+typing-inspection==0.4.2
+    # via pydantic
+wait-for-it==2.3.0
+    # via -r requirements/test-common.in
+yarl==1.21.0
+    # via -r requirements/runtime-deps.in
+zlib-ng==1.0.0
+    # via -r requirements/test-common.in
+backports.zstd==0.5.0 ; platform_python_implementation == "CPython" and python_version < "3.14"
+    # via -r requirements/runtime-deps.in
diff --git requirements/test.in requirements/test.in
index 1563689deae..d37efd6b841 100644
--- requirements/test.in
+++ requirements/test.in
@@ -1,20 +1,2 @@
 -r base.in
-
-blockbuster
-coverage
-freezegun
-isal
-mypy; implementation_name == "cpython"
-pkgconfig
-proxy.py >= 2.4.4rc5
-pytest
-pytest-cov
-pytest-mock
-pytest-xdist
-pytest_codspeed
-python-on-whales
-re-assert
-setuptools-git
-trustme; platform_machine != "i686"  # no 32-bit wheels
-wait-for-it
-zlib_ng
+-r test-common.in
diff --git requirements/test.txt requirements/test.txt
index b1ff140b7cc..b7c53b0b6d5 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -1,10 +1,10 @@
 #
-# This file is autogenerated by pip-compile with python 3.10
+# This file is autogenerated by pip-compile with Python 3.10
 # by the following command:
 #
-#    pip-compile --allow-unsafe --output-file=requirements/test.txt --resolver=backtracking --strip-extras requirements/test.in
+#    pip-compile --allow-unsafe --output-file=requirements/test.txt --strip-extras requirements/test.in
 #
-aiodns==3.4.0
+aiodns==3.5.0
     # via -r requirements/runtime-deps.in
 aiohappyeyeballs==2.6.1
     # via -r requirements/runtime-deps.in
@@ -16,22 +16,22 @@ async-timeout==5.0.1 ; python_version < "3.11"
     # via -r requirements/runtime-deps.in
 attrs==25.3.0
     # via -r requirements/runtime-deps.in
-blockbuster==1.5.24
-    # via -r requirements/test.in
+blockbuster==1.5.25
+    # via -r requirements/test-common.in
 brotli==1.1.0 ; platform_python_implementation == "CPython"
     # via -r requirements/runtime-deps.in
-cffi==1.17.1
+cffi==2.0.0
     # via
     #   cryptography
     #   pycares
     #   pytest-codspeed
 click==8.1.8
     # via wait-for-it
-coverage==7.8.1
+coverage==7.10.7
     # via
-    #   -r requirements/test.in
+    #   -r requirements/test-common.in
     #   pytest-cov
-cryptography==45.0.2
+cryptography==46.0.2
     # via trustme
 exceptiongroup==1.3.0
     # via pytest
@@ -39,9 +39,9 @@ execnet==2.1.1
     # via pytest-xdist
 forbiddenfruit==0.1.4
     # via blockbuster
-freezegun==1.5.1
-    # via -r requirements/test.in
-frozenlist==1.6.0
+freezegun==1.5.5
+    # via -r requirements/test-common.in
+frozenlist==1.8.0
     # via
     #   -r requirements/runtime-deps.in
     #   aiosignal
@@ -53,71 +53,77 @@ idna==3.4
     #   yarl
 iniconfig==2.1.0
     # via pytest
-isal==1.7.2
-    # via -r requirements/test.in
+isal==1.7.2 ; python_version < "3.14"
+    # via -r requirements/test-common.in
 markdown-it-py==3.0.0
     # via rich
 mdurl==0.1.2
     # via markdown-it-py
-multidict==6.4.4
+multidict==6.6.4
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-mypy==1.15.0 ; implementation_name == "cpython"
-    # via -r requirements/test.in
+mypy==1.18.2 ; implementation_name == "cpython"
+    # via -r requirements/test-common.in
 mypy-extensions==1.1.0
     # via mypy
 packaging==25.0
     # via
     #   gunicorn
     #   pytest
+pathspec==0.12.1
+    # via mypy
 pkgconfig==1.5.5
-    # via -r requirements/test.in
+    # via -r requirements/test-common.in
 pluggy==1.6.0
-    # via pytest
-propcache==0.3.1
+    # via
+    #   pytest
+    #   pytest-cov
+propcache==0.4.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
 proxy-py==2.4.10
-    # via -r requirements/test.in
-pycares==4.8.0
+    # via -r requirements/test-common.in
+pycares==4.11.0
     # via aiodns
-pycparser==2.22
+pycparser==2.23
     # via cffi
-pydantic==2.11.5
+pydantic==2.12.0a1
     # via python-on-whales
-pydantic-core==2.33.2
+pydantic-core==2.37.2
     # via pydantic
-pygments==2.19.1
-    # via rich
-pytest==8.3.5
+pygments==2.19.2
     # via
-    #   -r requirements/test.in
+    #   pytest
+    #   rich
+pytest==8.4.2
+    # via
+    #   -r requirements/test-common.in
     #   pytest-codspeed
     #   pytest-cov
     #   pytest-mock
     #   pytest-xdist
-pytest-codspeed==3.2.0
-    # via -r requirements/test.in
-pytest-cov==6.1.1
-    # via -r requirements/test.in
-pytest-mock==3.14.0
-    # via -r requirements/test.in
-pytest-xdist==3.6.1
-    # via -r requirements/test.in
+pytest-codspeed==4.0.0
+    # via -r requirements/test-common.in
+pytest-cov==7.0.0
+    # via -r requirements/test-common.in
+pytest-mock==3.15.1
+    # via -r requirements/test-common.in
+pytest-xdist==3.8.0
+    # via -r requirements/test-common.in
 python-dateutil==2.9.0.post0
     # via freezegun
-python-on-whales==0.76.1
-    # via -r requirements/test.in
+python-on-whales==0.78.0
+    # via -r requirements/test-common.in
 re-assert==1.1.0
-    # via -r requirements/test.in
-regex==2024.11.6
+    # via -r requirements/test-common.in
+regex==2025.9.18
     # via re-assert
-rich==14.0.0
+rich==14.1.0
     # via pytest-codspeed
 setuptools-git==1.2
-    # via -r requirements/test.in
+    # via -r requirements/test-common.in
 six==1.17.0
     # via python-dateutil
 tomli==2.2.1
@@ -126,25 +132,27 @@ tomli==2.2.1
     #   mypy
     #   pytest
 trustme==1.2.1 ; platform_machine != "i686"
-    # via -r requirements/test.in
-typing-extensions==4.13.2
+    # via -r requirements/test-common.in
+typing-extensions==4.15.0
     # via
     #   aiosignal
+    #   cryptography
     #   exceptiongroup
     #   multidict
     #   mypy
     #   pydantic
     #   pydantic-core
     #   python-on-whales
-    #   rich
     #   typing-inspection
-typing-inspection==0.4.1
+typing-inspection==0.4.2
     # via pydantic
 uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython"
     # via -r requirements/base.in
 wait-for-it==2.3.0
-    # via -r requirements/test.in
-yarl==1.20.0
+    # via -r requirements/test-common.in
+yarl==1.21.0
+    # via -r requirements/runtime-deps.in
+zlib-ng==1.0.0
+    # via -r requirements/test-common.in
+backports.zstd==0.5.0 ; platform_python_implementation == "CPython" and python_version < "3.14"
     # via -r requirements/runtime-deps.in
-zlib-ng==0.5.1
-    # via -r requirements/test.in
diff --git setup.cfg setup.cfg
index e3b7dc5102f..a78ae609f1b 100644
--- setup.cfg
+++ setup.cfg
@@ -1,55 +1,4 @@
-[metadata]
-name = aiohttp
-version = attr: aiohttp.__version__
-url = https://github.com/aio-libs/aiohttp
-project_urls =
-  Chat: Matrix = https://matrix.to/#/#aio-libs:matrix.org
-  Chat: Matrix Space = https://matrix.to/#/#aio-libs-space:matrix.org
-  CI: GitHub Actions = https://github.com/aio-libs/aiohttp/actions?query=workflow%%3ACI
-  Coverage: codecov = https://codecov.io/github/aio-libs/aiohttp
-  Docs: Changelog = https://docs.aiohttp.org/en/stable/changes.html
-  Docs: RTD = https://docs.aiohttp.org
-  GitHub: issues = https://github.com/aio-libs/aiohttp/issues
-  GitHub: repo = https://github.com/aio-libs/aiohttp
-description = Async http client/server framework (asyncio)
-long_description = file: README.rst
-long_description_content_type = text/x-rst
-maintainer = aiohttp team <[email protected]>
-maintainer_email = [email protected]
-license = Apache-2.0 AND MIT
-license_files =
-    LICENSE.txt
-    vendor/llhttp/LICENSE
-classifiers =
-  Development Status :: 5 - Production/Stable
-
-  Framework :: AsyncIO
-
-  Intended Audience :: Developers
-
-  Operating System :: POSIX
-  Operating System :: MacOS :: MacOS X
-  Operating System :: Microsoft :: Windows
-
-  Programming Language :: Python
-  Programming Language :: Python :: 3
-  Programming Language :: Python :: 3.9
-  Programming Language :: Python :: 3.10
-  Programming Language :: Python :: 3.11
-  Programming Language :: Python :: 3.12
-  Programming Language :: Python :: 3.13
-
-  Topic :: Internet :: WWW/HTTP
-
 [options]
-python_requires = >=3.9
-packages =
-  aiohttp
-  aiohttp._websocket
-# https://setuptools.readthedocs.io/en/latest/setuptools.html#setting-the-zip-safe-flag
-zip_safe = False
-include_package_data = True
-
 install_requires =
   aiohappyeyeballs >= 2.5.0
   aiosignal >= 1.4.0
@@ -60,27 +9,12 @@ install_requires =
   propcache >= 0.2.0
   yarl >= 1.17.0, < 2.0
 
-[options.exclude_package_data]
-* =
-    *.c
-    *.h
-
 [options.extras_require]
 speedups =
   aiodns >= 3.3.0
   Brotli; platform_python_implementation == 'CPython'
   brotlicffi; platform_python_implementation != 'CPython'
-
-[options.packages.find]
-exclude =
-  examples
-
-[options.package_data]
-# Ref:
-# https://setuptools.readthedocs.io/en/latest/setuptools.html#options
-# (see notes for the asterisk/`*` meaning)
-* =
-    *.so
+  backports.zstd; platform_python_implementation == 'CPython' and python_version < "3.14"
 
 [pep8]
 max-line-length=79
diff --git tests/conftest.py tests/conftest.py
index 54e0d3f21a7..6d91d08f10a 100644
--- tests/conftest.py
+++ tests/conftest.py
@@ -1,21 +1,28 @@
 import asyncio
 import base64
 import os
+import platform
 import socket
 import ssl
 import sys
-import zlib
+import time
+from collections.abc import AsyncIterator, Callable, Iterator
+from concurrent.futures import Future, ThreadPoolExecutor
 from hashlib import md5, sha1, sha256
 from pathlib import Path
 from tempfile import TemporaryDirectory
-from typing import Any, AsyncIterator, Generator, Iterator
+from typing import Any, Generator
 from unittest import mock
 from uuid import uuid4
 
-import isal.isal_zlib
 import pytest
-import zlib_ng.zlib_ng
-from blockbuster import blockbuster_ctx
+
+try:
+    from blockbuster import blockbuster_ctx
+
+    HAS_BLOCKBUSTER = True
+except ImportError:  # For downstreams only  # pragma: no cover
+    HAS_BLOCKBUSTER = False
 
 from aiohttp import payload
 from aiohttp.client_proto import ResponseHandler
@@ -48,7 +55,7 @@
 IS_LINUX = sys.platform.startswith("linux")
 
 
-@pytest.fixture(autouse=True)
+@pytest.fixture(autouse=HAS_BLOCKBUSTER)
 def blockbuster(request: pytest.FixtureRequest) -> Iterator[None]:
     # Allow selectively disabling blockbuster for specific tests
     # using the @pytest.mark.skip_blockbuster marker.
@@ -66,10 +73,6 @@ def blockbuster(request: pytest.FixtureRequest) -> Iterator[None]:
     with blockbuster_ctx(
         "aiohttp", excluded_modules=["aiohttp.pytest_plugin", "aiohttp.test_utils"]
     ) as bb:
-        # TODO: Fix blocking call in ClientRequest's constructor.
-        # https://github.com/aio-libs/aiohttp/issues/10435
-        for func in ["io.TextIOWrapper.read", "os.stat"]:
-            bb.functions[func].can_block_in("aiohttp/client_reqrep.py", "update_auth")
         for func in [
             "os.getcwd",
             "os.readlink",
@@ -251,6 +254,8 @@ def selector_loop() -> Iterator[asyncio.AbstractEventLoop]:
 
 @pytest.fixture
 def uvloop_loop() -> Iterator[asyncio.AbstractEventLoop]:
+    if uvloop is None:
+        pytest.skip("uvloop is not installed")
     factory = uvloop.new_event_loop
     with loop_context(factory) as _loop:
         asyncio.set_event_loop(_loop)
@@ -280,7 +285,52 @@ def netrc_contents(
 
 
 @pytest.fixture
-def start_connection():
+def netrc_default_contents(monkeypatch: pytest.MonkeyPatch, tmp_path: Path) -> Path:
+    """Create a temporary netrc file with default test credentials and set NETRC env var."""
+    netrc_file = tmp_path / ".netrc"
+    netrc_file.write_text("default login netrc_user password netrc_pass\n")
+
+    monkeypatch.setenv("NETRC", str(netrc_file))
+
+    return netrc_file
+
+
+@pytest.fixture
+def no_netrc(monkeypatch: pytest.MonkeyPatch) -> None:
+    """Ensure NETRC environment variable is not set."""
+    monkeypatch.delenv("NETRC", raising=False)
+
+
+@pytest.fixture
+def netrc_other_host(monkeypatch: pytest.MonkeyPatch, tmp_path: Path) -> Path:
+    """Create a temporary netrc file with credentials for a different host and set NETRC env var."""
+    netrc_file = tmp_path / ".netrc"
+    netrc_file.write_text("machine other.example.com login user password pass\n")
+
+    monkeypatch.setenv("NETRC", str(netrc_file))
+
+    return netrc_file
+
+
+@pytest.fixture
+def netrc_home_directory(monkeypatch: pytest.MonkeyPatch, tmp_path: Path) -> Path:
+    """Create a netrc file in a mocked home directory without setting NETRC env var."""
+    home_dir = tmp_path / "home"
+    home_dir.mkdir()
+    netrc_filename = "_netrc" if platform.system() == "Windows" else ".netrc"
+    netrc_file = home_dir / netrc_filename
+    netrc_file.write_text("default login netrc_user password netrc_pass\n")
+
+    home_env_var = "USERPROFILE" if platform.system() == "Windows" else "HOME"
+    monkeypatch.setenv(home_env_var, str(home_dir))
+    # Ensure NETRC env var is not set
+    monkeypatch.delenv("NETRC", raising=False)
+
+    return netrc_file
+
+
+@pytest.fixture
+def start_connection() -> Iterator[mock.Mock]:
     with mock.patch(
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
@@ -331,13 +381,13 @@ def unused_port_socket() -> Generator[socket.socket, None, None]:
         s.close()
 
 
-@pytest.fixture(params=[zlib, zlib_ng.zlib_ng, isal.isal_zlib])
+@pytest.fixture(params=["zlib", "zlib_ng.zlib_ng", "isal.isal_zlib"])
 def parametrize_zlib_backend(
     request: pytest.FixtureRequest,
 ) -> Generator[None, None, None]:
     original_backend: ZLibBackendProtocol = ZLibBackend._zlib_backend
-    set_zlib_backend(request.param)
-
+    backend = pytest.importorskip(request.param)
+    set_zlib_backend(backend)
     yield
 
     set_zlib_backend(original_backend)
@@ -354,3 +404,27 @@ async def cleanup_payload_pending_file_closes(
         loop_futures = [f for f in payload._CLOSE_FUTURES if f.get_loop() is loop]
         if loop_futures:
             await asyncio.gather(*loop_futures, return_exceptions=True)
+
+
+@pytest.fixture
+def slow_executor() -> Iterator[ThreadPoolExecutor]:
+    """Executor that adds delay to simulate slow operations.
+
+    Useful for testing cancellation and race conditions in compression tests.
+    """
+
+    class SlowExecutor(ThreadPoolExecutor):
+        """Executor that adds delay to operations."""
+
+        def submit(
+            self, fn: Callable[..., Any], /, *args: Any, **kwargs: Any
+        ) -> Future[Any]:
+            def slow_fn(*args: Any, **kwargs: Any) -> Any:
+                time.sleep(0.05)  # Add delay to simulate slow operation
+                return fn(*args, **kwargs)
+
+            return super().submit(slow_fn, *args, **kwargs)
+
+    executor = SlowExecutor(max_workers=10)
+    yield executor
+    executor.shutdown(wait=True)
diff --git tests/test_client_functional.py tests/test_client_functional.py
index eb0c822a4be..34cc69f88a7 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -77,8 +77,25 @@ def fname(here):
     return here / "conftest.py"
 
 
-async def test_keepalive_two_requests_success(aiohttp_client) -> None:
-    async def handler(request):
+@pytest.fixture
+def headers_echo_client(
+    aiohttp_client: AiohttpClient,
+) -> Callable[..., Awaitable[TestClient[web.Request, web.Application]]]:
+    """Create a client with an app that echoes request headers as JSON."""
+
+    async def factory(**kwargs: Any) -> TestClient[web.Request, web.Application]:
+        async def handler(request: web.Request) -> web.Response:
+            return web.json_response({"headers": dict(request.headers)})
+
+        app = web.Application()
+        app.router.add_get("/", handler)
+        return await aiohttp_client(app, **kwargs)
+
+    return factory
+
+
+async def test_keepalive_two_requests_success(aiohttp_client: AiohttpClient) -> None:
+    async def handler(request: web.Request) -> web.Response:
         body = await request.read()
         assert b"" == body
         return web.Response(body=b"OK")
@@ -3712,14 +3729,12 @@ async def handler(request):
     assert not ctx._coro.cr_running
 
 
-async def test_session_auth(aiohttp_client) -> None:
-    async def handler(request):
-        return web.json_response({"headers": dict(request.headers)})
-
-    app = web.Application()
-    app.router.add_get("/", handler)
-
-    client = await aiohttp_client(app, auth=aiohttp.BasicAuth("login", "pass"))
+async def test_session_auth(
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    client = await headers_echo_client(auth=aiohttp.BasicAuth("login", "pass"))
 
     r = await client.get("/")
     assert r.status == 200
@@ -3727,14 +3742,12 @@ async def handler(request):
     assert content["headers"]["Authorization"] == "Basic bG9naW46cGFzcw=="
 
 
-async def test_session_auth_override(aiohttp_client) -> None:
-    async def handler(request):
-        return web.json_response({"headers": dict(request.headers)})
-
-    app = web.Application()
-    app.router.add_get("/", handler)
-
-    client = await aiohttp_client(app, auth=aiohttp.BasicAuth("login", "pass"))
+async def test_session_auth_override(
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    client = await headers_echo_client(auth=aiohttp.BasicAuth("login", "pass"))
 
     r = await client.get("/", auth=aiohttp.BasicAuth("other_login", "pass"))
     assert r.status == 200
@@ -3756,14 +3769,77 @@ async def handler(request):
         await client.get("/", headers=headers)
 
 
-async def test_session_headers(aiohttp_client) -> None:
-    async def handler(request):
-        return web.json_response({"headers": dict(request.headers)})
+@pytest.mark.usefixtures("netrc_default_contents")
+async def test_netrc_auth_from_env(  # type: ignore[misc]
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    """Test that netrc authentication works when NETRC env var is set and trust_env=True."""
+    client = await headers_echo_client(trust_env=True)
+    async with client.get("/") as r:
+        assert r.status == 200
+        content = await r.json()
+    # Base64 encoded "netrc_user:netrc_pass" is "bmV0cmNfdXNlcjpuZXRyY19wYXNz"
+    assert content["headers"]["Authorization"] == "Basic bmV0cmNfdXNlcjpuZXRyY19wYXNz"
 
-    app = web.Application()
-    app.router.add_get("/", handler)
 
-    client = await aiohttp_client(app, headers={"X-Real-IP": "192.168.0.1"})
+@pytest.mark.usefixtures("no_netrc")
+async def test_netrc_auth_skipped_without_netrc_file(  # type: ignore[misc]
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    """Test that netrc authentication is skipped when no netrc file exists."""
+    client = await headers_echo_client(trust_env=True)
+    async with client.get("/") as r:
+        assert r.status == 200
+        content = await r.json()
+    # No Authorization header should be present
+    assert "Authorization" not in content["headers"]
+
+
+@pytest.mark.usefixtures("netrc_home_directory")
+async def test_netrc_auth_from_home_directory(  # type: ignore[misc]
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    """Test that netrc authentication works from default ~/.netrc without NETRC env var."""
+    client = await headers_echo_client(trust_env=True)
+    async with client.get("/") as r:
+        assert r.status == 200
+        content = await r.json()
+    assert content["headers"]["Authorization"] == "Basic bmV0cmNfdXNlcjpuZXRyY19wYXNz"
+
+
+@pytest.mark.usefixtures("netrc_default_contents")
+async def test_netrc_auth_overridden_by_explicit_auth(  # type: ignore[misc]
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    """Test that explicit auth parameter overrides netrc authentication."""
+    client = await headers_echo_client(trust_env=True)
+    # Make request with explicit auth (should override netrc)
+    async with client.get(
+        "/", auth=aiohttp.BasicAuth("explicit_user", "explicit_pass")
+    ) as r:
+        assert r.status == 200
+        content = await r.json()
+    # Base64 encoded "explicit_user:explicit_pass" is "ZXhwbGljaXRfdXNlcjpleHBsaWNpdF9wYXNz"
+    assert (
+        content["headers"]["Authorization"]
+        == "Basic ZXhwbGljaXRfdXNlcjpleHBsaWNpdF9wYXNz"
+    )
+
+
+async def test_session_headers(
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    client = await headers_echo_client(headers={"X-Real-IP": "192.168.0.1"})
 
     r = await client.get("/")
     assert r.status == 200
@@ -3771,15 +3847,13 @@ async def handler(request):
     assert content["headers"]["X-Real-IP"] == "192.168.0.1"
 
 
-async def test_session_headers_merge(aiohttp_client) -> None:
-    async def handler(request):
-        return web.json_response({"headers": dict(request.headers)})
-
-    app = web.Application()
-    app.router.add_get("/", handler)
-
-    client = await aiohttp_client(
-        app, headers=[("X-Real-IP", "192.168.0.1"), ("X-Sent-By", "requests")]
+async def test_session_headers_merge(
+    headers_echo_client: Callable[
+        ..., Awaitable[TestClient[web.Request, web.Application]]
+    ],
+) -> None:
+    client = await headers_echo_client(
+        headers=[("X-Real-IP", "192.168.0.1"), ("X-Sent-By", "requests")]
     )
 
     r = await client.get("/", headers={"X-Sent-By": "aiohttp"})
@@ -5515,3 +5589,46 @@ async def handler(request: web.Request) -> web.Response:
 
     finally:
         await asyncio.to_thread(f.close)
+
+
+async def test_stream_reader_total_raw_bytes(aiohttp_client: AiohttpClient) -> None:
+    """Test whether StreamReader.total_raw_bytes returns the number of bytes downloaded"""
+    source_data = b"@dKal^pH>1h|YW1:c2J$" * 4096
+
+    async def handler(request: web.Request) -> web.Response:
+        response = web.Response(body=source_data)
+        response.enable_compression()
+        return response
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+
+    client = await aiohttp_client(app)
+
+    # Check for decompressed data
+    async with client.get(
+        "/", headers={"Accept-Encoding": "gzip"}, auto_decompress=True
+    ) as resp:
+        assert resp.headers["Content-Encoding"] == "gzip"
+        assert int(resp.headers["Content-Length"]) < len(source_data)
+        data = await resp.content.read()
+        assert len(data) == len(source_data)
+        assert resp.content.total_raw_bytes == int(resp.headers["Content-Length"])
+
+    # Check for compressed data
+    async with client.get(
+        "/", headers={"Accept-Encoding": "gzip"}, auto_decompress=False
+    ) as resp:
+        assert resp.headers["Content-Encoding"] == "gzip"
+        data = await resp.content.read()
+        assert resp.content.total_raw_bytes == len(data)
+        assert resp.content.total_raw_bytes == int(resp.headers["Content-Length"])
+
+    # Check for non-compressed data
+    async with client.get(
+        "/", headers={"Accept-Encoding": "identity"}, auto_decompress=True
+    ) as resp:
+        assert "Content-Encoding" not in resp.headers
+        data = await resp.content.read()
+        assert resp.content.total_raw_bytes == len(data)
+        assert resp.content.total_raw_bytes == int(resp.headers["Content-Length"])
diff --git tests/test_client_request.py tests/test_client_request.py
index 2af540599f8..db25f6ff910 100644
--- tests/test_client_request.py
+++ tests/test_client_request.py
@@ -15,7 +15,7 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import BaseConnector, hdrs, helpers, payload
+from aiohttp import BaseConnector, hdrs, payload
 from aiohttp.abc import AbstractStreamWriter
 from aiohttp.client_exceptions import Client,ConnectionError
 from aiohttp.client_reqrep import (
@@ -355,7 +355,7 @@ def test_headers(make_request) -> None:
 
     assert hdrs.CONTENT_TYPE in req.headers
     assert req.headers[hdrs.CONTENT_TYPE] == "text/plain"
-    assert req.headers[hdrs.ACCEPT_ENCODING] == "gzip, deflate, br"
+    assert "gzip" in req.headers[hdrs.ACCEPT_ENCODING]
 
 
 def test_headers_list(make_request) -> None:
@@ -1529,35 +1529,20 @@ def test_loose_cookies_types(loop) -> None:
 
 
 @pytest.mark.parametrize(
-    "has_brotli,expected",
+    "has_brotli,has_zstd,expected",
     [
-        (False, "gzip, deflate"),
-        (True, "gzip, deflate, br"),
+        (False, False, "gzip, deflate"),
+        (True, False, "gzip, deflate, br"),
+        (False, True, "gzip, deflate, zstd"),
+        (True, True, "gzip, deflate, br, zstd"),
     ],
 )
-def test_gen_default_accept_encoding(has_brotli, expected) -> None:
+def test_gen_default_accept_encoding(
+    has_brotli: bool, has_zstd: bool, expected: str
+) -> None:
     with mock.patch("aiohttp.client_reqrep.HAS_BROTLI", has_brotli):
-        assert _gen_default_accept_encoding() == expected
-
-
-@pytest.mark.parametrize(
-    ("netrc_contents", "expected_auth"),
-    [
-        (
-            "machine example.com login username password pass\n",
-            helpers.BasicAuth("username", "pass"),
-        )
-    ],
-    indirect=("netrc_contents",),
-)
-@pytest.mark.usefixtures("netrc_contents")
-def test_basicauth_from_netrc_present(
-    make_request: Any,
-    expected_auth: Optional[helpers.BasicAuth],
-):
-    """Test appropriate Authorization header is sent when netrc is not empty."""
-    req = make_request("get", "http://example.com", trust_env=True)
-    assert req.headers[hdrs.AUTHORIZATION] == expected_auth.encode()
+        with mock.patch("aiohttp.client_reqrep.HAS_ZSTD", has_zstd):
+            assert _gen_default_accept_encoding() == expected
 
 
 @pytest.mark.parametrize(
diff --git tests/test_client_response.py tests/test_client_response.py
index 2d70feaf06d..a5061e08fe1 100644
--- tests/test_client_response.py
+++ tests/test_client_response.py
@@ -15,6 +15,7 @@
 from aiohttp import ClientSession, hdrs, http
 from aiohttp.client_reqrep import ClientResponse, RequestInfo
 from aiohttp.helpers import TimerNoop
+from aiohttp.multipart import BadContentDispositionHeader
 
 
 class WriterMock(mock.AsyncMock):
@@ -965,6 +966,34 @@ def test_content_disposition_no_parameters() -> None:
     assert {} == response.content_disposition.parameters
 
 
+@pytest.mark.parametrize(
+    "content_disposition",
+    (
+        'attachment; filename="archive.tar.gz";',
+        'attachment;; filename="archive.tar.gz"',
+    ),
+)
+def test_content_disposition_empty_parts(content_disposition: str) -> None:
+    response = ClientResponse(
+        "get",
+        URL("http://def-cl-resp.org"),
+        request_info=mock.Mock(),
+        writer=WriterMock(),
+        continue100=None,
+        timer=TimerNoop(),
+        traces=[],
+        loop=mock.Mock(),
+        session=mock.Mock(),
+    )
+    h = {"Content-Disposition": content_disposition}
+    response._headers = CIMultiDictProxy(CIMultiDict(h))
+
+    with pytest.warns(BadContentDispositionHeader):
+        assert response.content_disposition is not None
+        assert "attachment" == response.content_disposition.type
+        assert "archive.tar.gz" == response.content_disposition.filename
+
+
 def test_content_disposition_no_header() -> None:
     response = ClientResponse(
         "get",
diff --git tests/test_client_session.py tests/test_client_session.py
index c296c9670b0..5d017c8d0ba 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -25,6 +25,7 @@
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
 from aiohttp.pytest_plugin import AiohttpServer
+from aiohttp.test_utils import TestServer
 from aiohttp.tracing import Trace
 
 
@@ -75,7 +76,24 @@ def params():
     )
 
 
-async def test_close_coro(create_session) -> None:
+@pytest.fixture
+async def auth_server(aiohttp_server: AiohttpServer) -> TestServer:
+    """Create a server with an auth handler that returns auth header or 'no_auth'."""
+
+    async def handler(request: web.Request) -> web.Response:
+        auth_header = request.headers.get(hdrs.AUTHORIZATION)
+        if auth_header:
+            return web.Response(text=f"auth:{auth_header}")
+        return web.Response(text="no_auth")
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    return await aiohttp_server(app)
+
+
+async def test_close_coro(
+    create_session: Callable[..., Awaitable[ClientSession]],
+) -> None:
     session = await create_session()
     await session.close()
 
@@ -1321,3 +1339,75 @@ async def test_properties(
     value = uuid4()
     setattr(session, inner_name, value)
     assert value == getattr(session, outer_name)
+
+
+@pytest.mark.usefixtures("netrc_default_contents")
+async def test_netrc_auth_with_trust_env(auth_server: TestServer) -> None:
+    """Test that netrc authentication works with ClientSession when NETRC env var is set."""
+    async with (
+        ClientSession(trust_env=True) as session,
+        session.get(auth_server.make_url("/")) as resp,
+    ):
+        text = await resp.text()
+        # Base64 encoded "netrc_user:netrc_pass" is "bmV0cmNfdXNlcjpuZXRyY19wYXNz"
+        assert text == "auth:Basic bmV0cmNfdXNlcjpuZXRyY19wYXNz"
+
+
+@pytest.mark.usefixtures("netrc_default_contents")
+async def test_netrc_auth_skipped_without_trust_env(auth_server: TestServer) -> None:
+    """Test that netrc authentication is skipped when trust_env=False."""
+    async with (
+        ClientSession(trust_env=False) as session,
+        session.get(auth_server.make_url("/")) as resp,
+    ):
+        text = await resp.text()
+        assert text == "no_auth"
+
+
+@pytest.mark.usefixtures("no_netrc")
+async def test_netrc_auth_skipped_without_netrc_file(auth_server: TestServer) -> None:
+    """Test that netrc authentication is skipped when no netrc file exists."""
+    async with (
+        ClientSession(trust_env=True) as session,
+        session.get(auth_server.make_url("/")) as resp,
+    ):
+        text = await resp.text()
+        assert text == "no_auth"
+
+
+@pytest.mark.usefixtures("netrc_home_directory")
+async def test_netrc_auth_from_home_directory(auth_server: TestServer) -> None:
+    """Test that netrc authentication works from default ~/.netrc location without NETRC env var."""
+    async with (
+        ClientSession(trust_env=True) as session,
+        session.get(auth_server.make_url("/")) as resp,
+    ):
+        text = await resp.text()
+        assert text == "auth:Basic bmV0cmNfdXNlcjpuZXRyY19wYXNz"
+
+
+@pytest.mark.usefixtures("netrc_default_contents")
+async def test_netrc_auth_overridden_by_explicit_auth(auth_server: TestServer) -> None:
+    """Test that explicit auth parameter overrides netrc authentication."""
+    async with (
+        ClientSession(trust_env=True) as session,
+        session.get(
+            auth_server.make_url("/"),
+            auth=aiohttp.BasicAuth("explicit_user", "explicit_pass"),
+        ) as resp,
+    ):
+        text = await resp.text()
+        # Base64 encoded "explicit_user:explicit_pass" is "ZXhwbGljaXRfdXNlcjpleHBsaWNpdF9wYXNz"
+        assert text == "auth:Basic ZXhwbGljaXRfdXNlcjpleHBsaWNpdF9wYXNz"
+
+
+@pytest.mark.usefixtures("netrc_other_host")
+async def test_netrc_auth_host_not_in_netrc(auth_server: TestServer) -> None:
+    """Test that netrc lookup returns None when host is not in netrc file."""
+    async with (
+        ClientSession(trust_env=True) as session,
+        session.get(auth_server.make_url("/")) as resp,
+    ):
+        text = await resp.text()
+        # Should not have auth since the host is not in netrc
+        assert text == "no_auth"
diff --git tests/test_connector.py tests/test_connector.py
index 9932dee581b..9048bf61e2f 100644
--- tests/test_connector.py
+++ tests/test_connector.py
@@ -492,10 +492,11 @@ async def test_release(loop, key) -> None:
     conn._acquired_per_host[key].add(proto)
 
     conn._release(key, proto)
+    loop_time = loop.time()
     assert conn._release_waiter.called
     assert conn._cleanup_handle is not None
     assert conn._conns[key][0][0] == proto
-    assert conn._conns[key][0][1] == pytest.approx(loop.time(), abs=0.1)
+    assert conn._conns[key][0][1] == pytest.approx(loop_time, abs=0.1)
     assert not conn._cleanup_closed_transports
     await conn.close()
 
@@ -1344,7 +1345,6 @@ async def test_tcp_connector_dns_throttle_requests_exception_spread(loop) -> Non
 async def test_tcp_connector_dns_throttle_requests_cancelled_when_close(
     loop, dns_response
 ):
-
     with mock.patch("aiohttp.connector.DefaultResolver") as m_resolver:
         conn = aiohttp.TCPConnector(loop=loop, use_dns_cache=True, ttl_dns_cache=10)
         m_resolver().resolve.return_value = dns_response()
@@ -1375,7 +1375,6 @@ async def coro():
 async def test_tcp_connector_cancel_dns_error_captured(
     loop, dns_response_error
 ) -> None:
-
     exception_handler_called = False
 
     def exception_handler(loop, context):
@@ -1606,10 +1605,11 @@ async def test_release_not_started(loop) -> None:
     key = 1
     conn._acquired.add(proto)
     conn._release(key, proto)
+    loop_time = loop.time()
     # assert conn._conns == {1: [(proto, 10)]}
     rec = conn._conns[1]
     assert rec[0][0] == proto
-    assert rec[0][1] == pytest.approx(loop.time(), abs=0.05)
+    assert rec[0][1] == pytest.approx(loop_time, abs=0.05)
     assert not proto.close.called
     await conn.close()
 
@@ -3289,7 +3289,6 @@ async def f():
 
 
 async def test_connect_with_limit_cancelled(loop) -> None:
-
     proto = create_mocked_conn()
     proto.is_connected.return_value = True
 
diff --git tests/test_cookie_helpers.py tests/test_cookie_helpers.py
index 6deef6544c2..577e3156560 100644
--- tests/test_cookie_helpers.py
+++ tests/test_cookie_helpers.py
@@ -1,5 +1,6 @@
 """Tests for internal cookie helper functions."""
 
+import sys
 from http.cookies import (
     CookieError,
     Morsel,
@@ -427,6 +428,10 @@ def test_parse_set_cookie_headers_boolean_attrs() -> None:
         assert morsel.get("httponly") is True, f"{name} should have httponly=True"
 
 
+@pytest.mark.skipif(
+    sys.version_info < (3, 14),
+    reason="Partitioned cookies support requires Python 3.14+",
+)
 def test_parse_set_cookie_headers_boolean_attrs_with_partitioned() -> None:
     """Test that boolean attributes including partitioned work correctly."""
     # Test secure attribute variations
@@ -482,6 +487,10 @@ def test_parse_set_cookie_headers_encoded_values() -> None:
     assert result[2][1].value == "%21%40%23%24%25%5E%26*%28%29"
 
 
+@pytest.mark.skipif(
+    sys.version_info < (3, 14),
+    reason="Partitioned cookies support requires Python 3.14+",
+)
 def test_parse_set_cookie_headers_partitioned() -> None:
     """
     Test that parse_set_cookie_headers handles partitioned attribute correctly.
@@ -518,6 +527,10 @@ def test_parse_set_cookie_headers_partitioned() -> None:
     assert result[4][1].get("path") == "/"
 
 
+@pytest.mark.skipif(
+    sys.version_info < (3, 14),
+    reason="Partitioned cookies support requires Python 3.14+",
+)
 def test_parse_set_cookie_headers_partitioned_case_insensitive() -> None:
     """Test that partitioned attribute is recognized case-insensitively."""
     headers = [
@@ -555,6 +568,26 @@ def test_parse_set_cookie_headers_partitioned_not_set() -> None:
 
 
 # Tests that don't require partitioned support in SimpleCookie
+@pytest.mark.skipif(
+    sys.version_info >= (3, 14),
+    reason="Python 3.14+ has built-in partitioned cookie support",
+)
+def test_parse_set_cookie_headers_partitioned_not_set_if_no_support() -> None:
+    headers = [
+        "cookie1=value1; Partitioned",
+        "cookie2=value2; Partitioned=",
+        "cookie3=value3; Partitioned=true",
+    ]
+
+    result = parse_set_cookie_headers(headers)
+
+    assert len(result) == 3
+    for i, (_, morsel) in enumerate(result):
+        assert (
+            morsel.get("partitioned") is None
+        ), f"Cookie {i+1} should not have partitioned flag"
+
+
 def test_parse_set_c,ookie_headers_partitioned_with_other_attrs_manual() -> None:
     """
     Test parsing logic for partitioned cookies combined with all other attributes.
@@ -1104,6 +1137,32 @@ def test_parse_cookie_header_empty() -> None:
     assert parse_cookie_header("   ") == []
 
 
+def test_parse_cookie_gstate_header() -> None:
+    header = (
+        "_ga=ga; "
+        "ajs_anonymous_id=0anonymous; "
+        "analytics_session_id=session; "
+        "cookies-analytics=true; "
+        "cookies-functional=true; "
+        "cookies-marketing=true; "
+        "cookies-preferences=true; "
+        'g_state={"i_l":0,"i_ll":12345,"i_b":"blah"}; '
+        "analytics_session_id.last_access=1760128947692; "
+        "landingPageURLRaw=landingPageURLRaw; "
+        "landingPageURL=landingPageURL; "
+        "referrerPageURLRaw=; "
+        "referrerPageURL=; "
+        "formURLRaw=formURLRaw; "
+        "formURL=formURL; "
+        "fbnAuthExpressCheckout=fbnAuthExpressCheckout; "
+        "is_express_checkout=1; "
+    )
+
+    result = parse_cookie_header(header)
+    assert result[7][0] == "g_state"
+    assert result[8][0] == "analytics_session_id.last_access"
+
+
 def test_parse_cookie_header_quoted_values() -> None:
     """Test parse_cookie_header handles quoted values correctly."""
     header = 'name="quoted value"; session="with;semicolon"; data="with\\"escaped\\""'
@@ -1384,6 +1443,142 @@ def test_parse_cookie_header_illegal_names(caplog: pytest.LogCaptureFixture) ->
     assert "Can not load cookie: Illegal cookie name 'invalid,cookie'" in caplog.text
 
 
+def test_parse_cookie_header_large_value() -> None:
+    """Test that large cookie values don't cause DoS."""
+    large_value = "A" * 8192
+    header = f"normal=value; large={large_value}; after=cookie"
+
+    result = parse_cookie_header(header)
+    cookie_names = [name for name, _ in result]
+
+    assert len(result) == 3
+    assert "normal" in cookie_names
+    assert "large" in cookie_names
+    assert "after" in cookie_names
+
+    large_cookie = next(morsel for name, morsel in result if name == "large")
+    assert len(large_cookie.value) == 8192
+
+
+def test_parse_cookie_header_multiple_equals() -> None:
+    """Test handling of multiple equals signs in cookie values."""
+    header = "session=abc123; data=key1=val1&key2=val2; token=xyz"
+
+    result = parse_cookie_header(header)
+
+    assert len(result) == 3
+
+    name1, morsel1 = result[0]
+    assert name1 == "session"
+    assert morsel1.value == "abc123"
+
+    name2, morsel2 = result[1]
+    assert name2 == "data"
+    assert morsel2.value == "key1=val1&key2=val2"
+
+    name3, morsel3 = result[2]
+    assert name3 == "token"
+    assert morsel3.value == "xyz"
+
+
+def test_parse_cookie_header_fallback_preserves_subsequent_cookies() -> None:
+    """Test that fallback parser doesn't lose subsequent cookies."""
+    header = 'normal=value; malformed={"json":"value"}; after1=cookie1; after2=cookie2'
+
+    result = parse_cookie_header(header)
+    cookie_names = [name for name, _ in result]
+
+    assert len(result) == 4
+    assert cookie_names == ["normal", "malformed", "after1", "after2"]
+
+    name1, morsel1 = result[0]
+    assert morsel1.value == "value"
+
+    name2, morsel2 = result[1]
+    assert morsel2.value == '{"json":"value"}'
+
+    name3, morsel3 = result[2]
+    assert morsel3.value == "cookie1"
+
+    name4, morsel4 = result[3]
+    assert morsel4.value == "cookie2"
+
+
+def test_parse_cookie_header_whitespace_in_fallback() -> None:
+    """Test that fallback parser handles whitespace correctly."""
+    header = "a=1; b = 2 ; c= 3; d =4"
+
+    result = parse_cookie_header(header)
+
+    assert len(result) == 4
+    for name, morsel in result:
+        assert name in ("a", "b", "c", "d")
+        assert morsel.value in ("1", "2", "3", "4")
+
+
+def test_parse_cookie_header_empty_value_in_fallback() -> None:
+    """Test that fallback handles empty values correctly."""
+    header = "normal=value; empty=; another=test"
+
+    result = parse_cookie_header(header)
+
+    assert len(result) == 3
+
+    name1, morsel1 = result[0]
+    assert name1 == "normal"
+    assert morsel1.value == "value"
+
+    name2, morsel2 = result[1]
+    assert name2 == "empty"
+    assert morsel2.value == ""
+
+    name3, morsel3 = result[2]
+    assert name3 == "another"
+    assert morsel3.value == "test"
+
+
+def test_parse_cookie_header_invalid_name_in_fallback(
+    caplog: pytest.LogCaptureFixture,
+) -> None:
+    """Test that fallback parser rejects cookies with invalid names."""
+    header = 'normal=value; invalid,name={"x":"y"}; another=test'
+
+    result = parse_cookie_header(header)
+
+    assert len(result) == 2
+
+    name1, morsel1 = result[0]
+    assert name1 == "normal"
+    assert morsel1.value == "value"
+
+    name2, morsel2 = result[1]
+    assert name2 == "another"
+    assert morsel2.value == "test"
+
+    assert "Can not load cookie: Illegal cookie name 'invalid,name'" in caplog.text
+
+
+def test_parse_cookie_header_empty_key_in_fallback(
+    caplog: pytest.LogCaptureFixture,
+) -> None:
+    """Test that fallback parser logs warning for empty cookie names."""
+    header = 'normal=value; ={"malformed":"json"}; another=test'
+
+    result = parse_cookie_header(header)
+
+    assert len(result) == 2
+
+    name1, morsel1 = result[0]
+    assert name1 == "normal"
+    assert morsel1.value == "value"
+
+    name2, morsel2 = result[1]
+    assert name2 == "another"
+    assert morsel2.value == "test"
+
+    assert "Can not load cookie: Illegal cookie name ''" in caplog.text
+
+
 @pytest.mark.parametrize(
     ("input_str", "expected"),
     [
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index 15557085b4e..17e27e8f7ae 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -5,6 +5,7 @@
 import logging
 import pathlib
 import pickle
+import sys
 import unittest
 from http.cookies import BaseCookie, Morsel, SimpleCookie
 from operator import not_
@@ -199,6 +200,19 @@ def test_save_load(tmp_path, loop, cookies_to_send, cookies_to_receive) -> None:
     assert jar_test == cookies_to_receive
 
 
+def test_save_load_partitioned_cookies(tmp_path, loop) -> None:
+    file_path = pathlib.Path(str(tmp_path)) / "aiohttp.test2.cookie"
+    # export cookie jar
+    jar_save = CookieJar(loop=loop)
+    jar_save.update_cookies_from_headers(
+        ["session=cookie; Partitioned"], URL("https://example.com/")
+    )
+    jar_save.save(file_path=file_path)
+    jar_load = CookieJar(loop=loop)
+    jar_load.load(file_path=file_path)
+    assert jar_save._cookies == jar_load._cookies
+
+
 async def test_update_cookie_with_unicode_domain(loop) -> None:
     cookies = (
         "idna-domain-first=first; Domain=xn--9caa.com; Path=/;",
@@ -1094,7 +1108,11 @@ async def test_pickle_format(cookies_to_send) -> None:
         with file_path.open("wb") as f:
             pickle.dump(cookies, f, pickle.HIGHEST_PROTOCOL)
     """
-    pickled = b"\x80\x04\x95\xc8\x0b\x00\x00\x00\x00\x00\x00\x8c\x0bcollections\x94\x8c\x0bdefaultdict\x94\x93\x94\x8c\x0chttp.cookies\x94\x8c\x0cSimpleCookie\x94\x93\x94\x85\x94R\x94(\x8c\x00\x94h\x08\x86\x94h\x05)\x81\x94\x8c\rshared-cookie\x94h\x03\x8c\x06Morsel\x94\x93\x94)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94\x8c\x01/\x94\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\x08\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(\x8c\x03key\x94h\x0b\x8c\x05value\x94\x8c\x05first\x94\x8c\x0bcoded_value\x94h\x1cubs\x8c\x0bexample.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\rdomain-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\x1e\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah!h\x1b\x8c\x06second\x94h\x1dh-ub\x8c\x14dotted-domain-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x0bexample.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah.h\x1b\x8c\x05fifth\x94h\x1dh;ubu\x8c\x11test1.example.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x11subdomain1-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h<\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah?h\x1b\x8c\x05third\x94h\x1dhKubs\x8c\x11test2.example.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x11subdomain2-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94hL\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ahOh\x1b\x8c\x06fourth\x94h\x1dh[ubs\x8c\rdifferent.org\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x17different-domain-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\\\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah_h\x1b\x8c\x05sixth\x94h\x1dhkubs\x8c\nsecure.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\rsecure-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94hl\x8c\x07max-age\x94h\x08\x8c\x06secure\x94\x88\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ahoh\x1b\x8c\x07seventh\x94h\x1dh{ubs\x8c\x0cpathtest.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\x0eno-path-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h|\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\x7fh\x1b\x8c\x06eighth\x94h\x1dh\x8bub\x8c\x0cpath1-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x0cpathtest.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\x8ch\x1b\x8c\x05ninth\x94h\x1dh\x99ubu\x8c\x0cpathtest.com\x94\x8c\x04/one\x94\x86\x94h\x05)\x81\x94\x8c\x0cpath2-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x9b\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\x9a\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\x9eh\x1b\x8c\x05tenth\x94h\x1dh\xaaubs\x8c\x0cpathtest.com\x94\x8c\x08/one/two\x94\x86\x94h\x05)\x81\x94(\x8c\x0cpath3-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\xac\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xab\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xafh\x1b\x8c\x08eleventh\x94h\x1dh\xbbub\x8c\x0cpath4-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94\x8c\t/one/two/\x94\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x0cpathtest.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xbch\x1b\x8c\x07twelfth\x94h\x1dh\xcaubu\x8c\x0fexpirestest.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x0eexpires-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94\x8c\x1cTue, 1 Jan 2999 12:00:00 GMT\x94\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xcb\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xceh\x1b\x8c\nthirteenth\x94h\x1dh\xdbubs\x8c\x0emaxagetest.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x0emax-age-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xdc\x8c\x07max-age\x94\x8c\x0260\x94\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xdfh\x1b\x8c\nfourteenth\x94h\x1dh\xecubs\x8c\x12invalid-values.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\x16invalid-max-age-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xed\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xf0h\x1b\x8c\tfifteenth\x94h\x1dh\xfcub\x8c\x16invalid-expires-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x12invalid-values.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xfdh\x1b\x8c\tsixteenth\x94h\x1dj\n\x01\x00\x00ubuu."
+    if sys.version_info < (3, 14):
+        pickled = b"\x80\x04\x95\xc8\x0b\x00\x00\x00\x00\x00\x00\x8c\x0bcollections\x94\x8c\x0bdefaultdict\x94\x93\x94\x8c\x0chttp.cookies\x94\x8c\x0cSimpleCookie\x94\x93\x94\x85\x94R\x94(\x8c\x00\x94h\x08\x86\x94h\x05)\x81\x94\x8c\rshared-cookie\x94h\x03\x8c\x06Morsel\x94\x93\x94)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94\x8c\x01/\x94\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\x08\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(\x8c\x03key\x94h\x0b\x8c\x05value\x94\x8c\x05first\x94\x8c\x0bcoded_value\x94h\x1cubs\x8c\x0bexample.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\rdomain-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\x1e\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah!h\x1b\x8c\x06second\x94h\x1dh-ub\x8c\x14dotted-domain-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x0bexample.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah.h\x1b\x8c\x05fifth\x94h\x1dh;ubu\x8c\x11test1.example.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x11subdomain1-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h<\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah?h\x1b\x8c\x05third\x94h\x1dhKubs\x8c\x11test2.example.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x11subdomain2-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94hL\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ahOh\x1b\x8c\x06fourth\x94h\x1dh[ubs\x8c\rdifferent.org\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x17different-domain-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\\\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah_h\x1b\x8c\x05sixth\x94h\x1dhkubs\x8c\nsecure.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\rsecure-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94hl\x8c\x07max-age\x94h\x08\x8c\x06secure\x94\x88\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ahoh\x1b\x8c\x07seventh\x94h\x1dh{ubs\x8c\x0cpathtest.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\x0eno-path-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h|\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\x7fh\x1b\x8c\x06eighth\x94h\x1dh\x8bub\x8c\x0cpath1-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x0cpathtest.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\x8ch\x1b\x8c\x05ninth\x94h\x1dh\x99ubu\x8c\x0cpathtest.com\x94\x8c\x04/one\x94\x86\x94h\x05)\x81\x94\x8c\x0cpath2-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x9b\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\x9a\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\x9eh\x1b\x8c\x05tenth\x94h\x1dh\xaaubs\x8c\x0cpathtest.com\x94\x8c\x08/one/two\x94\x86\x94h\x05)\x81\x94(\x8c\x0cpath3-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\xac\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xab\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xafh\x1b\x8c\x08eleventh\x94h\x1dh\xbbub\x8c\x0cpath4-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94\x8c\t/one/two/\x94\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x0cpathtest.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xbch\x1b\x8c\x07twelfth\x94h\x1dh\xcaubu\x8c\x0fexpirestest.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x0eexpires-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94\x8c\x1cTue, 1 Jan 2999 12:00:00 GMT\x94\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xcb\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xceh\x1b\x8c\nthirteenth\x94h\x1dh\xdbubs\x8c\x0emaxagetest.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x0emax-age-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xdc\x8c\x07max-age\x94\x8c\x0260\x94\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xdfh\x1b\x8c\nfourteenth\x94h\x1dh\xecubs\x8c\x12invalid-values.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\x16invalid-max-age-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\xed\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xf0h\x1b\x8c\tfifteenth\x94h\x1dh\xfcub\x8c\x16invalid-expires-cookie\x94h\r)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94h\x11\x8c\x07comment\x94h\x08\x8c\x06domain\x94\x8c\x12invalid-values.com\x94\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08u}\x94(h\x1ah\xfdh\x1b\x8c\tsixteenth\x94h\x1dj\n\x01\x00\x00ubuu."
+    else:
+        pickled = b'\x80\x05\x95\x06\x08\x00\x00\x00\x00\x00\x00\x8c\x0bcollections\x94\x8c\x0bdefaultdict\x94\x93\x94\x8c\x0chttp.cookies\x94\x8c\x0cSimpleCookie\x94\x93\x94\x85\x94R\x94(\x8c\x00\x94h\x08\x86\x94h\x05)\x81\x94\x8c\rshared-cookie\x94h\x03\x8c\x06Morsel\x94\x93\x94)\x81\x94(\x8c\x07expires\x94h\x08\x8c\x04path\x94\x8c\x01/\x94\x8c\x07comment\x94h\x08\x8c\x06domain\x94h\x08\x8c\x07max-age\x94h\x08\x8c\x06secure\x94h\x08\x8c\x08httponly\x94h\x08\x8c\x07version\x94h\x08\x8c\x08samesite\x94h\x08\x8c\x0bpartitioned\x94h\x08u}\x94(\x8c\x03key\x94h\x0b\x8c\x05value\x94\x8c\x05first\x94\x8c\x0bcoded_value\x94h\x1dubs\x8c\x0bexample.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\rdomain-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13h\x1fh\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh"h\x1c\x8c\x06second\x94h\x1eh%ub\x8c\x14dotted-domain-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13\x8c\x0bexample.com\x94h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh&h\x1c\x8c\x05fifth\x94h\x1eh*ubu\x8c\x11test1.example.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x11subdomain1-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13h+h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh.h\x1c\x8c\x05third\x94h\x1eh1ubs\x8c\x11test2.example.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x11subdomain2-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13h2h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh5h\x1c\x8c\x06fourth\x94h\x1eh8ubs\x8c\rdifferent.org\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x17different-domain-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13h9h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh<h\x1c\x8c\x05sixth\x94h\x1eh?ubs\x8c\nsecure.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\rsecure-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13h@h\x14h\x08h\x15\x88h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bhCh\x1c\x8c\x07seventh\x94h\x1ehFubs\x8c\x0cpathtest.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\x0eno-path-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13hGh\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bhJh\x1c\x8c\x06eighth\x94h\x1ehMub\x8c\x0cpath1-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13\x8c\x0cpathtest.com\x94h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bhNh\x1c\x8c\x05ninth\x94h\x1ehRubu\x8c\x0cpathtest.com\x94\x8c\x04/one\x94\x86\x94h\x05)\x81\x94\x8c\x0cpath2-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10hTh\x12h\x08h\x13hSh\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bhWh\x1c\x8c\x05tenth\x94h\x1ehZubs\x8c\x0cpathtest.com\x94\x8c\x08/one/two\x94\x86\x94h\x05)\x81\x94(\x8c\x0cpath3-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\\h\x12h\x08h\x13h[h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh_h\x1c\x8c\x08eleventh\x94h\x1ehbub\x8c\x0cpath4-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10\x8c\t/one/two/\x94h\x12h\x08h\x13\x8c\x0cpathtest.com\x94h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bhch\x1c\x8c\x07twelfth\x94h\x1ehhubu\x8c\x0fexpirestest.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x0eexpires-cookie\x94h\r)\x81\x94(h\x0f\x8c\x1cTue, 1 Jan 2999 12:00:00 GMT\x94h\x10h\x11h\x12h\x08h\x13hih\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bhlh\x1c\x8c\nthirteenth\x94h\x1ehpubs\x8c\x0emaxagetest.com\x94h\x08\x86\x94h\x05)\x81\x94\x8c\x0emax-age-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13hqh\x14\x8c\x0260\x94h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bhth\x1c\x8c\nfourteenth\x94h\x1ehxubs\x8c\x12invalid-values.com\x94h\x08\x86\x94h\x05)\x81\x94(\x8c\x16invalid-max-age-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13hyh\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh|h\x1c\x8c\tfifteenth\x94h\x1eh\x7fub\x8c\x16invalid-expires-cookie\x94h\r)\x81\x94(h\x0fh\x08h\x10h\x11h\x12h\x08h\x13\x8c\x12invalid-values.com\x94h\x14h\x08h\x15h\x08h\x16h\x08h\x17h\x08h\x18h\x08h\x19h\x08u}\x94(h\x1bh\x80h\x1c\x8c\tsixteenth\x94h\x1eh\x84ubuu.'
+
     cookies = pickle.loads(pickled)
 
     cj = CookieJar()
diff --git tests/test_helpers.py tests/test_helpers.py
index a343cbdfedf..f4f28710123 100644
--- tests/test_helpers.py
+++ tests/test_helpers.py
@@ -6,11 +6,12 @@
 import weakref
 from math import ceil, modf
 from pathlib import Path
+from types import MappingProxyType
 from unittest import mock
 from urllib.request import getproxies_environment
 
 import pytest
-from multidict import MultiDict
+from multidict import MultiDict, MultiDictProxy
 from yarl import URL
 
 from aiohttp import helpers
@@ -65,6 +66,30 @@ def test_parse_mimetype(mimetype, expected) -> None:
     assert result == expected
 
 
+# ------------------- parse_content_type ------------------------------
+
+
+@pytest.mark.parametrize(
+    "content_type, expected",
+    [
+        (
+            "text/plain",
+            ("text/plain", MultiDictProxy(MultiDict())),
+        ),
+        (
+            "wrong",
+            ("application/octet-stream", MultiDictProxy(MultiDict())),
+        ),
+    ],
+)
+def test_parse_content_type(
+    content_type: str, expected: tuple[str, MappingProxyType[str, str]]
+) -> None:
+    result = helpers.parse_content_type(content_type)
+
+    assert result == expected
+
+
 # ------------------- guess_filename ----------------------------------
 
 
diff --git tests/test_http_parser.py tests/test_http_parser.py
index 385452c1cfb..7717e56f45e 100644
--- tests/test_http_parser.py
+++ tests/test_http_parser.py
@@ -2,6 +2,7 @@
 
 import asyncio
 import re
+import sys
 from contextlib import nullcontext
 from typing import Any, Dict, List
 from unittest import mock
@@ -33,6 +34,13 @@
 except ImportError:
     brotli = None
 
+try:
+    if sys.version_info >= (3, 14):
+        import compression.zstd as zstandard  # noqa: I900
+    else:
+        import backports.zstd as zstandard
+except ImportError:
+    zstandard = None  # type: ignore[assignment]
 
 REQUEST_PARSERS = [HttpRequestParserPy]
 RESPONSE_PARSERS = [HttpResponseParserPy]
@@ -558,7 +566,15 @@ def test_compression_brotli(parser) -> None:
     assert msg.compression == "br"
 
 
-def test_compression_unknown(parser) -> None:
+@pytest.mark.skipif(zstandard is None, reason="zstandard is not installed")
+def test_compression_zstd(parser: HttpRequestParser) -> None:
+    text = b"GET /test HTTP/1.1\r\ncontent-encoding: zstd\r\n\r\n"
+    messages, upgrade, tail = parser.feed_data(text)
+    msg = messages[0][0]
+    assert msg.compression == "zstd"
+
+
+def test_compression_unknown(parser: HttpRequestParser) -> None:
     text = b"GET /test HTTP/1.1\r\ncontent-encoding: compress\r\n\r\n"
     messages, upgrade, tail = parser.feed_data(text)
     msg = messages[0][0]
@@ -1768,10 +1784,24 @@ async def test_http_payload_brotli(self, protocol: BaseProtocol) -> None:
         assert b"brotli data" == out._buffer[0]
         assert out.is_eof()
 
+    @pytest.mark.skipif(zstandard is None, reason="zstandard is not installed")
+    async def test_http_payload_zstandard(self, protocol: BaseProtocol) -> None:
+        compressed = zstandard.compress(b"zstd data")
+        out = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+        p = HttpPayloadParser(
+            out,
+            length=len(compressed),
+            compression="zstd",
+            headers_parser=HeadersParser(),
+        )
+        p.feed_data(compressed)
+        assert b"zstd data" == out._buffer[0]
+        assert out.is_eof()
+
 
 class TestDeflateBuffer:
-    async def test_feed_data(self, stream) -> None:
-        buf = aiohttp.StreamReader(stream, 2**16, loop=asyncio.get_event_loop())
+    async def test_feed_data(self, protocol: BaseProtocol) -> None:
+        buf = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_event_loop())
         dbuf = DeflateBuffer(buf, "deflate")
 
         dbuf.decompressor = mock.Mock()
@@ -1781,8 +1811,8 @@ async def test_feed_data(self, stream) -> None:
         dbuf.feed_data(b"xxxx", 4)
         assert [b"line"] == list(buf._buffer)
 
-    async def test_feed_data_err(self, stream) -> None:
-        buf = aiohttp.StreamReader(stream, 2**16, loop=asyncio.get_event_loop())
+    async def test_feed_data_err(self, protocol: BaseProtocol) -> None:
+        buf = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_event_loop())
         dbuf = DeflateBuffer(buf, "deflate")
 
         exc = ValueError()
@@ -1838,6 +1868,18 @@ async def test_feed_eof_no_err_brotli(self, protocol: BaseProtocol) -> None:
         dbuf.feed_eof()
         assert [b"line"] == list(buf._buffer)
 
+    @pytest.mark.skipif(zstandard is None, reason="zstandard is not installed")
+    async def test_feed_eof_no_err_zstandard(self, protocol: BaseProtocol) -> None:
+        buf = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
+        dbuf = DeflateBuffer(buf, "zstd")
+
+        dbuf.decompressor = mock.Mock()
+        dbuf.decompressor.flush.return_value = b"line"
+        dbuf.decompressor.eof = False
+
+        dbuf.feed_eof()
+        assert [b"line"] == list(buf._buffer)
+
     async def test_empty_body(self, protocol: BaseProtocol) -> None:
         buf = aiohttp.StreamReader(protocol, 2**16, loop=asyncio.get_running_loop())
         dbuf = DeflateBuffer(buf, "deflate")
diff --git tests/test_loop.py tests/test_loop.py
index 944f17e69f0..a3520b457e4 100644
--- tests/test_loop.py
+++ tests/test_loop.py
@@ -33,9 +33,6 @@ async def on_startup_hook(self, app):
     async def test_on_startup_hook(self) -> None:
         self.assertTrue(self.on_startup_called)
 
-    def test_default_loop(self) -> None:
-        self.assertIs(self.loop, asyncio.get_event_loop_policy().get_event_loop())
-
 
 def test_default_loop(loop: asyncio.AbstractEventLoop) -> None:
     assert asyncio.get_event_loop() is loop
diff --git tests/test_run_app.py tests/test_run_app.py
index e269b452f86..c4c4d1784d9 100644
--- tests/test_run_app.py
+++ tests/test_run_app.py
@@ -621,14 +621,13 @@ def test_run_app_preexisting_inet6_socket(patched_loop) -> None:
 
 
 @pytest.mark.skipif(not hasattr(socket, "AF_UNIX"), reason="requires UNIX sockets")
-def test_run_app_preexisting_unix_socket(patched_loop, mocker) -> None:
+def test_run_app_preexisting_unix_socket(patched_loop, unix_sockname, mocker) -> None:
     app = web.Application()
 
-    sock_path = "/tmp/test_preexisting_sock1"
     sock = socket.socket(socket.AF_UNIX)
     with contextlib.closing(sock):
-        sock.bind(sock_path)
-        os.unlink(sock_path)
+        sock.bind(unix_sockname)
+        os.unlink(unix_sockname)
 
         printer = mock.Mock(wraps=stopper(patched_loop))
         web.run_app(app, sock=sock, print=printer, loop=patched_loop)
@@ -636,7 +635,7 @@ def test_run_app_preexisting_unix_socket(patched_loop, mocker) -> None:
         patched_loop.create_server.assert_called_with(
             mock.ANY, sock=sock, backlog=128, ssl=None
         )
-        assert f"http://unix:{sock_path}:" in printer.call_args[0][0]
+        assert f"http://unix:{unix_sockname}:" in printer.call_args[0][0]
 
 
 def test_run_app_multiple_preexisting_sockets(patched_loop) -> None:
@@ -894,22 +893,29 @@ async def on_startup(app):
     exc_handler.assert_called_with(patched_loop, msg)
 
 
-def test_run_app_keepalive_timeout(patched_loop, mocker, monkeypatch):
-    new_timeout = 1234
+@pytest.mark.parametrize(
+    "param",
+    (
+        "keepalive_timeout",
+        "max_line_size",
+        "max_headers",
+        "max_field_size",
+        "lingering_time",
+        "read_bufsize",
+        "auto_decompress",
+    ),
+)
+def test_run_app_pass_apprunner_kwargs(param, patched_loop, monkeypatch):
+    m = mock.Mock()
     base_runner_init_orig = BaseRunner.__init__
 
     def base_runner_init_spy(self, *args, **kwargs):
-        assert kwargs["keepalive_timeout"] == new_timeout
+        assert kwargs[param] is m
         base_runner_init_orig(self, *args, **kwargs)
 
     app = web.Application()
     monkeypatch.setattr(BaseRunner, "__init__", base_runner_init_spy)
-    web.run_app(
-        app,
-        keepalive_timeout=new_timeout,
-        print=stopper(patched_loop),
-        loop=patched_loop,
-    )
+    web.run_app(app, print=stopper(patched_loop), loop=patched_loop, **{param: m})
 
 
 def test_run_app_context_vars(patched_loop):
diff --git tests/test_web_functional.py tests/test_web_functional.py
index c33b3cec1ff..e0f123def0d 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2100,6 +2100,10 @@ async def resolve(self, host, port=0, family=socket.AF_INET):
     await client.close()
 
 
+@pytest.mark.skipif(
+    hasattr(sys, "_is_gil_enabled") and not sys._is_gil_enabled(),
+    reason="Fails to capture the warning",
+)
 async def test_return_http_exception_deprecated(aiohttp_client) -> None:
     async def handler(request):
         return web.HTTPForbidden()
diff --git tests/test_web_response.py tests/test_web_response.py
index c07bf671d8c..0525c1584f2 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -1164,10 +1164,10 @@ def test_ctor_content_type_with_extra() -> None:
     assert resp.headers["content-type"] == "text/plain; version=0.0.4; charset=utf-8"
 
 
-def test_invalid_content_type_parses_to_text_plain() -> None:
+def test_invalid_content_type_parses_to_application_octect_stream() -> None:
     resp = Response(text="test test", content_type="jpeg")
 
-    assert resp.content_type == "text/plain"
+    assert resp.content_type == "application/octet-stream"
     assert resp.headers["content-type"] == "jpeg; charset=utf-8"
 
 
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index ee60b6917c5..11ec47c1730 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -986,6 +986,28 @@ async def get(self) -> web.Response:
     await r.release()
 
 
+async def test_subapp_domain_routing_same_path(aiohttp_client: AiohttpClient) -> None:
+    """Regression test for #11665."""
+    app = web.Application()
+    sub_app = web.Application()
+
+    async def mainapp_handler(request: web.Request) -> web.Response:
+        assert False
+
+    async def subapp_handler(request: web.Request) -> web.Response:
+        return web.Response(text="SUBAPP")
+
+    app.router.add_get("/", mainapp_handler)
+    sub_app.router.add_get("/", subapp_handler)
+    app.add_domain("different.example.com", sub_app)
+
+    client = await aiohttp_client(app)
+    async with client.get("/", headers={"Host": "different.example.com"}) as r:
+        assert r.status == 200
+        result = await r.text()
+        assert result == "SUBAPP"
+
+
 async def test_route_with_regex(aiohttp_client: AiohttpClient) -> None:
     """Test a route with a regex preceded by a fixed string."""
     app = web.Application()
diff --git tests/test_websocket_writer.py tests/test_websocket_writer.py
index f5125dde361..6ec5aecb2a6 100644
--- tests/test_websocket_writer.py
+++ tests/test_websocket_writer.py
@@ -1,5 +1,7 @@
 import asyncio
 import random
+from concurrent.futures import ThreadPoolExecutor
+from contextlib import suppress
 from typing import Any, Callable
 from unittest import mock
 
@@ -7,6 +9,8 @@
 
 from aiohttp import WSMsgType
 from aiohttp._websocket.reader import WebSocketDataQueue
+from aiohttp.base_protocol import BaseProtocol
+from aiohttp.compression_utils import ZLibBackend
 from aiohttp.http import WebSocketReader, WebSocketWriter
 
 
@@ -25,7 +29,7 @@ def transport():
 
 
 @pytest.fixture
-def writer(protocol, transport):
+async def writer(loop, protocol, transport):
     return WebSocketWriter(protocol, transport, use_mask=False)
 
 
@@ -83,20 +87,44 @@ async def test_send_text_masked(protocol, transport) -> None:
     writer.transport.write.assert_called_with(b"\x81\x84\rg\xb3fy\x02\xcb\x12")  # type: ignore[attr-defined]
 
 
+@pytest.mark.usefixtures("parametrize_zlib_backend")
 async def test_send_compress_text(protocol, transport) -> None:
+    compress_obj = ZLibBackend.compressobj(level=ZLibBackend.Z_BEST_SPEED, wbits=-15)
     writer = WebSocketWriter(protocol, transport, compress=15)
+
+    msg = (
+        compress_obj.compress(b"text") + compress_obj.flush(ZLibBackend.Z_SYNC_FLUSH)
+    ).removesuffix(b"\x00\x00\xff\xff")
     await writer.send_frame(b"text", WSMsgType.TEXT)
-    writer.transport.write.assert_called_with(b"\xc1\x06*I\xad(\x01\x00")  # type: ignore[attr-defined]
+    writer.transport.write.assert_called_with(  # type: ignore[attr-defined]
+        b"\xc1" + len(msg).to_bytes(1, "big") + msg
+    )
+
+    msg = (
+        compress_obj.compress(b"text") + compress_obj.flush(ZLibBackend.Z_SYNC_FLUSH)
+    ).removesuffix(b"\x00\x00\xff\xff")
     await writer.send_frame(b"text", WSMsgType.TEXT)
-    writer.transport.write.assert_called_with(b"\xc1\x05*\x01b\x00\x00")  # type: ignore[attr-defined]
+    writer.transport.write.assert_called_with(  # type: ignore[attr-defined]
+        b"\xc1" + len(msg).to_bytes(1, "big") + msg
+    )
 
 
+@pytest.mark.usefixtures("parametrize_zlib_backend")
 async def test_send_compress_text_notakeover(protocol, transport) -> None:
+    compress_obj = ZLibBackend.compressobj(level=ZLibBackend.Z_BEST_SPEED, wbits=-15)
     writer = WebSocketWriter(protocol, transport, compress=15, notakeover=True)
+
+    msg = (
+        compress_obj.compress(b"text") + compress_obj.flush(ZLibBackend.Z_FULL_FLUSH)
+    ).removesuffix(b"\x00\x00\xff\xff")
     await writer.send_frame(b"text", WSMsgType.TEXT)
-    writer.transport.write.assert_called_with(b"\xc1\x06*I\xad(\x01\x00")  # type: ignore[attr-defined]
+    writer.transport.write.assert_called_with(  # type: ignore[attr-defined]
+        b"\xc1" + len(msg).to_bytes(1, "big") + msg
+    )
     await writer.send_frame(b"text", WSMsgType.TEXT)
-    writer.transport.write.assert_called_with(b"\xc1\x06*I\xad(\x01\x00")  # type: ignore[attr-defined]
+    writer.transport.write.assert_called_with(  # type: ignore[attr-defined]
+        b"\xc1" + len(msg).to_bytes(1, "big") + msg
+    )
 
 
 async def test_send_compress_text_per_message(protocol, transport) -> None:
@@ -109,6 +137,130 @@ async def test_send_compress_text_per_message(protocol, transport) -> None:
     writer.transport.write.assert_called_with(b"\xc1\x06*I\xad(\x01\x00")  # type: ignore[attr-defined]
 
 
+@pytest.mark.usefixtures("parametrize_zlib_backend")
+async def test_send_compress_cancelled(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    slow_executor: ThreadPoolExecutor,
+    monkeypatch: pytest.MonkeyPatch,
+) -> None:
+    """Test that cancelled compression doesn't corrupt subsequent sends.
+
+    Regression test for https://github.com/aio-libs/aiohttp/issues/11725
+    """
+    monkeypatch.setattr("aiohttp._websocket.writer.WEBSOCKET_MAX_SYNC_CHUNK_SIZE", 1024)
+    writer = WebSocketWriter(protocol, transport, compress=15)
+    loop = asyncio.get_running_loop()
+    queue = WebSocketDataQueue(mock.Mock(_reading_paused=False), 2**16, loop=loop)
+    reader = WebSocketReader(queue, 50000)
+
+    # Replace executor with slow one to make race condition reproducible
+    writer._compressobj = writer._get_compressor(None)
+    writer._compressobj._executor = slow_executor
+
+    # Create large data that will trigger executor-based compression
+    large_data_1 = b"A" * 10000
+    large_data_2 = b"B" * 10000
+
+    # Start first send and cancel it during compression
+    async def send_and_cancel() -> None:
+        await writer.send_frame(large_data_1, WSMsgType.BINARY)
+
+    task = asyncio.create_task(send_and_cancel())
+    # Give it a moment to start compression
+    await asyncio.sleep(0.01)
+    task.cancel()
+
+    # Await task cancellation (expected and intentionally ignored)
+    with suppress(asyncio.CancelledError):
+        await task
+
+    # Send second message - this should NOT be corrupted
+    await writer.send_frame(large_data_2, WSMsgType.BINARY)
+
+    # Verify the second send produced correct data
+    last_call = writer.transport.write.call_args_list[-1]  # type: ignore[attr-defined]
+    call_bytes = last_call[0][0]
+    result, _ = reader.feed_data(call_bytes)
+    assert result is False
+    msg = await queue.read()
+    assert msg.type is WSMsgType.BINARY
+    # The data should be all B's, not mixed with A's from the cancelled send
+    assert msg.data == large_data_2
+
+
+@pytest.mark.usefixtures("parametrize_zlib_backend")
+async def test_send_compress_multiple_cancelled(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    slow_executor: ThreadPoolExecutor,
+    monkeypatch: pytest.MonkeyPatch,
+) -> None:
+    """Test that multiple compressed sends all complete despite cancellation.
+
+    Regression test for https://github.com/aio-libs/aiohttp/issues/11725
+    This verifies that once a send operation enters the shield, it completes
+    even if cancelled. With the lock inside the shield, all tasks that enter
+    the shield will complete their sends, even while waiting for the lock.
+    """
+    monkeypatch.setattr("aiohttp._websocket.writer.WEBSOCKET_MAX_SYNC_CHUNK_SIZE", 1024)
+    writer = WebSocketWriter(protocol, transport, compress=15)
+    loop = asyncio.get_running_loop()
+    queue = WebSocketDataQueue(mock.Mock(_reading_paused=False), 2**16, loop=loop)
+    reader = WebSocketReader(queue, 50000)
+
+    # Replace executor with slow one
+    writer._compressobj = writer._get_compressor(None)
+    writer._compressobj._executor = slow_executor
+
+    # Create 5 large messages with different content
+    messages = [bytes([ord("A") + i]) * 10000 for i in range(5)]
+
+    # Start sending all 5 messages - they'll queue due to the lock
+    tasks = [
+        asyncio.create_task(writer.send_frame(msg, WSMsgType.BINARY))
+        for msg in messages
+    ]
+
+    # Cancel all tasks during execution
+    # With lock inside shield, all tasks that enter the shield will complete
+    # even while waiting for the lock
+    await asyncio.sleep(0.1)  # Let tasks enter the shield
+    for task in tasks:
+        task.cancel()
+
+    # Collect results
+    cancelled_count = 0
+    for task in tasks:
+        try:
+            await task
+        except asyncio.CancelledError:
+            cancelled_count += 1
+
+    # Wait for all background tasks to complete
+    # (they continue running even after cancellation due to shield)
+    await asyncio.gather(*writer._background_tasks, return_exceptions=True)
+
+    # All tasks that entered the shield should complete, even if cancelled
+    # With lock inside shield, all tasks enter shield immediately then wait for lock
+    sent_count = len(writer.transport.write.call_args_list)  # type: ignore[attr-defined]
+    assert (
+        sent_count == 5
+    ), "All 5 sends should complete due to shield protecting lock acquisition"
+
+    # Verify all sent messages are correct (no corruption)
+    for i in range(sent_count):
+        call = writer.transport.write.call_args_list[i]  # type: ignore[attr-defined]
+        call_bytes = call[0][0]
+        result, _ = reader.feed_data(call_bytes)
+        assert result is False
+        msg = await queue.read()
+        assert msg.type is WSMsgType.BINARY
+        # Verify the data matches the expected message
+        expected_byte = bytes([ord("A") + i])
+        assert msg.data == expected_byte * 10000, f"Message {i} corrupted"
+
+
 @pytest.mark.parametrize(
     ("max_sync_chunk_size", "payload_point_generator"),
     (
@@ -171,3 +323,6 @@ async def test_concurrent_messages(
         # we want to validate that all the bytes are
         # the same value
         assert bytes_data == bytes_data[0:1] * char_val
+
+    # Wait for any background tasks to complete
+    await asyncio.gather(*writer._background_tasks, return_exceptions=True)

Description

This PR represents a major version update from aiohttp 3.12.15 to 3.13.2, bringing significant new features and bug fixes. Key changes include:

  • Python 3.14 and free-threading support: Added support for Python 3.14 including experimental free-threaded builds
  • Zstandard compression: Added support for zstd content encoding via backports.zstd library
  • WebSocket improvements: Fixed critical cancellation safety issues in compressed WebSocket sends
  • Cookie parser robustness: Improved handling of malformed cookies and added fallback parsing
  • Netrc authentication: Fixed netrc credential loading from default locations
  • Dependency updates: Bumped numerous GitHub Actions and Python dependencies to latest versions
  • Build system modernization: Migrated core package metadata to pyproject.toml following PEP 621

The PR also includes extensive CI/CD workflow updates, new test fixtures, and documentation improvements.

Possible Issues

  1. Breaking changes in compression: The switch from zstandard to backports.zstd for Python <3.14 requires users to update their dependencies. While the PR includes migration notes, this could break existing installations that installed zstandard manually.

  2. WebSocket compression refactoring: The cancellation-safe WebSocket compression introduces significant complexity with locks and shields. While the tests are comprehensive, real-world race conditions might still occur under extreme load scenarios.

  3. Cookie parser fallback logic: The new fallback parser for malformed cookies adds multiple code paths that could have edge cases, particularly with unusual cookie formats that might not be covered by tests.

  4. Metadata migration risk: Moving from setup.cfg to pyproject.toml could cause issues for downstreams with older build tools, though the PR maintains backward compatibility with setuptools.

  5. Test dependency changes: Making blockbuster optional could hide blocking I/O issues in CI if it's not installed, potentially masking performance regressions.

Security Hotspots

  1. Cookie parser DoS potential (Low-Medium Risk): The new fallback cookie parser in _cookie_helpers.py has multiple loops and string operations that could be exploited with specially crafted cookie headers. The test for large values (8KB) provides some protection, but extremely large or deeply nested malformed cookies could still cause performance degradation.

  2. Netrc file handling (Low Risk): The netrc authentication feature now reads from ~/.netrc by default when trust_env=True. While moved to an executor to avoid blocking, malicious netrc files could potentially leak credentials if the environment is compromised. The implementation correctly uses run_in_executor but doesn't validate file permissions.

Changes

Core Package Structure

  • aiohttp/init.py: Version bump to 3.13.2
  • setup.cfg: Removed most metadata, keeping only dependency info
  • pyproject.toml: Added comprehensive package metadata following PEP 621

Compression Support

  • aiohttp/compression_utils.py:
    • Added ZSTDDecompressor class for zstd support
    • Refactored ZLibCompressor to remove internal lock (cancellation safety now handled by callers)
    • Added warning comments about cancellation safety
  • aiohttp/_http_parser.pyx: Added 'zstd' to recognized encodings
  • aiohttp/http_parser.py: Added zstd decompression support in DeflateBuffer
  • aiohttp/client_reqrep.py: Added zstd to default Accept-Encoding header

WebSocket Writer

  • aiohttp/_websocket/writer.py:
    • Added _send_lock to prevent compressor state corruption
    • Implemented _send_compressed_frame_sync() for small payloads
    • Implemented _send_compressed_frame_async_locked() with shield protection
    • Added _background_tasks set to track shielded operations
    • Split compression logic into separate methods based on payload size
    • Increased WEBSOCKET_MAX_SYNC_CHUNK_SIZE from 5KB to 16KB

Cookie Parsing

  • aiohttp/_cookie_helpers.py:
    • Added fallback parser for malformed cookies in parse_cookie_header()
    • Fixed handling of partitioned cookies for Python <3.14
    • Improved validation and error handling
  • tests/test_cookie_helpers.py: Extensive new tests for edge cases

Authentication

  • aiohttp/client.py:
    • Added _get_netrc_auth() method for executor-based netrc lookup
    • Integrated netrc auth into request flow when trust_env=True
  • aiohttp/client_reqrep.py: Removed blocking netrc lookup from update_auth()

HTTP Response Handling

  • aiohttp/helpers.py:
    • Added EnsureOctetStream class to handle invalid Content-Type headers
    • Changed default from text/plain to application/octet-stream per RFC 9110

Testing Infrastructure

  • tests/conftest.py:
    • Made blockbuster fixture conditional on installation
    • Added netrc_* fixtures for authentication testing
    • Added slow_executor fixture for concurrency testing
    • Made zlib backend parametrization conditional on module availability
  • requirements/test-ft.in: New file for free-threading test dependencies
  • Multiple new test files and fixtures for netrc, compression, and WebSocket scenarios

CI/CD Workflows

  • .github/workflows/ci-cd.yml:
    • Updated all action versions (checkout v4→v5, setup-python v5→v6, etc.)
    • Added Python 3.14 and 3.14t (free-threading) to test matrix
    • Added conditional dependency groups for free-threading
    • Added riscv64 to build matrix
    • Updated cibuildwheel from v2 to v3
sequenceDiagram
    participant Client
    participant Session
    participant Executor
    participant NetrcFile
    participant Request
    
    Client->>Session: GET with trust_env=True
    Session->>Session: Check for explicit auth
    alt No explicit auth
        Session->>Executor: run_in_executor(_get_netrc_auth)
        Executor->>NetrcFile: Read from NETRC or ~/.netrc
        NetrcFile-->>Executor: Return credentials
        Executor-->>Session: BasicAuth or None
    end
    Session->>Request: Create with auth
    Request->>Client: Execute request with Authorization header
    
    participant WSWriter as WebSocket Writer
    participant Compressor
    participant Shield
    participant Transport
    
    Note over WSWriter: Send compressed frame
    WSWriter->>WSWriter: Check payload size
    
    alt Small payload (< 16KB)
        WSWriter->>WSWriter: Acquire send_lock
        WSWriter->>Compressor: compress_sync()
        WSWriter->>Transport: write_websocket_frame()
        WSWriter->>WSWriter: Release send_lock
    else Large payload
        WSWriter->>WSWriter: Create shielded task
        WSWriter->>Shield: shield(compress_async)
        Shield->>WSWriter: Acquire send_lock
        Shield->>Compressor: compress() in executor
        Shield->>Transport: write_websocket_frame()
        Shield->>WSWriter: Release send_lock
    end
Loading

@mihaiplesa mihaiplesa merged commit f969cb7 into main Nov 15, 2025
5 checks passed
@mihaiplesa mihaiplesa deleted the renovate/aiohttp-3-x branch November 15, 2025 08:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants