Skip to content

chore(deps): update dependency aiohttp to v3.11.18 #40

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

renovate[bot]
Copy link
Contributor

@renovate renovate bot commented Apr 15, 2025

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
aiohttp ==3.11.9 -> ==3.11.18 age adoption passing confidence

Release Notes

aio-libs/aiohttp (aiohttp)

v3.11.18

Compare Source

====================

Bug fixes

  • Disabled TLS in TLS warning (when using HTTPS proxies) for uvloop and newer Python versions -- by :user:lezgomatt.

    Related issues and pull requests on GitHub:
    :issue:7686.

  • Fixed reading fragmented WebSocket messages when the payload was masked -- by :user:bdraco.

    The problem first appeared in 3.11.17

    Related issues and pull requests on GitHub:
    :issue:10764.


v3.11.17

Compare Source

====================

Miscellaneous internal changes

  • Optimized web server performance when access logging is disabled by reducing time syscalls -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10713.

  • Improved web server performance when connection can be reused -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10714.

  • Improved performance of the WebSocket reader -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10740.

  • Improved performance of the WebSocket reader with large messages -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10744.


v3.11.16

Compare Source

====================

Bug fixes

  • Replaced deprecated asyncio.iscoroutinefunction with its counterpart from inspect
    -- by :user:layday.

    Related issues and pull requests on GitHub:
    :issue:10634.

  • Fixed :class:multidict.CIMultiDict being mutated when passed to :class:aiohttp.web.Response -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10672.


v3.11.15

Compare Source

====================

Bug fixes

  • Reverted explicitly closing sockets if an exception is raised during create_connection -- by :user:bdraco.

    This change originally appeared in aiohttp 3.11.13

    Related issues and pull requests on GitHub:
    :issue:10464, :issue:10617, :issue:10656.

Miscellaneous internal changes

  • Improved performance of WebSocket buffer handling -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10601.

  • Improved performance of serializing headers -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10625.


v3.11.14

Compare Source

====================

Bug fixes

  • Fixed an issue where dns queries were delayed indefinitely when an exception occurred in a trace.send_dns_cache_miss
    -- by :user:logioniz.

    Related issues and pull requests on GitHub:
    :issue:10529.

  • Fixed DNS resolution on platforms that don't support socket.AI_ADDRCONFIG -- by :user:maxbachmann.

    Related issues and pull requests on GitHub:
    :issue:10542.

  • The connector now raises :exc:aiohttp.ClientConnectionError instead of :exc:OSError when failing to explicitly close the socket after :py:meth:asyncio.loop.create_connection fails -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10551.

  • Break cyclic references at connection close when there was a traceback -- by :user:bdraco.

    Special thanks to :user:availov for reporting the issue.

    Related issues and pull requests on GitHub:
    :issue:10556.

  • Break cyclic references when there is an exception handling a request -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10569.

Features

  • Improved logging on non-overlapping WebSocket client protocols to include the remote address -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10564.

Miscellaneous internal changes

  • Improved performance of parsing content types by adding a cache in the same manner currently done with mime types -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10552.


v3.11.13

Compare Source

====================

Bug fixes

  • Removed a break statement inside the finally block in :py:class:~aiohttp.web.RequestHandler
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10434.

  • Changed connection creation to explicitly close sockets if an exception is raised in the event loop's create_connection method -- by :user:top-oai.

    Related issues and pull requests on GitHub:
    :issue:10464.

Packaging updates and notes for downstreams

  • Fixed test test_write_large_payload_deflate_compression_data_in_eof_writelines failing with Python 3.12.9+ or 3.13.2+ -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10423.

Miscellaneous internal changes

  • Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:bdraco.

    Previously, the error messages were empty strings, which made it hard to determine what went wrong.

    Related issues and pull requests on GitHub:
    :issue:10422.


v3.11.12

Compare Source

====================

Bug fixes

  • MultipartForm.decode() now follows RFC1341 7.2.1 with a CRLF after the boundary
    -- by :user:imnotjames.

    Related issues and pull requests on GitHub:
    :issue:10270.

  • Restored the missing total_bytes attribute to EmptyStreamReader -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10387.

Features

  • Updated :py:func:~aiohttp.request to make it accept _RequestOptions kwargs.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10300.

  • Improved logging of HTTP protocol errors to include the remote address -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10332.

Improved documentation

  • Added aiohttp-openmetrics to list of third-party libraries -- by :user:jelmer.

    Related issues and pull requests on GitHub:
    :issue:10304.

Packaging updates and notes for downstreams

  • Added missing files to the source distribution to fix Makefile targets.
    Added a cythonize-nodeps target to run Cython without invoking pip to install dependencies.

    Related issues and pull requests on GitHub:
    :issue:10366.

  • Started building armv7l musllinux wheels -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10404.

Contributor-facing changes

  • The CI/CD workflow has been updated to use upload-artifact v4 and download-artifact v4 GitHub Actions -- by :user:silamon.

    Related issues and pull requests on GitHub:
    :issue:10281.

Miscellaneous internal changes

  • Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:bdraco.

    Zero copy writes were previously disabled due to :cve:2024-12254 which is resolved in these Python versions.

    Related issues and pull requests on GitHub:
    :issue:10137.


v3.11.11

Compare Source

====================

Bug fixes

  • Updated :py:meth:~aiohttp.ClientSession.request to reuse the quote_cookie setting from ClientSession._cookie_jar when processing cookies parameter.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10093.

  • Fixed type of SSLContext for some static type checkers (e.g. pyright).

    Related issues and pull requests on GitHub:
    :issue:10099.

  • Updated :meth:aiohttp.web.StreamResponse.write annotation to also allow :class:bytearray and :class:memoryview as inputs -- by :user:cdce8p.

    Related issues and pull requests on GitHub:
    :issue:10154.

  • Fixed a hang where a connection previously used for a streaming
    download could be returned to the pool in a paused state.
    -- by :user:javitonino.

    Related issues and pull requests on GitHub:
    :issue:10169.

Features

  • Enabled ALPN on default SSL contexts. This improves compatibility with some
    proxies which don't work without this extension.
    -- by :user:Cycloctane.

    Related issues and pull requests on GitHub:
    :issue:10156.

Miscellaneous internal changes

  • Fixed an infinite loop that can occur when using aiohttp in combination
    with async-solipsism_ -- by :user:bmerry.

    .. _async-solipsism: https://github.com/bmerry/async-solipsism

    Related issues and pull requests on GitHub:
    :issue:10149.


v3.11.10

Compare Source

====================

Bug fixes

  • Fixed race condition in :class:aiohttp.web.FileResponse that could have resulted in an incorrect response if the file was replaced on the file system during prepare -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10101, :issue:10113.

  • Replaced deprecated call to :func:mimetypes.guess_type with :func:mimetypes.guess_file_type when using Python 3.13+ -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10102.

  • Disabled zero copy writes in the StreamWriter -- by :user:bdraco.

    Related issues and pull requests on GitHub:
    :issue:10125.



Configuration

📅 Schedule: Branch creation - Tuesday through Thursday ( * * * * 2-4 ) (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

Copy link

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..a794dc65d77 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -114,7 +114,7 @@ jobs:
       run: |
         make generate-llhttp
     - name: Upload llhttp generated files
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -177,7 +177,7 @@ jobs:
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -250,11 +250,11 @@ jobs:
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13.2
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13.2
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
@@ -264,7 +264,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -325,7 +325,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -336,27 +336,41 @@ jobs:
       run: |
         python -m build --sdist
     - name: Upload artifacts
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: dist-sdist
         path: dist
 
   build-wheels:
-    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }}
-    runs-on: ${{ matrix.os }}-latest
+    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }} ${{ matrix.musl }}
+    runs-on: ${{ matrix.os }}
     needs: pre-deploy
     strategy:
       matrix:
-        os: [ubuntu, windows, macos]
+        os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
         qemu: ['']
+        musl: [""]
         include:
-          # Split ubuntu job for the sake of speed-up
-        - os: ubuntu
-          qemu: aarch64
-        - os: ubuntu
+          # Split ubuntu/musl jobs for the sake of speed-up
+        - os: ubuntu-latest
+          qemu: ppc64le
+          musl: ""
+        - os: ubuntu-latest
           qemu: ppc64le
-        - os: ubuntu
+          musl: musllinux
+        - os: ubuntu-latest
           qemu: s390x
+          musl: ""
+        - os: ubuntu-latest
+          qemu: s390x
+          musl: musllinux
+        - os: ubuntu-latest
+          qemu: armv7l
+          musl: musllinux
+        - os: ubuntu-latest
+          musl: musllinux
+        - os: ubuntu-24.04-arm
+          musl: musllinux
     steps:
     - name: Checkout
       uses: actions/checkout@v4
@@ -367,6 +381,10 @@ jobs:
       uses: docker/setup-qemu-action@v3
       with:
         platforms: all
+        # This should be temporary
+        # xref https://github.com/docker/setup-qemu-action/issues/188
+        # xref https://github.com/tonistiigi/binfmt/issues/215
+        image: tonistiigi/binfmt:qemu-v8.1.5
       id: qemu
     - name: Prepare emulation
       run: |
@@ -388,7 +406,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -398,10 +416,17 @@ jobs:
     - name: Build wheels
       uses: pypa/[email protected]
       env:
+        CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
-    - uses: actions/upload-artifact@v3
+    - name: Upload wheels
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: >-
+          dist-${{ matrix.os }}-${{ matrix.musl }}-${{
+            matrix.qemu
+            && matrix.qemu
+            || 'native'
+          }}
         path: ./wheelhouse/*.whl
 
   deploy:
@@ -426,10 +451,11 @@ jobs:
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
-        name: dist
         path: dist
+        pattern: dist-*
+        merge-multiple: true
     - name: Collected dists
       run: |
         tree dist
diff --git .readthedocs.yml .readthedocs.yml
index b3edaf4b8ea..b7d8a9236f6 100644
--- .readthedocs.yml
+++ .readthedocs.yml
@@ -5,6 +5,10 @@
 ---
 version: 2
 
+sphinx:
+  # Path to your Sphinx configuration file.
+  configuration: docs/conf.py
+
 submodules:
   include: all
   exclude: []
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..00d728e775d 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,418 @@
 
 .. towncrier release notes start
 
+3.11.16 (2025-04-01)
+====================
+
+Bug fixes
+---------
+
+- Replaced deprecated ``asyncio.iscoroutinefunction`` with its counterpart from ``inspect``
+  -- by :user:`layday`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10634`.
+
+
+
+- Fixed :class:`multidict.CIMultiDict` being mutated when passed to :class:`aiohttp.web.Response` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10672`.
+
+
+
+
+----
+
+
+3.11.15 (2025-03-31)
+====================
+
+Bug fixes
+---------
+
+- Reverted explicitly closing sockets if an exception is raised during ``create_connection`` -- by :user:`bdraco`.
+
+  This change originally appeared in aiohttp 3.11.13
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`, :issue:`10617`, :issue:`10656`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of WebSocket buffer handling -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10601`.
+
+
+
+- Improved performance of serializing headers -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10625`.
+
+
+
+
+----
+
+
+3.11.14 (2025-03-16)
+====================
+
+Bug fixes
+---------
+
+- Fixed an issue where dns queries were delayed indefinitely when an exception occurred in a ``trace.send_dns_cache_miss``
+  -- by :user:`logioniz`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10529`.
+
+
+
+- Fixed DNS resolution on platforms that don't support ``socket.AI_ADDRCONFIG`` -- by :user:`maxbachmann`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10542`.
+
+
+
+- The connector now raises :exc:`aiohttp.ClientConnectionError` instead of :exc:`OSError` when failing to explicitly close the socket after :py:meth:`asyncio.loop.create_connection` fails -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10551`.
+
+
+
+- Break cyclic references at connection close when there was a traceback -- by :user:`bdraco`.
+
+  Special thanks to :user:`availov` for reporting the issue.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10556`.
+
+
+
+- Break cyclic references when there is an exception handling a request -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10569`.
+
+
+
+
+Features
+--------
+
+- Improved logging on non-overlapping WebSocket client protocols to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10564`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of parsing content types by adding a cache in the same manner currently done with mime types -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10552`.
+
+
+
+
+----
+
+
+3.11.13 (2025-02-24)
+====================
+
+Bug fixes
+---------
+
+- Removed a break statement inside the finally block in :py:class:`~aiohttp.web.RequestHandler`
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10434`.
+
+
+
+- Changed connection creation to explicitly close sockets if an exception is raised in the event loop's ``create_connection`` method -- by :user:`top-oai`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Fixed test ``test_write_large_payload_deflate_compression_data_in_eof_writelines`` failing with Python 3.12.9+ or 3.13.2+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10423`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:`bdraco`.
+
+  Previously, the error messages were empty strings, which made it hard to determine what went wrong.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10422`.
+
+
+
+
+----
+
+
+3.11.12 (2025-02-05)
+====================
+
+Bug fixes
+---------
+
+- ``MultipartForm.decode()`` now follows RFC1341 7.2.1 with a ``CRLF`` after the boundary
+  -- by :user:`imnotjames`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10270`.
+
+
+
+- Restored the missing ``total_bytes`` attribute to ``EmptyStreamReader`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10387`.
+
+
+
+
+Features
+--------
+
+- Updated :py:func:`~aiohttp.request` to make it accept ``_RequestOptions`` kwargs.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10300`.
+
+
+
+- Improved logging of HTTP protocol errors to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10332`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``aiohttp-openmetrics`` to list of third-party libraries -- by :user:`jelmer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10304`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Added missing files to the source distribution to fix ``Makefile`` targets.
+  Added a ``cythonize-nodeps`` target to run Cython without invoking pip to install dependencies.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10366`.
+
+
+
+- Started building armv7l musllinux wheels -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10404`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The CI/CD workflow has been updated to use `upload-artifact` v4 and `download-artifact` v4 GitHub Actions -- by :user:`silamon`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10281`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:`bdraco`.
+
+  Zero copy writes were previously disabled due to :cve:`2024-12254` which is resolved in these Python versions.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10137`.
+
+
+
+
+----
+
+
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..953af52498a 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -30,6 +31,7 @@ Alexandru Mihai
 Alexey Firsov
 Alexey Nikitin
 Alexey Popravka
+Alexey Stavrov
 Alexey Stepanov
 Amin Etesamian
 Amit Tulshyan
@@ -41,6 +43,7 @@ Andrej Antonov
 Andrew Leech
 Andrew Lytvyn
 Andrew Svetlov
+Andrew Top
 Andrew Zhou
 Andrii Soldatenko
 Anes Abismail
@@ -166,10 +169,12 @@ Jaesung Lee
 Jake Davis
 Jakob Ackermann
 Jakub Wilk
+James Ward
 Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -364,6 +369,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git MANIFEST.in MANIFEST.in
index d7c5cef6aad..64cee139a1f 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -7,6 +7,7 @@ graft aiohttp
 graft docs
 graft examples
 graft tests
+graft tools
 graft requirements
 recursive-include vendor *
 global-include aiohttp *.pyi
diff --git Makefile Makefile
index b0a3ef3226b..c6193fea9e4 100644
--- Makefile
+++ Makefile
@@ -81,6 +81,9 @@ generate-llhttp: .llhttp-gen
 .PHONY: cythonize
 cythonize: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
 
+.PHONY: cythonize-nodeps
+cythonize-nodeps: $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
+
 .install-deps: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c $(call to-hash,$(CYS) $(REQS))
 	@python -m pip install -r requirements/dev.in -c requirements/dev.txt
 	@touch .install-deps
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..93b06c7367a 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.16"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/_http_writer.pyx aiohttp/_http_writer.pyx
index 287371334f8..4a3ae1f9e68 100644
--- aiohttp/_http_writer.pyx
+++ aiohttp/_http_writer.pyx
@@ -97,27 +97,34 @@ cdef inline int _write_str(Writer* writer, str s):
             return -1
 
 
-# --------------- _serialize_headers ----------------------
-
-cdef str to_str(object s):
+cdef inline int _write_str_raise_on_nlcr(Writer* writer, object s):
+    cdef Py_UCS4 ch
+    cdef str out_str
     if type(s) is str:
-        return <str>s
+        out_str = <str>s
     elif type(s) is _istr:
-        return PyObject_Str(s)
+        out_str = PyObject_Str(s)
     elif not isinstance(s, str):
         raise TypeError("Cannot serialize non-str key {!r}".format(s))
     else:
-        return str(s)
+        out_str = str(s)
+
+    for ch in out_str:
+        if ch == 0x0D or ch == 0x0A:
+            raise ValueError(
+                "Newline or carriage return detected in headers. "
+                "Potential header injection attack."
+            )
+        if _write_utf8(writer, ch) < 0:
+            return -1
 
 
+# --------------- _serialize_headers ----------------------
 
 def _serialize_headers(str status_line, headers):
     cdef Writer writer
     cdef object key
     cdef object val
-    cdef bytes ret
-    cdef str key_str
-    cdef str val_str
 
     _init_writer(&writer)
 
@@ -130,22 +137,13 @@ def _serialize_headers(str status_line, headers):
             raise
 
         for key, val in headers.items():
-            key_str = to_str(key)
-            val_str = to_str(val)
-
-            if "\r" in key_str or "\n" in key_str or "\r" in val_str or "\n" in val_str:
-                raise ValueError(
-                    "Newline or carriage return character detected in HTTP status message or "
-                    "header. This is a potential security issue."
-                )
-
-            if _write_str(&writer, key_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, key) < 0:
                 raise
             if _write_byte(&writer, b':') < 0:
                 raise
             if _write_byte(&writer, b' ') < 0:
                 raise
-            if _write_str(&writer, val_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, val) < 0:
                 raise
             if _write_byte(&writer, b'\r') < 0:
                 raise
diff --git aiohttp/_websocket/reader_c.pxd aiohttp/_websocket/reader_c.pxd
index 461e658e116..f156a7ff704 100644
--- aiohttp/_websocket/reader_c.pxd
+++ aiohttp/_websocket/reader_c.pxd
@@ -93,6 +93,7 @@ cdef class WebSocketReader:
         chunk_size="unsigned int",
         chunk_len="unsigned int",
         buf_length="unsigned int",
+        buf_cstr="const unsigned char *",
         first_byte="unsigned char",
         second_byte="unsigned char",
         end_pos="unsigned int",
diff --git aiohttp/_websocket/reader_py.py aiohttp/_websocket/reader_py.py
index 94d20010890..92ad47a52f0 100644
--- aiohttp/_websocket/reader_py.py
+++ aiohttp/_websocket/reader_py.py
@@ -93,6 +93,7 @@ def _release_waiter(self) -> None:
     def feed_eof(self) -> None:
         self._eof = True
         self._release_waiter()
+        self._exception = None  # Break cyclic references
 
     def feed_data(self, data: "WSMessage", size: "int_") -> None:
         self._size += size
@@ -193,9 +194,8 @@ def _feed_data(self, data: bytes) -> None:
                     if self._max_msg_size and len(self._partial) >= self._max_msg_size:
                         raise WebSocketError(
                             WSCloseCode.MESSAGE_TOO_BIG,
-                            "Message size {} exceeds limit {}".format(
-                                len(self._partial), self._max_msg_size
-                            ),
+                            f"Message size {len(self._partial)} "
+                            f"exceeds limit {self._max_msg_size}",
                         )
                     continue
 
@@ -214,7 +214,7 @@ def _feed_data(self, data: bytes) -> None:
                     raise WebSocketError(
                         WSCloseCode.PROTOCOL_ERROR,
                         "The opcode in non-fin frame is expected "
-                        "to be zero, got {!r}".format(opcode),
+                        f"to be zero, got {opcode!r}",
                     )
 
                 assembled_payload: Union[bytes, bytearray]
@@ -227,9 +227,8 @@ def _feed_data(self, data: bytes) -> None:
                 if self._max_msg_size and len(assembled_payload) >= self._max_msg_size:
                     raise WebSocketError(
                         WSCloseCode.MESSAGE_TOO_BIG,
-                        "Message size {} exceeds limit {}".format(
-                            len(assembled_payload), self._max_msg_size
-                        ),
+                        f"Message size {len(assembled_payload)} "
+                        f"exceeds limit {self._max_msg_size}",
                     )
 
                 # Decompress process must to be done after all packets
@@ -246,9 +245,8 @@ def _feed_data(self, data: bytes) -> None:
                         left = len(self._decompressobj.unconsumed_tail)
                         raise WebSocketError(
                             WSCloseCode.MESSAGE_TOO_BIG,
-                            "Decompressed message size {} exceeds limit {}".format(
-                                self._max_msg_size + left, self._max_msg_size
-                            ),
+                            f"Decompressed message size {self._max_msg_size + left}"
+                            f" exceeds limit {self._max_msg_size}",
                         )
                 elif type(assembled_payload) is bytes:
                     payload_merged = assembled_payload
@@ -327,14 +325,15 @@ def parse_frame(
 
         start_pos: int = 0
         buf_length = len(buf)
+        buf_cstr = buf
 
         while True:
             # read header
             if self._state == READ_HEADER:
                 if buf_length - start_pos < 2:
                     break
-                first_byte = buf[start_pos]
-                second_byte = buf[start_pos + 1]
+                first_byte = buf_cstr[start_pos]
+                second_byte = buf_cstr[start_pos + 1]
                 start_pos += 2
 
                 fin = (first_byte >> 7) & 1
@@ -399,14 +398,14 @@ def parse_frame(
                 if length_flag == 126:
                     if buf_length - start_pos < 2:
                         break
-                    first_byte = buf[start_pos]
-                    second_byte = buf[start_pos + 1]
+                    first_byte = buf_cstr[start_pos]
+                    second_byte = buf_cstr[start_pos + 1]
                     start_pos += 2
                     self._payload_length = first_byte << 8 | second_byte
                 elif length_flag > 126:
                     if buf_length - start_pos < 8:
                         break
-                    data = buf[start_pos : start_pos + 8]
+                    data = buf_cstr[start_pos : start_pos + 8]
                     start_pos += 8
                     self._payload_length = UNPACK_LEN3(data)[0]
                 else:
@@ -418,7 +417,7 @@ def parse_frame(
             if self._state == READ_PAYLOAD_MASK:
                 if buf_length - start_pos < 4:
                     break
-                self._frame_mask = buf[start_pos : start_pos + 4]
+                self._frame_mask = buf_cstr[start_pos : start_pos + 4]
                 start_pos += 4
                 self._state = READ_PAYLOAD
 
@@ -434,10 +433,10 @@ def parse_frame(
                 if self._frame_payload_len:
                     if type(self._frame_payload) is not bytearray:
                         self._frame_payload = bytearray(self._frame_payload)
-                    self._frame_payload += buf[start_pos:end_pos]
+                    self._frame_payload += buf_cstr[start_pos:end_pos]
                 else:
                     # Fast path for the first frame
-                    self._frame_payload = buf[start_pos:end_pos]
+                    self._frame_payload = buf_cstr[start_pos:end_pos]
 
                 self._frame_payload_len += end_pos - start_pos
                 start_pos = end_pos
@@ -463,6 +462,7 @@ def parse_frame(
                 self._frame_payload_len = 0
                 self._state = READ_HEADER
 
-        self._tail = buf[start_pos:] if start_pos < buf_length else b""
+        # XXX: Cython needs slices to be bounded, so we can't omit the slice end here.
+        self._tail = buf_cstr[start_pos:buf_length] if start_pos < buf_length else b""
 
         return frames
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..7c788e825eb 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
@@ -1469,106 +1471,80 @@ async def __aexit__(
         await self._session.close()
 
 
-def request(
-    method: str,
-    url: StrOrURL,
-    *,
-    params: Query = None,
-    data: Any = None,
-    json: Any = None,
-    headers: Optional[LooseHeaders] = None,
-    skip_auto_headers: Optional[Iterable[str]] = None,
-    auth: Optional[BasicAuth] = None,
-    allow_redirects: bool = True,
-    max_redirects: int = 10,
-    compress: Optional[str] = None,
-    chunked: Optional[bool] = None,
-    expect100: bool = False,
-    raise_for_status: Optional[bool] = None,
-    read_until_eof: bool = True,
-    proxy: Optional[StrOrURL] = None,
-    proxy_auth: Optional[BasicAuth] = None,
-    timeout: Union[ClientTimeout, object] = sentinel,
-    cookies: Optional[LooseCookies] = None,
-    version: HttpVersion = http.HttpVersion11,
-    connector: Optional[BaseConnector] = None,
-    read_bufsize: Optional[int] = None,
-    loop: Optional[asyncio.AbstractEventLoop] = None,
-    max_line_size: int = 8190,
-    max_field_size: int = 8190,
-) -> _SessionRequestContextManager:
-    """Constructs and sends a request.
-
-    Returns response object.
-    method - HTTP method
-    url - request url
-    params - (optional) Dictionary or bytes to be sent in the query
-      string of the new request
-    data - (optional) Dictionary, bytes, or file-like object to
-      send in the body of the request
-    json - (optional) Any json compatible python object
-    headers - (optional) Dictionary of HTTP Headers to send with
-      the request
-    cookies - (optional) Dict object to send with the request
-    auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
-    auth - aiohttp.helpers.BasicAuth
-    allow_redirects - (optional) If set to False, do not follow
-      redirects
-    version - Request HTTP version.
-    compress - Set to True if request has to be compressed
-       with deflate encoding.
-    chunked - Set to chunk size for chunked transfer encoding.
-    expect100 - Expect 100-continue response from server.
-    connector - BaseConnector sub-class instance to support
-       connection pooling.
-    read_until_eof - Read response until eof if response
-       does not have Content-Length header.
-    loop - Optional event loop.
-    timeout - Optional ClientTimeout settings structure, 5min
-       total timeout by default.
-    Usage::
-      >>> import aiohttp
-      >>> resp = await aiohttp.request('GET', 'http://python.org/')
-      >>> resp
-      <ClientResponse(python.org/) [200]>
-      >>> data = await resp.read()
-    """
-    connector_owner = False
-    if connector is None:
-        connector_owner = True
-        connector = TCPConnector(loop=loop, force_close=True)
-
-    session = ClientSession(
-        loop=loop,
-        cookies=cookies,
-        version=version,
-        timeout=timeout,
-        connector=connector,
-        connector_owner=connector_owner,
-    )
+if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
-    return _SessionRequestContextManager(
-        session._request(
-            method,
-            url,
-            params=params,
-            data=data,
-            json=json,
-            headers=headers,
-            skip_auto_headers=skip_auto_headers,
-            auth=auth,
-            allow_redirects=allow_redirects,
-            max_redirects=max_redirects,
-            compress=compress,
-            chunked=chunked,
-            expect100=expect100,
-            raise_for_status=raise_for_status,
-            read_until_eof=read_until_eof,
-            proxy=proxy,
-            proxy_auth=proxy_auth,
-            read_bufsize=read_bufsize,
-            max_line_size=max_line_size,
-            max_field_size=max_field_size,
-        ),
-        session,
-    )
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Unpack[_RequestOptions],
+    ) -> _SessionRequestContextManager: ...
+
+else:
+
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Any,
+    ) -> _SessionRequestContextManager:
+        """Constructs and sends a request.
+
+        Returns response object.
+        method - HTTP method
+        url - request url
+        params - (optional) Dictionary or bytes to be sent in the query
+        string of the new request
+        data - (optional) Dictionary, bytes, or file-like object to
+        send in the body of the request
+        json - (optional) Any json compatible python object
+        headers - (optional) Dictionary of HTTP Headers to send with
+        the request
+        cookies - (optional) Dict object to send with the request
+        auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
+        auth - aiohttp.helpers.BasicAuth
+        allow_redirects - (optional) If set to False, do not follow
+        redirects
+        version - Request HTTP version.
+        compress - Set to True if request has to be compressed
+        with deflate encoding.
+        chunked - Set to chunk size for chunked transfer encoding.
+        expect100 - Expect 100-continue response from server.
+        connector - BaseConnector sub-class instance to support
+        connection pooling.
+        read_until_eof - Read response until eof if response
+        does not have Content-Length header.
+        loop - Optional event loop.
+        timeout - Optional ClientTimeout settings structure, 5min
+        total timeout by default.
+        Usage::
+        >>> import aiohttp
+        >>> async with aiohttp.request('GET', 'http://python.org/') as resp:
+        ...    print(resp)
+        ...    data = await resp.read()
+        <ClientResponse(https://www.python.org/) [200 OK]>
+        """
+        connector_owner = False
+        if connector is None:
+            connector_owner = True
+            connector = TCPConnector(loop=loop, force_close=True)
+
+        session = ClientSession(
+            loop=loop,
+            cookies=kwargs.pop("cookies", None),
+            version=version,
+            timeout=kwargs.pop("timeout", sentinel),
+            connector=connector,
+            connector_owner=connector_owner,
+        )
+
+        return _SessionRequestContextManager(
+            session._request(method, url, **kwargs),
+            session,
+        )
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_proto.py aiohttp/client_proto.py
index 79f033e3e12..2d64b3f3644 100644
--- aiohttp/client_proto.py
+++ aiohttp/client_proto.py
@@ -64,6 +64,7 @@ def force_close(self) -> None:
         self._should_close = True
 
     def close(self) -> None:
+        self._exception = None  # Break cyclic references
         transport = self.transport
         if transport is not None:
             transport.close()
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/client_ws.py aiohttp/client_ws.py
index f4cfa1bffe8..daa57d1930b 100644
--- aiohttp/client_ws.py
+++ aiohttp/client_ws.py
@@ -163,7 +163,9 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
         self._ping_task = None
 
     def _pong_not_received(self) -> None:
-        self._handle_ping_pong_exception(ServerTimeoutError())
+        self._handle_ping_pong_exception(
+            ServerTimeoutError(f"No PONG received after {self._pong_heartbeat} seconds")
+        )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..7420bd6070a 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
@@ -1009,11 +1015,11 @@ async def _resolve_host_with_throttle(
         This method must be run in a task and shielded from cancellation
         to avoid cancelling the underlying lookup.
         """
-        if traces:
-            for trace in traces:
-                await trace.send_dns_cache_miss(host)
         try:
             if traces:
+                for trace in traces:
+                    await trace.send_dns_cache_miss(host)
+
                 for trace in traces:
                     await trace.send_dns_resolvehost_start(host)
 
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/helpers.py aiohttp/helpers.py
index 8038931ebec..ace4f0e9b53 100644
--- aiohttp/helpers.py
+++ aiohttp/helpers.py
@@ -21,7 +21,7 @@
 from email.utils import parsedate
 from math import ceil
 from pathlib import Path
-from types import TracebackType
+from types import MappingProxyType, TracebackType
 from typing import (
     Any,
     Callable,
@@ -357,6 +357,20 @@ def parse_mimetype(mimetype: str) -> MimeType:
     )
 
 
+@functools.lru_cache(maxsize=56)
+def parse_content_type(raw: str) -> Tuple[str, MappingProxyType[str, str]]:
+    """Parse Content-Type header.
+
+    Returns a tuple of the parsed content type and a
+    MappingProxyType of parameters.
+    """
+    msg = HeaderParser().parsestr(f"Content-Type: {raw}")
+    content_type = msg.get_content_type()
+    params = msg.get_params(())
+    content_dict = dict(params[1:])  # First element is content type again
+    return content_type, MappingProxyType(content_dict)
+
+
 def guess_filename(obj: Any, default: Optional[str] = None) -> Optional[str]:
     name = getattr(obj, "name", None)
     if name and isinstance(name, str) and name[0] != "<" and name[-1] != ">":
@@ -710,10 +724,10 @@ def _parse_content_type(self, raw: Optional[str]) -> None:
             self._content_type = "application/octet-stream"
             self._content_dict = {}
         else:
-            msg = HeaderParser().parsestr("Content-Type: " + raw)
-            self._content_type = msg.get_content_type()
-            params = msg.get_params(())
-            self._content_dict = dict(params[1:])  # First element is content type again
+            content_type, content_mapping_proxy = parse_content_type(raw)
+            self._content_type = content_type
+            # _content_dict needs to be mutable so we can update it
+            self._content_dict = content_mapping_proxy.copy()
 
     @property
     def content_type(self) -> str:
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..e031a97708d 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -1,6 +1,7 @@
 """Http related parsers and protocol."""
 
 import asyncio
+import sys
 import zlib
 from typing import (  # noqa
     Any,
@@ -24,6 +25,17 @@
 __all__ = ("StreamWriter", "HttpVersion", "HttpVersion10", "HttpVersion11")
 
 
+MIN_PAYLOAD_FOR_WRITELINES = 2048
+IS_PY313_BEFORE_313_2 = (3, 13, 0) <= sys.version_info < (3, 13, 2)
+IS_PY_BEFORE_312_9 = sys.version_info < (3, 12, 9)
+SKIP_WRITELINES = IS_PY313_BEFORE_313_2 or IS_PY_BEFORE_312_9
+# writelines is not safe for use
+# on Python 3.12+ until 3.12.9
+# on Python 3.13+ until 3.13.2
+# and on older versions it not any faster than write
+# CVE-2024-12254: https://github.com/python/cpython/pull/127656
+
+
 class HttpVersion(NamedTuple):
     major: int
     minor: int
@@ -72,7 +84,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +102,17 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        if SKIP_WRITELINES or size < MIN_PAYLOAD_FOR_WRITELINES:
+            transport.write(b"".join(chunks))
+        else:
+            transport.writelines(chunks)
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/multipart.py aiohttp/multipart.py
index e0bcce07449..bd4d8ae1ddf 100644
--- aiohttp/multipart.py
+++ aiohttp/multipart.py
@@ -979,7 +979,7 @@ def decode(self, encoding: str = "utf-8", errors: str = "strict") -> str:
         return "".join(
             "--"
             + self.boundary
-            + "\n"
+            + "\r\n"
             + part._binary_headers.decode(encoding, errors)
             + part.decode()
             for part, _e, _te in self._parts
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/pytest_plugin.py aiohttp/pytest_plugin.py
index 7ce60faa4a4..21d6ea7bbcd 100644
--- aiohttp/pytest_plugin.py
+++ aiohttp/pytest_plugin.py
@@ -98,7 +98,7 @@ def pytest_fixture_setup(fixturedef):  # type: ignore[no-untyped-def]
     if inspect.isasyncgenfunction(func):
         # async generator fixture
         is_async_gen = True
-    elif asyncio.iscoroutinefunction(func):
+    elif inspect.iscoroutinefunction(func):
         # regular async fixture
         is_async_gen = False
     else:
@@ -200,14 +200,14 @@ def _passthrough_loop_context(loop, fast=False):  # type: ignore[no-untyped-def]
 
 def pytest_pycollect_makeitem(collector, name, obj):  # type: ignore[no-untyped-def]
     """Fix pytest collecting for coroutines."""
-    if collector.funcnamefilter(name) and asyncio.iscoroutinefunction(obj):
+    if collector.funcnamefilter(name) and inspect.iscoroutinefunction(obj):
         return list(collector._genfunctions(name, obj))
 
 
 def pytest_pyfunc_call(pyfuncitem):  # type: ignore[no-untyped-def]
     """Run coroutines in an event loop instead of a normal function call."""
     fast = pyfuncitem.config.getoption("--aiohttp-fast")
-    if asyncio.iscoroutinefunction(pyfuncitem.function):
+    if inspect.iscoroutinefunction(pyfuncitem.function):
         existing_loop = pyfuncitem.funcargs.get(
             "proactor_loop"
         ) or pyfuncitem.funcargs.get("loop", None)
diff --git aiohttp/resolver.py aiohttp/resolver.py
index 9c744514fae..e14179cc8a2 100644
--- aiohttp/resolver.py
+++ aiohttp/resolver.py
@@ -18,6 +18,9 @@
 
 _NUMERIC_SOCKET_FLAGS = socket.AI_NUMERICHOST | socket.AI_NUMERICSERV
 _NAME_SOCKET_FLAGS = socket.NI_NUMERICHOST | socket.NI_NUMERICSERV
+_AI_ADDRCONFIG = socket.AI_ADDRCONFIG
+if hasattr(socket, "AI_MASK"):
+    _AI_ADDRCONFIG &= socket.AI_MASK
 
 
 class ThreadedResolver(AbstractResolver):
@@ -38,7 +41,7 @@ async def resolve(
             port,
             type=socket.SOCK_STREAM,
             family=family,
-            flags=socket.AI_ADDRCONFIG,
+            flags=_AI_ADDRCONFIG,
         )
 
         hosts: List[ResolveResult] = []
@@ -105,7 +108,7 @@ async def resolve(
                 port=port,
                 type=socket.SOCK_STREAM,
                 family=family,
-                flags=socket.AI_ADDRCONFIG,
+                flags=_AI_ADDRCONFIG,
             )
         except aiodns.error.DNSError as exc:
             msg = exc.args[1] if len(exc.args) >= 1 else "DNS lookup failed"
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..7a3f64d1289 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
@@ -551,6 +555,7 @@ class EmptyStreamReader(StreamReader):  # lgtm [py/missing-call-to-init]
 
     def __init__(self) -> None:
         self._read_eof_chunk = False
+        self.total_bytes = 0
 
     def __repr__(self) -> str:
         return "<%s>" % self.__class__.__name__
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..1dba9606ea0 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -458,7 +458,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
@@ -520,8 +520,6 @@ async def start(self) -> None:
         keep_alive(True) specified.
         """
         loop = self._loop
-        handler = asyncio.current_task(loop)
-        assert handler is not None
         manager = self._manager
         assert manager is not None
         keepalive_timeout = self._keepalive_timeout
@@ -551,7 +549,16 @@ async def start(self) -> None:
             else:
                 request_handler = self._request_handler
 
-            request = self._request_factory(message, payload, self, writer, handler)
+            # Important don't hold a reference to the current task
+            # as on traceback it will prevent the task from being
+            # collected and will cause a memory leak.
+            request = self._request_factory(
+                message,
+                payload,
+                self,
+                writer,
+                self._task_handler or asyncio.current_task(loop),  # type: ignore[arg-type]
+            )
             try:
                 # a new task is used for copy context vars (#3406)
                 coro = self._handle_request(request, start, request_handler)
@@ -608,26 +615,29 @@ async def start(self) -> None:
 
             except asyncio.CancelledError:
                 self.log_debug("Ignored premature client disconnection")
+                self.f,orce_close()
                 raise
             except Exception as exc:
                 self.log_exception("Unhandled exception", exc_info=exc)
                 self.force_close()
+            except BaseException:
+                self.force_close()
+                raise
             finally:
+                request._task = None  # type: ignore[assignment] # Break reference cycle in case of exception
                 if self.transport is None and resp is not None:
                     self.log_debug("Ignored premature client disconnection.")
-                elif not self._force_close:
-                    if self._keepalive and not self._close:
-                        # start keep-alive timer
-                        if keepalive_timeout is not None:
-                            now = loop.time()
-                            close_time = now + keepalive_timeout
-                            self._next_keepalive_close_time = close_time
-                            if self._keepalive_handle is None:
-                                self._keepalive_handle = loop.call_at(
-                                    close_time, self._process_keepalive
-                                )
-                    else:
-                        break
+
+            if self._keepalive and not self._close and not self._force_close:
+                # start keep-alive timer
+                close_time = loop.time() + keepalive_timeout
+                self._next_keepalive_close_time = close_time
+                if self._keepalive_handle is None:
+                    self._keepalive_handle = loop.call_at(
+                        close_time, self._process_keepalive
+                    )
+            else:
+                break
 
         # remove handler, close transport if no handlers left
         if not self._force_close:
@@ -694,9 +704,13 @@ def handle_error(
             # or encrypted traffic to an HTTP port. This is expected
             # to happen when connected to the public internet so we log
             # it at the debug level as to not fill logs with noise.
-            self.logger.debug("Error handling request", exc_info=exc)
+            self.logger.debug(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
         else:
-            self.log_exception("Error handling request", exc_info=exc)
+            self.log_exception(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
 
         # some data already got sent, connection is broken
         if request.writer.output_size > 0:
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..367ac6e8c0a 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
@@ -629,10 +629,8 @@ def __init__(
 
         if headers is None:
             real_headers: CIMultiDict[str] = CIMultiDict()
-        elif not isinstance(headers, CIMultiDict):
-            real_headers = CIMultiDict(headers)
         else:
-            real_headers = headers  # = cast('CIMultiDict[str]', headers)
+            real_headers = CIMultiDict(headers)
 
         if content_type is not None and "charset" in content_type:
             raise ValueError("charset must not be in content_type argument")
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/web_urldispatcher.py aiohttp/web_urldispatcher.py
index 6443c500a33..28ae2518fec 100644
--- aiohttp/web_urldispatcher.py
+++ aiohttp/web_urldispatcher.py
@@ -180,8 +180,8 @@ def __init__(
         if expect_handler is None:
             expect_handler = _default_expect_handler
 
-        assert asyncio.iscoroutinefunction(
-            expect_handler
+        assert inspect.iscoroutinefunction(expect_handler) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(expect_handler)
         ), f"Coroutine is expected, got {expect_handler!r}"
 
         method = method.upper()
@@ -189,7 +189,9 @@ def __init__(
             raise ValueError(f"{method} is not allowed HTTP method")
 
         assert callable(handler), handler
-        if asyncio.iscoroutinefunction(handler):
+        if inspect.iscoroutinefunction(handler) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(handler)
+        ):
             pass
         elif inspect.isgeneratorfunction(handler):
             warnings.warn(
diff --git aiohttp/web_ws.py aiohttp/web_ws.py
index 0fb1549a3aa..439b8049987 100644
--- aiohttp/web_ws.py
+++ aiohttp/web_ws.py
@@ -182,7 +182,11 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
 
     def _pong_not_received(self) -> None:
         if self._req is not None and self._req.transport is not None:
-            self._handle_ping_pong_exception(asyncio.TimeoutError())
+            self._handle_ping_pong_exception(
+                asyncio.TimeoutError(
+                    f"No PONG received after {self._pong_heartbeat} seconds"
+                )
+            )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
@@ -248,7 +252,8 @@ def _handshake(
             else:
                 # No overlap found: Return no protocol as per spec
                 ws_logger.warning(
-                    "Client protocols %r don’t overlap server-known ones %r",
+                    "%s: Client protocols %r don’t overlap server-known ones %r",
+                    request.remote,
                     req_protocols,
                     self._protocols,
                 )
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..f7281bfde75 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -1,12 +1,13 @@
 """Async gunicorn worker for aiohttp.web"""
 
 import asyncio
+import inspect
 import os
 import re
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +18,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
@@ -66,7 +72,9 @@ async def _run(self) -> None:
         runner = None
         if isinstance(self.wsgi, Application):
             app = self.wsgi
-        elif asyncio.iscoroutinefunction(self.wsgi):
+        elif inspect.iscoroutinefunction(self.wsgi) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(self.wsgi)
+        ):
             wsgi = await self.wsgi()
             if isinstance(wsgi, web.AppRunner):
                 runner = wsgi
diff --git docs/client_quickstart.rst docs/client_quickstart.rst
index f99339cf4a6..0e03f104e90 100644
--- docs/client_quickstart.rst
+++ docs/client_quickstart.rst
@@ -93,7 +93,7 @@ Passing Parameters In URLs
 You often want to send some sort of data in the URL's query string. If
 you were constructing the URL by hand, this data would be given as key/value
 pairs in the URL after a question mark, e.g. ``httpbin.org/get?key=val``.
-Requests allows you to provide these arguments as a :class:`dict`, using the
+aiohttp allows you to provide these arguments as a :class:`dict`, using the
 ``params`` keyword argument. As an example, if you wanted to pass
 ``key1=value1`` and ``key2=value2`` to ``httpbin.org/get``, you would use the
 following code::
diff --git docs/client_reference.rst docs/client_reference.rst
index c9031de5383..26537161971 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -448,11 +448,16 @@ The client session supports the context manager protocol for self closing.
       :param aiohttp.BasicAuth auth: an object that represents HTTP
                                      Basic Authorization (optional)
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed (up to ``max_redirects`` times)
+         and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :param int max_redirects: Maximum number of redirects to follow.
-                                ``10`` by default.
+         :exc:`TooManyRedirects` is raised if the number is exceeded.
+         Ignored when ``allow_redirects=False``.
+         ``10`` by default.
 
       :param bool compress: Set to ``True`` if request has to be compressed
          with deflate encoding. If `compress` can not be combined
@@ -508,7 +513,7 @@ The client session supports the context manager protocol for self closing.
          .. versionadded:: 3.0
 
       :param str server_hostname: Sets or overrides the host name that the
-         target server’s certificate will be matched against.
+         target server's certificate will be matched against.
 
          See :py:meth:`asyncio.loop.create_connection` for more information.
 
@@ -554,8 +559,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -623,8 +631,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``False`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``False`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -641,8 +652,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -836,14 +850,21 @@ certification chaining.
 
 .. function:: request(method, url, *, params=None, data=None, \
                         json=None,\
-                        headers=None, cookies=None, auth=None, \
+                        cookies=None, headers=None, skip_auto_headers=None, auth=None, \
                         allow_redirects=True, max_redirects=10, \
-                        encoding='utf-8', \
-                        version=HttpVersion(major=1, minor=1), \
-                        compress=None, chunked=None, expect100=False, raise_for_status=False, \
+                        compress=False, chunked=None, expect100=False, raise_for_status=None, \
+                        read_until_eof=True, \
+                        proxy=None, proxy_auth=None, \
+                        timeout=sentinel, ssl=True, \
+                        server_hostname=None, \
+                        proxy_headers=None, \
+                        trace_request_ctx=None, \
                         read_bufsize=None, \
-                        connector=None, loop=None,\
-                        read_until_eof=True, timeout=sentinel)
+                        auto_decompress=None, \
+                        max_line_size=None, \
+                        max_field_size=None, \
+                        version=aiohttp.HttpVersion11, \
+                        connector=None)
    :async:
 
    Asynchronous context manager for performing an asynchronous HTTP
@@ -856,8 +877,20 @@ certification chaining.
                be encoded with :class:`~yarl.URL` (see :class:`~yarl.URL`
                to skip encoding).
 
-   :param dict params: Parameters to be sent in the query
-                       string of the new request (optional)
+   :param params: Mapping, iterable of tuple of *key*/*value* pairs or
+                  string to be sent as parameters in the query
+                  string of the new request. Ignored for subsequent
+                  redirected requests (optional)
+
+                  Allowed values are:
+
+                  - :class:`collections.abc.Mapping` e.g. :class:`dict`,
+                     :class:`multidict.MultiDict` or
+                     :class:`multidict.MultiDictProxy`
+                  - :class:`collections.abc.Iterable` e.g. :class:`tuple` or
+                     :class:`list`
+                  - :class:`str` with preferably url-encoded content
+                     (**Warning:** content will not be encoded by *aiohttp*)
 
    :param data: The data to send in the body of the request. This can be a
                 :class:`FormData` object or anything that can be passed into
@@ -867,25 +900,46 @@ certification chaining.
    :param json: Any json compatible python object (optional). *json* and *data*
                 parameters could not be used at the same time.
 
+   :param dict cookies: HTTP Cookies to send with the request (optional)
+
    :param dict headers: HTTP Headers to send with the request (optional)
 
-   :param dict cookies: Cookies to send with the request (optional)
+   :param skip_auto_headers: set of headers for which autogeneration
+      should be skipped.
+
+      *aiohttp* autogenerates headers like ``User-Agent`` or
+      ``Content-Type`` if these headers are not explicitly
+      passed. Using ``skip_auto_headers`` parameter allows to skip
+      that generation.
+
+      Iterable of :class:`str` or :class:`~multidict.istr`
+      (optional)
 
    :param aiohttp.BasicAuth auth: an object that represents HTTP Basic
                                   Authorization (optional)
 
-   :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                ``True`` by default (optional).
+   :param bool allow_redirects: Whether to process redirects or not.
+      When ``True``, redirects are followed (up to ``max_redirects`` times)
+      and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+      When ``False``, the original response is returned.
+      ``True`` by default (optional).
 
-   :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional)
+   :param int max_redirects: Maximum number of redirects to follow.
+      :exc:`TooManyRedirects` is raised if the number is exceeded.
+      Ignored when ``allow_redirects=False``.
+      ``10`` by default.
 
    :param bool compress: Set to ``True`` if request has to be compressed
-                         with deflate encoding.
-                         ``False`` instructs aiohttp to not compress data.
+                         with deflate encoding. If `compress` can not be combined
+                         with a *Content-Encoding* and *Content-Length* headers.
                          ``None`` by default (optional).
 
    :param int chunked: Enables chunked transfer encoding.
-                       ``None`` by default (optional).
+      It is up to the developer
+      to decide how to chunk data streams. If chunking is enabled, aiohttp
+      encodes the provided chunks in the "Transfer-encoding: chunked" format.
+      If *chunked* is set, then the *Transfer-encoding* and *content-length*
+      headers are disallowed. ``None`` by default (optional).
 
    :param bool expect100: Expect 100-continue response from server.
                           ``False`` by default (optional).
@@ -899,28 +953,60 @@ certification chaining.
 
       .. versionadded:: 3.4
 
-   :param aiohttp.BaseConnector connector: BaseConnector sub-class
-      instance to support connection pooling.
-
    :param bool read_until_eof: Read response until EOF if response
                                does not have Content-Length header.
                                ``True`` by default (optional).
 
+   :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional)
+
+   :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP
+                                        Basic Authorization (optional)
+
+   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
+        total timeout, 30 seconds socket connect timeout by default.
+
+   :param ssl: SSL validation mode. ``True`` for default SSL check
+               (:func:`ssl.create_default_context` is used),
+               ``False`` for skip SSL certificate validation,
+               :class:`aiohttp.Fingerprint` for fingerprint
+               validation, :class:`ssl.SSLContext` for custom SSL
+               certificate validation.
+
+               Supersedes *verify_ssl*, *ssl_context* and
+               *fingerprint* parameters.
+
+   :param str server_hostname: Sets or overrides the host name that the
+      target server's certificate will be matched against.
+
+      See :py:meth:`asyncio.loop.create_connection`
+      for more information.
+
+   :param collections.abc.Mapping proxy_headers: HTTP headers to send to the proxy
+      if the parameter proxy has been provided.
+
+   :param trace_request_ctx: Object used to give as a kw param for each new
+      :class:`TraceConfig` object instantiated,
+      used to give information to the
+      tracers that is only available at request time.
+
    :param int read_bufsize: Size of the read buffer (:attr:`ClientResponse.content`).
                             ``None`` by default,
                             it means that the session global value is used.
 
       .. versionadded:: 3.7
 
-   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
-        total timeout, 30 seconds socket connect timeout by default.
+   :param bool auto_decompress: Automatically decompress response body.
+      May be used to enable/disable auto decompression on a per-request basis.
 
-   :param loop: :ref:`event loop<asyncio-event-loop>`
-                used for processing HTTP requests.
-                If param is ``None``, :func:`asyncio.get_event_loop`
-                is used for getting default event loop.
+   :param int max_line_size: Maximum allowed size of lines in responses.
 
-      .. deprecated:: 2.0
+   :param int max_field_size: Maximum allowed size of header fields in responses.
+
+   :param aiohttp.protocol.HttpVersion version: Request HTTP version,
+      ``HTTP 1.1`` by default. (optional)
+
+   :param aiohttp.BaseConnector connector: BaseConnector sub-class
+      instance to support connection pooling. (optional)
 
    :return ClientResponse: a :class:`client response <ClientResponse>` object.
 
diff --git docs/contributing-admins.rst docs/contributing-admins.rst
index acfaebc0e97..b17cbe1019a 100644
--- docs/contributing-admins.rst
+++ docs/contributing-admins.rst
@@ -21,9 +21,9 @@ To create a new release:
 #. Run ``towncrier``.
 #. Check and cleanup the changes in ``CHANGES.rst``.
 #. Checkout a new branch: e.g. ``git checkout -b release/v3.8.6``
-#. Commit and create a PR. Once PR is merged, continue.
+#. Commit and create a PR. Verify the changelog and release notes look good on Read the Docs. Once PR is merged, continue.
 #. Go back to the release branch: e.g. ``git checkout 3.8 && git pull``
-#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6'``
+#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6' -s``
 #. Push the tag: e.g. ``git push origin v3.8.6``
 #. Monitor CI to ensure release process completes without errors.
 
@@ -49,6 +49,10 @@ first merge into the newer release branch (e.g. 3.8 into 3.9) and then to master
 
 Back on the original release branch, bump the version number and append ``.dev0`` in ``__init__.py``.
 
+Post the release announcement to social media:
+ - BlueSky: https://bsky.app/profile/aiohttp.org and re-post to https://bsky.app/profile/aio-libs.org
+ - Mastodon: https://fosstodon.org/@aiohttp and re-post to https://fosstodon.org/@aio_libs
+
 If doing a minor release:
 
 #. Create a new release branch for future features to go to: e.g. ``git checkout -b 3.10 3.9 && git push``
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..59ea99c40bb 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -13,6 +13,8 @@ app
 app’s
 apps
 arg
+args
+armv
 Arsenic
 async
 asyncio
@@ -169,6 +171,7 @@ keepaliving
 kib
 KiB
 kwarg
+kwargs
 latin
 lifecycle
 linux
@@ -199,6 +202,7 @@ multidicts
 Multidicts
 multipart
 Multipart
+musllinux
 mypy
 Nagle
 Nagle’s
@@ -245,6 +249,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git docs/third_party.rst docs/third_party.rst
index e8095c7f09d..145a505a5de 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -305,3 +305,6 @@ ask to raise the status.
 
 - `aiohttp-asgi-connector <https://github.com/thearchitector/aiohttp-asgi-connector>`_
   An aiohttp connector for using a ``ClientSession`` to interface directly with separate ASGI applications.
+
+- `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
+  An aiohttp middleware for exposing Prometheus metrics.
diff --git requirements/base.txt requirements/base.txt
index 1e7c0bbe6c1..d79bdab3893 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -30,7 +30,7 @@ multidict==6.1.0
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-packaging==24.1
+packaging==24.2
     # via gunicorn
 propcache==0.2.0
     # via
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..041a3737ab0 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -129,7 +129,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..a99644dff81 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -122,7 +122,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..43b3822706e 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pyenchant==3.2.2
     # via sphinxcontrib-spelling
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..6ddfc47455b 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pygments==2.18.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/lint.txt requirements/lint.txt
index d7d97277bce..e2547d13da5 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via pytest
 platformdirs==4.3.6
     # via virtualenv
diff --git requirements/test.txt requirements/test.txt
index 33510f18682..cf81a7bf257 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -70,7 +70,7 @@ mypy==1.11.2 ; implementation_name == "cpython"
     # via -r requirements/test.in
 mypy-extensions==1.0.0
     # via mypy
-packaging==24.1
+packaging==24.2
     # via
     #   gunicorn
     #   pytest
diff --git tests/conftest.py tests/conftest.py
index 44ae384b633..95a98cd4fc0 100644
--- tests/conftest.py
+++ tests/conftest.py
@@ -221,6 +221,7 @@ def start_connection():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     ) as start_connection_mock:
         yield start_connection_mock
 
diff --git a/tests/isolated/check_for_client_response_leak.py b/tests/isolated/check_for_client_response_leak.py
new file mode 100644
index 00000000000..67393c2c2d8
--- /dev/null
+++ tests/isolated/check_for_client_response_leak.py
@@ -0,0 +1,47 @@
+import asyncio
+import contextlib
+import gc
+import sys
+
+from aiohttp import ClientError, ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def stream_handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        request.transport.close()  # Forcefully closing connection
+        return web.Response()
+
+    app.router.add_get("/stream", stream_handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    session = ClientSession()
+
+    async def fetch_stream(url: str) -> None:
+        """Fetch a stream and read a few bytes from it."""
+        with contextlib.suppress(ClientError):
+            await session.get(url)
+
+    client_task = asyncio.create_task(fetch_stream(f"http://localhost:{port}/stream"))
+    await client_task
+    gc.collect()
+    client_response_present = any(
+        type(obj).__name__ == "ClientResponse" for obj in gc.garbage
+    )
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if client_response_present else 0)
+
+
+asyncio.run(main())
diff --git a/tests/isolated/check_for_request_leak.py b/tests/isolated/check_for_request_leak.py
new file mode 100644
index 00000000000..6f340a05277
--- /dev/null
+++ tests/isolated/check_for_request_leak.py
@@ -0,0 +1,41 @@
+import asyncio
+import gc
+import sys
+from typing import NoReturn
+
+from aiohttp import ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def handler(request: web.Request) -> NoReturn:
+        await request.json()
+        assert False
+
+    app.router.add_route("GET", "/json", handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    async with ClientSession() as session:
+        async with session.get(f"http://127.0.0.1:{port}/json") as resp:
+            await resp.read()
+
+    # Give time for the cancelled task to be collected
+    await asyncio.sleep(0.5)
+    gc.collect()
+    request_present = any(type(obj).__name__ == "Request" for obj in gc.garbage)
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if request_present else 0)
+
+
+asyncio.run(main())
diff --git tests/test_benchmarks_client.py tests/test_benchmarks_client.py
index 61439183334..aa3536be820 100644
--- tests/test_benchmarks_client.py
+++ tests/test_benchmarks_client.py
@@ -124,7 +124,7 @@ def test_one_hundred_get_requests_with_512kib_chunked_payload(
     aiohttp_client: AiohttpClient,
     benchmark: BenchmarkFixture,
 ) -> None:
-    """Benchmark 100 GET requests with a payload of 512KiB."""
+    """Benchmark 100 GET requests with a payload of 512KiB using read."""
     message_count = 100
     payload = b"a" * (2**19)
 
@@ -148,6 +148,36 @@ def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
 
 
+def test_one_hundred_get_requests_iter_chunks_on_512kib_chunked_payload(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 GET requests with a payload of 512KiB using iter_chunks."""
+    message_count = 100
+    payload = b"a" * (2**19)
+
+    async def handler(request: web.Request) -> web.Response:
+        resp = web.Response(body=payload)
+        resp.enable_chunked_encoding()
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
 def test_get_request_with_251308_compressed_chunked_payload(
     loop: asyncio.AbstractEventLoop,
     aiohttp_client: AiohttpClient,
@@ -289,3 +319,158 @@ async def run_client_benchmark() -> None:
     @benchmark
     def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
+
+
+def test_one_hundred_json_post_requests(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 JSON POST requests that check the content-type."""
+    message_count = 100
+
+    async def handler(request: web.Request) -> web.Response:
+        _ = request.content_type
+        _ = request.charset
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("POST", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            await client.post("/", json={"key": "value"})
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_any(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_any."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_any():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_4096(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 4096."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 4096 iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(4096):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_65536(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 65536."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 64 KiB iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(65536):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunks(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunks."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..ba75e8e93c6 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
@@ -3358,6 +3382,22 @@ async def handler(request: web.Request) -> web.Response:
     await server.close()
 
 
+async def test_aiohttp_request_ssl(
+    aiohttp_server: AiohttpServer,
+    ssl_ctx: ssl.SSLContext,
+    client_ssl_ctx: ssl.SSLContext,
+) -> None:
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    async with aiohttp.request("GET", server.make_url("/"), ssl=client_ssl_ctx) as resp:
+        assert resp.status == 200
+
+
 async def test_yield_from_in_session_request(aiohttp_client: AiohttpClient) -> None:
     # a test for backward compatibility with yield from syntax
     async def handler(request):
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_client_ws_functional.py tests/test_client_ws_functional.py
index 7ede7432adf..0ca57ab3ab2 100644
--- tests/test_client_ws_functional.py
+++ tests/test_client_ws_functional.py
@@ -315,7 +315,6 @@ async def test_concurrent_close(aiohttp_client) -> None:
     client_ws = None
 
     async def handler(request):
-        nonlocal client_ws
         ws = web.WebSocketResponse()
         await ws.prepare(request)
 
@@ -902,6 +901,7 @@ async def handler(request):
         assert resp.close_code is WSCloseCode.ABNORMAL_CLOSURE
         assert msg.type is WSMsgType.ERROR
         assert isinstance(msg.data, ServerTimeoutError)
+        assert str(msg.data) == "No PONG received after 0.05 seconds"
 
 
 async def test_close_websocket_while_ping_inflight(
@@ -935,7 +935,7 @@ async def delayed_send_frame(
         message: bytes, opcode: int, compress: Optional[int] = None
     ) -> None:
         assert opcode == WSMsgType.PING
-        nonlocal cancelled, ping_started
+        nonlocal cancelled
         ping_started.set_result(None)
         try:
             await asyncio.sleep(1)
diff --git tests/test_connector.py tests/test_connector.py
index 483759a4180..a3fffc447ae 100644
--- tests/test_connector.py
+++ tests/test_connector.py
@@ -3474,6 +3474,61 @@ async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
     await connector.close()
 
 
+async def test_connector_resolve_in_case_of_trace_cache_miss_exception(
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    token: ResolveResult = {
+        "hostname": "localhost",
+        "host": "127.0.0.1",
+        "port": 80,
+        "family": socket.AF_INET,
+        "proto": 0,
+        "flags": socket.AI_NUMERICHOST,
+    }
+
+    request_count = 0
+
+    class DummyTracer(Trace):
+        def __init__(self) -> None:
+            """Dummy"""
+
+        async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
+            """Dummy send_dns_cache_hit"""
+
+        async def send_dns_resolvehost_start(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_start"""
+
+        async def send_dns_resolvehost_end(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_end"""
+
+        async def send_dns_cache_miss(self, *args: object, **kwargs: object) -> None:
+            nonlocal request_count
+            request_count += 1
+            if request_count <= 1:
+                raise Exception("first attempt")
+
+    async def resolve_response() -> List[ResolveResult]:
+        await asyncio.sleep(0)
+        return [token]
+
+    with mock.patch("aiohttp.connector.DefaultResolver") as m_resolver:
+        m_resolver().resolve.return_value = resolve_response()
+
+        connector = TCPConnector()
+        traces = [DummyTracer()]
+
+        with pytest.raises(Exception):
+            await connector._resolve_host("", 0, traces)
+
+        await connector._resolve_host("", 0, traces) == [token]
+
+    await connector.close()
+
+
 async def test_connector_does_not_remove_needed_waiters(
     loop: asyncio.AbstractEventLoop, key: ConnectionKey
 ) -> None:
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_helpers.py tests/test_helpers.py
index 2a83032e557..a343cbdfedf 100644
--- tests/test_helpers.py
+++ tests/test_helpers.py
@@ -351,7 +351,6 @@ async def test_timer_context_timeout_does_swallow_cancellation() -> None:
     ctx = helpers.TimerContext(loop)
 
     async def task_with_timeout() -> None:
-        nonlocal ctx
         new_task = asyncio.current_task()
         assert new_task is not None
         with pytest.raises(asyncio.TimeoutError):
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..420816b3137 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -2,19 +2,38 @@
 import array
 import asyncio
 import zlib
-from typing import Iterable
+from typing import Generator, Iterable
 from unittest import mock
 
 import pytest
 from multidict import CIMultiDict
 
-from aiohttp import ClientConnectionResetError, http
+from aiohttp import ClientConnectionResetError, hdrs, http
 from aiohttp.base_protocol import BaseProtocol
+from aiohttp.http_writer import _serialize_headers
 from aiohttp.test_utils import make_mocked_coro
 
 
 @pytest.fixture
-def buf():
+def enable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", False):
+        yield
+
+
+@pytest.fixture
+def disable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", True):
+        yield
+
+
+@pytest.fixture
+def force_writelines_small_payloads() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.MIN_PAYLOAD_FOR_WRITELINES", 1):
+        yield
+
+
+@pytest.fixture
+def buf() -> bytearray:
     return bytearray()
 
 
@@ -92,6 +111,7 @@ async def test_write_payload_length(protocol, transport, loop) -> None:
     assert b"da" == content.split(b"\r\n\r\n", 1)[-1]
 
 
+@pytest.mark.usefixtures("disable_writelines")
 async def test_write_large_payload_deflate_compression_data_in_eof(
     protocol: BaseProtocol,
     transport: asyncio.Transport,
@@ -100,6 +120,32 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     msg = http.StreamWriter(protocol, loop)
     msg.enable_compression("deflate")
 
+    await msg.write(b"data" * 4096)
+    assert transport.write.called  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    transport.write.reset_mock()  # type: ignore[attr-defined]
+
+    # This payload compresses to 20447 bytes
+    payload = b"".join(
+        [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
+    )
+    await msg.write_eof(payload)
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+async def test_write_large_payload_deflate_compression_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+
     await msg.write(b"data" * 4096)
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
@@ -180,6 +226,26 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\na\r\nKI,I\x04\x00\x04\x00\x01\x9b\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof()
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -216,6 +282,26 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\nd\r\nKI,IL\xcdK\x01\x00\x0b@\x02\xd2\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof(b"end")
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -231,6 +317,34 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     msg.enable_compression("deflate")
     msg.enable_chunking()
 
+    await msg.write(b"data" * 4096)
+    # This payload compresses to 1111 bytes
+    payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
+    await msg.write_eof(payload)
+
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
+
+    content = b"".join(compressed)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+
     await msg.write(b"data" * 4096)
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
@@ -421,3 +535,29 @@ async def test_set_eof_after_write_headers(
     msg.set_eof()
     await msg.write_eof()
     assert not transport.write.called
+
+
+@pytest.mark.parametrize(
+    "char",
+    [
+        "\n",
+        "\r",
+    ],
+)
+def test_serialize_headers_raises_on_new_line_or_carriage_return(char: str) -> None:
+    """Verify serialize_headers raises on cr or nl in the headers."""
+    status_line = "HTTP/1.1 200 OK"
+    headers = CIMultiDict(
+        {
+            hdrs.CONTENT_TYPE: f"text/plain{char}",
+        }
+    )
+
+    with pytest.raises(
+        ValueError,
+        match=(
+            "Newline or carriage return detected in headers. "
+            "Potential header injection attack."
+        ),
+    ):
+        _serialize_headers(status_line, headers)
diff --git tests/test_imports.py tests/test_imports.py
index 5a2bb76b03c..b3f545ad900 100644
--- tests/test_imports.py
+++ tests/test_imports.py
@@ -38,7 +38,7 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
         # and even slower under pytest-xdist, especially in CI
         _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
         if _IS_XDIST_RUN
-        else 265
+        else 295
     ),
 }
 _TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
diff --git a/tests/test_leaks.py b/tests/test_leaks.py
new file mode 100644
index 00000000000..07b506bdb99
--- /dev/null
+++ tests/test_leaks.py
@@ -0,0 +1,37 @@
+import pathlib
+import platform
+import subprocess
+import sys
+
+import pytest
+
+IS_PYPY = platform.python_implementation() == "PyPy"
+
+
+@pytest.mark.skipif(IS_PYPY, reason="gc.DEBUG_LEAK not available on PyPy")
+@pytest.mark.parametrize(
+    ("script", "message"),
+    [
+        (
+            # Test that ClientResponse is collected after server disconnects.
+            # https://github.com/aio-libs/aiohttp/issues/10535
+            "check_for_client_response_leak.py",
+            "ClientResponse leaked",
+        ),
+        (
+            # Test that Request object is collected when the handler raises.
+            # https://github.com/aio-libs/aiohttp/issues/10548
+            "check_for_request_leak.py",
+            "Request leaked",
+        ),
+    ],
+)
+def test_leak(script: str, message: str) -> None:
+    """Run isolated leak test script and check for leaks."""
+    leak_test_script = pathlib.Path(__file__).parent.joinpath("isolated", script)
+
+    with subprocess.Popen(
+        [sys.executable, "-u", str(leak_test_script)],
+        stdout=subprocess.PIPE,
+    ) as proc:
+        assert proc.wait() == 0, message
diff --git tests/test_proxy.py tests/test_proxy.py
index 1679b68909f..83457de891f 100644
--- tests/test_proxy.py
+++ tests/test_proxy.py
@@ -207,6 +207,7 @@ async def make_conn():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     )
     def test_proxy_connection_error(self, start_connection: Any) -> None:
         async def make_conn():
diff --git tests/test_streams.py tests/test_streams.py
index fcf13a91eb3..1b65f771c77 100644
--- tests/test_streams.py
+++ tests/test_streams.py
@@ -1141,6 +1141,7 @@ async def test_empty_stream_reader() -> None:
     with pytest.raises(asyncio.IncompleteReadError):
         await s.readexactly(10)
     assert s.read_nowait() == b""
+    assert s.total_bytes == 0
 
 
 async def test_empty_stream_reader_iter_chunks() -> None:
diff --git tests/test_urldispatch.py tests/test_urldispatch.py
index 8ee3df33202..ba6bdff23a0 100644
--- tests/test_urldispatch.py
+++ tests/test_urldispatch.py
@@ -358,7 +358,7 @@ def test_add_static_path_resolution(router: any) -> None:
     """Test that static paths are expanded and absolute."""
     res = router.add_static("/", "~/..")
     directory = str(res.get_info()["directory"])
-    assert directory == str(pathlib.Path.home().parent)
+    assert directory == str(pathlib.Path.home().resolve(strict=True).parent)
 
 
 def test_add_static(router) -> None:
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure th,e keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_response.py tests/test_web_response.py
index f4acf23f61b..95769161804 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -10,7 +10,7 @@
 
 import aiosignal
 import pytest
-from multidict import CIMultiDict, CIMultiDictProxy
+from multidict import CIMultiDict, CIMultiDictProxy, MultiDict
 from re_assert import Matches
 
 from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs
@@ -1201,7 +1201,7 @@ def read(self, size: int = -1) -> bytes:
         (BodyPartReader("x", CIMultiDictProxy(CIMultiDict()), mock.Mock()), None),
         (
             mpwriter,
-            "--x\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
+            "--x\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
         ),
     ),
 )
@@ -1479,3 +1479,15 @@ def test_text_is_json_encoded(self) -> None:
     def test_content_type_is_overrideable(self) -> None:
         resp = json_response({"foo": 42}, content_type="application/vnd.json+api")
         assert "application/vnd.json+api" == resp.content_type
+
+
+@pytest.mark.parametrize("loose_header_type", (MultiDict, CIMultiDict, dict))
+async def test_passing_cimultidict_to_web_response_not_mutated(
+    loose_header_type: type,
+) -> None:
+    req = make_request("GET", "/")
+    headers = loose_header_type({})
+    resp = Response(body=b"answer", headers=headers)
+    await resp.prepare(req)
+    assert resp.content_length == 6
+    assert not headers
diff --git tests/test_web_server.py tests/test_web_server.py
index 7b9b87a374a..d2f1341afe0 100644
--- tests/test_web_server.py
+++ tests/test_web_server.py
@@ -56,7 +56,9 @@ async def handler(request):
     assert txt.startswith("500 Internal Server Error")
     assert "Traceback" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_with_loop_debug(
@@ -85,7 +87,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
     logger.debug.reset_mock()
 
     # Now make another connection to the server
@@ -99,7 +103,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_without_loop_debug(
@@ -128,7 +134,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_second_request(
@@ -159,7 +167,9 @@ async def handler(request: web.BaseRequest) -> web.Response:
     # BadHttpMethod should be logged as an exception
     # if its not the first request since we know
     # that the client already was speaking HTTP
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_bad_status_line_as_exception(
@@ -184,7 +194,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_handler_timeout(
@@ -221,6 +233,24 @@ async def handler(request):
     logger.debug.assert_called_with("Ignored premature client disconnection")
 
 
+async def test_raw_server_does_not_swallow_base_exceptions(
+    aiohttp_raw_server: AiohttpRawServer, aiohttp_client: AiohttpClient
+) -> None:
+    class UnexpectedException(BaseException):
+        """Dummy base exception."""
+
+    async def handler(request: web.BaseRequest) -> NoReturn:
+        raise UnexpectedException()
+
+    loop = asyncio.get_event_loop()
+    loop.set_debug(True)
+    server = await aiohttp_raw_server(handler)
+    cli = await aiohttp_client(server)
+
+    with pytest.raises(client.ServerDisconnectedError):
+        await cli.get("/path/to", timeout=client.ClientTimeout(10))
+
+
 async def test_raw_server_cancelled_in_write_eof(aiohttp_raw_server, aiohttp_client):
     async def handler(request):
         resp = web.Response(text=str(request.rel_url))
@@ -254,7 +284,9 @@ async def handler(request):
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client):
@@ -278,7 +310,9 @@ async def handler(request):
         "</body></html>\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client):
@@ -302,7 +336,9 @@ async def handler(request):
         "<pre>Traceback (most recent call last):\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
@@ -311,7 +347,6 @@ async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
     port = sock.getsockname()[1]
 
     async def on_request(_: web.Request) -> web.Response:
-        nonlocal event
         try:
             await asyncio.sleep(10)
         except asyncio.CancelledError:
@@ -353,7 +388,7 @@ async def test_no_handler_cancellation(unused_port_socket: socket.socket) -> Non
     started = False
 
     async def on_request(_: web.Request) -> web.Response:
-        nonlocal done_event, started, timeout_event
+        nonlocal started
         started = True
         await asyncio.wait_for(timeout_event.wait(), timeout=5)
         done_event.set()
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))
diff --git tests/test_web_websocket_functional.py tests/test_web_websocket_functional.py
index b7494d9265f..945096a2af3 100644
--- tests/test_web_websocket_functional.py
+++ tests/test_web_websocket_functional.py
@@ -797,6 +797,7 @@ async def handler(request: web.Request) -> NoReturn:
     assert ws.close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert ws_server_close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert isinstance(ws_server_exception, asyncio.TimeoutError)
+    assert str(ws_server_exception) == "No PONG received after 0.025 seconds"
     await ws.close()
 
 
diff --git tests/test_websocket_handshake.py tests/test_websocket_handshake.py
index bbfa1d9260d..53d5d9152bb 100644
--- tests/test_websocket_handshake.py
+++ tests/test_websocket_handshake.py
@@ -174,7 +174,7 @@ async def test_handshake_protocol_unsupported(caplog) -> None:
 
     assert (
         caplog.records[-1].msg
-        == "Client protocols %r don’t overlap server-known ones %r"
+        == "%s: Client protocols %r don’t overlap server-known ones %r"
     )
     assert ws.ws_protocol is None
 
diff --git tools/gen.py tools/gen.py
index ab2b39a2df0..24fb71bdd9d 100755
--- tools/gen.py
+++ tools/gen.py
@@ -7,7 +7,7 @@
 import multidict
 
 ROOT = pathlib.Path.cwd()
-while ROOT.parent != ROOT and not (ROOT / ".git").exists():
+while ROOT.parent != ROOT and not (ROOT / "pyproject.toml").exists():
     ROOT = ROOT.parent
 
 

Description

This PR updates aiohttp to version 3.11.16 with several maintenance and improvement changes. It includes dependency updates, bug fixes, performance improvements, and packaging enhancements. The PR updates GitHub workflow actions, adds support for Python 3.13, and addresses several memory leaks and cyclic references.

Changes

Changes

GitHub Workflow Changes

  • Updated GitHub Actions to newer versions:

Python Version Updates

  • Updated Python version for deployment from 3.12 to 3.13.2
  • Added support for building musllinux wheels for more architectures
  • Reorganized wheel building matrix to include more platform combinations

Bug Fixes

  • Fixed issue with CIMultiDict being mutated when passed to web.Response
  • Replaced deprecated asyncio.iscoroutinefunction with inspect.iscoroutinefunction
  • Fixed race condition in FileResponse that could result in incorrect responses
  • Fixed header serialization to better handle potential header injection attacks
  • Disabled zero copy writes to address security issues
  • Fixed DNS resolution on platforms without socket.AI_ADDRCONFIG support
  • Break cyclic references at various points to prevent memory leaks
  • Added human-readable error messages for WebSocket disconnects

Performance Improvements

  • Improved WebSocket buffer handling
  • Enhanced header serialization performance
  • Added caching for content type parsing
  • Optimized memory usage in stream handling

Documentation Updates

  • Added aiohttp-openmetrics to third-party libraries
  • Updated client reference documentation
  • Added new testing scripts and benchmarks

Packaging Updates

  • Started building armv7l musllinux wheels
  • Fixed test issues with Python 3.12.9+ and 3.13.2+
  • Added missing files to source distribution

Other Changes

  • Various dependency updates in requirements files
  • Updated Sphinx and other documentation tools
  • Added new isolated test cases for leak detection
sequenceDiagram
    participant Client
    participant ClientSession
    participant Connector
    participant WebSocketReader
    participant HttpWriter
    participant Server
    participant FileResponse

    Client->>ClientSession: request(method, url)
    ClientSession->>Connector: _create_connection(request_info)
    Connector->>Connector: _resolve_host(host)
    
    alt DNS cache miss
        Note over Connector: Now traces DNS cache misses properly
        Connector->>Connector: resolve_host()
    end
    
    Connector->>Server: connect()
    Server->>ClientSession: response
    
    alt WebSocket Connection
        Client->>ClientSession: ws_connect()
        ClientSession->>WebSocketReader: create()
        WebSocketReader->>WebSocketReader: feed_data()
        Note over WebSocketReader: Improved buffer handling
        
        alt Ping timeout
            WebSocketReader->>Client: Error with descriptive message
        end
    end
    
    alt File Download
        Client->>Server: GET /file
        Server->>FileResponse: create()
        FileResponse->>FileResponse: _make_response()
        Note over FileResponse: Fixed race condition when files modified during response
        FileResponse->>Client: file data
    end
    
    alt Headers Processing
        HttpWriter->>HttpWriter: _serialize_headers()
        Note over HttpWriter: Better header injection protection
        HttpWriter->>Server: send headers
    end
    
    alt Connection Cleanup
        ClientSession->>Client: response complete
        Note over ClientSession: Break cyclic references
        Connector->>Connector: release connection
    end
Loading

@renovate renovate bot force-pushed the renovate/aiohttp-3-x branch from bfb3760 to 8a23550 Compare April 28, 2025 17:31
@renovate renovate bot changed the title chore(deps): update dependency aiohttp to v3.11.16 chore(deps): update dependency aiohttp to v3.11.18 Apr 28, 2025
Copy link

[puLL-Merge] - aio-libs/[email protected]

Diff
diff --git .github/workflows/ci-cd.yml .github/workflows/ci-cd.yml
index 765047b933f..23266b2b2d5 100644
--- .github/workflows/ci-cd.yml
+++ .github/workflows/ci-cd.yml
@@ -47,7 +47,7 @@ jobs:
       with:
         python-version: 3.11
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-lint-${{ hashFiles('requirements/*.txt') }}
         path: ~/.cache/pip
@@ -99,7 +99,7 @@ jobs:
       with:
         submodules: true
     - name: Cache llhttp generated files
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       id: cache
       with:
         key: llhttp-${{ hashFiles('vendor/llhttp/package*.json', 'vendor/llhttp/src/**/*') }}
@@ -114,7 +114,7 @@ jobs:
       run: |
         make generate-llhttp
     - name: Upload llhttp generated files
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build
@@ -163,7 +163,7 @@ jobs:
         echo "dir=$(pip cache dir)" >> "${GITHUB_OUTPUT}"
       shell: bash
     - name: Cache PyPI
-      uses: actions/[email protected]
+      uses: actions/[email protected]
       with:
         key: pip-ci-${{ runner.os }}-${{ matrix.pyver }}-${{ matrix.no-extensions }}-${{ hashFiles('requirements/*.txt') }}
         path: ${{ steps.pip-cache.outputs.dir }}
@@ -177,7 +177,7 @@ jobs:
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
       if: ${{ matrix.no-extensions == '' }}
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -244,17 +244,17 @@ jobs:
     needs: gen_llhttp
 
     runs-on: ubuntu-latest
-    timeout-minutes: 7
+    timeout-minutes: 9
     steps:
     - name: Checkout project
       uses: actions/checkout@v4
       with:
         submodules: true
-    - name: Setup Python 3.12
+    - name: Setup Python 3.13.2
       id: python-install
       uses: actions/setup-python@v5
       with:
-        python-version: 3.12
+        python-version: 3.13.2
         cache: pip
         cache-dependency-path: requirements/*.txt
     - name: Update pip, wheel, setuptools, build, twine
@@ -264,7 +264,7 @@ jobs:
       run: |
         python -m pip install -r requirements/test.in -c requirements/test.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -325,7 +325,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -336,27 +336,41 @@ jobs:
       run: |
         python -m build --sdist
     - name: Upload artifacts
-      uses: actions/upload-artifact@v3
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: dist-sdist
         path: dist
 
   build-wheels:
-    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }}
-    runs-on: ${{ matrix.os }}-latest
+    name: Build wheels on ${{ matrix.os }} ${{ matrix.qemu }} ${{ matrix.musl }}
+    runs-on: ${{ matrix.os }}
     needs: pre-deploy
     strategy:
       matrix:
-        os: [ubuntu, windows, macos]
+        os: ["ubuntu-latest", "windows-latest", "macos-latest", "ubuntu-24.04-arm"]
         qemu: ['']
+        musl: [""]
         include:
-          # Split ubuntu job for the sake of speed-up
-        - os: ubuntu
-          qemu: aarch64
-        - os: ubuntu
+          # Split ubuntu/musl jobs for the sake of speed-up
+        - os: ubuntu-latest
+          qemu: ppc64le
+          musl: ""
+        - os: ubuntu-latest
           qemu: ppc64le
-        - os: ubuntu
+          musl: musllinux
+        - os: ubuntu-latest
           qemu: s390x
+          musl: ""
+        - os: ubuntu-latest
+          qemu: s390x
+          musl: musllinux
+        - os: ubuntu-latest
+          qemu: armv7l
+          musl: musllinux
+        - os: ubuntu-latest
+          musl: musllinux
+        - os: ubuntu-24.04-arm
+          musl: musllinux
     steps:
     - name: Checkout
       uses: actions/checkout@v4
@@ -367,6 +381,10 @@ jobs:
       uses: docker/setup-qemu-action@v3
       with:
         platforms: all
+        # This should be temporary
+        # xref https://github.com/docker/setup-qemu-action/issues/188
+        # xref https://github.com/tonistiigi/binfmt/issues/215
+        image: tonistiigi/binfmt:qemu-v8.1.5
       id: qemu
     - name: Prepare emulation
       run: |
@@ -388,7 +406,7 @@ jobs:
         python -m
         pip install -r requirements/cython.in -c requirements/cython.txt
     - name: Restore llhttp generated files
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
         name: llhttp
         path: vendor/llhttp/build/
@@ -398,10 +416,17 @@ jobs:
     - name: Build wheels
       uses: pypa/[email protected]
       env:
+        CIBW_SKIP: pp* ${{ matrix.musl == 'musllinux' && '*manylinux*' || '*musllinux*' }}
         CIBW_ARCHS_MACOS: x86_64 arm64 universal2
-    - uses: actions/upload-artifact@v3
+    - name: Upload wheels
+      uses: actions/upload-artifact@v4
       with:
-        name: dist
+        name: >-
+          dist-${{ matrix.os }}-${{ matrix.musl }}-${{
+            matrix.qemu
+            && matrix.qemu
+            || 'native'
+          }}
         path: ./wheelhouse/*.whl
 
   deploy:
@@ -426,10 +451,11 @@ jobs:
       run: |
         echo "${{ secrets.GITHUB_TOKEN }}" | gh auth login --with-token
     - name: Download distributions
-      uses: actions/download-artifact@v3
+      uses: actions/download-artifact@v4
       with:
-        name: dist
         path: dist
+        pattern: dist-*
+        merge-multiple: true
     - name: Collected dists
       run: |
         tree dist
diff --git .readthedocs.yml .readthedocs.yml
index b3edaf4b8ea..b7d8a9236f6 100644
--- .readthedocs.yml
+++ .readthedocs.yml
@@ -5,6 +5,10 @@
 ---
 version: 2
 
+sphinx:
+  # Path to your Sphinx configuration file.
+  configuration: docs/conf.py
+
 submodules:
   include: all
   exclude: []
diff --git CHANGES.rst CHANGES.rst
index 8352236c320..11fd19153e3 100644
--- CHANGES.rst
+++ CHANGES.rst
@@ -10,6 +10,488 @@
 
 .. towncrier release notes start
 
+3.11.18 (2025-04-20)
+====================
+
+Bug fixes
+---------
+
+- Disabled TLS in TLS warning (when using HTTPS proxies) for uvloop and newer Python versions -- by :user:`lezgomatt`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`7686`.
+
+
+
+- Fixed reading fragmented WebSocket messages when the payload was masked -- by :user:`bdraco`.
+
+  The problem first appeared in 3.11.17
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10764`.
+
+
+
+
+----
+
+
+3.11.17 (2025-04-19)
+====================
+
+Miscellaneous internal changes
+------------------------------
+
+- Optimized web server performance when access logging is disabled by reducing time syscalls -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10713`.
+
+
+
+- Improved web server performance when connection can be reused -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10714`.
+
+
+
+- Improved performance of the WebSocket reader -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10740`.
+
+
+
+- Improved performance of the WebSocket reader with large messages -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10744`.
+
+
+
+
+----
+
+
+3.11.16 (2025-04-01)
+====================
+
+Bug fixes
+---------
+
+- Replaced deprecated ``asyncio.iscoroutinefunction`` with its counterpart from ``inspect``
+  -- by :user:`layday`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10634`.
+
+
+
+- Fixed :class:`multidict.CIMultiDict` being mutated when passed to :class:`aiohttp.web.Response` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10672`.
+
+
+
+
+----
+
+
+3.11.15 (2025-03-31)
+====================
+
+Bug fixes
+---------
+
+- Reverted explicitly closing sockets if an exception is raised during ``create_connection`` -- by :user:`bdraco`.
+
+  This change originally appeared in aiohttp 3.11.13
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`, :issue:`10617`, :issue:`10656`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of WebSocket buffer handling -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10601`.
+
+
+
+- Improved performance of serializing headers -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10625`.
+
+
+
+
+----
+
+
+3.11.14 (2025-03-16)
+====================
+
+Bug fixes
+---------
+
+- Fixed an issue where dns queries were delayed indefinitely when an exception occurred in a ``trace.send_dns_cache_miss``
+  -- by :user:`logioniz`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10529`.
+
+
+
+- Fixed DNS resolution on platforms that don't support ``socket.AI_ADDRCONFIG`` -- by :user:`maxbachmann`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10542`.
+
+
+
+- The connector now raises :exc:`aiohttp.ClientConnectionError` instead of :exc:`OSError` when failing to explicitly close the socket after :py:meth:`asyncio.loop.create_connection` fails -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10551`.
+
+
+
+- Break cyclic references at connection close when there was a traceback -- by :user:`bdraco`.
+
+  Special thanks to :user:`availov` for reporting the issue.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10556`.
+
+
+
+- Break cyclic references when there is an exception handling a request -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10569`.
+
+
+
+
+Features
+--------
+
+- Improved logging on non-overlapping WebSocket client protocols to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10564`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Improved performance of parsing content types by adding a cache in the same manner currently done with mime types -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10552`.
+
+
+
+
+----
+
+
+3.11.13 (2025-02-24)
+====================
+
+Bug fixes
+---------
+
+- Removed a break statement inside the finally block in :py:class:`~aiohttp.web.RequestHandler`
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10434`.
+
+
+
+- Changed connection creation to explicitly close sockets if an exception is raised in the event loop's ``create_connection`` method -- by :user:`top-oai`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10464`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Fixed test ``test_write_large_payload_deflate_compression_data_in_eof_writelines`` failing with Python 3.12.9+ or 3.13.2+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10423`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Added human-readable error messages to the exceptions for WebSocket disconnects due to PONG not being received -- by :user:`bdraco`.
+
+  Previously, the error messages were empty strings, which made it hard to determine what went wrong.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10422`.
+
+
+
+
+----
+
+
+3.11.12 (2025-02-05)
+====================
+
+Bug fixes
+---------
+
+- ``MultipartForm.decode()`` now follows RFC1341 7.2.1 with a ``CRLF`` after the boundary
+  -- by :user:`imnotjames`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10270`.
+
+
+
+- Restored the missing ``total_bytes`` attribute to ``EmptyStreamReader`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10387`.
+
+
+
+
+Features
+--------
+
+- Updated :py:func:`~aiohttp.request` to make it accept ``_RequestOptions`` kwargs.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10300`.
+
+
+
+- Improved logging of HTTP protocol errors to include the remote address -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10332`.
+
+
+
+
+Improved documentation
+----------------------
+
+- Added ``aiohttp-openmetrics`` to list of third-party libraries -- by :user:`jelmer`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10304`.
+
+
+
+
+Packaging updates and notes for downstreams
+-------------------------------------------
+
+- Added missing files to the source distribution to fix ``Makefile`` targets.
+  Added a ``cythonize-nodeps`` target to run Cython without invoking pip to install dependencies.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10366`.
+
+
+
+- Started building armv7l musllinux wheels -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10404`.
+
+
+
+
+Contributor-facing changes
+--------------------------
+
+- The CI/CD workflow has been updated to use `upload-artifact` v4 and `download-artifact` v4 GitHub Actions -- by :user:`silamon`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10281`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Restored support for zero copy writes when using Python 3.12 versions 3.12.9 and later or Python 3.13.2+ -- by :user:`bdraco`.
+
+  Zero copy writes were previously disabled due to :cve:`2024-12254` which is resolved in these Python versions.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10137`.
+
+
+
+
+----
+
+
+3.11.11 (2024-12-18)
+====================
+
+Bug fixes
+---------
+
+- Updated :py:meth:`~aiohttp.ClientSession.request` to reuse the ``quote_cookie`` setting from ``ClientSession._cookie_jar`` when processing cookies parameter.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10093`.
+
+
+
+- Fixed type of ``SSLContext`` for some static type checkers (e.g. pyright).
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10099`.
+
+
+
+- Updated :meth:`aiohttp.web.StreamResponse.write` annotation to also allow :class:`bytearray` and :class:`memoryview` as inputs -- by :user:`cdce8p`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10154`.
+
+
+
+- Fixed a hang where a connection previously used for a streaming
+  download could be returned to the pool in a paused state.
+  -- by :user:`javitonino`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10169`.
+
+
+
+
+Features
+--------
+
+- Enabled ALPN on default SSL contexts. This improves compatibility with some
+  proxies which don't work without this extension.
+  -- by :user:`Cycloctane`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10156`.
+
+
+
+
+Miscellaneous internal changes
+------------------------------
+
+- Fixed an infinite loop that can occur when using aiohttp in combination
+  with `async-solipsism`_ -- by :user:`bmerry`.
+
+  .. _async-solipsism: https://github.com/bmerry/async-solipsism
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10149`.
+
+
+
+
+----
+
+
+3.11.10 (2024-12-05)
+====================
+
+Bug fixes
+---------
+
+- Fixed race condition in :class:`aiohttp.web.FileResponse` that could have resulted in an incorrect response if the file was replaced on the file system during ``prepare`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10101`, :issue:`10113`.
+
+
+
+- Replaced deprecated call to :func:`mimetypes.guess_type` with :func:`mimetypes.guess_file_type` when using Python 3.13+ -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10102`.
+
+
+
+- Disabled zero copy writes in the ``StreamWriter`` -- by :user:`bdraco`.
+
+
+  *Related issues and pull requests on GitHub:*
+  :issue:`10125`.
+
+
+
+
+----
+
+
 3.11.9 (2024-12-01)
 ===================
 
diff --git CONTRIBUTORS.txt CONTRIBUTORS.txt
index 6adb3b97fb1..9fec4933dc0 100644
--- CONTRIBUTORS.txt
+++ CONTRIBUTORS.txt
@@ -9,6 +9,7 @@ Adam Mills
 Adrian Krupa
 Adrián Chaves
 Ahmed Tahri
+Alan Bogarin
 Alan Tse
 Alec Hanefeld
 Alejandro Gómez
@@ -30,6 +31,7 @@ Alexandru Mihai
 Alexey Firsov
 Alexey Nikitin
 Alexey Popravka
+Alexey Stavrov
 Alexey Stepanov
 Amin Etesamian
 Amit Tulshyan
@@ -41,6 +43,7 @@ Andrej Antonov
 Andrew Leech
 Andrew Lytvyn
 Andrew Svetlov
+Andrew Top
 Andrew Zhou
 Andrii Soldatenko
 Anes Abismail
@@ -166,10 +169,12 @@ Jaesung Lee
 Jake Davis
 Jakob Ackermann
 Jakub Wilk
+James Ward
 Jan Buchar
 Jan Gosmann
 Jarno Elonen
 Jashandeep Sohi
+Javier Torres
 Jean-Baptiste Estival
 Jens Steinhauser
 Jeonghun Lee
@@ -236,6 +241,7 @@ Martin Sucha
 Mathias Fröjdman
 Mathieu Dugré
 Matt VanEseltine
+Matthew Go
 Matthias Marquardt
 Matthieu Hauglustaine
 Matthieu Rigal
@@ -364,6 +370,7 @@ William S.
 Wilson Ong
 wouter bolsterlee
 Xavier Halloran
+Xi Rui
 Xiang Li
 Yang Zhou
 Yannick Koechlin
diff --git MANIFEST.in MANIFEST.in
index d7c5cef6aad..64cee139a1f 100644
--- MANIFEST.in
+++ MANIFEST.in
@@ -7,6 +7,7 @@ graft aiohttp
 graft docs
 graft examples
 graft tests
+graft tools
 graft requirements
 recursive-include vendor *
 global-include aiohttp *.pyi
diff --git Makefile Makefile
index b0a3ef3226b..c6193fea9e4 100644
--- Makefile
+++ Makefile
@@ -81,6 +81,9 @@ generate-llhttp: .llhttp-gen
 .PHONY: cythonize
 cythonize: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
 
+.PHONY: cythonize-nodeps
+cythonize-nodeps: $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c
+
 .install-deps: .install-cython $(PYXS:.pyx=.c) aiohttp/_websocket/reader_c.c $(call to-hash,$(CYS) $(REQS))
 	@python -m pip install -r requirements/dev.in -c requirements/dev.txt
 	@touch .install-deps
diff --git aiohttp/__init__.py aiohttp/__init__.py
index 5615e5349ae..e3e0f3cc51e 100644
--- aiohttp/__init__.py
+++ aiohttp/__init__.py
@@ -1,4 +1,4 @@
-__version__ = "3.11.9"
+__version__ = "3.11.18"
 
 from typing import TYPE_CHECKING, Tuple
 
diff --git aiohttp/_http_writer.pyx aiohttp/_http_writer.pyx
index 287371334f8..4a3ae1f9e68 100644
--- aiohttp/_http_writer.pyx
+++ aiohttp/_http_writer.pyx
@@ -97,27 +97,34 @@ cdef inline int _write_str(Writer* writer, str s):
             return -1
 
 
-# --------------- _serialize_headers ----------------------
-
-cdef str to_str(object s):
+cdef inline int _write_str_raise_on_nlcr(Writer* writer, object s):
+    cdef Py_UCS4 ch
+    cdef str out_str
     if type(s) is str:
-        return <str>s
+        out_str = <str>s
     elif type(s) is _istr:
-        return PyObject_Str(s)
+        out_str = PyObject_Str(s)
     elif not isinstance(s, str):
         raise TypeError("Cannot serialize non-str key {!r}".format(s))
     else:
-        return str(s)
+        out_str = str(s)
+
+    for ch in out_str:
+        if ch == 0x0D or ch == 0x0A:
+            raise ValueError(
+                "Newline or carriage return detected in headers. "
+                "Potential header injection attack."
+            )
+        if _write_utf8(writer, ch) < 0:
+            return -1
 
 
+# --------------- _serialize_headers ----------------------
 
 def _serialize_headers(str status_line, headers):
     cdef Writer writer
     cdef object key
     cdef object val
-    cdef bytes ret
-    cdef str key_str
-    cdef str val_str
 
     _init_writer(&writer)
 
@@ -130,22 +137,13 @@ def _serialize_headers(str status_line, headers):
             raise
 
         for key, val in headers.items():
-            key_str = to_str(key)
-            val_str = to_str(val)
-
-            if "\r" in key_str or "\n" in key_str or "\r" in val_str or "\n" in val_str:
-                raise ValueError(
-                    "Newline or carriage return character detected in HTTP status message or "
-                    "header. This is a potential security issue."
-                )
-
-            if _write_str(&writer, key_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, key) < 0:
                 raise
             if _write_byte(&writer, b':') < 0:
                 raise
             if _write_byte(&writer, b' ') < 0:
                 raise
-            if _write_str(&writer, val_str) < 0:
+            if _write_str_raise_on_nlcr(&writer, val) < 0:
                 raise
             if _write_byte(&writer, b'\r') < 0:
                 raise
diff --git aiohttp/_websocket/reader_c.pxd aiohttp/_websocket/reader_c.pxd
index 461e658e116..a7620d8e87f 100644
--- aiohttp/_websocket/reader_c.pxd
+++ aiohttp/_websocket/reader_c.pxd
@@ -8,12 +8,17 @@ cdef unsigned int READ_PAYLOAD_LENGTH
 cdef unsigned int READ_PAYLOAD_MASK
 cdef unsigned int READ_PAYLOAD
 
-cdef unsigned int OP_CODE_CONTINUATION
-cdef unsigned int OP_CODE_TEXT
-cdef unsigned int OP_CODE_BINARY
-cdef unsigned int OP_CODE_CLOSE
-cdef unsigned int OP_CODE_PING
-cdef unsigned int OP_CODE_PONG
+cdef int OP_CODE_NOT_SET
+cdef int OP_CODE_CONTINUATION
+cdef int OP_CODE_TEXT
+cdef int OP_CODE_BINARY
+cdef int OP_CODE_CLOSE
+cdef int OP_CODE_PING
+cdef int OP_CODE_PONG
+
+cdef int COMPRESSED_NOT_SET
+cdef int COMPRESSED_FALSE
+cdef int COMPRESSED_TRUE
 
 cdef object UNPACK_LEN3
 cdef object UNPACK_CLOSE_CODE
@@ -60,18 +65,18 @@ cdef class WebSocketReader:
     cdef bytearray _partial
     cdef unsigned int _state
 
-    cdef object _opcode
-    cdef object _frame_fin
-    cdef object _frame_opcode
-    cdef object _frame_payload
-    cdef unsigned long long _frame_payload_len
+    cdef int _opcode
+    cdef bint _frame_fin
+    cdef int _frame_opcode
+    cdef list _payload_fragments
+    cdef Py_ssize_t _frame_payload_len
 
     cdef bytes _tail
     cdef bint _has_mask
     cdef bytes _frame_mask
-    cdef unsigned long long _payload_length
-    cdef unsigned int _payload_length_flag
-    cdef object _compressed
+    cdef Py_ssize_t _payload_bytes_to_read
+    cdef unsigned int _payload_len_flag
+    cdef int _compressed
     cdef object _decompressobj
     cdef bint _compress
 
@@ -82,21 +87,24 @@ cdef class WebSocketReader:
         fin=bint,
         has_partial=bint,
         payload_merged=bytes,
-        opcode="unsigned int",
     )
-    cpdef void _feed_data(self, bytes data)
+    cpdef void _handle_frame(self, bint fin, int opcode, object payload, int compressed) except *
 
     @cython.locals(
-        start_pos="unsigned int",
-        buf_len="unsigned int",
-        length="unsigned int",
-        chunk_size="unsigned int",
-        chunk_len="unsigned int",
-        buf_length="unsigned int",
+        start_pos=Py_ssize_t,
+        data_len=Py_ssize_t,
+        length=Py_ssize_t,
+        chunk_size=Py_ssize_t,
+        chunk_len=Py_ssize_t,
+        data_len=Py_ssize_t,
+        data_cstr="const unsigned char *",
         first_byte="unsigned char",
         second_byte="unsigned char",
-        end_pos="unsigned int",
+        f_start_pos=Py_ssize_t,
+        f_end_pos=Py_ssize_t,
         has_mask=bint,
         fin=bint,
+        had_fragments=Py_ssize_t,
+        payload_bytearray=bytearray,
     )
-    cpdef list parse_frame(self, bytes buf)
+    cpdef void _feed_data(self, bytes data) except *
diff --git aiohttp/_websocket/reader_py.py aiohttp/_websocket/reader_py.py
index 94d20010890..f0060fd723c 100644
--- aiohttp/_websocket/reader_py.py
+++ aiohttp/_websocket/reader_py.py
@@ -3,7 +3,7 @@
 import asyncio
 import builtins
 from collections import deque
-from typing import Deque, Final, List, Optional, Set, Tuple, Union
+from typing import Deque, Final, Optional, Set, Tuple, Union
 
 from ..base_protocol import BaseProtocol
 from ..compression_utils import ZLibDecompressor
@@ -31,6 +31,7 @@
 WS_MSG_TYPE_TEXT = WSMsgType.TEXT
 
 # WSMsgType values unpacked so they can by cythonized to ints
+OP_CODE_NOT_SET = -1
 OP_CODE_CONTINUATION = WSMsgType.CONTINUATION.value
 OP_CODE_TEXT = WSMsgType.TEXT.value
 OP_CODE_BINARY = WSMsgType.BINARY.value
@@ -41,9 +42,13 @@
 EMPTY_FRAME_ERROR = (True, b"")
 EMPTY_FRAME = (False, b"")
 
+COMPRESSED_NOT_SET = -1
+COMPRESSED_FALSE = 0
+COMPRESSED_TRUE = 1
+
 TUPLE_NEW = tuple.__new__
 
-int_ = int  # Prevent Cython from converting to PyInt
+cython_int = int  # Typed to int in Python, but cython with use a signed int in the pxd
 
 
 class WebSocketDataQueue:
@@ -93,8 +98,9 @@ def _release_waiter(self) -> None:
     def feed_eof(self) -> None:
         self._eof = True
         self._release_waiter()
+        self._exception = None  # Break cyclic references
 
-    def feed_data(self, data: "WSMessage", size: "int_") -> None:
+    def feed_data(self, data: "WSMessage", size: "cython_int") -> None:
         self._size += size
         self._put_buffer((data, size))
         self._release_waiter()
@@ -135,18 +141,18 @@ def __init__(
         self._partial = bytearray()
         self._state = READ_HEADER
 
-        self._opcode: Optional[int] = None
+        self._opcode: int = OP_CODE_NOT_SET
         self._frame_fin = False
-        self._frame_opcode: Optional[int] = None
-        self._frame_payload: Union[bytes, bytearray] = b""
+        self._frame_opcode: int = OP_CODE_NOT_SET
+        self._payload_fragments: list[bytes] = []
         self._frame_payload_len = 0
 
         self._tail: bytes = b""
         self._has_mask = False
         self._frame_mask: Optional[bytes] = None
-        self._payload_length = 0
-        self._payload_length_flag = 0
-        self._compressed: Optional[bool] = None
+        self._payload_bytes_to_read = 0
+        self._payload_len_flag = 0
+        self._compressed: int = COMPRESSED_NOT_SET
         self._decompressobj: Optional[ZLibDecompressor] = None
         self._compress = compress
 
@@ -174,167 +180,153 @@ def feed_data(
 
         return EMPTY_FRAME
 
-    def _feed_data(self, data: bytes) -> None:
+    def _handle_frame(
+        self,
+        fin: bool,
+        opcode: Union[int, cython_int],  # Union intended: Cython pxd uses C int
+        payload: Union[bytes, bytearray],
+        compressed: Union[int, cython_int],  # Union intended: Cython pxd uses C int
+    ) -> None:
         msg: WSMessage
-        for frame in self.parse_frame(data):
-            fin = frame[0]
-            opcode = frame[1]
-            payload = frame[2]
-            compressed = frame[3]
-
-            is_continuation = opcode == OP_CODE_CONTINUATION
-            if opcode == OP_CODE_TEXT or opcode == OP_CODE_BINARY or is_continuation:
-                # load text/binary
-                if not fin:
-                    # got partial frame payload
-                    if not is_continuation:
-                        self._opcode = opcode
-                    self._partial += payload
-                    if self._max_msg_size and len(self._partial) >= self._max_msg_size:
-                        raise WebSocketError(
-                            WSCloseCode.MESSAGE_TOO_BIG,
-                            "Message size {} exceeds limit {}".format(
-                                len(self._partial), self._max_msg_size
-                            ),
-                        )
-                    continue
-
-                has_partial = bool(self._partial)
-                if is_continuation:
-                    if self._opcode is None:
-                        raise WebSocketError(
-                            WSCloseCode.PROTOCOL_ERROR,
-                            "Continuation frame for non started message",
-                        )
-                    opcode = self._opcode
-                    self._opcode = None
-                # previous frame was non finished
-                # we should get continuation opcode
-                elif has_partial:
+        if opcode in {OP_CODE_TEXT, OP_CODE_BINARY, OP_CODE_CONTINUATION}:
+            # load text/binary
+            if not fin:
+                # got partial frame payload
+                if opcode != OP_CODE_CONTINUATION:
+                    self._opcode = opcode
+                self._partial += payload
+                if self._max_msg_size and len(self._partial) >= self._max_msg_size:
+                    raise WebSocketError(
+                        WSCloseCode.MESSAGE_TOO_BIG,
+                        f"Message size {len(self._partial)} "
+                        f"exceeds limit {self._max_msg_size}",
+                    )
+                return
+
+            has_partial = bool(self._partial)
+            if opcode == OP_CODE_CONTINUATION:
+                if self._opcode == OP_CODE_NOT_SET:
                     raise WebSocketError(
                         WSCloseCode.PROTOCOL_ERROR,
-                        "The opcode in non-fin frame is expected "
-                        "to be zero, got {!r}".format(opcode),
+                        "Continuation frame for non started message",
                     )
+                opcode = self._opcode
+                self._opcode = OP_CODE_NOT_SET
+            # previous frame was non finished
+            # we should get continuation opcode
+            elif has_partial:
+                raise WebSocketError(
+                    WSCloseCode.PROTOCOL_ERROR,
+                    "The opcode in non-fin frame is expected "
+                    f"to be zero, got {opcode!r}",
+                )
 
-                assembled_payload: Union[bytes, bytearray]
-                if has_partial:
-                    assembled_payload = self._partial + payload
-                    self._partial.clear()
-                else:
-                    assembled_payload = payload
+            assembled_payload: Union[bytes, bytearray]
+            if has_partial:
+                assembled_payload = self._partial + payload
+                self._partial.clear()
+            else:
+                assembled_payload = payload
+
+            if self._max_msg_size and len(assembled_payload) >= self._max_msg_size:
+                raise WebSocketError(
+                    WSCloseCode.MESSAGE_TOO_BIG,
+                    f"Message size {len(assembled_payload)} "
+                    f"exceeds limit {self._max_msg_size}",
+                )
 
-                if self._max_msg_size and len(assembled_payload) >= self._max_msg_size:
+            # Decompress process must to be done after all packets
+            # received.
+            if compressed:
+                if not self._decompressobj:
+                    self._decompressobj = ZLibDecompressor(suppress_deflate_header=True)
+                payload_merged = self._decompressobj.decompress_sync(
+                    assembled_payload + WS_DEFLATE_TRAILING, self._max_msg_size
+                )
+                if self._decompressobj.unconsumed_tail:
+                    left = len(self._decompressobj.unconsumed_tail)
                     raise WebSocketError(
                         WSCloseCode.MESSAGE_TOO_BIG,
-                        "Message size {} exceeds limit {}".format(
-                            len(assembled_payload), self._max_msg_size
-                        ),
+                        f"Decompressed message size {self._max_msg_size + left}"
+                        f" exceeds limit {self._max_msg_size}",
                     )
+            elif type(assembled_payload) is bytes:
+                payload_merged = assembled_payload
+            else:
+                payload_merged = bytes(assembled_payload)
 
-                # Decompress process must to be done after all packets
-                # received.
-                if compressed:
-                    if not self._decompressobj:
-                        self._decompressobj = ZLibDecompressor(
-                            suppress_deflate_header=True
-                        )
-                    payload_merged = self._decompressobj.decompress_sync(
-                        assembled_payload + WS_DEFLATE_TRAILING, self._max_msg_size
-                    )
-                    if self._decompressobj.unconsumed_tail:
-                        left = len(self._decompressobj.unconsumed_tail)
-                        raise WebSocketError(
-                            WSCloseCode.MESSAGE_TOO_BIG,
-                            "Decompressed message size {} exceeds limit {}".format(
-                                self._max_msg_size + left, self._max_msg_size
-                            ),
-                        )
-                elif type(assembled_payload) is bytes:
-                    payload_merged = assembled_payload
-                else:
-                    payload_merged = bytes(assembled_payload)
-
-                if opcode == OP_CODE_TEXT:
-                    try:
-                        text = payload_merged.decode("utf-8")
-                    except UnicodeDecodeError as exc:
-                        raise WebSocketError(
-                            WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message"
-                        ) from exc
-
-                    # XXX: The Text and Binary messages here can be a performance
-                    # bottleneck, so we use tuple.__new__ to improve performance.
-                    # This is not type safe, but many tests should fail in
-                    # test_client_ws_functional.py if this is wrong.
-                    self.queue.feed_data(
-                        TUPLE_NEW(WSMessage, (WS_MSG_TYPE_TEXT, text, "")),
-                        len(payload_merged),
-                    )
-                else:
-                    self.queue.feed_data(
-                        TUPLE_NEW(WSMessage, (WS_MSG_TYPE_BINARY, payload_merged, "")),
-                        len(payload_merged),
-                    )
-            elif opcode == OP_CODE_CLOSE:
-                if len(payload) >= 2:
-                    close_code = UNPACK_CLOSE_CODE(payload[:2])[0]
-                    if close_code < 3000 and close_code not in ALLOWED_CLOSE_CODES:
-                        raise WebSocketError(
-                            WSCloseCode.PROTOCOL_ERROR,
-                            f"Invalid close code: {close_code}",
-                        )
-                    try:
-                        close_message = payload[2:].decode("utf-8")
-                    except UnicodeDecodeError as exc:
-                        raise WebSocketError(
-                            WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message"
-                        ) from exc
-                    msg = TUPLE_NEW(
-                        WSMessage, (WSMsgType.CLOSE, close_code, close_message)
-                    )
-                elif payload:
+            if opcode == OP_CODE_TEXT:
+                try:
+                    text = payload_merged.decode("utf-8")
+                except UnicodeDecodeError as exc:
+                    raise WebSocketError(
+                        WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message"
+                    ) from exc
+
+                # XXX: The Text and Binary messages here can be a performance
+                # bottleneck, so we use tuple.__new__ to improve performance.
+                # This is not type safe, but many tests should fail in
+                # test_client_ws_functional.py if this is wrong.
+                self.queue.feed_data(
+                    TUPLE_NEW(WSMessage, (WS_MSG_TYPE_TEXT, text, "")),
+                    len(payload_merged),
+                )
+            else:
+                self.queue.feed_data(
+                    TUPLE_NEW(WSMessage, (WS_MSG_TYPE_BINARY, payload_merged, "")),
+                    len(payload_merged),
+                )
+        elif opcode == OP_CODE_CLOSE:
+            if len(payload) >= 2:
+                close_code = UNPACK_CLOSE_CODE(payload[:2])[0]
+                if close_code < 3000 and close_code not in ALLOWED_CLOSE_CODES:
                     raise WebSocketError(
                         WSCloseCode.PROTOCOL_ERROR,
-                        f"Invalid close frame: {fin} {opcode} {payload!r}",
+                        f"Invalid close code: {close_code}",
                     )
-                else:
-                    msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, 0, ""))
-
-                self.queue.feed_data(msg, 0)
-            elif opcode == OP_CODE_PING:
-                msg = TUPLE_NEW(WSMessage, (WSMsgType.PING, payload, ""))
-                self.queue.feed_data(msg, len(payload))
-
-            elif opcode == OP_CODE_PONG:
-                msg = TUPLE_NEW(WSMessage, (WSMsgType.PONG, payload, ""))
-                self.queue.feed_data(msg, len(payload))
-
-            else:
+                try:
+                    close_message = payload[2:].decode("utf-8")
+                except UnicodeDecodeError as exc:
+                    raise WebSocketError(
+                        WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message"
+                    ) from exc
+                msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, close_code, close_message))
+            elif payload:
                 raise WebSocketError(
-                    WSCloseCode.PROTOCOL_ERROR, f"Unexpected opcode={opcode!r}"
+                    WSCloseCode.PROTOCOL_ERROR,
+                    f"Invalid close frame: {fin} {opcode} {payload!r}",
                 )
+            else:
+                msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, 0, ""))
+
+            self.queue.feed_data(msg, 0)
+        elif opcode == OP_CODE_PING:
+            msg = TUPLE_NEW(WSMessage, (WSMsgType.PING, payload, ""))
+            self.queue.feed_data(msg, len(payload))
+        elif opcode == OP_CODE_PONG:
+            msg = TUPLE_NEW(WSMessage, (WSMsgType.PONG, payload, ""))
+            self.queue.feed_data(msg, len(payload))
+        else:
+            raise WebSocketError(
+                WSCloseCode.PROTOCOL_ERROR, f"Unexpected opcode={opcode!r}"
+            )
 
-    def parse_frame(
-        self, buf: bytes
-    ) -> List[Tuple[bool, Optional[int], Union[bytes, bytearray], Optional[bool]]]:
+    def _feed_data(self, data: bytes) -> None:
         """Return the next frame from the socket."""
-        frames: List[
-            Tuple[bool, Optional[int], Union[bytes, bytearray], Optional[bool]]
-        ] = []
         if self._tail:
-            buf, self._tail = self._tail + buf, b""
+            data, self._tail = self._tail + data, b""
 
         start_pos: int = 0
-        buf_length = len(buf)
+        data_len = len(data)
+        data_cstr = data
 
         while True:
             # read header
             if self._state == READ_HEADER:
-                if buf_length - start_pos < 2:
+                if data_len - start_pos < 2:
                     break
-                first_byte = buf[start_pos]
-                second_byte = buf[start_pos + 1]
+                first_byte = data_cstr[start_pos]
+                second_byte = data_cstr[start_pos + 1]
                 start_pos += 2
 
                 fin = (first_byte >> 7) & 1
@@ -379,8 +371,8 @@ def parse_frame(
                 # Set compress status if last package is FIN
                 # OR set compress status if this is first fragment
                 # Raise error if not first fragment with rsv1 = 0x1
-                if self._frame_fin or self._compressed is None:
-                    self._compressed = True if rsv1 else False
+                if self._frame_fin or self._compressed == COMPRESSED_NOT_SET:
+                    self._compressed = COMPRESSED_TRUE if rsv1 else COMPRESSED_FALSE
                 elif rsv1:
                     raise WebSocketError(
                         WSCloseCode.PROTOCOL_ERROR,
@@ -390,79 +382,87 @@ def parse_frame(
                 self._frame_fin = bool(fin)
                 self._frame_opcode = opcode
                 self._has_mask = bool(has_mask)
-                self._payload_length_flag = length
+                self._payload_len_flag = length
                 self._state = READ_PAYLOAD_LENGTH
 
             # read payload length
             if self._state == READ_PAYLOAD_LENGTH:
-                length_flag = self._payload_length_flag
-                if length_flag == 126:
-                    if buf_length - start_pos < 2:
+                len_flag = self._payload_len_flag
+                if len_flag == 126:
+                    if data_len - start_pos < 2:
                         break
-                    first_byte = buf[start_pos]
-                    second_byte = buf[start_pos + 1]
+                    first_byte = data_cstr[start_pos]
+                    second_byte = data_cstr[start_pos + 1]
                     start_pos += 2
-                    self._payload_length = first_byte << 8 | second_byte
-                elif length_flag > 126:
-                    if buf_length - start_pos < 8:
+                    self._payload_bytes_to_read = first_byte << 8 | second_byte
+                elif len_flag > 126:
+                    if data_len - start_pos < 8:
                         break
-                    data = buf[start_pos : start_pos + 8]
+                    self._payload_bytes_to_read = UNPACK_LEN3(data, start_pos)[0]
                     start_pos += 8
-                    self._payload_length = UNPACK_LEN3(data)[0]
                 else:
-                    self._payload_length = length_flag
+                    self._payload_bytes_to_read = len_flag
 
                 self._state = READ_PAYLOAD_MASK if self._has_mask else READ_PAYLOAD
 
             # read payload mask
             if self._state == READ_PAYLOAD_MASK:
-                if buf_length - start_pos < 4:
+                if data_len - start_pos < 4:
                     break
-                self._frame_mask = buf[start_pos : start_pos + 4]
+                self._frame_mask = data_cstr[start_pos : start_pos + 4]
                 start_pos += 4
                 self._state = READ_PAYLOAD
 
             if self._state == READ_PAYLOAD:
-                chunk_len = buf_length - start_pos
-                if self._payload_length >= chunk_len:
-                    end_pos = buf_length
-                    self._payload_length -= chunk_len
+                chunk_len = data_len - start_pos
+                if self._payload_bytes_to_read >= chunk_len:
+                    f_end_pos = data_len
+                    self._payload_bytes_to_read -= chunk_len
                 else:
-                    end_pos = start_pos + self._payload_length
-                    self._payload_length = 0
-
-                if self._frame_payload_len:
-                    if type(self._frame_payload) is not bytearray:
-                        self._frame_payload = bytearray(self._frame_payload)
-                    self._frame_payload += buf[start_pos:end_pos]
-                else:
-                    # Fast path for the first frame
-                    self._frame_payload = buf[start_pos:end_pos]
-
-                self._frame_payload_len += end_pos - start_pos
-                start_pos = end_pos
-
-                if self._payload_length != 0:
+                    f_end_pos = start_pos + self._payload_bytes_to_read
+                    self._payload_bytes_to_read = 0
+
+                had_fragments = self._frame_payload_len
+                self._frame_payload_len += f_end_pos - start_pos
+                f_start_pos = start_pos
+                start_pos = f_end_pos
+
+                if self._payload_bytes_to_read != 0:
+                    # If we don't have a complete frame, we need to save the
+                    # data for the next call to feed_data.
+                    self._payload_fragments.append(data_cstr[f_start_pos:f_end_pos])
                     break
 
-                if self._has_mask:
+                payload: Union[bytes, bytearray]
+                if had_fragments:
+                    # We have to join the payload fragments get the payload
+                    self._payload_fragments.append(data_cstr[f_start_pos:f_end_pos])
+                    if self._has_mask:
+                        assert self._frame_mask is not None
+                        payload_bytearray = bytearray(b"".join(self._payload_fragments))
+                        websocket_mask(self._frame_mask, payload_bytearray)
+                        payload = payload_bytearray
+                    else:
+                        payload = b"".join(self._payload_fragments)
+                    self._payload_fragments.clear()
+                elif self._has_mask:
                     assert self._frame_mask is not None
-                    if type(self._frame_payload) is not bytearray:
-                        self._frame_payload = bytearray(self._frame_payload)
-                    websocket_mask(self._frame_mask, self._frame_payload)
-
-                frames.append(
-                    (
-                        self._frame_fin,
-                        self._frame_opcode,
-                        self._frame_payload,
-                        self._compressed,
-                    )
+                    payload_bytearray = data_cstr[f_start_pos:f_end_pos]  # type: ignore[assignment]
+                    if type(payload_bytearray) is not bytearray:  # pragma: no branch
+                        # Cython will do the conversion for us
+                        # but we need to do it for Python and we
+                        # will always get here in Python
+                        payload_bytearray = bytearray(payload_bytearray)
+                    websocket_mask(self._frame_mask, payload_bytearray)
+                    payload = payload_bytearray
+                else:
+                    payload = data_cstr[f_start_pos:f_end_pos]
+
+                self._handle_frame(
+                    self._frame_fin, self._frame_opcode, payload, self._compressed
                 )
-                self._frame_payload = b""
                 self._frame_payload_len = 0
                 self._state = READ_HEADER
 
-        self._tail = buf[start_pos:] if start_pos < buf_length else b""
-
-        return frames
+        # XXX: Cython needs slices to be bounded, so we can't omit the slice end here.
+        self._tail = data_cstr[start_pos:data_len] if start_pos < data_len else b""
diff --git aiohttp/abc.py aiohttp/abc.py
index d6f9f782b0f..5794a9108b0 100644
--- aiohttp/abc.py
+++ aiohttp/abc.py
@@ -17,6 +17,7 @@
     Optional,
     Tuple,
     TypedDict,
+    Union,
 )
 
 from multidict import CIMultiDict
@@ -175,6 +176,11 @@ class AbstractCookieJar(Sized, IterableBase):
     def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop] = None) -> None:
         self._loop = loop or asyncio.get_running_loop()
 
+    @property
+    @abstractmethod
+    def quote_cookie(self) -> bool:
+        """Return True if cookies should be quoted."""
+
     @abstractmethod
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         """Clear all cookies if no predicate is passed."""
@@ -200,7 +206,7 @@ class AbstractStreamWriter(ABC):
     length: Optional[int] = 0
 
     @abstractmethod
-    async def write(self, chunk: bytes) -> None:
+    async def write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         """Write chunk into stream."""
 
     @abstractmethod
diff --git aiohttp/client.py aiohttp/client.py
index e04a6ff989a..7c788e825eb 100644
--- aiohttp/client.py
+++ aiohttp/client.py
@@ -658,7 +658,9 @@ async def _request(
                     all_cookies = self._cookie_jar.filter_cookies(url)
 
                     if cookies is not None:
-                        tmp_cookie_jar = CookieJar()
+                        tmp_cookie_jar = CookieJar(
+                            quote_cookie=self._cookie_jar.quote_cookie
+                        )
                         tmp_cookie_jar.update_cookies(cookies)
                         req_cookies = tmp_cookie_jar.filter_cookies(url)
                         if req_cookies:
@@ -1469,106 +1471,80 @@ async def __aexit__(
         await self._session.close()
 
 
-def request(
-    method: str,
-    url: StrOrURL,
-    *,
-    params: Query = None,
-    data: Any = None,
-    json: Any = None,
-    headers: Optional[LooseHeaders] = None,
-    skip_auto_headers: Optional[Iterable[str]] = None,
-    auth: Optional[BasicAuth] = None,
-    allow_redirects: bool = True,
-    max_redirects: int = 10,
-    compress: Optional[str] = None,
-    chunked: Optional[bool] = None,
-    expect100: bool = False,
-    raise_for_status: Optional[bool] = None,
-    read_until_eof: bool = True,
-    proxy: Optional[StrOrURL] = None,
-    proxy_auth: Optional[BasicAuth] = None,
-    timeout: Union[ClientTimeout, object] = sentinel,
-    cookies: Optional[LooseCookies] = None,
-    version: HttpVersion = http.HttpVersion11,
-    connector: Optional[BaseConnector] = None,
-    read_bufsize: Optional[int] = None,
-    loop: Optional[asyncio.AbstractEventLoop] = None,
-    max_line_size: int = 8190,
-    max_field_size: int = 8190,
-) -> _SessionRequestContextManager:
-    """Constructs and sends a request.
-
-    Returns response object.
-    method - HTTP method
-    url - request url
-    params - (optional) Dictionary or bytes to be sent in the query
-      string of the new request
-    data - (optional) Dictionary, bytes, or file-like object to
-      send in the body of the request
-    json - (optional) Any json compatible python object
-    headers - (optional) Dictionary of HTTP Headers to send with
-      the request
-    cookies - (optional) Dict object to send with the request
-    auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
-    auth - aiohttp.helpers.BasicAuth
-    allow_redirects - (optional) If set to False, do not follow
-      redirects
-    version - Request HTTP version.
-    compress - Set to True if request has to be compressed
-       with deflate encoding.
-    chunked - Set to chunk size for chunked transfer encoding.
-    expect100 - Expect 100-continue response from server.
-    connector - BaseConnector sub-class instance to support
-       connection pooling.
-    read_until_eof - Read response until eof if response
-       does not have Content-Length header.
-    loop - Optional event loop.
-    timeout - Optional ClientTimeout settings structure, 5min
-       total timeout by default.
-    Usage::
-      >>> import aiohttp
-      >>> resp = await aiohttp.request('GET', 'http://python.org/')
-      >>> resp
-      <ClientResponse(python.org/) [200]>
-      >>> data = await resp.read()
-    """
-    connector_owner = False
-    if connector is None:
-        connector_owner = True
-        connector = TCPConnector(loop=loop, force_close=True)
-
-    session = ClientSession(
-        loop=loop,
-        cookies=cookies,
-        version=version,
-        timeout=timeout,
-        connector=connector,
-        connector_owner=connector_owner,
-    )
+if sys.version_info >= (3, 11) and TYPE_CHECKING:
 
-    return _SessionRequestContextManager(
-        session._request(
-            method,
-            url,
-            params=params,
-            data=data,
-            json=json,
-            headers=headers,
-            skip_auto_headers=skip_auto_headers,
-            auth=auth,
-            allow_redirects=allow_redirects,
-            max_redirects=max_redirects,
-            compress=compress,
-            chunked=chunked,
-            expect100=expect100,
-            raise_for_status=raise_for_status,
-            read_until_eof=read_until_eof,
-            proxy=proxy,
-            proxy_auth=proxy_auth,
-            read_bufsize=read_bufsize,
-            max_line_size=max_line_size,
-            max_field_size=max_field_size,
-        ),
-        session,
-    )
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Unpack[_RequestOptions],
+    ) -> _SessionRequestContextManager: ...
+
+else:
+
+    def request(
+        method: str,
+        url: StrOrURL,
+        *,
+        version: HttpVersion = http.HttpVersion11,
+        connector: Optional[BaseConnector] = None,
+        loop: Optional[asyncio.AbstractEventLoop] = None,
+        **kwargs: Any,
+    ) -> _SessionRequestContextManager:
+        """Constructs and sends a request.
+
+        Returns response object.
+        method - HTTP method
+        url - request url
+        params - (optional) Dictionary or bytes to be sent in the query
+        string of the new request
+        data - (optional) Dictionary, bytes, or file-like object to
+        send in the body of the request
+        json - (optional) Any json compatible python object
+        headers - (optional) Dictionary of HTTP Headers to send with
+        the request
+        cookies - (optional) Dict object to send with the request
+        auth - (optional) BasicAuth named tuple represent HTTP Basic Auth
+        auth - aiohttp.helpers.BasicAuth
+        allow_redirects - (optional) If set to False, do not follow
+        redirects
+        version - Request HTTP version.
+        compress - Set to True if request has to be compressed
+        with deflate encoding.
+        chunked - Set to chunk size for chunked transfer encoding.
+        expect100 - Expect 100-continue response from server.
+        connector - BaseConnector sub-class instance to support
+        connection pooling.
+        read_until_eof - Read response until eof if response
+        does not have Content-Length header.
+        loop - Optional event loop.
+        timeout - Optional ClientTimeout settings structure, 5min
+        total timeout by default.
+        Usage::
+        >>> import aiohttp
+        >>> async with aiohttp.request('GET', 'http://python.org/') as resp:
+        ...    print(resp)
+        ...    data = await resp.read()
+        <ClientResponse(https://www.python.org/) [200 OK]>
+        """
+        connector_owner = False
+        if connector is None:
+            connector_owner = True
+            connector = TCPConnector(loop=loop, force_close=True)
+
+        session = ClientSession(
+            loop=loop,
+            cookies=kwargs.pop("cookies", None),
+            version=version,
+            timeout=kwargs.pop("timeout", sentinel),
+            connector=connector,
+            connector_owner=connector_owner,
+        )
+
+        return _SessionRequestContextManager(
+            session._request(method, url, **kwargs),
+            session,
+        )
diff --git aiohttp/client_exceptions.py aiohttp/client_exceptions.py
index 667da8d5084..1d298e9a8cf 100644
--- aiohttp/client_exceptions.py
+++ aiohttp/client_exceptions.py
@@ -8,13 +8,17 @@
 
 from .typedefs import StrOrURL
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = SSLContext = None  # type: ignore[assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = SSLContext = None  # type: ignore[assignment]
 
 if TYPE_CHECKING:
     from .client_reqrep import ClientResponse, ConnectionKey, Fingerprint, RequestInfo
diff --git aiohttp/client_proto.py aiohttp/client_proto.py
index 79f033e3e12..2d64b3f3644 100644
--- aiohttp/client_proto.py
+++ aiohttp/client_proto.py
@@ -64,6 +64,7 @@ def force_close(self) -> None:
         self._should_close = True
 
     def close(self) -> None:
+        self._exception = None  # Break cyclic references
         transport = self.transport
         if transport is not None:
             transport.close()
diff --git aiohttp/client_reqrep.py aiohttp/client_reqrep.py
index e97c40ce0e5..43b48063c6e 100644
--- aiohttp/client_reqrep.py
+++ aiohttp/client_reqrep.py
@@ -72,12 +72,16 @@
     RawHeaders,
 )
 
-try:
+if TYPE_CHECKING:
     import ssl
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("ClientRequest", "ClientResponse", "RequestInfo", "Fingerprint")
diff --git aiohttp/client_ws.py aiohttp/client_ws.py
index f4cfa1bffe8..daa57d1930b 100644
--- aiohttp/client_ws.py
+++ aiohttp/client_ws.py
@@ -163,7 +163,9 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
         self._ping_task = None
 
     def _pong_not_received(self) -> None:
-        self._handle_ping_pong_exception(ServerTimeoutError())
+        self._handle_ping_pong_exception(
+            ServerTimeoutError(f"No PONG received after {self._pong_heartbeat} seconds")
+        )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
diff --git aiohttp/connector.py aiohttp/connector.py
index 93bc2513b20..7d5bcf755ec 100644
--- aiohttp/connector.py
+++ aiohttp/connector.py
@@ -60,14 +60,18 @@
 )
 from .resolver import DefaultResolver
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
 
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 EMPTY_SCHEMA_SET = frozenset({""})
 HTTP_SCHEMA_SET = frozenset({"http", "https"})
@@ -776,14 +780,16 @@ def _make_ssl_context(verified: bool) -> SSLContext:
         # No ssl support
         return None
     if verified:
-        return ssl.create_default_context()
-    sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
-    sslcontext.options |= ssl.OP_NO_SSLv2
-    sslcontext.options |= ssl.OP_NO_SSLv3
-    sslcontext.check_hostname = False
-    sslcontext.verify_mode = ssl.CERT_NONE
-    sslcontext.options |= ssl.OP_NO_COMPRESSION
-    sslcontext.set_default_verify_paths()
+        sslcontext = ssl.create_default_context()
+    else:
+        sslcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        sslcontext.options |= ssl.OP_NO_SSLv2
+        sslcontext.options |= ssl.OP_NO_SSLv3
+        sslcontext.check_hostname = False
+        sslcontext.verify_mode = ssl.CERT_NONE
+        sslcontext.options |= ssl.OP_NO_COMPRESSION
+        sslcontext.set_default_verify_paths()
+    sslcontext.set_alpn_protocols(("http/1.1",))
     return sslcontext
 
 
@@ -1009,11 +1015,11 @@ async def _resolve_host_with_throttle(
         This method must be run in a task and shielded from cancellation
         to avoid cancelling the underlying lookup.
         """
-        if traces:
-            for trace in traces:
-                await trace.send_dns_cache_miss(host)
         try:
             if traces:
+                for trace in traces:
+                    await trace.send_dns_cache_miss(host)
+
                 for trace in traces:
                     await trace.send_dns_resolvehost_start(host)
 
@@ -1197,7 +1203,13 @@ def _warn_about_tls_in_tls(
         if req.request_info.url.scheme != "https":
             return
 
-        asyncio_supports_tls_in_tls = getattr(
+        # Check if uvloop is being used, which supports TLS in TLS,
+        # otherwise assume that asyncio's native transport is being used.
+        if type(underlying_transport).__module__.startswith("uvloop"):
+            return
+
+        # Support in asyncio was added in Python 3.11 (bpo-44011)
+        asyncio_supports_tls_in_tls = sys.version_info >= (3, 11) or getattr(
             underlying_transport,
             "_start_tls_compatible",
             False,
diff --git aiohttp/cookiejar.py aiohttp/cookiejar.py
index ef04bda5ad6..f6b9a921767 100644
--- aiohttp/cookiejar.py
+++ aiohttp/cookiejar.py
@@ -117,6 +117,10 @@ def __init__(
         self._expire_heap: List[Tuple[float, Tuple[str, str, str]]] = []
         self._expirations: Dict[Tuple[str, str, str], float] = {}
 
+    @property
+    def quote_cookie(self) -> bool:
+        return self._quote_cookie
+
     def save(self, file_path: PathLike) -> None:
         file_path = pathlib.Path(file_path)
         with file_path.open(mode="wb") as f:
@@ -474,6 +478,10 @@ def __iter__(self) -> "Iterator[Morsel[str]]":
     def __len__(self) -> int:
         return 0
 
+    @property
+    def quote_cookie(self) -> bool:
+        return True
+
     def clear(self, predicate: Optional[ClearCookiePredicate] = None) -> None:
         pass
 
diff --git aiohttp/helpers.py aiohttp/helpers.py
index 8038931ebec..ace4f0e9b53 100644
--- aiohttp/helpers.py
+++ aiohttp/helpers.py
@@ -21,7 +21,7 @@
 from email.utils import parsedate
 from math import ceil
 from pathlib import Path
-from types import TracebackType
+from types import MappingProxyType, TracebackType
 from typing import (
     Any,
     Callable,
@@ -357,6 +357,20 @@ def parse_mimetype(mimetype: str) -> MimeType:
     )
 
 
+@functools.lru_cache(maxsize=56)
+def parse_content_type(raw: str) -> Tuple[str, MappingProxyType[str, str]]:
+    """Parse Content-Type header.
+
+    Returns a tuple of the parsed content type and a
+    MappingProxyType of parameters.
+    """
+    msg = HeaderParser().parsestr(f"Content-Type: {raw}")
+    content_type = msg.get_content_type()
+    params = msg.get_params(())
+    content_dict = dict(params[1:])  # First element is content type again
+    return content_type, MappingProxyType(content_dict)
+
+
 def guess_filename(obj: Any, default: Optional[str] = None) -> Optional[str]:
     name = getattr(obj, "name", None)
     if name and isinstance(name, str) and name[0] != "<" and name[-1] != ">":
@@ -710,10 +724,10 @@ def _parse_content_type(self, raw: Optional[str]) -> None:
             self._content_type = "application/octet-stream"
             self._content_dict = {}
         else:
-            msg = HeaderParser().parsestr("Content-Type: " + raw)
-            self._content_type = msg.get_content_type()
-            params = msg.get_params(())
-            self._content_dict = dict(params[1:])  # First element is content type again
+            content_type, content_mapping_proxy = parse_content_type(raw)
+            self._content_type = content_type
+            # _content_dict needs to be mutable so we can update it
+            self._content_dict = content_mapping_proxy.copy()
 
     @property
     def content_type(self) -> str:
diff --git aiohttp/http_writer.py aiohttp/http_writer.py
index c66fda3d8d0..e031a97708d 100644
--- aiohttp/http_writer.py
+++ aiohttp/http_writer.py
@@ -1,6 +1,7 @@
 """Http related parsers and protocol."""
 
 import asyncio
+import sys
 import zlib
 from typing import (  # noqa
     Any,
@@ -24,6 +25,17 @@
 __all__ = ("StreamWriter", "HttpVersion", "HttpVersion10", "HttpVersion11")
 
 
+MIN_PAYLOAD_FOR_WRITELINES = 2048
+IS_PY313_BEFORE_313_2 = (3, 13, 0) <= sys.version_info < (3, 13, 2)
+IS_PY_BEFORE_312_9 = sys.version_info < (3, 12, 9)
+SKIP_WRITELINES = IS_PY313_BEFORE_313_2 or IS_PY_BEFORE_312_9
+# writelines is not safe for use
+# on Python 3.12+ until 3.12.9
+# on Python 3.13+ until 3.13.2
+# and on older versions it not any faster than write
+# CVE-2024-12254: https://github.com/python/cpython/pull/127656
+
+
 class HttpVersion(NamedTuple):
     major: int
     minor: int
@@ -72,7 +84,7 @@ def enable_compression(
     ) -> None:
         self._compress = ZLibCompressor(encoding=encoding, strategy=strategy)
 
-    def _write(self, chunk: bytes) -> None:
+    def _write(self, chunk: Union[bytes, bytearray, memoryview]) -> None:
         size = len(chunk)
         self.buffer_size += size
         self.output_size += size
@@ -90,10 +102,17 @@ def _writelines(self, chunks: Iterable[bytes]) -> None:
         transport = self._protocol.transport
         if transport is None or transport.is_closing():
             raise ClientConnectionResetError("Cannot write to closing transport")
-        transport.writelines(chunks)
+        if SKIP_WRITELINES or size < MIN_PAYLOAD_FOR_WRITELINES:
+            transport.write(b"".join(chunks))
+        else:
+            transport.writelines(chunks)
 
     async def write(
-        self, chunk: bytes, *, drain: bool = True, LIMIT: int = 0x10000
+        self,
+        chunk: Union[bytes, bytearray, memoryview],
+        *,
+        drain: bool = True,
+        LIMIT: int = 0x10000,
     ) -> None:
         """Writes chunk of data to a stream.
 
diff --git aiohttp/multipart.py aiohttp/multipart.py
index e0bcce07449..bd4d8ae1ddf 100644
--- aiohttp/multipart.py
+++ aiohttp/mu,ltipart.py
@@ -979,7 +979,7 @@ def decode(self, encoding: str = "utf-8", errors: str = "strict") -> str:
         return "".join(
             "--"
             + self.boundary
-            + "\n"
+            + "\r\n"
             + part._binary_headers.decode(encoding, errors)
             + part.decode()
             for part, _e, _te in self._parts
diff --git aiohttp/payload.py aiohttp/payload.py
index c8c01814698..3f6d3672db2 100644
--- aiohttp/payload.py
+++ aiohttp/payload.py
@@ -4,6 +4,7 @@
 import json
 import mimetypes
 import os
+import sys
 import warnings
 from abc import ABC, abstractmethod
 from itertools import chain
@@ -169,7 +170,11 @@ def __init__(
         if content_type is not sentinel and content_type is not None:
             self._headers[hdrs.CONTENT_TYPE] = content_type
         elif self._filename is not None:
-            content_type = mimetypes.guess_type(self._filename)[0]
+            if sys.version_info >= (3, 13):
+                guesser = mimetypes.guess_file_type
+            else:
+                guesser = mimetypes.guess_type
+            content_type = guesser(self._filename)[0]
             if content_type is None:
                 content_type = self._default_content_type
             self._headers[hdrs.CONTENT_TYPE] = content_type
diff --git aiohttp/pytest_plugin.py aiohttp/pytest_plugin.py
index 7ce60faa4a4..21d6ea7bbcd 100644
--- aiohttp/pytest_plugin.py
+++ aiohttp/pytest_plugin.py
@@ -98,7 +98,7 @@ def pytest_fixture_setup(fixturedef):  # type: ignore[no-untyped-def]
     if inspect.isasyncgenfunction(func):
         # async generator fixture
         is_async_gen = True
-    elif asyncio.iscoroutinefunction(func):
+    elif inspect.iscoroutinefunction(func):
         # regular async fixture
         is_async_gen = False
     else:
@@ -200,14 +200,14 @@ def _passthrough_loop_context(loop, fast=False):  # type: ignore[no-untyped-def]
 
 def pytest_pycollect_makeitem(collector, name, obj):  # type: ignore[no-untyped-def]
     """Fix pytest collecting for coroutines."""
-    if collector.funcnamefilter(name) and asyncio.iscoroutinefunction(obj):
+    if collector.funcnamefilter(name) and inspect.iscoroutinefunction(obj):
         return list(collector._genfunctions(name, obj))
 
 
 def pytest_pyfunc_call(pyfuncitem):  # type: ignore[no-untyped-def]
     """Run coroutines in an event loop instead of a normal function call."""
     fast = pyfuncitem.config.getoption("--aiohttp-fast")
-    if asyncio.iscoroutinefunction(pyfuncitem.function):
+    if inspect.iscoroutinefunction(pyfuncitem.function):
         existing_loop = pyfuncitem.funcargs.get(
             "proactor_loop"
         ) or pyfuncitem.funcargs.get("loop", None)
diff --git aiohttp/resolver.py aiohttp/resolver.py
index 9c744514fae..e14179cc8a2 100644
--- aiohttp/resolver.py
+++ aiohttp/resolver.py
@@ -18,6 +18,9 @@
 
 _NUMERIC_SOCKET_FLAGS = socket.AI_NUMERICHOST | socket.AI_NUMERICSERV
 _NAME_SOCKET_FLAGS = socket.NI_NUMERICHOST | socket.NI_NUMERICSERV
+_AI_ADDRCONFIG = socket.AI_ADDRCONFIG
+if hasattr(socket, "AI_MASK"):
+    _AI_ADDRCONFIG &= socket.AI_MASK
 
 
 class ThreadedResolver(AbstractResolver):
@@ -38,7 +41,7 @@ async def resolve(
             port,
             type=socket.SOCK_STREAM,
             family=family,
-            flags=socket.AI_ADDRCONFIG,
+            flags=_AI_ADDRCONFIG,
         )
 
         hosts: List[ResolveResult] = []
@@ -105,7 +108,7 @@ async def resolve(
                 port=port,
                 type=socket.SOCK_STREAM,
                 family=family,
-                flags=socket.AI_ADDRCONFIG,
+                flags=_AI_ADDRCONFIG,
             )
         except aiodns.error.DNSError as exc:
             msg = exc.args[1] if len(exc.args) >= 1 else "DNS lookup failed"
diff --git aiohttp/streams.py aiohttp/streams.py
index b97846171b1..7a3f64d1289 100644
--- aiohttp/streams.py
+++ aiohttp/streams.py
@@ -220,6 +220,9 @@ def feed_eof(self) -> None:
             self._eof_waiter = None
             set_result(waiter, None)
 
+        if self._protocol._reading_paused:
+            self._protocol.resume_reading()
+
         for cb in self._eof_callbacks:
             try:
                 cb()
@@ -517,8 +520,9 @@ def _read_nowait_chunk(self, n: int) -> bytes:
         else:
             data = self._buffer.popleft()
 
-        self._size -= len(data)
-        self._cursor += len(data)
+        data_len = len(data)
+        self._size -= data_len
+        self._cursor += data_len
 
         chunk_splits = self._http_chunk_splits
         # Prevent memory leak: drop useless chunk splits
@@ -551,6 +555,7 @@ class EmptyStreamReader(StreamReader):  # lgtm [py/missing-call-to-init]
 
     def __init__(self) -> None:
         self._read_eof_chunk = False
+        self.total_bytes = 0
 
     def __repr__(self) -> str:
         return "<%s>" % self.__class__.__name__
diff --git aiohttp/test_utils.py aiohttp/test_utils.py
index be6e9b3353e..87c31427867 100644
--- aiohttp/test_utils.py
+++ aiohttp/test_utils.py
@@ -730,6 +730,10 @@ def make_mocked_request(
     if protocol is sentinel:
         protocol = mock.Mock()
         protocol.transport = transport
+        type(protocol).peername = mock.PropertyMock(
+            return_value=transport.get_extra_info("peername")
+        )
+        type(protocol).ssl_context = mock.PropertyMock(return_value=sslcontext)
 
     if writer is sentinel:
         writer = mock.Mock()
diff --git aiohttp/web.py aiohttp/web.py
index f975b665331..d6ab6f6fad4 100644
--- aiohttp/web.py
+++ aiohttp/web.py
@@ -9,6 +9,7 @@
 from contextlib import suppress
 from importlib import import_module
 from typing import (
+    TYPE_CHECKING,
     Any,
     Awaitable,
     Callable,
@@ -287,10 +288,13 @@
 )
 
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:  # pragma: no cover
-    SSLContext = Any  # type: ignore[misc,assignment]
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 # Only display warning when using -Wdefault, -We, -X dev or similar.
 warnings.filterwarnings("ignore", category=NotAppKeyWarning, append=True)
diff --git aiohttp/web_fileresponse.py aiohttp/web_fileresponse.py
index 3b2bc2caf12..be9cf87e069 100644
--- aiohttp/web_fileresponse.py
+++ aiohttp/web_fileresponse.py
@@ -1,7 +1,10 @@
 import asyncio
+import io
 import os
 import pathlib
+import sys
 from contextlib import suppress
+from enum import Enum, auto
 from mimetypes import MimeTypes
 from stat import S_ISREG
 from types import MappingProxyType
@@ -15,6 +18,7 @@
     Iterator,
     List,
     Optional,
+    Set,
     Tuple,
     Union,
     cast,
@@ -66,12 +70,25 @@
     }
 )
 
+
+class _FileResponseResult(Enum):
+    """The result of the file response."""
+
+    SEND_FILE = auto()  # Ie a regular file to send
+    NOT_ACCEPTABLE = auto()  # Ie a socket, or non-regular file
+    PRE_CONDITION_FAILED = auto()  # Ie If-Match or If-None-Match failed
+    NOT_MODIFIED = auto()  # 304 Not Modified
+
+
 # Add custom pairs and clear the encodings map so guess_type ignores them.
 CONTENT_TYPES.encodings_map.clear()
 for content_type, extension in ADDITIONAL_CONTENT_TYPES.items():
     CONTENT_TYPES.add_type(content_type, extension)  # type: ignore[attr-defined]
 
 
+_CLOSE_FUTURES: Set[asyncio.Future[None]] = set()
+
+
 class FileResponse(StreamResponse):
     """A response object can be used to send files."""
 
@@ -160,10 +177,12 @@ async def _precondition_failed(
         self.content_length = 0
         return await super().prepare(request)
 
-    def _get_file_path_stat_encoding(
-        self, accept_encoding: str
-    ) -> Tuple[pathlib.Path, os.stat_result, Optional[str]]:
-        """Return the file path, stat result, and encoding.
+    def _make_response(
+        self, request: "BaseRequest", accept_encoding: str
+    ) -> Tuple[
+        _FileResponseResult, Optional[io.BufferedReader], os.stat_result, Optional[str]
+    ]:
+        """Return the response result, io object, stat result, and encoding.
 
         If an uncompressed file is returned, the encoding is set to
         :py:data:`None`.
@@ -171,6 +190,52 @@ def _get_file_path_stat_encoding(
         This method should be called from a thread executor
         since it calls os.stat which may block.
         """
+        file_path, st, file_encoding = self._get_file_path_stat_encoding(
+            accept_encoding
+        )
+        if not file_path:
+            return _FileResponseResult.NOT_ACCEPTABLE, None, st, None
+
+        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
+        if (ifmatch := request.if_match) is not None and not self._etag_match(
+            etag_value, ifmatch, weak=False
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        if (
+            (unmodsince := request.if_unmodified_since) is not None
+            and ifmatch is None
+            and st.st_mtime > unmodsince.timestamp()
+        ):
+            return _FileResponseResult.PRE_CONDITION_FAILED, None, st, file_encoding
+
+        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
+        if (ifnonematch := request.if_none_match) is not None and self._etag_match(
+            etag_value, ifnonematch, weak=True
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        if (
+            (modsince := request.if_modified_since) is not None
+            and ifnonematch is None
+            and st.st_mtime <= modsince.timestamp()
+        ):
+            return _FileResponseResult.NOT_MODIFIED, None, st, file_encoding
+
+        fobj = file_path.open("rb")
+        with suppress(OSError):
+            # fstat() may not be available on all platforms
+            # Once we open the file, we want the fstat() to ensure
+            # the file has not changed between the first stat()
+            # and the open().
+            st = os.stat(fobj.fileno())
+        return _FileResponseResult.SEND_FILE, fobj, st, file_encoding
+
+    def _get_file_path_stat_encoding(
+        self, accept_encoding: str
+    ) -> Tuple[Optional[pathlib.Path], os.stat_result, Optional[str]]:
         file_path = self._path
         for file_extension, file_encoding in ENCODING_EXTENSIONS.items():
             if file_encoding not in accept_encoding:
@@ -184,7 +249,8 @@ def _get_file_path_stat_encoding(
                     return compressed_path, st, file_encoding
 
         # Fallback to the uncompressed file
-        return file_path, file_path.stat(), None
+        st = file_path.stat()
+        return file_path if S_ISREG(st.st_mode) else None, st, None
 
     async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter]:
         loop = asyncio.get_running_loop()
@@ -192,9 +258,12 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # https://www.rfc-editor.org/rfc/rfc9110#section-8.4.1
         accept_encoding = request.headers.get(hdrs.ACCEPT_ENCODING, "").lower()
         try:
-            file_path, st, file_encoding = await loop.run_in_executor(
-                None, self._get_file_path_stat_encoding, accept_encoding
+            response_result, fobj, st, file_encoding = await loop.run_in_executor(
+                None, self._make_response, request, accept_encoding
             )
+        except PermissionError:
+            self.set_status(HTTPForbidden.status_code)
+            return await super().prepare(request)
         except OSError:
             # Most likely to be FileNotFoundError or OSError for circular
             # symlinks in python >= 3.13, so respond with 404.
@@ -202,51 +271,46 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             return await super().prepare(request)
 
         # Forbid special files like sockets, pipes, devices, etc.
-        if not S_ISREG(st.st_mode):
+        if response_result is _FileResponseResult.NOT_ACCEPTABLE:
             self.set_status(HTTPForbidden.status_code)
             return await super().prepare(request)
 
-        etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
-        last_modified = st.st_mtime
-
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.1-2
-        ifmatch = request.if_match
-        if ifmatch is not None and not self._etag_match(
-            etag_value, ifmatch, weak=False
-        ):
-            return await self._precondition_failed(request)
-
-        unmodsince = request.if_unmodified_since
-        if (
-            unmodsince is not None
-            and ifmatch is None
-            and st.st_mtime > unmodsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.PRE_CONDITION_FAILED:
             return await self._precondition_failed(request)
 
-        # https://www.rfc-editor.org/rfc/rfc9110#section-13.1.2-2
-        ifnonematch = request.if_none_match
-        if ifnonematch is not None and self._etag_match(
-            etag_value, ifnonematch, weak=True
-        ):
-            return await self._not_modified(request, etag_value, last_modified)
-
-        modsince = request.if_modified_since
-        if (
-            modsince is not None
-            and ifnonematch is None
-            and st.st_mtime <= modsince.timestamp()
-        ):
+        if response_result is _FileResponseResult.NOT_MODIFIED:
+            etag_value = f"{st.st_mtime_ns:x}-{st.st_size:x}"
+            last_modified = st.st_mtime
             return await self._not_modified(request, etag_value, last_modified)
 
+        assert fobj is not None
+        try:
+            return await self._prepare_open_file(request, fobj, st, file_encoding)
+        finally:
+            # We do not await here because we do not want to wait
+            # for the executor to finish before returning the response
+            # so the connection can begin servicing another request
+            # as soon as possible.
+            close_future = loop.run_in_executor(None, fobj.close)
+            # Hold a strong reference to the future to prevent it from being
+            # garbage collected before it completes.
+            _CLOSE_FUTURES.add(close_future)
+            close_future.add_done_callback(_CLOSE_FUTURES.remove)
+
+    async def _prepare_open_file(
+        self,
+        request: "BaseRequest",
+        fobj: io.BufferedReader,
+        st: os.stat_result,
+        file_encoding: Optional[str],
+    ) -> Optional[AbstractStreamWriter]:
         status = self._status
-        file_size = st.st_size
-        count = file_size
-
-        start = None
+        file_size: int = st.st_size
+        file_mtime: float = st.st_mtime
+        count: int = file_size
+        start: Optional[int] = None
 
-        ifrange = request.if_range
-        if ifrange is None or st.st_mtime <= ifrange.timestamp():
+        if (ifrange := request.if_range) is None or file_mtime <= ifrange.timestamp():
             # If-Range header check:
             # condition = cached date >= last modification date
             # return 206 if True else 200.
@@ -257,7 +321,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
             try:
                 rng = request.http_range
                 start = rng.start
-                end = rng.stop
+                end: Optional[int] = rng.stop
             except ValueError:
                 # https://tools.ietf.org/html/rfc7233:
                 # A server generating a 416 (Range Not Satisfiable) response to
@@ -268,13 +332,13 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                 #
                 # Will do the same below. Many servers ignore this and do not
                 # send a Content-Range header with HTTP 416
-                self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                 self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                 return await super().prepare(request)
 
             # If a range request has been made, convert start, end slice
             # notation into file pointer offset and count
-            if start is not None or end is not None:
+            if start is not None:
                 if start < 0 and end is None:  # return tail of file
                     start += file_size
                     if start < 0:
@@ -304,7 +368,7 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
                     # suffix-byte-range-spec with a non-zero suffix-length,
                     # then the byte-range-set is satisfiable. Otherwise, the
                     # byte-range-set is unsatisfiable.
-                    self.headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
+                    self._headers[hdrs.CONTENT_RANGE] = f"bytes */{file_size}"
                     self.set_status(HTTPRequestRangeNotSatisfiable.status_code)
                     return await super().prepare(request)
 
@@ -316,48 +380,39 @@ async def prepare(self, request: "BaseRequest") -> Optional[AbstractStreamWriter
         # If the Content-Type header is not already set, guess it based on the
         # extension of the request path. The encoding returned by guess_type
         #  can be ignored since the map was cleared above.
-        if hdrs.CONTENT_TYPE not in self.headers:
-            self.content_type = (
-                CONTENT_TYPES.guess_type(self._path)[0] or FALLBACK_CONTENT_TYPE
-            )
+        if hdrs.CONTENT_TYPE not in self._headers:
+            if sys.version_info >= (3, 13):
+                guesser = CONTENT_TYPES.guess_file_type
+            else:
+                guesser = CONTENT_TYPES.guess_type
+            self.content_type = guesser(self._path)[0] or FALLBACK_CONTENT_TYPE
 
         if file_encoding:
-            self.headers[hdrs.CONTENT_ENCODING] = file_encoding
-            self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
+            self._headers[hdrs.CONTENT_ENCODING] = file_encoding
+            self._headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING
             # Disable compression if we are already sending
             # a compressed file since we don't want to double
             # compress.
             self._compression = False
 
-        self.etag = etag_value  # type: ignore[assignment]
-        self.last_modified = st.st_mtime  # type: ignore[assignment]
+        self.etag = f"{st.st_mtime_ns:x}-{st.st_size:x}"  # type: ignore[assignment]
+        self.last_modified = file_mtime  # type: ignore[assignment]
         self.content_length = count
 
-        self.headers[hdrs.ACCEPT_RANGES] = "bytes"
-
-        real_start = cast(int, start)
+        self._headers[hdrs.ACCEPT_RANGES] = "bytes"
 
         if status == HTTPPartialContent.status_code:
-            self.headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
+            real_start = start
+            assert real_start is not None
+            self._headers[hdrs.CONTENT_RANGE] = "bytes {}-{}/{}".format(
                 real_start, real_start + count - 1, file_size
             )
 
         # If we are sending 0 bytes calling sendfile() will throw a ValueError
-        if count == 0 or must_be_empty_body(request.method, self.status):
-            return await super().prepare(request)
-
-        try:
-            fobj = await loop.run_in_executor(None, file_path.open, "rb")
-        except PermissionError:
-            self.set_status(HTTPForbidden.status_code)
+        if count == 0 or must_be_empty_body(request.method, status):
             return await super().prepare(request)
 
-        if start:  # be aware that start could be None or int=0 here.
-            offset = start
-        else:
-            offset = 0
+        # be aware that start could be None or int=0 here.
+        offset = start or 0
 
-        try:
-            return await self._sendfile(request, fobj, offset, count)
-        finally:
-            await asyncio.shield(loop.run_in_executor(None, fobj.close))
+        return await self._sendfile(request, fobj, offset, count)
diff --git aiohttp/web_protocol.py aiohttp/web_protocol.py
index e8bb41abf97..e1923aac24b 100644
--- aiohttp/web_protocol.py
+++ aiohttp/web_protocol.py
@@ -24,6 +24,7 @@
 
 import attr
 import yarl
+from propcache import under_cached_property
 
 from .abc import AbstractAccessLogger, AbstractStreamWriter
 from .base_protocol import BaseProtocol
@@ -47,6 +48,8 @@
 __all__ = ("RequestHandler", "RequestPayloadError", "PayloadAccessError")
 
 if TYPE_CHECKING:
+    import ssl
+
     from .web_server import Server
 
 
@@ -167,6 +170,8 @@ class RequestHandler(BaseProtocol):
         "_current_request",
         "_timeout_ceil_threshold",
         "_request_in_progress",
+        "_logging_enabled",
+        "_cache",
     )
 
     def __init__(
@@ -240,12 +245,15 @@ def __init__(
             self.access_logger: Optional[AbstractAccessLogger] = access_log_class(
                 access_log, access_log_format
             )
+            self._logging_enabled = self.access_logger.enabled
         else:
             self.access_logger = None
+            self._logging_enabled = False
 
         self._close = False
         self._force_close = False
         self._request_in_progress = False
+        self._cache: dict[str, Any] = {}
 
     def __repr__(self) -> str:
         return "<{} {}>".format(
@@ -253,6 +261,26 @@ def __repr__(self) -> str:
             "connected" if self.transport is not None else "disconnected",
         )
 
+    @under_cached_property
+    def ssl_context(self) -> Optional["ssl.SSLContext"]:
+        """Return SSLContext if available."""
+        return (
+            None
+            if self.transport is None
+            else self.transport.get_extra_info("sslcontext")
+        )
+
+    @under_cached_property
+    def peername(
+        self,
+    ) -> Optional[Union[str, Tuple[str, int, int, int], Tuple[str, int]]]:
+        """Return peername if available."""
+        return (
+            None
+            if self.transport is None
+            else self.transport.get_extra_info("peername")
+        )
+
     @property
     def keepalive_timeout(self) -> float:
         return self._keepalive_timeout
@@ -438,9 +466,11 @@ def force_close(self) -> None:
             self.transport = None
 
     def log_access(
-        self, request: BaseRequest, response: StreamResponse, time: float
+        self, request: BaseRequest, response: StreamResponse, time: Optional[float]
     ) -> None:
         if self.access_logger is not None and self.access_logger.enabled:
+            if TYPE_CHECKING:
+                assert time is not None
             self.access_logger.log(request, response, self._loop.time() - time)
 
     def log_debug(self, *args: Any, **kw: Any) -> None:
@@ -458,7 +488,7 @@ def _process_keepalive(self) -> None:
         loop = self._loop
         now = loop.time()
         close_time = self._next_keepalive_close_time
-        if now <= close_time:
+        if now < close_time:
             # Keep alive close check fired too early, reschedule
             self._keepalive_handle = loop.call_at(close_time, self._process_keepalive)
             return
@@ -470,7 +500,7 @@ def _process_keepalive(self) -> None:
     async def _handle_request(
         self,
         request: BaseRequest,
-        start_time: float,
+        start_time: Optional[float],
         request_handler: Callable[[BaseRequest], Awaitable[StreamResponse]],
     ) -> Tuple[StreamResponse, bool]:
         self._request_in_progress = True
@@ -520,8 +550,6 @@ async def start(self) -> None:
         keep_alive(True) specified.
         """
         loop = self._loop
-        handler = asyncio.current_task(loop)
-        assert handler is not None
         manager = self._manager
         assert manager is not None
         keepalive_timeout = self._keepalive_timeout
@@ -540,7 +568,9 @@ async def start(self) -> None:
 
             message, payload = self._messages.popleft()
 
-            start = loop.time()
+            # time is only fetched if logging is enabled as otherwise
+            # its thrown away and never used.
+            start = loop.time() if self._logging_enabled else None
 
             manager.requests_count += 1
             writer = StreamWriter(self, loop)
@@ -551,7 +581,16 @@ async def start(self) -> None:
             else:
                 request_handler = self._request_handler
 
-            request = self._request_factory(message, payload, self, writer, handler)
+            # Important don't hold a reference to the current task
+            # as on traceback it will prevent the task from being
+            # collected and will cause a memory leak.
+            request = self._request_factory(
+                message,
+                payload,
+                self,
+                writer,
+                self._task_handler or asyncio.current_task(loop),  # type: ignore[arg-type]
+            )
             try:
                 # a new task is used for copy context vars (#3406)
                 coro = self._handle_request(request, start, request_handler)
@@ -608,26 +647,29 @@ async def start(self) -> None:
 
             except asyncio.CancelledError:
                 self.log_debug("Ignored premature client disconnection")
+                self.force_close()
                 raise
             except Exception as exc:
                 self.log_exception("Unhandled exception", exc_info=exc)
                 self.force_close()
+            except BaseException:
+                self.force_close()
+                raise
             finally:
+                request._task = None  # type: ignore[assignment] # Break reference cycle in case of exception
                 if self.transport is None and resp is not None:
                     self.log_debug("Ignored premature client disconnection.")
-                elif not self._force_close:
-                    if self._keepalive and not self._close:
-                        # start keep-alive timer
-                        if keepalive_timeout is not None:
-                            now = loop.time()
-                            close_time = now + keepalive_timeout
-                            self._next_keepalive_close_time = close_time
-                            if self._keepalive_handle is None:
-                                self._keepalive_handle = loop.call_at(
-                                    close_time, self._process_keepalive
-                                )
-                    else:
-                        break
+
+            if self._keepalive and not self._close and not self._force_close:
+                # start keep-alive timer
+                close_time = loop.time() + keepalive_timeout
+                self._next_keepalive_close_time = close_time
+                if self._keepalive_handle is None:
+                    self._keepalive_handle = loop.call_at(
+                        close_time, self._process_keepalive
+                    )
+            else:
+                break
 
         # remove handler, close transport if no handlers left
         if not self._force_close:
@@ -636,7 +678,7 @@ async def start(self) -> None:
                 self.transport.close()
 
     async def finish_response(
-        self, request: BaseRequest, resp: StreamResponse, start_time: float
+        self, request: BaseRequest, resp: StreamResponse, start_time: Optional[float]
     ) -> Tuple[StreamResponse, bool]:
         """Prepare the response and write_eof, then log access.
 
@@ -694,9 +736,13 @@ def handle_error(
             # or encrypted traffic to an HTTP port. This is expected
             # to happen when connected to the public internet so we log
             # it at the debug level as to not fill logs with noise.
-            self.logger.debug("Error handling request", exc_info=exc)
+            self.logger.debug(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
         else:
-            self.log_exception("Error handling request", exc_info=exc)
+            self.log_exception(
+                "Error handling request from %s", request.remote, exc_info=exc
+            )
 
         # some data already got sent, connection is broken
         if request.writer.output_size > 0:
diff --git aiohttp/web_request.py aiohttp/web_request.py
index f11d49020a0..6bf5a9dea74 100644
--- aiohttp/web_request.py
+++ aiohttp/web_request.py
@@ -198,10 +198,8 @@ def __init__(
         self._client_max_size = client_max_size
         self._loop = loop
 
-        transport = protocol.transport
-        assert transport is not None
-        self._transport_sslcontext = transport.get_extra_info("sslcontext")
-        self._transport_peername = transport.get_extra_info("peername")
+        self._transport_sslcontext = protocol.ssl_context
+        self._transport_peername = protocol.peername
 
         if remote is not None:
             self._cache["remote"] = remote
diff --git aiohttp/web_response.py aiohttp/web_response.py
index cd2be24f1a3..367ac6e8c0a 100644
--- aiohttp/web_response.py
+++ aiohttp/web_response.py
@@ -537,7 +537,7 @@ async def _write_headers(self) -> None:
         status_line = f"HTTP/{version[0]}.{version[1]} {self._status} {self._reason}"
         await writer.write_headers(status_line, self._headers)
 
-    async def write(self, data: bytes) -> None:
+    async def write(self, data: Union[bytes, bytearray, memoryview]) -> None:
         assert isinstance(
             data, (bytes, bytearray, memoryview)
         ), "data argument must be byte-ish (%r)" % type(data)
@@ -629,10 +629,8 @@ def __init__(
 
         if headers is None:
             real_headers: CIMultiDict[str] = CIMultiDict()
-        elif not isinstance(headers, CIMultiDict):
-            real_headers = CIMultiDict(headers)
         else:
-            real_headers = headers  # = cast('CIMultiDict[str]', headers)
+            real_headers = CIMultiDict(headers)
 
         if content_type is not None and "charset" in content_type:
             raise ValueError("charset must not be in content_type argument")
diff --git aiohttp/web_runner.py aiohttp/web_runner.py
index f8933383435..bcfec727c84 100644
--- aiohttp/web_runner.py
+++ aiohttp/web_runner.py
@@ -3,7 +3,7 @@
 import socket
 import warnings
 from abc import ABC, abstractmethod
-from typing import Any, List, Optional, Set
+from typing import TYPE_CHECKING, Any, List, Optional, Set
 
 from yarl import URL
 
@@ -11,11 +11,13 @@
 from .web_app import Application
 from .web_server import Server
 
-try:
+if TYPE_CHECKING:
     from ssl import SSLContext
-except ImportError:
-    SSLContext = object  # type: ignore[misc,assignment]
-
+else:
+    try:
+        from ssl import SSLContext
+    except ImportError:  # pragma: no cover
+        SSLContext = object  # type: ignore[misc,assignment]
 
 __all__ = (
     "BaseSite",
diff --git aiohttp/web_urldispatcher.py aiohttp/web_urldispatcher.py
index 6443c500a33..28ae2518fec 100644
--- aiohttp/web_urldispatcher.py
+++ aiohttp/web_urldispatcher.py
@@ -180,8 +180,8 @@ def __init__(
         if expect_handler is None:
             expect_handler = _default_expect_handler
 
-        assert asyncio.iscoroutinefunction(
-            expect_handler
+        assert inspect.iscoroutinefunction(expect_handler) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(expect_handler)
         ), f"Coroutine is expected, got {expect_handler!r}"
 
         method = method.upper()
@@ -189,7 +189,9 @@ def __init__(
             raise ValueError(f"{method} is not allowed HTTP method")
 
         assert callable(handler), handler
-        if asyncio.iscoroutinefunction(handler):
+        if inspect.iscoroutinefunction(handler) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(handler)
+        ):
             pass
         elif inspect.isgeneratorfunction(handler):
             warnings.warn(
diff --git aiohttp/web_ws.py aiohttp/web_ws.py
index 0fb1549a3aa..439b8049987 100644
--- aiohttp/web_ws.py
+++ aiohttp/web_ws.py
@@ -182,7 +182,11 @@ def _ping_task_done(self, task: "asyncio.Task[None]") -> None:
 
     def _pong_not_received(self) -> None:
         if self._req is not None and self._req.transport is not None:
-            self._handle_ping_pong_exception(asyncio.TimeoutError())
+            self._handle_ping_pong_exception(
+                asyncio.TimeoutError(
+                    f"No PONG received after {self._pong_heartbeat} seconds"
+                )
+            )
 
     def _handle_ping_pong_exception(self, exc: BaseException) -> None:
         """Handle exceptions raised during ping/pong processing."""
@@ -248,7 +252,8 @@ def _handshake(
             else:
                 # No overlap found: Return no protocol as per spec
                 ws_logger.warning(
-                    "Client protocols %r don’t overlap server-known ones %r",
+                    "%s: Client protocols %r don’t overlap server-known ones %r",
+                    request.remote,
                     req_protocols,
                     self._protocols,
                 )
diff --git aiohttp/worker.py aiohttp/worker.py
index 9b307697336..f7281bfde75 100644
--- aiohttp/worker.py
+++ aiohttp/worker.py
@@ -1,12 +1,13 @@
 """Async gunicorn worker for aiohttp.web"""
 
 import asyncio
+import inspect
 import os
 import re
 import signal
 import sys
 from types import FrameType
-from typing import Any, Awaitable, Callable, Optional, Union  # noqa
+from typing import TYPE_CHECKING, Any, Optional
 
 from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat
 from gunicorn.workers import base
@@ -17,13 +18,18 @@
 from .web_app import Application
 from .web_log import AccessLogger
 
-try:
+if TYPE_CHECKING:
     import ssl
 
     SSLContext = ssl.SSLContext
-except ImportError:  # pragma: no cover
-    ssl = None  # type: ignore[assignment]
-    SSLContext = object  # type: ignore[misc,assignment]
+else:
+    try:
+        import ssl
+
+        SSLContext = ssl.SSLContext
+    except ImportError:  # pragma: no cover
+        ssl = None  # type: ignore[assignment]
+        SSLContext = object  # type: ignore[misc,assignment]
 
 
 __all__ = ("GunicornWebWorker", "GunicornUVLoopWebWorker")
@@ -66,7 +72,9 @@ async def _run(self) -> None:
         runner = None
         if isinstance(self.wsgi, Application):
             app = self.wsgi
-        elif asyncio.iscoroutinefunction(self.wsgi):
+        elif inspect.iscoroutinefunction(self.wsgi) or (
+            sys.version_info < (3, 14) and asyncio.iscoroutinefunction(self.wsgi)
+        ):
             wsgi = await self.wsgi()
             if isinstance(wsgi, web.AppRunner):
                 runner = wsgi
diff --git docs/client_quickstart.rst docs/client_quickstart.rst
index f99339cf4a6..0e03f104e90 100644
--- docs/client_quickstart.rst
+++ docs/client_quickstart.rst
@@ -93,7 +93,7 @@ Passing Parameters In URLs
 You often want to send some sort of data in the URL's query string. If
 you were constructing the URL by hand, this data would be given as key/value
 pairs in the URL after a question mark, e.g. ``httpbin.org/get?key=val``.
-Requests allows you to provide these arguments as a :class:`dict`, using the
+aiohttp allows you to provide these arguments as a :class:`dict`, using the
 ``params`` keyword argument. As an example, if you wanted to pass
 ``key1=value1`` and ``key2=value2`` to ``httpbin.org/get``, you would use the
 following code::
diff --git docs/client_reference.rst docs/client_reference.rst
index c9031de5383..130ba6cc336 100644
--- docs/client_reference.rst
+++ docs/client_reference.rst
@@ -364,7 +364,7 @@ The client session supports the context manager protocol for self closing.
 
       .. versionadded:: 3.7
 
-   .. attribute:: trace_config
+   .. attribute:: trace_configs
 
       A list of :class:`TraceConfig` instances used for client
       tracing.  ``None`` (default) is used for request tracing
@@ -448,11 +448,16 @@ The client session supports the context manager protocol for self closing.
       :param aiohttp.BasicAuth auth: an object that represents HTTP
                                      Basic Authorization (optional)
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed (up to ``max_redirects`` times)
+         and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :param int max_redirects: Maximum number of redirects to follow.
-                                ``10`` by default.
+         :exc:`TooManyRedirects` is raised if the number is exceeded.
+         Ignored when ``allow_redirects=False``.
+         ``10`` by default.
 
       :param bool compress: Set to ``True`` if request has to be compressed
          with deflate encoding. If `compress` can not be combined
@@ -508,7 +513,7 @@ The client session supports the context manager protocol for self closing.
          .. versionadded:: 3.0
 
       :param str server_hostname: Sets or overrides the host name that the
-         target server’s certificate will be matched against.
+         target server's certificate will be matched against.
 
          See :py:meth:`asyncio.loop.create_connection` for more information.
 
@@ -554,8 +559,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -623,8 +631,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``False`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``False`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -641,8 +652,11 @@ The client session supports the context manager protocol for self closing.
 
       :param url: Request URL, :class:`str` or :class:`~yarl.URL`
 
-      :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                   ``True`` by default (optional).
+      :param bool allow_redirects: Whether to process redirects or not.
+         When ``True``, redirects are followed and logged into
+         :attr:`ClientResponse.history`.
+         When ``False``, the original response is returned.
+         ``True`` by default (optional).
 
       :return ClientResponse: a :class:`client response
                               <ClientResponse>` object.
@@ -836,14 +850,21 @@ certification chaining.
 
 .. function:: request(method, url, *, params=None, data=None, \
                         json=None,\
-                        headers=None, cookies=None, auth=None, \
+                        cookies=None, headers=None, skip_auto_headers=None, auth=None, \
                         allow_redirects=True, max_redirects=10, \
-                        encoding='utf-8', \
-                        version=HttpVersion(major=1, minor=1), \
-                        compress=None, chunked=None, expect100=False, raise_for_status=False, \
+                        compress=False, chunked=None, expect100=False, raise_for_status=None, \
+                        read_until_eof=True, \
+                        proxy=None, proxy_auth=None, \
+                        timeout=sentinel, ssl=True, \
+                        server_hostname=None, \
+                        proxy_headers=None, \
+                        trace_request_ctx=None, \
                         read_bufsize=None, \
-                        connector=None, loop=None,\
-                        read_until_eof=True, timeout=sentinel)
+                        auto_decompress=None, \
+                        max_line_size=None, \
+                        max_field_size=None, \
+                        version=aiohttp.HttpVersion11, \
+                        connector=None)
    :async:
 
    Asynchronous context manager for performing an asynchronous HTTP
@@ -856,8 +877,20 @@ certification chaining.
                be encoded with :class:`~yarl.URL` (see :class:`~yarl.URL`
                to skip encoding).
 
-   :param dict params: Parameters to be sent in the query
-                       string of the new request (optional)
+   :param params: Mapping, iterable of tuple of *key*/*value* pairs or
+                  string to be sent as parameters in the query
+                  string of the new request. Ignored for subsequent
+                  redirected requests (optional)
+
+                  Allowed values are:
+
+                  - :class:`collections.abc.Mapping` e.g. :class:`dict`,
+                     :class:`multidict.MultiDict` or
+                     :class:`multidict.MultiDictProxy`
+                  - :class:`collections.abc.Iterable` e.g. :class:`tuple` or
+                     :class:`list`
+                  - :class:`str` with preferably url-encoded content
+                     (**Warning:** content will not be encoded by *aiohttp*)
 
    :param data: The data to send in the body of the request. This can be a
                 :class:`FormData` object or anything that can be passed into
@@ -867,25 +900,46 @@ certification chaining.
    :param json: Any json compatible python object (optional). *json* and *data*
                 parameters could not be used at the same time.
 
+   :param dict cookies: HTTP Cookies to send with the request (optional)
+
    :param dict headers: HTTP Headers to send with the request (optional)
 
-   :param dict cookies: Cookies to send with the request (optional)
+   :param skip_auto_headers: set of headers for which autogeneration
+      should be skipped.
+
+      *aiohttp* autogenerates headers like ``User-Agent`` or
+      ``Content-Type`` if these headers are not explicitly
+      passed. Using ``skip_auto_headers`` parameter allows to skip
+      that generation.
+
+      Iterable of :class:`str` or :class:`~multidict.istr`
+      (optional)
 
    :param aiohttp.BasicAuth auth: an object that represents HTTP Basic
                                   Authorization (optional)
 
-   :param bool allow_redirects: If set to ``False``, do not follow redirects.
-                                ``True`` by default (optional).
+   :param bool allow_redirects: Whether to process redirects or not.
+      When ``True``, redirects are followed (up to ``max_redirects`` times)
+      and logged into :attr:`ClientResponse.history` and ``trace_configs``.
+      When ``False``, the original response is returned.
+      ``True`` by default (optional).
 
-   :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional)
+   :param int max_redirects: Maximum number of redirects to follow.
+      :exc:`TooManyRedirects` is raised if the number is exceeded.
+      Ignored when ``allow_redirects=False``.
+      ``10`` by default.
 
    :param bool compress: Set to ``True`` if request has to be compressed
-                         with deflate encoding.
-                         ``False`` instructs aiohttp to not compress data.
+                         with deflate encoding. If `compress` can not be combined
+                         with a *Content-Encoding* and *Content-Length* headers.
                          ``None`` by default (optional).
 
    :param int chunked: Enables chunked transfer encoding.
-                       ``None`` by default (optional).
+      It is up to the developer
+      to decide how to chunk data streams. If chunking is enabled, aiohttp
+      encodes the provided chunks in the "Transfer-encoding: chunked" format.
+      If *chunked* is set, then the *Transfer-encoding* and *content-length*
+      headers are disallowed. ``None`` by default (optional).
 
    :param bool expect100: Expect 100-continue response from server.
                           ``False`` by default (optional).
@@ -899,28 +953,60 @@ certification chaining.
 
       .. versionadded:: 3.4
 
-   :param aiohttp.BaseConnector connector: BaseConnector sub-class
-      instance to support connection pooling.
-
    :param bool read_until_eof: Read response until EOF if response
                                does not have Content-Length header.
                                ``True`` by default (optional).
 
+   :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional)
+
+   :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP
+                                        Basic Authorization (optional)
+
+   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
+        total timeout, 30 seconds socket connect timeout by default.
+
+   :param ssl: SSL validation mode. ``True`` for default SSL check
+               (:func:`ssl.create_default_context` is used),
+               ``False`` for skip SSL certificate validation,
+               :class:`aiohttp.Fingerprint` for fingerprint
+               validation, :class:`ssl.SSLContext` for custom SSL
+               certificate validation.
+
+               Supersedes *verify_ssl*, *ssl_context* and
+               *fingerprint* parameters.
+
+   :param str server_hostname: Sets or overrides the host name that the
+      target server's certificate will be matched against.
+
+      See :py:meth:`asyncio.loop.create_connection`
+      for more information.
+
+   :param collections.abc.Mapping proxy_headers: HTTP headers to send to the proxy
+      if the parameter proxy has been provided.
+
+   :param trace_request_ctx: Object used to give as a kw param for each new
+      :class:`TraceConfig` object instantiated,
+      used to give information to the
+      tracers that is only available at request time.
+
    :param int read_bufsize: Size of the read buffer (:attr:`ClientResponse.content`).
                             ``None`` by default,
                             it means that the session global value is used.
 
       .. versionadded:: 3.7
 
-   :param timeout: a :class:`ClientTimeout` settings structure, 300 seconds (5min)
-        total timeout, 30 seconds socket connect timeout by default.
+   :param bool auto_decompress: Automatically decompress response body.
+      May be used to enable/disable auto decompression on a per-request basis.
 
-   :param loop: :ref:`event loop<asyncio-event-loop>`
-                used for processing HTTP requests.
-                If param is ``None``, :func:`asyncio.get_event_loop`
-                is used for getting default event loop.
+   :param int max_line_size: Maximum allowed size of lines in responses.
 
-      .. deprecated:: 2.0
+   :param int max_field_size: Maximum allowed size of header fields in responses.
+
+   :param aiohttp.protocol.HttpVersion version: Request HTTP version,
+      ``HTTP 1.1`` by default. (optional)
+
+   :param aiohttp.BaseConnector connector: BaseConnector sub-class
+      instance to support connection pooling. (optional)
 
    :return ClientResponse: a :class:`client response <ClientResponse>` object.
 
diff --git docs/contributing-admins.rst docs/contributing-admins.rst
index acfaebc0e97..b17cbe1019a 100644
--- docs/contributing-admins.rst
+++ docs/contributing-admins.rst
@@ -21,9 +21,9 @@ To create a new release:
 #. Run ``towncrier``.
 #. Check and cleanup the changes in ``CHANGES.rst``.
 #. Checkout a new branch: e.g. ``git checkout -b release/v3.8.6``
-#. Commit and create a PR. Once PR is merged, continue.
+#. Commit and create a PR. Verify the changelog and release notes look good on Read the Docs. Once PR is merged, continue.
 #. Go back to the release branch: e.g. ``git checkout 3.8 && git pull``
-#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6'``
+#. Add a tag: e.g. ``git tag -a v3.8.6 -m 'Release 3.8.6' -s``
 #. Push the tag: e.g. ``git push origin v3.8.6``
 #. Monitor CI to ensure release process completes without errors.
 
@@ -49,6 +49,10 @@ first merge into the newer release branch (e.g. 3.8 into 3.9) and then to master
 
 Back on the original release branch, bump the version number and append ``.dev0`` in ``__init__.py``.
 
+Post the release announcement to social media:
+ - BlueSky: https://bsky.app/profile/aiohttp.org and re-post to https://bsky.app/profile/aio-libs.org
+ - Mastodon: https://fosstodon.org/@aiohttp and re-post to https://fosstodon.org/@aio_libs
+
 If doing a minor release:
 
 #. Create a new release branch for future features to go to: e.g. ``git checkout -b 3.10 3.9 && git push``
diff --git docs/spelling_wordlist.txt docs/spelling_wordlist.txt
index a1f3d944584..59ea99c40bb 100644
--- docs/spelling_wordlist.txt
+++ docs/spelling_wordlist.txt
@@ -13,6 +13,8 @@ app
 app’s
 apps
 arg
+args
+armv
 Arsenic
 async
 asyncio
@@ -169,6 +171,7 @@ keepaliving
 kib
 KiB
 kwarg
+kwargs
 latin
 lifecycle
 linux
@@ -199,6 +202,7 @@ multidicts
 Multidicts
 multipart
 Multipart
+musllinux
 mypy
 Nagle
 Nagle’s
@@ -245,6 +249,7 @@ py
 pydantic
 pyenv
 pyflakes
+pyright
 pytest
 Pytest
 Quickstart
diff --git docs/third_party.rst docs/third_party.rst
index e8095c7f09d..145a505a5de 100644
--- docs/third_party.rst
+++ docs/third_party.rst
@@ -305,3 +305,6 @@ ask to raise the status.
 
 - `aiohttp-asgi-connector <https://github.com/thearchitector/aiohttp-asgi-connector>`_
   An aiohttp connector for using a ``ClientSession`` to interface directly with separate ASGI applications.
+
+- `aiohttp-openmetrics <https://github.com/jelmer/aiohttp-openmetrics>`_
+  An aiohttp middleware for exposing Prometheus metrics.
diff --git requirements/base.txt requirements/base.txt
index 1e7c0bbe6c1..f279c187ebc 100644
--- requirements/base.txt
+++ requirements/base.txt
@@ -26,11 +26,11 @@ gunicorn==23.0.0
     # via -r requirements/base.in
 idna==3.4
     # via yarl
-multidict==6.1.0
+multidict==6.4.3
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
-packaging==24.1
+packaging==24.2
     # via gunicorn
 propcache==0.2.0
     # via
diff --git requirements/constraints.txt requirements/constraints.txt
index d32acc7b773..16816dcd426 100644
--- requirements/constraints.txt
+++ requirements/constraints.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -116,7 +116,7 @@ markupsafe==2.1.5
     # via jinja2
 mdurl==0.1.2
     # via markdown-it-py
-multidict==6.1.0
+multidict==6.4.3
     # via
     #   -r requirements/multidict.in
     #   -r requirements/runtime-deps.in
@@ -129,7 +129,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -236,22 +236,22 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/cython.txt requirements/cython.txt
index f67cc903a0b..b2ff3e71d39 100644
--- requirements/cython.txt
+++ requirements/cython.txt
@@ -6,7 +6,7 @@
 #
 cython==3.0.11
     # via -r requirements/cython.in
-multidict==6.1.0
+multidict==6.4.3
     # via -r requirements/multidict.in
 typing-extensions==4.12.2
     # via multidict
diff --git requirements/dev.txt requirements/dev.txt
index 168ce639d19..6ab9baf6b59 100644
--- requirements/dev.txt
+++ requirements/dev.txt
@@ -14,7 +14,7 @@ aiohttp-theme==0.1.7
     # via -r requirements/doc.in
 aiosignal==1.3.1
     # via -r requirements/runtime-deps.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 annotated-types==0.7.0
     # via pydantic
@@ -110,7 +110,7 @@ markupsafe==2.1.5
     # via jinja2
 mdurl==0.1.2
     # via markdown-it-py
-multidict==6.1.0
+multidict==6.4.3
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
@@ -122,7 +122,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via
     #   build
     #   gunicorn
@@ -210,21 +210,21 @@ slotscheck==0.19.1
     # via -r requirements/lint.in
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/doc-spelling.txt requirements/doc-spelling.txt
index df393012548..43b3822706e 100644
--- requirements/doc-spelling.txt
+++ requirements/doc-spelling.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pyenchant==3.2.2
     # via sphinxcontrib-spelling
@@ -46,22 +46,22 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-spelling
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-spelling==8.0.0 ; platform_system != "Windows"
     # via -r requirements/doc-spelling.in
diff --git requirements/doc.txt requirements/doc.txt
index 43b7c6b7e8b..6ddfc47455b 100644
--- requirements/doc.txt
+++ requirements/doc.txt
@@ -6,7 +6,7 @@
 #
 aiohttp-theme==0.1.7
     # via -r requirements/doc.in
-alabaster==0.7.13
+alabaster==1.0.0
     # via sphinx
 babel==2.16.0
     # via sphinx
@@ -34,7 +34,7 @@ jinja2==3.1.4
     #   towncrier
 markupsafe==2.1.5
     # via jinja2
-packaging==24.1
+packaging==24.2
     # via sphinx
 pygments==2.18.0
     # via sphinx
@@ -44,21 +44,21 @@ requests==2.32.3
     # via sphinx
 snowballstemmer==2.2.0
     # via sphinx
-sphinx==7.1.2
+sphinx==8.1.3
     # via
     #   -r requirements/doc.in
     #   sphinxcontrib-towncrier
-sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-applehelp==2.0.0
     # via sphinx
-sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-devhelp==2.0.0
     # via sphinx
-sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-htmlhelp==2.1.0
     # via sphinx
 sphinxcontrib-jsmath==1.0.1
     # via sphinx
-sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-qthelp==2.0.0
     # via sphinx
-sphinxcontrib-serializinghtml==1.1.5
+sphinxcontrib-serializinghtml==2.0.0
     # via sphinx
 sphinxcontrib-towncrier==0.4.0a0
     # via -r requirements/doc.in
diff --git requirements/lint.txt requirements/lint.txt
index d7d97277bce..e2547d13da5 100644
--- requirements/lint.txt
+++ requirements/lint.txt
@@ -55,7 +55,7 @@ mypy-extensions==1.0.0
     # via mypy
 nodeenv==1.9.1
     # via pre-commit
-packaging==24.1
+packaging==24.2
     # via pytest
 platformdirs==4.3.6
     # via virtualenv
diff --git requirements/multidict.txt requirements/multidict.txt
index b8b44428920..a83b5029c3f 100644
--- requirements/multidict.txt
+++ requirements/multidict.txt
@@ -4,7 +4,7 @@
 #
 #    pip-compile --allow-unsafe --output-file=requirements/multidict.txt --resolver=backtracking --strip-extras requirements/multidict.in
 #
-multidict==6.1.0
+multidict==6.4.3
     # via -r requirements/multidict.in
 typing-extensions==4.12.2
     # via multidict
diff --git requirements/runtime-deps.txt requirements/runtime-deps.txt
index cf7f0e396f6..6c9fcc5ccd0 100644
--- requirements/runtime-deps.txt
+++ requirements/runtime-deps.txt
@@ -24,7 +24,7 @@ frozenlist==1.5.0
     #   aiosignal
 idna==3.4
     # via yarl
-multidict==6.1.0
+multidict==6.4.3
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
diff --git requirements/test.txt requirements/test.txt
index 33510f18682..025940dcf50 100644
--- requirements/test.txt
+++ requirements/test.txt
@@ -62,7 +62,7 @@ markdown-it-py==3.0.0
     # via rich
 mdurl==0.1.2
     # via markdown-it-py
-multidict==6.1.0
+multidict==6.4.3
     # via
     #   -r requirements/runtime-deps.in
     #   yarl
@@ -70,7 +70,7 @@ mypy==1.11.2 ; implementation_name == "cpython"
     # via -r requirements/test.in
 mypy-extensions==1.0.0
     # via mypy
-packaging==24.1
+packaging==24.2
     # via
     #   gunicorn
     #   pytest
diff --git tests/conftest.py tests/conftest.py
index 44ae384b633..bceec5212a9 100644
--- tests/conftest.py
+++ tests/conftest.py
@@ -7,7 +7,7 @@
 from hashlib import md5, sha1, sha256
 from pathlib import Path
 from tempfile import TemporaryDirectory
-from typing import Any, Generator
+from typing import Any, Generator, Iterator
 from unittest import mock
 from uuid import uuid4
 
@@ -27,6 +27,12 @@
 except ImportError:
     TRUSTME = False
 
+
+try:
+    import uvloop
+except ImportError:
+    uvloop = None  # type: ignore[assignment]
+
 pytest_plugins = ["aiohttp.pytest_plugin", "pytester"]
 
 IS_HPUX = sys.platform.startswith("hp-ux")
@@ -193,6 +199,16 @@ def selector_loop():
         yield _loop
 
 
+@pytest.fixture
+def uvloop_loop() -> Iterator[asyncio.AbstractEventLoop]:
+    policy = uvloop.EventLoopPolicy()
+    asyncio.set_event_loop_policy(policy)
+
+    with loop_context(policy.new_event_loop) as _loop:
+        asyncio.set_event_loop(_loop)
+        yield _loop
+
+
 @pytest.fixture
 def netrc_contents(
     tmp_path: Path,
@@ -221,6 +237,7 @@ def start_connection():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     ) as start_connection_mock:
         yield start_connection_mock
 
diff --git a/tests/isolated/check_for_client_response_leak.py b/tests/isolated/check_for_client_response_leak.py
new file mode 100644
index 00000000000..67393c2c2d8
--- /dev/null
+++ tests/isolated/check_for_client_response_leak.py
@@ -0,0 +1,47 @@
+import asyncio
+import contextlib
+import gc
+import sys
+
+from aiohttp import ClientError, ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def stream_handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        request.transport.close()  # Forcefully closing connection
+        return web.Response()
+
+    app.router.add_get("/stream", stream_handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    session = ClientSession()
+
+    async def fetch_stream(url: str) -> None:
+        """Fetch a stream and read a few bytes from it."""
+        with contextlib.suppress(ClientError):
+            await session.get(url)
+
+    client_task = asyncio.create_task(fetch_stream(f"http://localhost:{port}/stream"))
+    await client_task
+    gc.collect()
+    client_response_present = any(
+        type(obj).__name__ == "ClientResponse" for obj in gc.garbage
+    )
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if client_response_present else 0)
+
+
+asyncio.run(main())
diff --git a/tests/isolated/check_for_request_leak.py b/tests/isolated/check_for_request_leak.py
new file mode 100644
index 00000000000..6f340a05277
--- /dev/null
+++ tests/isolated/check_for_request_leak.py
@@ -0,0 +1,41 @@
+import asyncio
+import gc
+import sys
+from typing import NoReturn
+
+from aiohttp import ClientSession, web
+from aiohttp.test_utils import get_unused_port_socket
+
+gc.set_debug(gc.DEBUG_LEAK)
+
+
+async def main() -> None:
+    app = web.Application()
+
+    async def handler(request: web.Request) -> NoReturn:
+        await request.json()
+        assert False
+
+    app.router.add_route("GET", "/json", handler)
+    sock = get_unused_port_socket("127.0.0.1")
+    port = sock.getsockname()[1]
+
+    runner = web.AppRunner(app)
+    await runner.setup()
+    site = web.SockSite(runner, sock)
+    await site.start()
+
+    async with ClientSession() as session:
+        async with session.get(f"http://127.0.0.1:{port}/json") as resp:
+            await resp.read()
+
+    # Give time for the cancelled task to be collected
+    await asyncio.sleep(0.5)
+    gc.collect()
+    request_present = any(type(obj).__name__ == "Request" for obj in gc.garbage)
+    await session.close()
+    await runner.cleanup()
+    sys.exit(1 if request_present else 0)
+
+
+asyncio.run(main())
diff --git tests/test_benchmarks_client.py tests/test_benchmarks_client.py
index 61439183334..aa3536be820 100644
--- tests/test_benchmarks_client.py
+++ tests/test_benchmarks_client.py
@@ -124,7 +124,7 @@ def test_one_hundred_get_requests_with_512kib_chunked_payload(
     aiohttp_client: AiohttpClient,
     benchmark: BenchmarkFixture,
 ) -> None:
-    """Benchmark 100 GET requests with a payload of 512KiB."""
+    """Benchmark 100 GET requests with a payload of 512KiB using read."""
     message_count = 100
     payload = b"a" * (2**19)
 
@@ -148,6 +148,36 @@ def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
 
 
+def test_one_hundred_get_requests_iter_chunks_on_512kib_chunked_payload(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 GET requests with a payload of 512KiB using iter_chunks."""
+    message_count = 100
+    payload = b"a" * (2**19)
+
+    async def handler(request: web.Request) -> web.Response:
+        resp = web.Response(body=payload)
+        resp.enable_chunked_encoding()
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
 def test_get_request_with_251308_compressed_chunked_payload(
     loop: asyncio.AbstractEventLoop,
     aiohttp_client: AiohttpClient,
@@ -289,3 +319,158 @@ async def run_client_benchmark() -> None:
     @benchmark
     def _run() -> None:
         loop.run_until_complete(run_client_benchmark())
+
+
+def test_one_hundred_json_post_requests(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 100 JSON POST requests that c,heck the content-type."""
+    message_count = 100
+
+    async def handler(request: web.Request) -> web.Response:
+        _ = request.content_type
+        _ = request.charset
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("POST", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            await client.post("/", json={"key": "value"})
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_any(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_any."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_any():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_4096(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 4096."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 4096 iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(4096):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunked_65536(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunked 65536."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size, 64 KiB iter_chunked
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunked(65536):
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
+
+
+def test_ten_streamed_responses_iter_chunks(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark 10 streamed responses using iter_chunks."""
+    message_count = 10
+    data = b"x" * 65536  # 64 KiB chunk size
+
+    async def handler(request: web.Request) -> web.StreamResponse:
+        resp = web.StreamResponse()
+        await resp.prepare(request)
+        for _ in range(10):
+            await resp.write(data)
+        return resp
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_client_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(message_count):
+            resp = await client.get("/")
+            async for _ in resp.content.iter_chunks():
+                pass
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_client_benchmark())
diff --git tests/test_benchmarks_client_ws.py tests/test_benchmarks_client_ws.py
index 6d4cf309cad..044c1c1eb6d 100644
--- tests/test_benchmarks_client_ws.py
+++ tests/test_benchmarks_client_ws.py
@@ -2,6 +2,7 @@
 
 import asyncio
 
+import pytest
 from pytest_codspeed import BenchmarkFixture
 
 from aiohttp import web
@@ -40,19 +41,22 @@ def _run() -> None:
         loop.run_until_complete(run_websocket_benchmark())
 
 
+@pytest.mark.parametrize("msg_size", [6, MSG_SIZE * 4], ids=["small", "large"])
 def test_one_thousand_round_trip_websocket_binary_messages(
     loop: asyncio.AbstractEventLoop,
     aiohttp_client: AiohttpClient,
     benchmark: BenchmarkFixture,
+    msg_size: int,
 ) -> None:
     """Benchmark round trip of 1000 WebSocket binary messages."""
     message_count = 1000
+    raw_message = b"x" * msg_size
 
     async def handler(request: web.Request) -> web.WebSocketResponse:
         ws = web.WebSocketResponse()
         await ws.prepare(request)
         for _ in range(message_count):
-            await ws.send_bytes(b"answer")
+            await ws.send_bytes(raw_message)
         await ws.close()
         return ws
 
@@ -101,3 +105,67 @@ async def run_websocket_benchmark() -> None:
     @benchmark
     def _run() -> None:
         loop.run_until_complete(run_websocket_benchmark())
+
+
+def test_client_send_large_websocket_compressed_messages(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark send of compressed WebSocket binary messages."""
+    message_count = 10
+    raw_message = b"x" * 2**19  # 512 KiB
+
+    async def handler(request: web.Request) -> web.WebSocketResponse:
+        ws = web.WebSocketResponse()
+        await ws.prepare(request)
+        for _ in range(message_count):
+            await ws.receive()
+        await ws.close()
+        return ws
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_websocket_benchmark() -> None:
+        client = await aiohttp_client(app)
+        resp = await client.ws_connect("/", compress=15)
+        for _ in range(message_count):
+            await resp.send_bytes(raw_message)
+        await resp.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_websocket_benchmark())
+
+
+def test_client_receive_large_websocket_compressed_messages(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark receive of compressed WebSocket binary messages."""
+    message_count = 10
+    raw_message = b"x" * 2**19  # 512 KiB
+
+    async def handler(request: web.Request) -> web.WebSocketResponse:
+        ws = web.WebSocketResponse()
+        await ws.prepare(request)
+        for _ in range(message_count):
+            await ws.send_bytes(raw_message)
+        await ws.close()
+        return ws
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_websocket_benchmark() -> None:
+        client = await aiohttp_client(app)
+        resp = await client.ws_connect("/", compress=15)
+        for _ in range(message_count):
+            await resp.receive()
+        await resp.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_websocket_benchmark())
diff --git a/tests/test_benchmarks_web_fileresponse.py b/tests/test_benchmarks_web_fileresponse.py
new file mode 100644
index 00000000000..01aa7448c86
--- /dev/null
+++ tests/test_benchmarks_web_fileresponse.py
@@ -0,0 +1,105 @@
+"""codspeed benchmarks for the web file responses."""
+
+import asyncio
+import pathlib
+
+from multidict import CIMultiDict
+from pytest_codspeed import BenchmarkFixture
+
+from aiohttp import ClientResponse, web
+from aiohttp.pytest_plugin import AiohttpClient
+
+
+def test_simple_web_file_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_sendfile_fallback_response(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark creating 100 simple web.FileResponse without sendfile."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        transport = request.transport
+        assert transport is not None
+        transport._sendfile_compatible = False  # type: ignore[attr-defined]
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def run_file_response_benchmark() -> None:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            await client.get("/")
+        await client.close()
+
+    @benchmark
+    def _run() -> None:
+        loop.run_until_complete(run_file_response_benchmark())
+
+
+def test_simple_web_file_response_not_modified(
+    loop: asyncio.AbstractEventLoop,
+    aiohttp_client: AiohttpClient,
+    benchmark: BenchmarkFixture,
+) -> None:
+    """Benchmark web.FileResponse that return a 304."""
+    response_count = 100
+    filepath = pathlib.Path(__file__).parent / "sample.txt"
+
+    async def handler(request: web.Request) -> web.FileResponse:
+        return web.FileResponse(path=filepath)
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    async def make_last_modified_header() -> CIMultiDict[str]:
+        client = await aiohttp_client(app)
+        resp = await client.get("/")
+        last_modified = resp.headers["Last-Modified"]
+        headers = CIMultiDict({"If-Modified-Since": last_modified})
+        return headers
+
+    async def run_file_response_benchmark(
+        headers: CIMultiDict[str],
+    ) -> ClientResponse:
+        client = await aiohttp_client(app)
+        for _ in range(response_count):
+            resp = await client.get("/", headers=headers)
+
+        await client.close()
+        return resp  # type: ignore[possibly-undefined]
+
+    headers = loop.run_until_complete(make_last_modified_header())
+
+    @benchmark
+    def _run() -> None:
+        resp = loop.run_until_complete(run_file_response_benchmark(headers))
+        assert resp.status == 304
diff --git tests/test_client_functional.py tests/test_client_functional.py
index b34ccdb600d..ba75e8e93c6 100644
--- tests/test_client_functional.py
+++ tests/test_client_functional.py
@@ -603,6 +603,30 @@ async def handler(request):
     assert txt == "Test message"
 
 
+async def test_ssl_client_alpn(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    ssl_ctx: ssl.SSLContext,
+) -> None:
+
+    async def handler(request: web.Request) -> web.Response:
+        assert request.transport is not None
+        sslobj = request.transport.get_extra_info("ssl_object")
+        return web.Response(text=sslobj.selected_alpn_protocol())
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    ssl_ctx.set_alpn_protocols(("http/1.1",))
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    connector = aiohttp.TCPConnector(ssl=False)
+    client = await aiohttp_client(server, connector=connector)
+    resp = await client.get("/")
+    assert resp.status == 200
+    txt = await resp.text()
+    assert txt == "http/1.1"
+
+
 async def test_tcp_connector_fingerprint_ok(
     aiohttp_server,
     aiohttp_client,
@@ -3358,6 +3382,22 @@ async def handler(request: web.Request) -> web.Response:
     await server.close()
 
 
+async def test_aiohttp_request_ssl(
+    aiohttp_server: AiohttpServer,
+    ssl_ctx: ssl.SSLContext,
+    client_ssl_ctx: ssl.SSLContext,
+) -> None:
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, ssl=ssl_ctx)
+
+    async with aiohttp.request("GET", server.make_url("/"), ssl=client_ssl_ctx) as resp:
+        assert resp.status == 200
+
+
 async def test_yield_from_in_session_request(aiohttp_client: AiohttpClient) -> None:
     # a test for backward compatibility with yield from syntax
     async def handler(request):
diff --git tests/test_client_session.py tests/test_client_session.py
index 65f80b6abe9..6309c5daf2e 100644
--- tests/test_client_session.py
+++ tests/test_client_session.py
@@ -15,13 +15,14 @@
 from yarl import URL
 
 import aiohttp
-from aiohttp import client, hdrs, web
+from aiohttp import CookieJar, client, hdrs, web
 from aiohttp.client import ClientSession
 from aiohttp.client_proto import ResponseHandler
 from aiohttp.client_reqrep import ClientRequest
 from aiohttp.connector import BaseConnector, Connection, TCPConnector, UnixConnector
 from aiohttp.helpers import DEBUG
 from aiohttp.http import RawResponseMessage
+from aiohttp.pytest_plugin import AiohttpServer
 from aiohttp.test_utils import make_mocked_coro
 from aiohttp.tracing import Trace
 
@@ -634,8 +635,24 @@ async def handler(request):
     assert resp_cookies["response"].value == "resp_value"
 
 
-async def test_session_default_version(loop) -> None:
-    session = aiohttp.ClientSession(loop=loop)
+async def test_cookies_with_not_quoted_cookie_jar(
+    aiohttp_server: AiohttpServer,
+) -> None:
+    async def handler(_: web.Request) -> web.Response:
+        return web.Response()
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+    server = await aiohttp_server(app)
+    jar = CookieJar(quote_cookie=False)
+    cookies = {"name": "val=foobar"}
+    async with aiohttp.ClientSession(cookie_jar=jar) as sess:
+        resp = await sess.request("GET", server.make_url("/"), cookies=cookies)
+    assert resp.request_info.headers.get("Cookie", "") == "name=val=foobar"
+
+
+async def test_session_default_version(loop: asyncio.AbstractEventLoop) -> None:
+    session = aiohttp.ClientSession()
     assert session.version == aiohttp.HttpVersion11
     await session.close()
 
diff --git tests/test_client_ws_functional.py tests/test_client_ws_functional.py
index 7ede7432adf..0ca57ab3ab2 100644
--- tests/test_client_ws_functional.py
+++ tests/test_client_ws_functional.py
@@ -315,7 +315,6 @@ async def test_concurrent_close(aiohttp_client) -> None:
     client_ws = None
 
     async def handler(request):
-        nonlocal client_ws
         ws = web.WebSocketResponse()
         await ws.prepare(request)
 
@@ -902,6 +901,7 @@ async def handler(request):
         assert resp.close_code is WSCloseCode.ABNORMAL_CLOSURE
         assert msg.type is WSMsgType.ERROR
         assert isinstance(msg.data, ServerTimeoutError)
+        assert str(msg.data) == "No PONG received after 0.05 seconds"
 
 
 async def test_close_websocket_while_ping_inflight(
@@ -935,7 +935,7 @@ async def delayed_send_frame(
         message: bytes, opcode: int, compress: Optional[int] = None
     ) -> None:
         assert opcode == WSMsgType.PING
-        nonlocal cancelled, ping_started
+        nonlocal cancelled
         ping_started.set_result(None)
         try:
             await asyncio.sleep(1)
diff --git tests/test_connector.py tests/test_connector.py
index 483759a4180..a3fffc447ae 100644
--- tests/test_connector.py
+++ tests/test_connector.py
@@ -3474,6 +3474,61 @@ async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
     await connector.close()
 
 
+async def test_connector_resolve_in_case_of_trace_cache_miss_exception(
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    token: ResolveResult = {
+        "hostname": "localhost",
+        "host": "127.0.0.1",
+        "port": 80,
+        "family": socket.AF_INET,
+        "proto": 0,
+        "flags": socket.AI_NUMERICHOST,
+    }
+
+    request_count = 0
+
+    class DummyTracer(Trace):
+        def __init__(self) -> None:
+            """Dummy"""
+
+        async def send_dns_cache_hit(self, *args: object, **kwargs: object) -> None:
+            """Dummy send_dns_cache_hit"""
+
+        async def send_dns_resolvehost_start(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_start"""
+
+        async def send_dns_resolvehost_end(
+            self, *args: object, **kwargs: object
+        ) -> None:
+            """Dummy send_dns_resolvehost_end"""
+
+        async def send_dns_cache_miss(self, *args: object, **kwargs: object) -> None:
+            nonlocal request_count
+            request_count += 1
+            if request_count <= 1:
+                raise Exception("first attempt")
+
+    async def resolve_response() -> List[ResolveResult]:
+        await asyncio.sleep(0)
+        return [token]
+
+    with mock.patch("aiohttp.connector.DefaultResolver") as m_resolver:
+        m_resolver().resolve.return_value = resolve_response()
+
+        connector = TCPConnector()
+        traces = [DummyTracer()]
+
+        with pytest.raises(Exception):
+            await connector._resolve_host("", 0, traces)
+
+        await connector._resolve_host("", 0, traces) == [token]
+
+    await connector.close()
+
+
 async def test_connector_does_not_remove_needed_waiters(
     loop: asyncio.AbstractEventLoop, key: ConnectionKey
 ) -> None:
diff --git tests/test_cookiejar.py tests/test_cookiejar.py
index bdcf54fa796..0b440bc2ca6 100644
--- tests/test_cookiejar.py
+++ tests/test_cookiejar.py
@@ -807,6 +807,7 @@ async def make_jar():
 async def test_dummy_cookie_jar() -> None:
     cookie = SimpleCookie("foo=bar; Domain=example.com;")
     dummy_jar = DummyCookieJar()
+    assert dummy_jar.quote_cookie is True
     assert len(dummy_jar) == 0
     dummy_jar.update_cookies(cookie)
     assert len(dummy_jar) == 0
diff --git tests/test_flowcontrol_streams.py tests/test_flowcontrol_streams.py
index 68e623b6dd7..9874cc2511e 100644
--- tests/test_flowcontrol_streams.py
+++ tests/test_flowcontrol_streams.py
@@ -4,6 +4,7 @@
 import pytest
 
 from aiohttp import streams
+from aiohttp.base_protocol import BaseProtocol
 
 
 @pytest.fixture
@@ -112,6 +113,15 @@ async def test_read_nowait(self, stream) -> None:
         assert res == b""
         assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
 
+    async def test_resumed_on_eof(self, stream: streams.StreamReader) -> None:
+        stream.feed_data(b"data")
+        assert stream._protocol.pause_reading.call_count == 1  # type: ignore[attr-defined]
+        assert stream._protocol.resume_reading.call_count == 0  # type: ignore[attr-defined]
+        stream._protocol._reading_paused = True
+
+        stream.feed_eof()
+        assert stream._protocol.resume_reading.call_count == 1  # type: ignore[attr-defined]
+
 
 async def test_flow_control_data_queue_waiter_cancelled(
     buffer: streams.FlowControlDataQueue,
@@ -180,3 +190,16 @@ async def test_flow_control_data_queue_read_eof(
     buffer.feed_eof()
     with pytest.raises(streams.EofStream):
         await buffer.read()
+
+
+async def test_stream_reader_eof_when_full() -> None:
+    loop = asyncio.get_event_loop()
+    protocol = BaseProtocol(loop=loop)
+    protocol.transport = asyncio.Transport()
+    stream = streams.StreamReader(protocol, 1024, loop=loop)
+
+    data_len = stream._high_water + 1
+    stream.feed_data(b"0" * data_len)
+    assert protocol._reading_paused
+    stream.feed_eof()
+    assert not protocol._reading_paused
diff --git tests/test_helpers.py tests/test_helpers.py
index 2a83032e557..a343cbdfedf 100644
--- tests/test_helpers.py
+++ tests/test_helpers.py
@@ -351,7 +351,6 @@ async def test_timer_context_timeout_does_swallow_cancellation() -> None:
     ctx = helpers.TimerContext(loop)
 
     async def task_with_timeout() -> None:
-        nonlocal ctx
         new_task = asyncio.current_task()
         assert new_task is not None
         with pytest.raises(asyncio.TimeoutError):
diff --git tests/test_http_writer.py tests/test_http_writer.py
index 0ed0e615700..420816b3137 100644
--- tests/test_http_writer.py
+++ tests/test_http_writer.py
@@ -2,19 +2,38 @@
 import array
 import asyncio
 import zlib
-from typing import Iterable
+from typing import Generator, Iterable
 from unittest import mock
 
 import pytest
 from multidict import CIMultiDict
 
-from aiohttp import ClientConnectionResetError, http
+from aiohttp import ClientConnectionResetError, hdrs, http
 from aiohttp.base_protocol import BaseProtocol
+from aiohttp.http_writer import _serialize_headers
 from aiohttp.test_utils import make_mocked_coro
 
 
 @pytest.fixture
-def buf():
+def enable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", False):
+        yield
+
+
+@pytest.fixture
+def disable_writelines() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.SKIP_WRITELINES", True):
+        yield
+
+
+@pytest.fixture
+def force_writelines_small_payloads() -> Generator[None, None, None]:
+    with mock.patch("aiohttp.http_writer.MIN_PAYLOAD_FOR_WRITELINES", 1):
+        yield
+
+
+@pytest.fixture
+def buf() -> bytearray:
     return bytearray()
 
 
@@ -92,6 +111,7 @@ async def test_write_payload_length(protocol, transport, loop) -> None:
     assert b"da" == content.split(b"\r\n\r\n", 1)[-1]
 
 
+@pytest.mark.usefixtures("disable_writelines")
 async def test_write_large_payload_deflate_compression_data_in_eof(
     protocol: BaseProtocol,
     transport: asyncio.Transport,
@@ -100,6 +120,32 @@ async def test_write_large_payload_deflate_compression_data_in_eof(
     msg = http.StreamWriter(protocol, loop)
     msg.enable_compression("deflate")
 
+    await msg.write(b"data" * 4096)
+    assert transport.write.called  # type: ignore[attr-defined]
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    transport.write.reset_mock()  # type: ignore[attr-defined]
+
+    # This payload compresses to 20447 bytes
+    payload = b"".join(
+        [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)]
+    )
+    await msg.write_eof(payload)
+    chunks.extend([c[1][0] for c in list(transport.write.mock_calls)])  # type: ignore[attr-defined]
+
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+async def test_write_large_payload_deflate_compression_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+
     await msg.write(b"data" * 4096)
     assert transport.write.called  # type: ignore[attr-defined]
     chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
@@ -180,6 +226,26 @@ async def test_write_payload_deflate_compression_chunked(
     await msg.write(b"data")
     await msg.write_eof()
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\na\r\nKI,I\x04\x00\x04\x00\x01\x9b\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof()
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -216,6 +282,26 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof(
     await msg.write(b"data")
     await msg.write_eof(b"end")
 
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    assert all(chunks)
+    content = b"".join(chunks)
+    assert content == expected
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    expected = b"2\r\nx\x9c\r\nd\r\nKI,IL\xcdK\x01\x00\x0b@\x02\xd2\r\n0\r\n\r\n"
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+    await msg.write(b"data")
+    await msg.write_eof(b"end")
+
     chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)]  # type: ignore[attr-defined]
     assert all(chunks)
     content = b"".join(chunks)
@@ -231,6 +317,34 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof(
     msg.enable_compression("deflate")
     msg.enable_chunking()
 
+    await msg.write(b"data" * 4096)
+    # This payload compresses to 1111 bytes
+    payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
+    await msg.write_eof(payload)
+
+    compressed = []
+    chunks = [c[1][0] for c in list(transport.write.mock_calls)]  # type: ignore[attr-defined]
+    chunked_body = b"".join(chunks)
+    split_body = chunked_body.split(b"\r\n")
+    while split_body:
+        if split_body.pop(0):
+            compressed.append(split_body.pop(0))
+
+    content = b"".join(compressed)
+    assert zlib.decompress(content) == (b"data" * 4096) + payload
+
+
+@pytest.mark.usefixtures("enable_writelines")
+@pytest.mark.usefixtures("force_writelines_small_payloads")
+async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines(
+    protocol: BaseProtocol,
+    transport: asyncio.Transport,
+    loop: asyncio.AbstractEventLoop,
+) -> None:
+    msg = http.StreamWriter(protocol, loop)
+    msg.enable_compression("deflate")
+    msg.enable_chunking()
+
     await msg.write(b"data" * 4096)
     # This payload compresses to 1111 bytes
     payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)])
@@ -421,3 +535,29 @@ async def test_set_eof_after_write_headers(
     msg.set_eof()
     await msg.write_eof()
     assert not transport.write.called
+
+
+@pytest.mark.parametrize(
+    "char",
+    [
+        "\n",
+        "\r",
+    ],
+)
+def test_serialize_headers_raises_on_new_line_or_carriage_return(char: str) -> None:
+    """Verify serialize_headers raises on cr or nl in the headers."""
+    status_line = "HTTP/1.1 200 OK"
+    headers = CIMultiDict(
+        {
+            hdrs.CONTENT_TYPE: f"text/plain{char}",
+        }
+    )
+
+    with pytest.raises(
+        ValueError,
+        match=(
+            "Newline or carriage return detected in headers. "
+            "Potential header injection attack."
+        ),
+    ):
+        _serialize_headers(status_line, headers)
diff --git tests/test_imports.py tests/test_imports.py
index 5a2bb76b03c..b3f545ad900 100644
--- tests/test_imports.py
+++ tests/test_imports.py
@@ -38,7 +38,7 @@ def test_web___all__(pytester: pytest.Pytester) -> None:
         # and even slower under pytest-xdist, especially in CI
         _XDIST_WORKER_COUNT * 100 * (1 if _IS_CI_ENV else 1.53)
         if _IS_XDIST_RUN
-        else 265
+        else 295
     ),
 }
 _TARGET_TIMINGS_BY_PYTHON_VERSION["3.13"] = _TARGET_TIMINGS_BY_PYTHON_VERSION["3.12"]
diff --git a/tests/test_leaks.py b/tests/test_leaks.py
new file mode 100644
index 00000000000..07b506bdb99
--- /dev/null
+++ tests/test_leaks.py
@@ -0,0 +1,37 @@
+import pathlib
+import platform
+import subprocess
+import sys
+
+import pytest
+
+IS_PYPY = platform.python_implementation() == "PyPy"
+
+
+@pytest.mark.skipif(IS_PYPY, reason="gc.DEBUG_LEAK not available on PyPy")
+@pytest.mark.parametrize(
+    ("script", "message"),
+    [
+        (
+            # Test that ClientResponse is collected after server disconnects.
+            # https://github.com/aio-libs/aiohttp/issues/10535
+            "check_for_client_response_leak.py",
+            "ClientResponse leaked",
+        ),
+        (
+            # Test that Request object is collected when the handler raises.
+            # https://github.com/aio-libs/aiohttp/issues/10548
+            "check_for_request_leak.py",
+            "Request leaked",
+        ),
+    ],
+)
+def test_leak(script: str, message: str) -> None:
+    """Run isolated leak test script and check for leaks."""
+    leak_test_script = pathlib.Path(__file__).parent.joinpath("isolated", script)
+
+    with subprocess.Popen(
+        [sys.executable, "-u", str(leak_test_script)],
+        stdout=subprocess.PIPE,
+    ) as proc:
+        assert proc.wait() == 0, message
diff --git tests/test_proxy.py tests/test_proxy.py
index 1679b68909f..83457de891f 100644
--- tests/test_proxy.py
+++ tests/test_proxy.py
@@ -207,6 +207,7 @@ async def make_conn():
         "aiohttp.connector.aiohappyeyeballs.start_connection",
         autospec=True,
         spec_set=True,
+        return_value=mock.create_autospec(socket.socket, spec_set=True, instance=True),
     )
     def test_proxy_connection_error(self, start_connection: Any) -> None:
         async def make_conn():
diff --git tests/test_proxy_functional.py tests/test_proxy_functional.py
index 0921d5487bb..02d77700d96 100644
--- tests/test_proxy_functional.py
+++ tests/test_proxy_functional.py
@@ -1,6 +1,7 @@
 import asyncio
 import os
 import pathlib
+import platform
 import ssl
 import sys
 from re import match as match_regex
@@ -202,6 +203,32 @@ async def test_https_proxy_unsupported_tls_in_tls(
     await asyncio.sleep(0.1)
 
 
+@pytest.mark.usefixtures("uvloop_loop")
+@pytest.mark.skipif(
+    platform.system() == "Windows" or sys.implementation.name != "cpython",
+    reason="uvloop is not supported on Windows and non-CPython implementations",
+)
+@pytest.mark.filterwarnings(r"ignore:.*ssl.OP_NO_SSL*")
+# Filter out the warning from
+# https://github.com/abhinavsingh/proxy.py/blob/30574fd0414005dfa8792a6e797023e862bdcf43/proxy/common/utils.py#L226
+# otherwise this test will fail because the proxy will die with an error.
+async def test_uvloop_secure_https_proxy(
+    client_ssl_ctx: ssl.SSLContext,
+    secure_proxy_url: URL,
+) -> None:
+    """Ensure HTTPS sites are accessible through a secure proxy without warning when using uvloop."""
+    conn = aiohttp.TCPConnector()
+    sess = aiohttp.ClientSession(connector=conn)
+    url = URL("https://example.com")
+
+    async with sess.get(url, proxy=secure_proxy_url, ssl=client_ssl_ctx) as response:
+        assert response.status == 200
+
+    await sess.close()
+    await conn.close()
+    await asyncio.sleep(0.1)
+
+
 @pytest.fixture
 def proxy_test_server(aiohttp_raw_server, loop, monkeypatch):
     # Handle all proxy requests and imitate remote server response.
diff --git tests/test_streams.py tests/test_streams.py
index fcf13a91eb3..1b65f771c77 100644
--- tests/test_streams.py
+++ tests/test_streams.py
@@ -1141,6 +1141,7 @@ async def test_empty_stream_reader() -> None:
     with pytest.raises(asyncio.IncompleteReadError):
         await s.readexactly(10)
     assert s.read_nowait() == b""
+    assert s.total_bytes == 0
 
 
 async def test_empty_stream_reader_iter_chunks() -> None:
diff --git tests/test_urldispatch.py tests/test_urldispatch.py
index 8ee3df33202..ba6bdff23a0 100644
--- tests/test_urldispatch.py
+++ tests/test_urldispatch.py
@@ -358,7 +358,7 @@ def test_add_static_path_resolution(router: any) -> None:
     """Test that static paths are expanded and absolute."""
     res = router.add_static("/", "~/..")
     directory = str(res.get_info()["directory"])
-    assert directory == str(pathlib.Path.home().parent)
+    assert directory == str(pathlib.Path.home().resolve(strict=True).parent)
 
 
 def test_add_static(router) -> None:
diff --git tests/test_web_app.py tests/test_web_app.py
index 6a86a3458a3..8c03a6041b2 100644
--- tests/test_web_app.py
+++ tests/test_web_app.py
@@ -144,6 +144,20 @@ def log(self, request, response, time):
     )
 
 
+async def test_app_make_handler_no_access_log_class(mocker) -> None:
+    srv = mocker.patch("aiohttp.web_app.Server")
+    app = web.Application(handler_args={"access_log": None})
+    app._make_handler(access_log=None)
+    srv.assert_called_with(
+        app._handle,
+        request_factory=app._make_request,
+        loop=asyncio.get_event_loop(),
+        access_log=None,
+        debug=mock.ANY,
+        access_log_class=mock.ANY,
+    )
+
+
 async def test_app_make_handler_raises_deprecation_warning() -> None:
     app = web.Application()
 
diff --git tests/test_web_functional.py tests/test_web_functional.py
index a3a990141a1..e4979851300 100644
--- tests/test_web_functional.py
+++ tests/test_web_functional.py
@@ -2324,3 +2324,41 @@ async def handler(request: web.Request) -> web.Response:
         # Make 2nd request which will hit the race condition.
         async with client.get("/") as resp:
             assert resp.status == 200
+
+
+async def test_keepalive_expires_on_time(aiohttp_client: AiohttpClient) -> None:
+    """Test that the keepalive handle expires on time."""
+
+    async def handler(request: web.Request) -> web.Response:
+        body = await request.read()
+        assert b"" == body
+        return web.Response(body=b"OK")
+
+    app = web.Application()
+    app.router.add_route("GET", "/", handler)
+
+    connector = aiohttp.TCPConnector(limit=1)
+    client = await aiohttp_client(app, connector=connector)
+
+    loop = asyncio.get_running_loop()
+    now = loop.time()
+
+    # Patch loop time so we can control when the keepalive timeout is processed
+    with mock.patch.object(loop, "time") as loop_time_mock:
+        loop_time_mock.return_value = now
+        resp1 = await client.get("/")
+        await resp1.read()
+        request_handler = client.server.handler.connections[0]
+
+        # Ensure the keep alive handle is set
+        assert request_handler._keepalive_handle is not None
+
+        # Set the loop time to exactly the keepalive timeout
+        loop_time_mock.return_value = request_handler._next_keepalive_close_time
+
+        # sleep twice to ensure the keep alive timeout is processed
+        await asyncio.sleep(0)
+        await asyncio.sleep(0)
+
+        # Ensure the keep alive handle expires
+        assert request_handler._keepalive_handle is None
diff --git tests/test_web_log.py tests/test_web_log.py
index 0896c41c9e1..16c4b976daa 100644
--- tests/test_web_log.py
+++ tests/test_web_log.py
@@ -255,3 +255,29 @@ def enabled(self) -> bool:
     resp = await client.get("/")
     assert 200 == resp.status
     assert "This should not be logged" not in caplog.text
+
+
+async def test_logger_set_to_none(
+    aiohttp_server: AiohttpServer,
+    aiohttp_client: AiohttpClient,
+    caplog: pytest.LogCaptureFixture,
+) -> None:
+    """Test logger does nothing when access_log is set to None."""
+
+    async def handler(request: web.Request) -> web.Response:
+        return web.Response()
+
+    class Logger(AbstractAccessLogger):
+
+        def log(
+            self, request: web.BaseRequest, response: web.StreamResponse, time: float
+        ) -> None:
+            self.logger.critical("This should not be logged")  # pragma: no cover
+
+    app = web.Application()
+    app.router.add_get("/", handler)
+    server = await aiohttp_server(app, access_log=None, access_log_class=Logger)
+    client = await aiohttp_client(server)
+    resp = await client.get("/")
+    assert 200 == resp.status
+    assert "This should not be logged" not in caplog.text
diff --git tests/test_web_response.py tests/test_web_response.py
index f4acf23f61b..95769161804 100644
--- tests/test_web_response.py
+++ tests/test_web_response.py
@@ -10,7 +10,7 @@
 
 import aiosignal
 import pytest
-from multidict import CIMultiDict, CIMultiDictProxy
+from multidict import CIMultiDict, CIMultiDictProxy, MultiDict
 from re_assert import Matches
 
 from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs
@@ -1201,7 +1201,7 @@ def read(self, size: int = -1) -> bytes:
         (BodyPartReader("x", CIMultiDictProxy(CIMultiDict()), mock.Mock()), None),
         (
             mpwriter,
-            "--x\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
+            "--x\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 4\r\n\r\ntest",
         ),
     ),
 )
@@ -1479,3 +1479,15 @@ def test_text_is_json_encoded(self) -> None:
     def test_content_type_is_overrideable(self) -> None:
         resp = json_response({"foo": 42}, content_type="application/vnd.json+api")
         assert "application/vnd.json+api" == resp.content_type
+
+
+@pytest.mark.parametrize("loose_header_type", (MultiDict, CIMultiDict, dict))
+async def test_passing_cimultidict_to_web_response_not_mutated(
+    loose_header_type: type,
+) -> None:
+    req = make_request("GET", "/")
+    headers = loose_header_type({})
+    resp = Response(body=b"answer", headers=headers)
+    await resp.prepare(req)
+    assert resp.content_length == 6
+    assert not headers
diff --git tests/test_web_server.py tests/test_web_server.py
index 7b9b87a374a..d2f1341afe0 100644
--- tests/test_web_server.py
+++ tests/test_web_server.py
@@ -56,7 +56,9 @@ async def handler(request):
     assert txt.startswith("500 Internal Server Error")
     assert "Traceback" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_with_loop_debug(
@@ -85,7 +87,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
     logger.debug.reset_mock()
 
     # Now make another connection to the server
@@ -99,7 +103,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_without_loop_debug(
@@ -128,7 +134,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     # on the first request since the client may
     # be probing for TLS/SSL support which is
     # expected to fail
-    logger.debug.assert_called_with("Error handling request", exc_info=exc)
+    logger.debug.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_invalid_method_second_request(
@@ -159,7 +167,9 @@ async def handler(request: web.BaseRequest) -> web.Response:
     # BadHttpMethod should be logged as an exception
     # if its not the first request since we know
     # that the client already was speaking HTTP
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_logs_bad_status_line_as_exception(
@@ -184,7 +194,9 @@ async def handler(request: web.BaseRequest) -> NoReturn:
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" not in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_handler_timeout(
@@ -221,6 +233,24 @@ async def handler(request):
     logger.debug.assert_called_with("Ignored premature client disconnection")
 
 
+async def test_raw_server_does_not_swallow_base_exceptions(
+    aiohttp_raw_server: AiohttpRawServer, aiohttp_client: AiohttpClient
+) -> None:
+    class UnexpectedException(BaseException):
+        """Dummy base exception."""
+
+    async def handler(request: web.BaseRequest) -> NoReturn:
+        raise UnexpectedException()
+
+    loop = asyncio.get_event_loop()
+    loop.set_debug(True)
+    server = await aiohttp_raw_server(handler)
+    cli = await aiohttp_client(server)
+
+    with pytest.raises(client.ServerDisconnectedError):
+        await cli.get("/path/to", timeout=client.ClientTimeout(10))
+
+
 async def test_raw_server_cancelled_in_write_eof(aiohttp_raw_server, aiohttp_client):
     async def handler(request):
         resp = web.Response(text=str(request.rel_url))
@@ -254,7 +284,9 @@ async def handler(request):
     txt = await resp.text()
     assert "Traceback (most recent call last):\n" in txt
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client):
@@ -278,7 +310,9 @@ async def handler(request):
         "</body></html>\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client):
@@ -302,7 +336,9 @@ async def handler(request):
         "<pre>Traceback (most recent call last):\n"
     )
 
-    logger.exception.assert_called_with("Error handling request", exc_info=exc)
+    logger.exception.assert_called_with(
+        "Error handling request from %s", cli.host, exc_info=exc
+    )
 
 
 async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
@@ -311,7 +347,6 @@ async def test_handler_cancellation(unused_port_socket: socket.socket) -> None:
     port = sock.getsockname()[1]
 
     async def on_request(_: web.Request) -> web.Response:
-        nonlocal event
         try:
             await asyncio.sleep(10)
         except asyncio.CancelledError:
@@ -353,7 +388,7 @@ async def test_no_handler_cancellation(unused_port_socket: socket.socket) -> Non
     started = False
 
     async def on_request(_: web.Request) -> web.Response:
-        nonlocal done_event, started, timeout_event
+        nonlocal started
         started = True
         await asyncio.wait_for(timeout_event.wait(), timeout=5)
         done_event.set()
diff --git tests/test_web_urldispatcher.py tests/test_web_urldispatcher.py
index 92066f09b7d..ee60b6917c5 100644
--- tests/test_web_urldispatcher.py
+++ tests/test_web_urldispatcher.py
@@ -585,16 +585,17 @@ async def test_access_mock_special_resource(
     my_special.touch()
 
     real_result = my_special.stat()
-    real_stat = pathlib.Path.stat
+    real_stat = os.stat
 
-    def mock_stat(self: pathlib.Path, **kwargs: Any) -> os.stat_result:
-        s = real_stat(self, **kwargs)
+    def mock_stat(path: Any, **kwargs: Any) -> os.stat_result:
+        s = real_stat(path, **kwargs)
         if os.path.samestat(s, real_result):
             mock_mode = S_IFIFO | S_IMODE(s.st_mode)
             s = os.stat_result([mock_mode] + list(s)[1:])
         return s
 
     monkeypatch.setattr("pathlib.Path.stat", mock_stat)
+    monkeypatch.setattr("os.stat", mock_stat)
 
     app = web.Application()
     app.router.add_static("/", str(tmp_path))
diff --git tests/test_web_websocket_functional.py tests/test_web_websocket_functional.py
index b7494d9265f..945096a2af3 100644
--- tests/test_web_websocket_functional.py
+++ tests/test_web_websocket_functional.py
@@ -797,6 +797,7 @@ async def handler(request: web.Request) -> NoReturn:
     assert ws.close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert ws_server_close_code == WSCloseCode.ABNORMAL_CLOSURE
     assert isinstance(ws_server_exception, asyncio.TimeoutError)
+    assert str(ws_server_exception) == "No PONG received after 0.025 seconds"
     await ws.close()
 
 
diff --git tests/test_websocket_handshake.py tests/test_websocket_handshake.py
index bbfa1d9260d..53d5d9152bb 100644
--- tests/test_websocket_handshake.py
+++ tests/test_websocket_handshake.py
@@ -174,7 +174,7 @@ async def test_handshake_protocol_unsupported(caplog) -> None:
 
     assert (
         caplog.records[-1].msg
-        == "Client protocols %r don’t overlap server-known ones %r"
+        == "%s: Client protocols %r don’t overlap server-known ones %r"
     )
     assert ws.ws_protocol is None
 
diff --git tests/test_websocket_parser.py tests/test_websocket_parser.py
index 7f8b98d4566..d1d96f716fd 100644
--- tests/test_websocket_parser.py
+++ tests/test_websocket_parser.py
@@ -27,6 +27,25 @@
 class PatchableWebSocketReader(WebSocketReader):
     """WebSocketReader subclass that allows for patching parse_frame."""
 
+    def parse_frame(
+        self, data: bytes
+    ) -> list[tuple[bool, int, Union[bytes, bytearray], int]]:
+        # This method is overridden to allow for patching in tests.
+        frames: list[tuple[bool, int, Union[bytes, bytearray], int]] = []
+
+        def _handle_frame(
+            fin: bool,
+            opcode: int,
+            payload: Union[bytes, bytearray],
+            compressed: int,
+        ) -> None:
+            # This method is overridden to allow for patching in tests.
+            frames.append((fin, opcode, payload, compressed))
+
+        with mock.patch.object(self, "_handle_frame", _handle_frame):
+            self._feed_data(data)
+        return frames
+
 
 def build_frame(
     message, opcode, use_mask=False, noheader=False, is_fin=True, compress=False
@@ -127,32 +146,32 @@ def test_feed_data_remembers_exception(parser: WebSocketReader) -> None:
     assert data == b""
 
 
-def test_parse_frame(parser) -> None:
+def test_parse_frame(parser: PatchableWebSocketReader) -> None:
     parser.parse_frame(struct.pack("!BB", 0b00000001, 0b00000001))
     res = parser.parse_frame(b"1")
     fin, opcode, payload, compress = res[0]
 
-    assert (0, 1, b"1", False) == (fin, opcode, payload, not not compress)
+    assert (0, 1, b"1", 0) == (fin, opcode, payload, not not compress)
 
 
-def test_parse_frame_length0(parser) -> None:
+def test_parse_frame_length0(parser: PatchableWebSocketReader) -> None:
     fin, opcode, payload, compress = parser.parse_frame(
         struct.pack("!BB", 0b00000001, 0b00000000)
     )[0]
 
-    assert (0, 1, b"", False) == (fin, opcode, payload, not not compress)
+    assert (0, 1, b"", 0) == (fin, opcode, payload, not not compress)
 
 
-def test_parse_frame_length2(parser) -> None:
+def test_parse_frame_length2(parser: PatchableWebSocketReader) -> None:
     parser.parse_frame(struct.pack("!BB", 0b00000001, 126))
     parser.parse_frame(struct.pack("!H", 4))
     res = parser.parse_frame(b"1234")
     fin, opcode, payload, compress = res[0]
 
-    assert (0, 1, b"1234", False) == (fin, opcode, payload, not not compress)
+    assert (0, 1, b"1234", 0) == (fin, opcode, payload, not not compress)
 
 
-def test_parse_frame_length2_multi_byte(parser: WebSocketReader) -> None:
+def test_parse_frame_length2_multi_byte(parser: PatchableWebSocketReader) -> None:
     """Ensure a multi-byte length is parsed correctly."""
     expected_payload = b"1" * 32768
     parser.parse_frame(struct.pack("!BB", 0b00000001, 126))
@@ -160,10 +179,12 @@ def test_parse_frame_length2_multi_byte(parser: WebSocketReader) -> None:
     res = parser.parse_frame(b"1" * 32768)
     fin, opcode, payload, compress = res[0]
 
-    assert (0, 1, expected_payload, False) == (fin, opcode, payload, not not compress)
+    assert (0, 1, expected_payload, 0) == (fin, opcode, payload, not not compress)
 
 
-def test_parse_frame_length2_multi_byte_multi_packet(parser: WebSocketReader) -> None:
+def test_parse_frame_length2_multi_byte_multi_packet(
+    parser: PatchableWebSocketReader,
+) -> None:
     """Ensure a multi-byte length with multiple packets is parsed correctly."""
     expected_payload = b"1" * 32768
     assert parser.parse_frame(struct.pack("!BB", 0b00000001, 126)) == []
@@ -174,44 +195,53 @@ def test_parse_frame_length2_multi_byte_multi_packet(parser: WebSocketReader) ->
     res = parser.parse_frame(b"1" * 8192)
     fin, opcode, payload, compress = res[0]
     assert len(payload) == 32768
-    assert (0, 1, expected_payload, False) == (fin, opcode, payload, not not compress)
+    assert (0, 1, expected_payload, 0) == (fin, opcode, payload, not not compress)
 
 
-def test_parse_frame_length4(parser: WebSocketReader) -> None:
+def test_parse_frame_length4(parser: PatchableWebSocketReader) -> None:
     parser.parse_frame(struct.pack("!BB", 0b00000001, 127))
     parser.parse_frame(struct.pack("!Q", 4))
     fin, opcode, payload, compress = parser.parse_frame(b"1234")[0]
 
-    assert (0, 1, b"1234", False) == (fin, opcode, payload, not not compress)
+    assert (0, 1, b"1234", 0) == (fin, opcode, payload, compress)
 
 
-def test_parse_frame_mask(parser) -> None:
+def test_parse_frame_mask(parser: PatchableWebSocketReader) -> None:
     parser.parse_frame(struct.pack("!BB", 0b00000001, 0b10000001))
     parser.parse_frame(b"0001")
     fin, opcode, payload, compress = parser.parse_frame(b"1")[0]
 
-    assert (0, 1, b"\x01", False) == (fin, opcode, payload, not not compress)
+    assert (0, 1, b"\x01", 0) == (fin, opcode, payload, compress)
 
 
-def test_parse_frame_header_reversed_bits(out, parser) -> None:
+def test_parse_frame_header_reversed_bits(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     with pytest.raises(WebSocketError):
         parser.parse_frame(struct.pack("!BB", 0b01100000, 0b00000000))
         raise out.exception()
 
 
-def test_parse_frame_header_control_frame(out, parser) -> None:
+def test_parse_frame_header_control_frame(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     with pytest.raises(WebSocketError):
         parser.parse_frame(struct.pack("!BB", 0b00001000, 0b00000000))
         raise out.exception()
 
 
-def _test_parse_frame_header_new_data_err(out, parser):
+@pytest.mark.xfail()
+def test_parse_frame_header_new_data_err(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     with pytest.raises(WebSocketError):
         parser.parse_frame(struct.pack("!BB", 0b000000000, 0b00000000))
         raise out.exception()
 
 
-def test_parse_frame_header_payload_size(out, parser) -> None:
+def test_parse_frame_header_payload_size(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     with pytest.raises(WebSocketError):
         parser.parse_frame(struct.pack("!BB", 0b10001000, 0b01111110))
         raise out.exception()
@@ -226,54 +256,45 @@ def test_parse_frame_header_payload_size(out, parser) -> None:
 )
 def test_ping_frame(
     out: WebSocketDataQueue,
-    parser: WebSocketReader,
+    parser: PatchableWebSocketReader,
     data: Union[bytes, bytearray, memoryview],
 ) -> None:
-    with mock.patch.object(parser, "parse_frame", autospec=True) as m:
-        m.return_value = [(1, WSMsgType.PING, b"data", False)]
-
-        parser.feed_data(data)
-        res = out._buffer[0]
-        assert res == ((WSMsgType.PING, b"data", ""), 4)
,-
+    parser._handle_frame(True, WSMsgType.PING, b"data", 0)
+    res = out._buffer[0]
+    assert res == ((WSMsgType.PING, b"data", ""), 4)
 
-def test_pong_frame(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [(1, WSMsgType.PONG, b"data", False)]
 
-    parser.feed_data(b"")
+def test_pong_frame(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None:
+    parser._handle_frame(True, WSMsgType.PONG, b"data", 0)
     res = out._buffer[0]
     assert res == ((WSMsgType.PONG, b"data", ""), 4)
 
 
-def test_close_frame(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"", False)]
-
-    parser.feed_data(b"")
+def test_close_frame(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None:
+    parser._handle_frame(True, WSMsgType.CLOSE, b"", 0)
     res = out._buffer[0]
     assert res == ((WSMsgType.CLOSE, 0, ""), 0)
 
 
-def test_close_frame_info(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"0112345", False)]
-
-    parser.feed_data(b"")
+def test_close_frame_info(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    parser._handle_frame(True, WSMsgType.CLOSE, b"0112345", 0)
     res = out._buffer[0]
     assert res == (WSMessage(WSMsgType.CLOSE, 12337, "12345"), 0)
 
 
-def test_close_frame_invalid(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"1", False)]
-    parser.feed_data(b"")
-
-    assert isinstance(out.exception(), WebSocketError)
-    assert out.exception().code == WSCloseCode.PROTOCOL_ERROR
+def test_close_frame_invalid(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    with pytest.raises(WebSocketError) as ctx:
+        parser._handle_frame(True, WSMsgType.CLOSE, b"1", 0)
+    assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR
 
 
-def test_close_frame_invalid_2(out, parser) -> None:
+def test_close_frame_invalid_2(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     data = build_close_frame(code=1)
 
     with pytest.raises(WebSocketError) as ctx:
@@ -282,7 +303,7 @@ def test_close_frame_invalid_2(out, parser) -> None:
     assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR
 
 
-def test_close_frame_unicode_err(parser) -> None:
+def test_close_frame_unicode_err(parser: PatchableWebSocketReader) -> None:
     data = build_close_frame(code=1000, message=b"\xf4\x90\x80\x80")
 
     with pytest.raises(WebSocketError) as ctx:
@@ -291,23 +312,21 @@ def test_close_frame_unicode_err(parser) -> None:
     assert ctx.value.code == WSCloseCode.INVALID_TEXT
 
 
-def test_unknown_frame(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [(1, WSMsgType.CONTINUATION, b"", False)]
-
+def test_unknown_frame(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     with pytest.raises(WebSocketError):
-        parser.feed_data(b"")
-        raise out.exception()
+        parser._handle_frame(True, WSMsgType.CONTINUATION, b"", 0)
 
 
-def test_simple_text(out, parser) -> None:
+def test_simple_text(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None:
     data = build_frame(b"text", WSMsgType.TEXT)
     parser._feed_data(data)
     res = out._buffer[0]
     assert res == ((WSMsgType.TEXT, "text", ""), 4)
 
 
-def test_simple_text_unicode_err(parser) -> None:
+def test_simple_text_unicode_err(parser: PatchableWebSocketReader) -> None:
     data = build_frame(b"\xf4\x90\x80\x80", WSMsgType.TEXT)
 
     with pytest.raises(WebSocketError) as ctx:
@@ -316,16 +335,29 @@ def test_simple_text_unicode_err(parser) -> None:
     assert ctx.value.code == WSCloseCode.INVALID_TEXT
 
 
-def test_simple_binary(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [(1, WSMsgType.BINARY, b"binary", False)]
+def test_simple_binary(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    data = build_frame(b"binary", WSMsgType.BINARY)
+    parser._feed_data(data)
+    res = out._buffer[0]
+    assert res == ((WSMsgType.BINARY, b"binary", ""), 6)
+
 
-    parser.feed_data(b"")
+def test_one_byte_at_a_time(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    """Send one byte at a time to the parser."""
+    data = build_frame(b"binary", WSMsgType.BINARY)
+    for i in range(len(data)):
+        parser._feed_data(data[i : i + 1])
     res = out._buffer[0]
     assert res == ((WSMsgType.BINARY, b"binary", ""), 6)
 
 
-def test_fragmentation_header(out, parser) -> None:
+def test_fragmentation_header(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     data = build_frame(b"a", WSMsgType.TEXT)
     parser._feed_data(data[:1])
     parser._feed_data(data[1:])
@@ -334,7 +366,54 @@ def test_fragmentation_header(out, parser) -> None:
     assert res == (WSMessage(WSMsgType.TEXT, "a", ""), 1)
 
 
-def test_continuation(out, parser) -> None:
+def test_large_message(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    large_payload = b"b" * 131072
+    data = build_frame(large_payload, WSMsgType.BINARY)
+    parser._feed_data(data)
+
+    res = out._buffer[0]
+    assert res == ((WSMsgType.BINARY, large_payload, ""), 131072)
+
+
+def test_large_masked_message(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    large_payload = b"b" * 131072
+    data = build_frame(large_payload, WSMsgType.BINARY, use_mask=True)
+    parser._feed_data(data)
+
+    res = out._buffer[0]
+    assert res == ((WSMsgType.BINARY, large_payload, ""), 131072)
+
+
+def test_fragmented_masked_message(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    large_payload = b"b" * 100
+    data = build_frame(large_payload, WSMsgType.BINARY, use_mask=True)
+    for i in range(len(data)):
+        parser._feed_data(data[i : i + 1])
+
+    res = out._buffer[0]
+    assert res == ((WSMsgType.BINARY, large_payload, ""), 100)
+
+
+def test_large_fragmented_masked_message(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    large_payload = b"b" * 131072
+    data = build_frame(large_payload, WSMsgType.BINARY, use_mask=True)
+    for i in range(0, len(data), 16384):
+        parser._feed_data(data[i : i + 16384])
+    res = out._buffer[0]
+    assert res == ((WSMsgType.BINARY, large_payload, ""), 131072)
+
+
+def test_continuation(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     data1 = build_frame(b"line1", WSMsgType.TEXT, is_fin=False)
     parser._feed_data(data1)
 
@@ -345,14 +424,9 @@ def test_continuation(out, parser) -> None:
     assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10)
 
 
-def test_continuation_with_ping(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [
-        (0, WSMsgType.TEXT, b"line1", False),
-        (0, WSMsgType.PING, b"", False),
-        (1, WSMsgType.CONTINUATION, b"line2", False),
-    ]
-
+def test_continuation_with_ping(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
     data1 = build_frame(b"line1", WSMsgType.TEXT, is_fin=False)
     parser._feed_data(data1)
 
@@ -368,90 +442,78 @@ def test_continuation_with_ping(out, parser) -> None:
     assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10)
 
 
-def test_continuation_err(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [
-        (0, WSMsgType.TEXT, b"line1", False),
-        (1, WSMsgType.TEXT, b"line2", False),
-    ]
-
+def test_continuation_err(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0)
     with pytest.raises(WebSocketError):
-        parser._feed_data(b"")
+        parser._handle_frame(True, WSMsgType.TEXT, b"line2", 0)
 
 
-def test_continuation_with_close(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [
-        (0, WSMsgType.TEXT, b"line1", False),
-        (0, WSMsgType.CLOSE, build_close_frame(1002, b"test", noheader=True), False),
-        (1, WSMsgType.CONTINUATION, b"line2", False),
-    ]
-
-    parser.feed_data(b"")
+def test_continuation_with_close(
+    out: WebSocketDataQueue, parser: WebSocketReader
+) -> None:
+    parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0)
+    parser._handle_frame(
+        False,
+        WSMsgType.CLOSE,
+        build_close_frame(1002, b"test", noheader=True),
+        False,
+    )
+    parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0)
     res = out._buffer[0]
-    assert res, (WSMessage(WSMsgType.CLOSE, 1002, "test"), 0)
+    assert res == (WSMessage(WSMsgType.CLOSE, 1002, "test"), 0)
     res = out._buffer[1]
     assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10)
 
 
-def test_continuation_with_close_unicode_err(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [
-        (0, WSMsgType.TEXT, b"line1", False),
-        (
-            0,
+def test_continuation_with_close_unicode_err(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0)
+    with pytest.raises(WebSocketError) as ctx:
+        parser._handle_frame(
+            False,
             WSMsgType.CLOSE,
             build_close_frame(1000, b"\xf4\x90\x80\x80", noheader=True),
-            False,
-        ),
-        (1, WSMsgType.CONTINUATION, b"line2", False),
-    ]
-
-    with pytest.raises(WebSocketError) as ctx:
-        parser._feed_data(b"")
-
+            0,
+        )
+    parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0)
     assert ctx.value.code == WSCloseCode.INVALID_TEXT
 
 
-def test_continuation_with_close_bad_code(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [
-        (0, WSMsgType.TEXT, b"line1", False),
-        (0, WSMsgType.CLOSE, build_close_frame(1, b"test", noheader=True), False),
-        (1, WSMsgType.CONTINUATION, b"line2", False),
-    ]
-
+def test_continuation_with_close_bad_code(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0)
     with pytest.raises(WebSocketError) as ctx:
-        parser._feed_data(b"")
 
+        parser._handle_frame(
+            False, WSMsgType.CLOSE, build_close_frame(1, b"test", noheader=True), 0
+        )
     assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR
+    parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0)
 
 
-def test_continuation_with_close_bad_payload(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [
-        (0, WSMsgType.TEXT, b"line1", False),
-        (0, WSMsgType.CLOSE, b"1", False),
-        (1, WSMsgType.CONTINUATION, b"line2", False),
-    ]
-
+def test_continuation_with_close_bad_payload(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0)
     with pytest.raises(WebSocketError) as ctx:
-        parser._feed_data(b"")
-
-    assert ctx.value.code, WSCloseCode.PROTOCOL_ERROR
+        parser._handle_frame(False, WSMsgType.CLOSE, b"1", 0)
+    assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR
+    parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0)
 
 
-def test_continuation_with_close_empty(out, parser) -> None:
-    parser.parse_frame = mock.Mock()
-    parser.parse_frame.return_value = [
-        (0, WSMsgType.TEXT, b"line1", False),
-        (0, WSMsgType.CLOSE, b"", False),
-        (1, WSMsgType.CONTINUATION, b"line2", False),
-    ]
+def test_continuation_with_close_empty(
+    out: WebSocketDataQueue, parser: PatchableWebSocketReader
+) -> None:
+    parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0)
+    parser._handle_frame(False, WSMsgType.CLOSE, b"", 0)
+    parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0)
 
-    parser.feed_data(b"")
     res = out._buffer[0]
-    assert res, (WSMessage(WSMsgType.CLOSE, 0, ""), 0)
+    assert res == (WSMessage(WSMsgType.CLOSE, 0, ""), 0)
     res = out._buffer[1]
     assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10)
 
@@ -506,7 +568,7 @@ def test_msgtype_aliases() -> None:
     assert aiohttp.WSMsgType.ERROR == aiohttp.WSMsgType.error
 
 
-def test_parse_compress_frame_single(parser) -> None:
+def test_parse_compress_frame_single(parser: PatchableWebSocketReader) -> None:
     parser.parse_frame(struct.pack("!BB", 0b11000001, 0b00000001))
     res = parser.parse_frame(b"1")
     fin, opcode, payload, compress = res[0]
@@ -514,7 +576,7 @@ def test_parse_compress_frame_single(parser) -> None:
     assert (1, 1, b"1", True) == (fin, opcode, payload, not not compress)
 
 
-def test_parse_compress_frame_multi(parser) -> None:
+def test_parse_compress_frame_multi(parser: PatchableWebSocketReader) -> None:
     parser.parse_frame(struct.pack("!BB", 0b01000001, 126))
     parser.parse_frame(struct.pack("!H", 4))
     res = parser.parse_frame(b"1234")
@@ -534,7 +596,7 @@ def test_parse_compress_frame_multi(parser) -> None:
     assert (1, 1, b"1234", False) == (fin, opcode, payload, not not compress)
 
 
-def test_parse_compress_error_frame(parser) -> None:
+def test_parse_compress_error_frame(parser: PatchableWebSocketReader) -> None:
     parser.parse_frame(struct.pack("!BB", 0b01000001, 0b00000001))
     parser.parse_frame(b"1")
 
@@ -545,10 +607,8 @@ def test_parse_compress_error_frame(parser) -> None:
     assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR
 
 
-async def test_parse_no_compress_frame_single(
-    loop: asyncio.AbstractEventLoop, out: WebSocketDataQueue
-) -> None:
-    parser_no_compress = WebSocketReader(out, 0, compress=False)
+def test_parse_no_compress_frame_single(out: WebSocketDataQueue) -> None:
+    parser_no_compress = PatchableWebSocketReader(out, 0, compress=False)
     with pytest.raises(WebSocketError) as ctx:
         parser_no_compress.parse_frame(struct.pack("!BB", 0b11000001, 0b00000001))
         parser_no_compress.parse_frame(b"1")
@@ -600,34 +660,28 @@ def test_pickle(self) -> None:
 def test_flow_control_binary(
     protocol: BaseProtocol,
     out_low_limit: WebSocketDataQueue,
-    parser_low_limit: WebSocketReader,
+    parser_low_limit: PatchableWebSocketReader,
 ) -> None:
     large_payload = b"b" * (1 + 16 * 2)
-    large_payload_len = len(large_payload)
-    with mock.patch.object(parser_low_limit, "parse_frame", autospec=True) as m:
-        m.return_value = [(1, WSMsgType.BINARY, large_payload, False)]
-
-        parser_low_limit.feed_data(b"")
-
+    large_payload_size = len(large_payload)
+    parser_low_limit._handle_frame(True, WSMsgType.BINARY, large_payload, 0)
     res = out_low_limit._buffer[0]
-    assert res == (WSMessage(WSMsgType.BINARY, large_payload, ""), large_payload_len)
+    assert res == (WSMessage(WSMsgType.BINARY, large_payload, ""), large_payload_size)
     assert protocol._reading_paused is True
 
 
 def test_flow_control_multi_byte_text(
     protocol: BaseProtocol,
     out_low_limit: WebSocketDataQueue,
-    parser_low_limit: WebSocketReader,
+    parser_low_limit: PatchableWebSocketReader,
 ) -> None:
     large_payload_text = "𒀁" * (1 + 16 * 2)
     large_payload = large_payload_text.encode("utf-8")
-    large_payload_len = len(large_payload)
-
-    with mock.patch.object(parser_low_limit, "parse_frame", autospec=True) as m:
-        m.return_value = [(1, WSMsgType.TEXT, large_payload, False)]
-
-        parser_low_limit.feed_data(b"")
-
+    large_payload_size = len(large_payload)
+    parser_low_limit._handle_frame(True, WSMsgType.TEXT, large_payload, 0)
     res = out_low_limit._buffer[0]
-    assert res == (WSMessage(WSMsgType.TEXT, large_payload_text, ""), large_payload_len)
+    assert res == (
+        WSMessage(WSMsgType.TEXT, large_payload_text, ""),
+        large_payload_size,
+    )
     assert protocol._reading_paused is True
diff --git tools/gen.py tools/gen.py
index ab2b39a2df0..24fb71bdd9d 100755
--- tools/gen.py
+++ tools/gen.py
@@ -7,7 +7,7 @@
 import multidict
 
 ROOT = pathlib.Path.cwd()
-while ROOT.parent != ROOT and not (ROOT / ".git").exists():
+while ROOT.parent != ROOT and not (ROOT / "pyproject.toml").exists():
     ROOT = ROOT.parent
 
 

Description

This PR updates various dependencies and GitHub Actions, adds Python 3.13 support, improves WebSocket performance, introduces memory leak fixes, enhances FileResponse safety, and restructures the build system for wheels. The PR includes several optimizations and bug fixes across different parts of the codebase.

Changes

Changes

GitHub Actions Updates

  • Updated GitHub Actions: actions/[email protected]v4.2.0
  • Updated actions/upload-artifact and actions/download-artifact to v4
  • Increased timeout for deploy jobs
  • Updated Python version from 3.12 to 3.13.2 in the deploy workflow
  • Enhanced wheel building to include musllinux support for multiple architectures

Python 3.13 Support and Dependencies

  • Fixed compatibility with Python 3.13 by using proper imports and mimetypes API
  • Updated multidict from 6.1.0 to 6.4.3
  • Updated packaging from 24.1 to 24.2
  • Updated various Sphinx components to newer versions

WebSocket Optimizations

  • Complete rewrite of WebSocket reader for improved performance
  • Fixed reading fragmented WebSocket messages with masked payloads
  • Large-scale optimization for WebSocket buffer handling
  • Added better error messages for WebSocket disconnects

Memory Leak Fixes

  • Added two isolated tests for client response and request leak detection
  • Break cyclic references at connection close when there was a traceback
  • Break cyclic references when there is an exception handling a request

Security Improvements

  • Disabled zero copy writes in the StreamWriter to address CVE-2024-12254
  • Added validation for headers to prevent header injection attacks
  • Re-enabled zero copy writes when using Python versions that fix the CVE

File Response Improvements

  • Refactored FileResponse to fix race condition and improve safety
  • Added proper cleanup of file objects using a global reference set
  • Improved error handling in FileResponse

Documentation and Build System

  • Updated changelog with entries for versions 3.11.10 through 3.11.18
  • Added new contributors to CONTRIBUTORS.txt
  • Enhanced Makefile with a new cythonize-nodeps target
  • Included missing files in the source distribution

Small Optimizations

  • Improved performance of serializing headers
  • Added caching for content type parsing
  • Optimized web server when access logging is disabled
  • Better handling of connection reuse in the web server
sequenceDiagram
    participant CI as GitHub Actions CI
    participant Build as Build System
    participant Aiohttp as Aiohttp Core
    participant WS as WebSocket Module
    participant FR as FileResponse
    
    CI->>Build: Update actions/cache to v4.2.0
    CI->>Build: Update actions/upload-artifact to v4
    CI->>Build: Update actions/download-artifact to v4
    
    CI->>Build: Add musllinux wheel building
    CI->>Build: Update Python to 3.13.2 in deploy job
    
    Build->>Aiohttp: Fix Python 3.13 compatibility
    Build->>Aiohttp: Update multidict dependency
    
    Aiohttp->>WS: Optimize WebSocket reader
    Aiohttp->>WS: Fix masked fragmented messages
    Aiohttp->>WS: Improve error messages
    
    Aiohttp->>Aiohttp: Add header injection protection
    Aiohttp->>Aiohttp: Handle CVE-2024-12254 safely
    
    Aiohttp->>FR: Fix race condition
    Aiohttp->>FR: Improve file handling
    Aiohttp->>FR: Better cleanup of resources
    
    Aiohttp->>Aiohttp: Break cyclic references
    Aiohttp->>Aiohttp: Fix memory leaks
    Aiohttp->>Aiohttp: Optimize HTTP performance
Loading

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants