Skip to content

Pass max_length parameter to ZLibDecompressor #8248

Open
@ikrivosheev

Description

@ikrivosheev

Is your feature request related to a problem?

Steps to reproduce:

  1. Create file: dd if=/dev/zero bs=1M count=1024 | gzip > 1G.gz
  2. Start nginx on localhost with location:
location /1G.gz {
        try_files $uri /tmp/1G.gz;
        add_header Content-Encoding gzip;
    }
  1. Try to download file:
async with aiohttp.ClientSession(headers=headers) as session:
    async with session.get('http://localhost/1G.gz') as response:
         async for content in response.content.iter_any():
             print(len(content)
  1. See memory usage. It'll so big... I run my service in docker and have a memory limit for container. I get OOMKilled.

Describe the solution you'd like

Will be great an option to pass max_length to ZLibDecompressor (https://github.com/aio-libs/aiohttp/blob/master/aiohttp/compression_utils.py#L104)

Describe alternatives you've considered

Change default max_length from 0 to, 4Kb or greater.

Related component

Client

Additional context

No response

Code of Conduct

  • I agree to follow the aio-libs Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions