Open
Description
Is your feature request related to a problem?
Steps to reproduce:
- Create file:
dd if=/dev/zero bs=1M count=1024 | gzip > 1G.gz
- Start nginx on localhost with location:
location /1G.gz {
try_files $uri /tmp/1G.gz;
add_header Content-Encoding gzip;
}
- Try to download file:
async with aiohttp.ClientSession(headers=headers) as session:
async with session.get('http://localhost/1G.gz') as response:
async for content in response.content.iter_any():
print(len(content)
- See memory usage. It'll so big... I run my service in docker and have a memory limit for container. I get OOMKilled.
Describe the solution you'd like
Will be great an option to pass max_length
to ZLibDecompressor
(https://github.com/aio-libs/aiohttp/blob/master/aiohttp/compression_utils.py#L104)
Describe alternatives you've considered
Change default max_length
from 0
to, 4Kb or greater.
Related component
Client
Additional context
No response
Code of Conduct
- I agree to follow the aio-libs Code of Conduct