Skip to content

Large plain/text files returned as gzip and library doesn't handle this case #19

Open
@grunichev

Description

@grunichev

python --version
Python 3.6.1

smartfile==2.19

Example from documentation:

>>> import shutil
>>> from smartfile import BasicClient
>>> api = BasicClient()
>>> f = api.get('/path/data/', 'small.txt')
>>> with open('small.txt', 'wb') as o:
>>>     # f.getheaders().get('Content-Encoding') 
>>>     shutil.copyfileobj(f, o)

works fine with small files, but when file is large (I tested 100Mb text file)
f.getheaders().get('Content-Encoding') returns gzip and instead of expected source text file gzip content is downloaded and saved to file.

It can be handled with:

        if f.getheaders().get('Content-Encoding') == 'gzip':
            o.write(gzip.decompress(f.read()))
        else:
            shutil.copyfileobj(f, o)

but ideally I think library should handle this case.
Or, at least, docs have to be updated and mention this possible issue.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions