Skip to content

disk caching very large items fails #313

Open
@jllanfranchi

Description

@jllanfranchi

Probably a limitation of sqlite3; possibly need to use filesystem. Workaround for now is to try/except to fail gracefully, and simply move on as if the item wasn't in the cache. If writing fails, it apparently can corrupt the cache, though, so other items stored successfully won't be retrievable and next time you try to write to the cache, it will fail regardless. (Also, reading if a key is present fails.)

Possible solutions would be to move to simple file storage but place a sqlite3 db to store file metadata and to implement inter-process locking. Once a lock has been obtained, then the file can be written directly to disk, presumably in the cache dir or a subdir where the db resides.

For now, this is not a blocking bug because all but somewhat unusually large items do work.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions