Skip to content

Command import:files too slow to be usable #2747

@miiimooo

Description

@miiimooo

Hi,

I have a files backup copied to S3 using the AWS CLI piped from backup:get --element=files. The backup is ~50GB in size. The copying from google storage to aws s3 takes ~6m.
Importing the same files archive using import:files reliably times out after exactly 2h. I estimate that transferring 50GB at (a slow) 16MB/s would take around 1hr.

This is too slow for any bigger site. From the outside it's difficult to say whether it's the network transfer or the unpacking that's the problem. For the record, I'm testing this on a sandbox site. Are there limitations because of this?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions