-
Notifications
You must be signed in to change notification settings - Fork 195
Open
Description
Hi,
I have a files backup copied to S3 using the AWS CLI piped from backup:get --element=files. The backup is ~50GB in size. The copying from google storage to aws s3 takes ~6m.
Importing the same files archive using import:files reliably times out after exactly 2h. I estimate that transferring 50GB at (a slow) 16MB/s would take around 1hr.
This is too slow for any bigger site. From the outside it's difficult to say whether it's the network transfer or the unpacking that's the problem. For the record, I'm testing this on a sandbox site. Are there limitations because of this?
Metadata
Metadata
Assignees
Labels
No labels