Skip to content

Big file issue #634

Open
Open
@lyandr

Description

@lyandr

Hello,

i am trying to download and upload files from S3 to local filesystem and vice versa using Knp Gaufrette in a Symfony project.

My code work run on a container in openshift and work fine for little files but with large file (1Go) i have memory issues. I am limited to 1,5Go in my container. I don't understand why my memory is growing so high. Maybe i don't understand well the concept of stream in php but i thought that the file wasn't loaded in memory with stream but loaded chunk by chunk.
With my code i can see that when i do $srcStream = $this->fs_s3->createStream($filename); $srcStream->open(new StreamMode('rb+')); my memory is growing with the size of the file.
I also tried copy('gaufrette://s3/'.$filename,'gaufrette://nfs/'.$filename); but is is the same.

Am i using stream in the wrong way? Any advice?

Thank you in advance for your help.
Regards

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions