Description
When uploading a file and setting streams/chunks to dynamic or just chunks to anything smaller than the size of the file getting uploaded - artifacts are introduced into the file. During download they are quite apparent, if it's an image file that is - binary data.
On the client side we use the pipe function to encrypt the data before it reaches the server. Seems like every chunk gets encrypted and then somehow the chunks get merged when storing the file on the server. The file is then decrypted and encrypted once again. When the file is then downloaded the artifacts are present - i.e. missing pixel data, wrong pixel data.
If the chunk size is set to a number bigger than the size of the file, the piping function encrypts only once and it seems to work just fine in that case. The problem is that having huge chunks pretty much freezes the browser while it's getting uploaded - which can take minutes.
Am I doing something wrong or is there an issue with binary data getting preserved correctly when being chunked ?
ostrio:[email protected]
[email protected]
Seems like a Client issue