Im trying to zip a large file with streams and while reading the stream I chunk the large ziped file and inserting it to indexedDB. And getting this error
conflux.esm.js:815 Uncaught (in promise) RangeError: Array buffer allocation failed
at ZipTransformer._callee$ (conflux.esm.js:815:31)
Cant the lib handle large zip files? or Im i doing things wrong.
This is my code.
this.logger.info('zipping files', this.files);
const iterator = this.files.entries();
const myReadable = new ReadableStream({
async pull(controller) {
const { value, done } = await iterator.next();
console.log('TEST', value);
if (done) {
controller.close();
} else {
console.log('TEST2', value);
return controller.enqueue({
name: `/${value[1].name}`,
stream: () => value[1].stream(),
});
}
},
});
const appDB = await openDB<DB>('db');
const writableStream = new WritableStream({
start(controller) {
},
async write(chunk, controller) {
await appDB.add('chunks', { transferId: '1', index: 'index', chunkOrder: 1, blob: chunk });
//console.log('data', chunk);
chunk = null;
},
close() {
console.log('[close]');
},
abort(reason) {
/* … */
},
});
myReadable.pipeThrough(new Writer()).pipeThrough(chunkSlicer(640000)).pipeTo(writableStream);
Im trying to zip a large file with streams and while reading the stream I chunk the large ziped file and inserting it to indexedDB. And getting this error
conflux.esm.js:815 Uncaught (in promise) RangeError: Array buffer allocation failed
at ZipTransformer._callee$ (conflux.esm.js:815:31)
Cant the lib handle large zip files? or Im i doing things wrong.
This is my code.