Skip to content

Zipping large file 3.5 GB memory issue #116

@Guling85

Description

@Guling85

Im trying to zip a large file with streams and while reading the stream I chunk the large ziped file and inserting it to indexedDB. And getting this error
conflux.esm.js:815 Uncaught (in promise) RangeError: Array buffer allocation failed
at ZipTransformer._callee$ (conflux.esm.js:815:31)

Cant the lib handle large zip files? or Im i doing things wrong.

This is my code.

this.logger.info('zipping files', this.files);

    const iterator = this.files.entries();

    const myReadable = new ReadableStream({
      async pull(controller) {
        const { value, done } = await iterator.next();
        console.log('TEST', value);
        if (done) {
          controller.close();
        } else {

          console.log('TEST2', value);

          return controller.enqueue({
            name: `/${value[1].name}`,
            stream: () => value[1].stream(),
          });
        }
      },
    });

    const appDB = await openDB<DB>('db');

    const writableStream = new WritableStream({
      start(controller) {

      },

      async write(chunk, controller) {
        await appDB.add('chunks', { transferId: '1', index: 'index', chunkOrder: 1, blob: chunk });
        //console.log('data', chunk);
        chunk = null;
      },
      close() {
        console.log('[close]');
      },
      abort(reason) {
        /* … */
      },
    });

    myReadable.pipeThrough(new Writer()).pipeThrough(chunkSlicer(640000)).pipeTo(writableStream);

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions