Large files in a serverless/edge environment #472
-
Hi everyone, I've been wanting to try this library but I'm not sure whether and how it fits in for my use case. What I'm trying to put together is essentially a proxy which pulls files from a cloud blob store (like S3 or R2) and lets the user download them as a zipped archive— that would include some pretty large files of over several GB. I'm a bit confused about how zip.js works internally, particularly how it uses memory. Serverless environments are pretty tight with memory limits (e.g. Cloudflare Workers allow only 128mb), so I'm wondering how zip.js could work in my case— would it be possible to somehow pipe through the zipped file to a streamed HTTP response? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi, The problem is that zip.js needs some improvements regarding the handling of errors when using streams. Today, they are not propagated into the streams. So, you have to use the promises returned by zip.js APIs to detect errors. This issue makes the code less intuitive to write. Here is below an example of code running in Deno which shows what you want to achieve. It creates a zip file on the fly with data coming from third-party servers. Normally, this code should be able to generate zip files with constant memory consumption. import { serve } from "https://deno.land/[email protected]/http/server.ts";
import { ZipWriter, configure } from "https://deno.land/x/[email protected]/index.js";
serve(async () => {
configure({ maxWorkers: 1 });
const { readable, writable } = new TransformStream();
const zipWriter = new ZipWriter(writable);
const promises = [
zipWriter.add("example.com.html", (await fetch("https://www.example.com")).body),
zipWriter.add("example.org.html", (await fetch("https://www.example.org")).body),
zipWriter.close()
];
Promise.all(promises).catch(error => console.error(error));
return new Response(readable, { headers: [["Content-Disposition", "attachment; filename=\"hello.zip\""]] });
}); You can test it here: https://dash.deno.com/projects/bitter-kingfisher-23 |
Beta Was this translation helpful? Give feedback.
Hi,
The problem is that zip.js needs some improvements regarding the handling of errors when using streams. Today, they are not propagated into the streams. So, you have to use the promises returned by zip.js APIs to detect errors. This issue makes the code less intuitive to write.
Here is below an example of code running in Deno which shows what you want to achieve. It creates a zip file on the fly with data coming from third-party servers. Normally, this code should be able to generate zip files with constant memory consumption.