I need to allows users to upload roughly ~150mb files to S3, not very often, but since waffle_ecto is using the binary upload, it loads the entire file into memory. I'd rather not pay for more memory when Waffle has support for streaming a file to S3 if the file is from disk and not a binary.
Is there a way I can configure waffle_ecto to stream to local disk, then do a stream upload to S3? That way I don't have 150mb loaded into memory for every concurrent upload. Using gigalixir I'd essentially have to pay for $10 per concurrent upload I want to support without crashing due to running out of memory.