Skip to content

Manual chunk upload for GCS #2480

@AmbroiseCouissin

Description

@AmbroiseCouissin

Hello! Thank you for the work on this SDK as it has been all working perfectly for now.

We are trying to manually upload chunks the following way:

  1. Client calls our backend service that, in turn, calls the APIs of Google.Cloud.Storage.V1 to initialize a chunked upload of a large file (multiple GBs). For this we use client.CreateObjectUploader and then await uploader.InitiateSessionAsync. We therefore get a session URI.
using MemoryStream emptyMemoryStream = new();
Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload uploader = client.CreateObjectUploader(command.BucketName, command.Key, GetMimeType(command.Key), emptyMemoryStream);

Uri uploadUri = await uploader.InitiateSessionAsync(cancellationToken);
  1. Client manually upload chunks of data, 1 by 1, to our backend that, in turn, calls the APIs of Google.Cloud.Storage.V1 .
    Every time the client uploads a chunk of data, we do the following:
ResumableUpload actualUploader = ResumableUpload.CreateFromUploadUri(new Uri(request.ResumableUrl), new MemoryStream(Convert.FromBase64String(request.ContentAsBase64)));

IUploadProgress uploadProgress = await actualUploader.ResumeAsync(new Uri(request.ResumableUrl), cancellationToken);

Unfortunately, everytime a new chunk is uploaded, it replaces the one before that. We are not sure what we are missing as we have tried a few ways to do that.

Can you please let us know if you have any idea of what we are doing wrong?

Metadata

Metadata

Assignees

Labels

priority: p3Desirable enhancement or fix. May not be included in next release.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions