Skip to content

DataLakeServiceAsyncClient chunked Flux<ByteBuffer> read() #45307

Open
@o-shevchenko

Description

@o-shevchenko

Query/Question
Is there a way to configure chunk size for data stream?
As far as I see it returns 8kb chunks and it can't be chaged.
We would like to set chunk size:

val chunkSize = DataSize.ofMegabytes(1).toBytes()
val data: Flux<ByteBuffer> = client
            .getFileSystemAsyncClient(fileSystemName)
            .getFileAsyncClient(path)
            .read(chunkSize) // <--- Set chunk size

Why is this not a Bug or a feature Request?
I'm not sure if such a common functionality already exists. If not, will be great to move it to a Feature Request.

Setup (please complete the following information if applicable):

  • OS: MacOS
  • IDE: IntelliJ
  • Library/Libraries: com.azure:azure-storage-file-datalake:12.23.0

Information Checklist
Kindly make sure that you have added all the following information above and checkoff the required fields otherwise we will treat the issuer as an incomplete report

  • Query Added
  • Setup information Added

Metadata

Metadata

Labels

ClientThis issue points to a problem in the data-plane of the library.HttpClientStorageStorage Service (Queues, Blobs, Files)customer-reportedIssues that are reported by GitHub users external to the Azure organization.feature-requestThis issue requires a new behavior in the product in order be resolved.needs-team-attentionWorkflow: This issue needs attention from Azure service team or SDK team

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions