Open
Description
Query/Question
Is there a way to configure chunk size for data stream?
As far as I see it returns 8kb chunks and it can't be chaged.
We would like to set chunk size:
val chunkSize = DataSize.ofMegabytes(1).toBytes()
val data: Flux<ByteBuffer> = client
.getFileSystemAsyncClient(fileSystemName)
.getFileAsyncClient(path)
.read(chunkSize) // <--- Set chunk size
Why is this not a Bug or a feature Request?
I'm not sure if such a common functionality already exists. If not, will be great to move it to a Feature Request.
Setup (please complete the following information if applicable):
- OS: MacOS
- IDE: IntelliJ
- Library/Libraries: com.azure:azure-storage-file-datalake:12.23.0
Information Checklist
Kindly make sure that you have added all the following information above and checkoff the required fields otherwise we will treat the issuer as an incomplete report
- Query Added
- Setup information Added
Metadata
Metadata
Assignees
Labels
This issue points to a problem in the data-plane of the library.Storage Service (Queues, Blobs, Files)Issues that are reported by GitHub users external to the Azure organization.This issue requires a new behavior in the product in order be resolved.Workflow: This issue needs attention from Azure service team or SDK team