Description
While exploring zarr.js performance on big datasets, I realized that there are some limitations when loading and decoding lots of chunks, as for know everything happens in the main thread ( #33 was mentioning this issue before). I played a bit around and found that using "treads.js" and their pool implementation speed up the loading dramatically, as decoding can happen in a webworker (threadjs also provides the same abstraction for running on node), however this required a rewrite of getBasicSelection
in ZarrArray.
I understood from #33 that this might be mission creep, but maybe it would be an option to give the ability to offload the decoding to the store (by maybe extending DecodingStore), then the store implementation could handle the retrieval and decoding in a worker pool? (There are a few gotchas of what you are able to pass to and from a webworker but one way is to only send the Codec Config and the ChunkKey to the worker).