You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Allow more specificity in requested chunk_dims for get_image_dask_data (_read_delayed).
Use Case
Dask best practices for chunk sizing: https://docs.dask.org/en/latest/delayed-best-practices.html
Currently bioio only allows you to choose a dimension AXIS for chunking and always chunks with the whole range of that dimension.
When reading a large file with something like t=500, c=2, z=150, y=1000, x=2000, (each xy slice is about 4MB) we don't really have the option to chunk by just a few, or even half, of the z slices.
Solution
Not sure about the api to use here, but possibly pass in a chunk_size with actual numeric values.
So the user would probably do this, as an example:
im = BioImage(path)
dims = im.dims
im.get_image_dask_data(chunk_size=[1,1,1,dims.Y*0.5, dims.X*0.5]). # (Defaults to something sensible?)
The text was updated successfully, but these errors were encountered:
Feature Description
Allow more specificity in requested chunk_dims for get_image_dask_data (_read_delayed).
Use Case
Dask best practices for chunk sizing: https://docs.dask.org/en/latest/delayed-best-practices.html
Currently bioio only allows you to choose a dimension AXIS for chunking and always chunks with the whole range of that dimension.
When reading a large file with something like t=500, c=2, z=150, y=1000, x=2000, (each xy slice is about 4MB) we don't really have the option to chunk by just a few, or even half, of the z slices.
Solution
Not sure about the api to use here, but possibly pass in a chunk_size with actual numeric values.
So the user would probably do this, as an example:
The text was updated successfully, but these errors were encountered: