-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorrect error raised for chunked Zarr region write #9557
Comments
Right, sorry, I guess the logic for checking chunks isn't quite right. Thanks a lot for spotting. This passes if we're only writing to the first chunk rather than the second: import xarray
import numpy as np
data = np.random.RandomState(0).randn(2920, 25, 53)
ds = xarray.Dataset({'temperature': (('time', 'lat', 'lon'), data)})
chunks = {'time': 1000, 'lat': 25, 'lon': 53}
store = 'testing.zarr'
ds.chunk(chunks).to_zarr(store, compute=False)
# region = {'time': slice(1000, 2000, 1)}
region = {'time': slice(0, 1000, 1)}
chunk = ds.isel(region)
chunk = chunk.chunk() # triggers error
chunk.chunk().to_zarr(store, region=region) ...so somehow there's a small logic error there |
Hi, I will take a look at this case, as a workaround you can use the mode = "a", but I think that the problem is that the algorithm is detecting that the region is covering the last chunk which is incorrect for this case I think. |
Indeed, it was an error in the validation of the last chunk. I'm sorry. I sent a new PR to fix the bug #9559 |
What happened?
Writing a chunk with
to_zarr()
on a specific region incorrectly fails if a variable is chunked with Dask, even if the variable's chunks are compatible with the Zarr store.What did you expect to happen?
This code path is used by Xarray-Beam. In particular, this test in Xarray-Beam fails with the latest development version of Xarray.
Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
Anything else we need to know?
These error messages were introduced by #9527
Environment
The text was updated successfully, but these errors were encountered: