-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiscale #15
Comments
related (it was working before): from funlib.persistence import open_ds
path = "/nrs/cellmap/data/jrc_cos7-1a/jrc_cos7-1a.zarr/recon-1/em/fibsem-uint8/s2"
ds = open_ds(path, 'r')
ds.voxel_size result: |
our old zarr file (no multiscale) - metadata: {
"offset": [
16900,
17156,
21164
],
"resolution": [
2,
2,
2
]
} new version of funlib.persistence can't detect the voxel_size from resolution |
You can set the default metadata in a couple ways.
If set in the |
thank you. i did fix the old data one resolution data. but still the multiscale bug
{
"multiscales": [
{
"axes": [
{
"name": "z",
"type": "space",
"unit": "nanometer"
},
{
"name": "y",
"type": "space",
"unit": "nanometer"
},
{
"name": "x",
"type": "space",
"unit": "nanometer"
}
],
"coordinateTransformations": [
{
"scale": [
1.0,
1.0,
1.0
],
"type": "scale"
}
],
"datasets": [
{
"coordinateTransformations": [
{
"scale": [
4.0,
4.0,
4.0
],
"type": "scale"
},
{
"translation": [
0.0,
0.0,
0.0
],
"type": "translation"
}
],
"path": "s0"
},
{
"coordinateTransformations": [
{
"scale": [
8.0,
8.0,
8.0
],
"type": "scale"
},
{
"translation": [
2.0,
2.0,
2.0
],
"type": "translation"
}
],
"path": "s1"
},
{
"coordinateTransformations": [
{
"scale": [
16.0,
16.0,
16.0
],
"type": "scale"
},
{
"translation": [
6.0,
6.0,
6.0
],
"type": "translation"
}
],
"path": "s2"
},
{
"coordinateTransformations": [
{
"scale": [
32.0,
32.0,
32.0
],
"type": "scale"
},
{
"translation": [
14.0,
14.0,
14.0
],
"type": "translation"
}
],
"path": "s3"
},
{
"coordinateTransformations": [
{
"scale": [
64.0,
64.0,
64.0
],
"type": "scale"
},
{
"translation": [
30.0,
30.0,
30.0
],
"type": "translation"
}
],
"path": "s4"
},
{
"coordinateTransformations": [
{
"scale": [
128.0,
128.0,
128.0
],
"type": "scale"
},
{
"translation": [
62.0,
62.0,
62.0
],
"type": "translation"
}
],
"path": "s5"
}
],
"name": "/recon-1/em/fibsem-uint8",
"version": "0.4"
}
],
"name": "fibsem-uint16"
|
For working with OME metadata I would recommend using an ome-ngff specific tool to read and write the metadata. Something like: offset, voxel_size, axis_names, units = read_ome_metadata("/path/to/pyramid", scale_level=0)
open_ds("path/to/pyramid/0", offset=offset, voxel_size=voxel_size,axis_names=axis_names, units=units) The OME-NGFF metadata spec is quite extensive so it is probably better to use a library dedicated to it rather than writing a custom implementation in If you have a good library for reading/writing OME-NGFF metadata let me know, I plan on adding a |
is there anyway to have a
|
Not at the moment but that would be a good feature to add |
Hi @pattonw did you encounter any issues when testing
I tried to look into ome-zarr-py repo. Perhaps it makes sense to utilize |
update, after funlib.persistence changes, DaCapo+funlib.show.neuroglancer are not working anymore with all cellmap data |
Add basic support for multiscale.
array -> multiscale helper function
open array from ome-ngff compliant pyramid
The text was updated successfully, but these errors were encountered: