-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
should reductions return dense arrays? #99
Comments
It's not possible. If there's a missing row or missing column (which is totally possible per the spec) it is required to be sparse. If a (subgraph) has no edges for that node then we have to return If there are no missing entries, then it will return a "dense" GBMatrix. That can then be unpacked or converted by a memcopy to a Matrix. I'm working on a new library that should support something like this, but it's a ways off. And we still want to support the graph method. More generally if you're asking about |
The semantics of GBMatrix seems not to be entirely consistent with that of an julia> x = GBMatrix([1,2], [2, 3], [1,2], fill=100000)
2x3 GraphBLAS int64_t matrix, bitmap by row
2 entries, memory: 264 bytes
(1,2) 1
(2,3) 2
julia> x[1,1]
100000
julia> sum(x)
3 and so we have this weird thing that manually computing the sum with a for loop gives a different value compared to |
Yes, so there's a significant issue here I've yet to resolve. The main problem is that This is solvable for But then you get to multiplication and I don't have a great answer. Do you think The performance impact of densifying would be immense. |
Keep in mind also, the use of this library as a SparseArray is mostly secondary to its use for graph algorithms. The
Note: |
Reductions on
SparseMatrixCSC
return dense arrays. Should the same happen forGBMatrix
?Current behavior is
The text was updated successfully, but these errors were encountered: