You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ideas/discussion - "sparsity aware element feedback"
use case would be learning rules for AI using sparse matrices, e.g. steps of backprop modifying a matrix of weights, or whatever else AI researchers may imagine. (eg backprop: 'vec_a' would be previous layer activations, 'vec_b' would be error values). Also further bridges the gap between a "sparse matrix lib" and 'graph processing'
is there an existing interface in any existing matrix libraries that supports this functionality?
are there other ways of expressing this (like "a hadamard product of a sparse matrix and (the tensor product of vec_a,vec_b)")?
is this already possible?
my prefered option would be an interface that takes a lambda function, then people could use that to apply whatever operations they liked (eg expressing a product operation). (Personally I also think it would also be interesting to implement matrix multiply via a traversal taking generalised element combination & reduction functions aswel..)
This is trivial enough for dense matrices, fairly trivial for COO vs dense vectors, trickier for any compressed sparse formats X sparse vectors, and where it would get extremely useful (and difficult) is threaded implementations of these.
One may also want to consider different permuations of what to do with empty elements (eg would we want to apply this with all 'a[j]','b[j]',and m[i][j] occupied, or any occupied m[i][j] for either a[i] or b[j] occupied
The text was updated successfully, but these errors were encountered:
The hadamard product is rather simple, but requires left and rigth to have the same sparsity structure. In which case you can iterate over vec.data() of lhs and rhs.
For the tensor product we have some code, as this is simply a Nx1 by 1xM´ matrix product. You can have a look at the smmp` part of the code to get an idea on how to parallelize this. The serial part of this requires figuring out the sparsity pattern of the output, but this section could be optimized in the case of vectors.
mulimoen
changed the title
'element feedback'
Arbitrary tensor products
Apr 26, 2021
ideas/discussion - "sparsity aware element feedback"
use case would be learning rules for AI using sparse matrices, e.g. steps of backprop modifying a matrix of weights, or whatever else AI researchers may imagine. (eg backprop: 'vec_a' would be previous layer activations, 'vec_b' would be error values). Also further bridges the gap between a "sparse matrix lib" and 'graph processing'
my prefered option would be an interface that takes a lambda function, then people could use that to apply whatever operations they liked (eg expressing a product operation). (Personally I also think it would also be interesting to implement matrix multiply via a traversal taking generalised element combination & reduction functions aswel..)
This is trivial enough for dense matrices, fairly trivial for COO vs dense vectors, trickier for any compressed sparse formats X sparse vectors, and where it would get extremely useful (and difficult) is threaded implementations of these.
One may also want to consider different permuations of what to do with empty elements (eg would we want to apply this with all 'a[j]','b[j]',and m[i][j] occupied, or any occupied m[i][j] for either a[i] or b[j] occupied
The text was updated successfully, but these errors were encountered: