Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Arbitrary tensor products #287

Open
experiment9123 opened this issue Apr 26, 2021 · 1 comment
Open

Arbitrary tensor products #287

experiment9123 opened this issue Apr 26, 2021 · 1 comment

Comments

@experiment9123
Copy link

ideas/discussion - "sparsity aware element feedback"
use case would be learning rules for AI using sparse matrices, e.g. steps of backprop modifying a matrix of weights, or whatever else AI researchers may imagine. (eg backprop: 'vec_a' would be previous layer activations, 'vec_b' would be error values). Also further bridges the gap between a "sparse matrix lib" and 'graph processing'

  • is there an existing interface in any existing matrix libraries that supports this functionality?
  • are there other ways of expressing this (like "a hadamard product of a sparse matrix and (the tensor product of vec_a,vec_b)")?
  • is this already possible?
    my prefered option would be an interface that takes a lambda function, then people could use that to apply whatever operations they liked (eg expressing a product operation). (Personally I also think it would also be interesting to implement matrix multiply via a traversal taking generalised element combination & reduction functions aswel..)

This is trivial enough for dense matrices, fairly trivial for COO vs dense vectors, trickier for any compressed sparse formats X sparse vectors, and where it would get extremely useful (and difficult) is threaded implementations of these.

One may also want to consider different permuations of what to do with empty elements (eg would we want to apply this with all 'a[j]','b[j]',and m[i][j] occupied, or any occupied m[i][j] for either a[i] or b[j] occupied
IMG_3832

@mulimoen
Copy link
Collaborator

The hadamard product is rather simple, but requires left and rigth to have the same sparsity structure. In which case you can iterate over vec.data() of lhs and rhs.

For the tensor product we have some code, as this is simply a Nx1 by 1xM´ matrix product. You can have a look at the smmp` part of the code to get an idea on how to parallelize this. The serial part of this requires figuring out the sparsity pattern of the output, but this section could be optimized in the case of vectors.

@mulimoen mulimoen changed the title 'element feedback' Arbitrary tensor products Apr 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants