Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Would it be possible to do a non-negative partial Tucker factorization? #498

Open
MelanieN opened this issue Apr 3, 2023 · 8 comments
Open
Labels
new feature Adding a new feature/method/algorithm

Comments

@MelanieN
Copy link

MelanieN commented Apr 3, 2023

Is your feature request related to a problem? Please describe.

I have non-negative data and only want to decompose wrt one mode.

Describe the solution you'd like

A combined algorithm on nn Tucker and partial Tucker

Describe alternatives you've considered

Trying to combine them myself

@cohenjer
Copy link
Contributor

Hi @MelanieN,
What you are asking for is something we have implemented for CP and nonnegative CP, but not yet for Tucker.
If I understood your issue correctly, you would want to fit a nonnegative Tucker decomposition, say with factors (U,V,W) and a core G, but e.g. U and V are already known and fixed?
For CP we have an option skip_mode which allows you to not update some of the modes. It would be fairly easy, in my opinion, to add this functionality to Tucker, if this is what you are looking for.

@JeanKossaifi
Copy link
Member

Thanks @cohenjer. I think @MelanieN refers to the partial_tucker in which case we don't have any projection matrix at all along some of the modes.

We could use a full Tucker along with a skip_mode param and create a (fixed) identity for that mode though ideally we'd just want to make an NN partial_tucker and completely skip computation along the skip modes.

@cohenjer
Copy link
Contributor

@JeanKossaifi I see, sorry I misunderstood the issue. I don't ever use partial Tucker :)

We could definitively implement a partial nonnegative Tucker. I am working on nonnegative Tucker these days, so I might first add a skip_mode option at first; if I find the time I will also do the nonnegative_partial_tucker.

@JeanKossaifi
Copy link
Member

JeanKossaifi commented Apr 12, 2023

Awesome! As a side note, going the other way may be easier: tucker is just partial_tucker with no skipped mode.
Typically the main difference would just be when unfolding the tensor along the modes to optimize, we just have this skipped mode:

$$X_{[n]} = U_n * (\text{Core} \times_{i=1...n,\; i != n,\; i \text{ not in skiped modes}} U_i )_{[n]}$$

which is a more efficient way to compute the expression using the Kronecker and U_k = Id for the skipped modes:

$$X_{[n]} = U_n * \text{Core}_{[n]} (U_1 \otimes \cdots \otimes U_{n-1} \otimes U_{n+1} \otimes \cdots U_{N})^\top$$

@MelanieN
Copy link
Author

Hi, thanks for the reply! It is indeed as @JeanKossaifi mentions, but I am not so familiar with tensor theory, just applying it. I look forward to any updates on this topic!

@cohenjer
Copy link
Contributor

Also I just remembered, but we do have a skip-mode option in non_negative_tucker_hals which is another algorithm for computing NTD that I would recommend giving a try.

So to solve your issue, although this is not ideal performance-wise, you can do the following:
1/ use non_negative_tucker_hals() to compute NTD
2/ in the init= field, choose a tucker tensor where you have identity matrices along the modes that should not be decomposed.
3/ in the fixed_modes= field, put a list with the indexes of the modes where the tensor is not decomposed.
This should work just fine. Here is some tentative example code, if you want to decompose a tensor along modes 1 and 2 but keep the mode 0 fixed:

import tensorly as tl
from tensorly.decomposition import non_negative_tucker_hals

data = tl.abs(tl.randn([5,5,5]))
tucker_init = tl.tucker_tensor.TuckerTensor((tl.abs(tl.randn([5,3,3])),[tl.eye(5),tl.abs(tl.randn([5,3])), tl.abs(tl.randn([5,3]))]))
out = non_negative_tucker_hals(data, [5,3,3], init=tucker_init, fixed_modes=[0])

@MelanieN
Copy link
Author

Thank you, I'll give this a try!

@JeanKossaifi JeanKossaifi added the new feature Adding a new feature/method/algorithm label May 25, 2023
@JeanKossaifi
Copy link
Member

@MelanieN did this work for you?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new feature Adding a new feature/method/algorithm
Projects
None yet
Development

No branches or pull requests

3 participants