Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example showing standard linear algebra decompositions and solver #1003

Open
lessthanoptimal opened this issue Sep 29, 2020 · 11 comments
Open

Comments

@lessthanoptimal
Copy link

Issue Description

I'm in the process of adding ND4J to a Java linear algebra benchmark but I'm having trouble adding decompositions like, Cholesky, QR, SVD, Eigen, ... etc or solving for square or rectangular systems. Could examples be added for these operations? I was able to find example code for matrix multiplication and element-wise operations just fine. I can see the JavaDoc for these operations but it's fairly opaque. Thanks.

@saudet
Copy link
Contributor

saudet commented Sep 29, 2020

/cc @rcorbish

@lessthanoptimal
Copy link
Author

bump. I'll be updating the benchmark soon and would like to include this library. Thanks

@Lundez
Copy link

Lundez commented Jan 19, 2021

@lessthanoptimal all the docs has been gathered under Deeplearning4J (DL4J).
API: https://deeplearning4j.org/api/latest/
"Guide": https://deeplearning4j.konduit.ai/operation-namespaces/linalg

It's a bit underwhelming, but the API isn't too hard to figure out through the use of the guide (even though it might not be the best documented guide it shows the methods).

As you can see Cholesky, QR, SVD (svd) & more exists in the linalg guide.
Not sure if actually eigen exists (see: deeplearning4j/deeplearning4j#6764)

@lessthanoptimal
Copy link
Author

I'm working on this now and even after looking at those links I would consider this to be excessively difficult. So let's turn this discussion into documentation for ND4J. Could someone (maybe @Lundez ) provide simple code example for how to compute a Cholesky decomposition.

To make this easier for your users I would add these simple examples to https://deeplearning4j.konduit.ai/operation-namespaces/linalg . Right now, by looking at the functions in Cholesky, I can see where I specify the inputs, get the outputs, but I don't see a function which screams "compute" or know if the inputs are modified or not, as they are in some implementations of Cholesky.

@Lundez
Copy link

Lundez commented Feb 1, 2021

@lessthanoptimal hi again,
I'll take a look later this week to confirm how it's done.

I know nd4j & benchmarking is not always the easiest where the result could be optimized sometimes by using r-major rather than c-major and so on on top of the usual JVM optimizations with GC / warm-up etc.
I think it makes sense to use the default operations while benchmarking, because the benchmark should be how a user uses it and not a professional ND4J-user, right? 😄

@Lundez
Copy link

Lundez commented Feb 6, 2021

@lessthanoptimal Hi,

Sorry for taking such time, but been busy at work (recently joined a new team). 😅

Anyhow, ND4J is not always the clearest, they use Executioners to execute operations on the INDArrays, at least when it comes to transformations such as Cosine Distance, SVD & Cholesky.
All of these implement XYZOp which in turn implements BaseOp and further Op.

In the Transforms they've got some wrappers for common usecases, such as cosineSim to make it simpler to use.
Unfortunately Cholesky is not included and as such you need to call the executioner yourself.

val someMatrix = Nd4j.createFromArray(arrayOf(
            arrayOf(19f,-3f,-3f),
            arrayOf(-3f,19f,15f),
            arrayOf(-3f,15f,13f),
        ))

val choleskyOp = Cholesky(someMatrix)
val choleskyDecomposition = Nd4j.getExecutioner().exec(choleskyOp).toList()
println(choleskyDecomposition)

Please note that this is in Kotlin and not Java, so some minor updates might be required. The toList() is added to make the print nicer (Array/[] is not pretty 😅)


Finally, ND4J has started adding a new API through the SameDiff with their SDVariable's.
I haven't wrapped my head around it really, but my feeling is that the INDArray is still as relevant as ever and that the SDVariable is more to be used for Neural Networks & integration with tensorflow.

I hope this was helpful 😄

@lessthanoptimal
Copy link
Author

Thanks! I've got more of it filled out with a bunch of educated guesses. There's a numerical stability test (which needs to be fixed) that should tell me if I coded it up correctly. As you mentioned before, the only ops I couldn't figure out was Eigenvalue Decomposition. That means ND4J will get dinged in the summary plots.

Do you know if there's any way to programically get the library's version? I'm hard coding it by hand right now.

@Lundez
Copy link

Lundez commented Feb 17, 2021

Do you know if there's any way to programically get the library's version?

What do you mean? I always use gradle to build/include libraries.
I found the latest version through https://mvnrepository.com/artifact/org.nd4j/nd4j-native
I guess I've hard-coded it also, not sure how else you'd do it?

@lessthanoptimal
Copy link
Author

Some libraries provide a way to get the version at runtime from the jar. In EJML I use this library that I wrote that autogenerates the code from Gradle

https://github.com/lessthanoptimal/gversion-plugin

Having said that, I guess I could write some Gradle code to dump the version of all its dependencies at compile time instead of trying to determine it at runtime.

@saudet
Copy link
Contributor

saudet commented Feb 18, 2021

@lessthanoptimal I believe this is what VersionCheck.getVersions() is for: https://github.com/eclipse/deeplearning4j/blob/master/nd4j/nd4j-backends/nd4j-api-parent/nd4j-api/src/main/java/org/nd4j/versioncheck/VersionCheck.java#L271

@lessthanoptimal
Copy link
Author

lessthanoptimal commented Jan 6, 2022

Working on the benchmark again and just posted new results, but needed to leave out ND4J since it's still incomplete. Anyone know how to do cholesky inverse of a SPD matrix or eigenvalue decompositions? I think that's all that's missing now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants