Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implemented Cholesky decomposition to numpy and tensorflow backends #890

Open
wants to merge 29 commits into
base: master
Choose a base branch
from

Conversation

LuiGiovanni
Copy link
Contributor

Added the decomposition function following the idea of having a pivot on the tensor, this is in reference to issue #852. which we then reshape into a matrix for the Cholesky function used in NumPy linear algebra. which you may find here for reference.

I Will start working on adding the function to the missing backend which is symmetric since Jax uses NumPy's decomposition file. I await any feedback you may have and that you for the help!

@google-cla google-cla bot added the cla: yes label Dec 23, 2020
@LuiGiovanni
Copy link
Contributor Author

@alewis @mganahl Hello! I implemented the Cholesky decomposition to various backends following the idea of a pivot value for the tensors, with their respective tests where I defined a matrix that is assured to be positive-definite and also a hermitian matrix:

[1, 0]
[0, 1]

I assumed we won't be validating if the matrices are hermitian or positive-definite since there could be some performance cost in doing so (at least for positive-definite checking) the addition of a hermitian check would be nice to have.

A Pylint error has been occurring in the test runs, but it is somewhere in the code where I haven't made changes to. the file in question is tensornetwork/tn_keras/conv2d_mpo.py which seems to only be affected in versions 3.6 and 3.7, this issue is also occurring in my other PR #889 maybe freezing the version could fix this but this is a lazy solution and would not solve it for all cases, so I will wait for your feedback to see what can be done here. Here is an image of the error for reference.

image



def cholesky(
tf: Any,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename tf to np (the user passes the numpy module here)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ya

@@ -52,6 +52,13 @@ def test_qr(self):
q, r = decompositions.qr(np, random_matrix, 1, non_negative_diagonal)
self.assertAllClose(q.dot(r), random_matrix)

def test_cholesky(self):
#Assured positive-definite hermitian matrixs
random_matrix = [[1.0, 0], [0, 1.0]]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using the identity matrix as "random" matrix is not an ideal test case. You can generate a positive definite matrix e.g. by

A = np.random.rand(D,D)
A  = A @ A.T.conj()

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

btw. also test this for all supported dtypes

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do np.random.rand(D, D).astype(dtype) to get your fav type. if it's imaginary you might need to initialize the real and imaginary parts separately.

put np.random_seed(10) (or any other fixed integer) before the rand so that the matrix is fixed.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no dtype defined in any of the tests, should I define one myself?

[tf.reduce_prod(left_dims),
tf.reduce_prod(right_dims)])
L = tf.linalg.cholesky(tensor)
if non_negative_diagonal:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is unnecessary. The cholesky decomposition should already return a lower-triangular matrix with a real, positive diagonal.

tf: Any,
tensor: Tensor,
pivot_axis: int,
non_negative_diagonal: bool
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove non_negative argument

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(it only makes sense for QR)

torch: Any,
tensor: Tensor,
pivot_axis: int,
non_negative_diagonal: bool = False
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove non_negative_diagonal (see above)

@@ -42,6 +42,13 @@ def test_expected_shapes_rq():
assert r.shape == (2, 3, 6)
assert q.shape == (6, 4, 5)

def test_cholesky():
#Assured positive-definite hermitian matrix
random_matrix = np.array([[1.0, 0], [0, 1.0]])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use better test matrix (see above)

#Assured positive-definite hermitian matrix
random_matrix = np.array([[1.0, 0], [0, 1.0]])
random_matrix = torch.from_numpy(random_matrix)
for non_negative_diagonal in [True, False]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

tf: Any,
tensor: Tensor,
pivot_axis: int,
non_negative_diagonal: bool
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

[tf.reduce_prod(left_dims),
tf.reduce_prod(right_dims)])
L = tf.linalg.cholesky(tensor)
if non_negative_diagonal:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see above

@@ -54,6 +54,13 @@ def test_qr(self):
q, r = decompositions.qr(tf, random_matrix, 1, non_negative_diagonal)
self.assertAllClose(tf.tensordot(q, r, ([1], [0])), random_matrix)

def test_cholesky(self):
#Assured positive-definite hermitian matrix
random_matrix = [[1.0, 0], [0, 1.0]]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modify test, see above

@mganahl
Copy link
Collaborator

mganahl commented Dec 28, 2020

A Pylint error has been occurring in the test runs, but it is somewhere in the code where I haven't made changes to. the file in question is tensornetwork/tn_keras/conv2d_mpo.py which seems to only be affected in versions 3.6 and 3.7, this issue is also occurring in my other PR #889 maybe freezing the version could fix this but this is a lazy solution and would not solve it for all cases, so I will wait for your feedback to see what can be done here. Here is an image of the error for reference.

image

Thanks for pointing this out. Likely this is due to a new tensorflow release. I'll fix it.

@codecov-io
Copy link

codecov-io commented Dec 28, 2020

Codecov Report

Merging #890 (92c931a) into master (f669725) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #890   +/-   ##
=======================================
  Coverage   98.00%   98.00%           
=======================================
  Files         129      129           
  Lines       22625    22665   +40     
=======================================
+ Hits        22173    22213   +40     
  Misses        452      452           
Impacted Files Coverage Δ
tensornetwork/backends/numpy/decompositions.py 100.00% <100.00%> (ø)
...ensornetwork/backends/numpy/decompositions_test.py 98.75% <100.00%> (+0.11%) ⬆️
tensornetwork/backends/pytorch/decompositions.py 100.00% <100.00%> (ø)
...sornetwork/backends/pytorch/decompositions_test.py 100.00% <100.00%> (ø)
...ensornetwork/backends/tensorflow/decompositions.py 100.00% <100.00%> (ø)
...network/backends/tensorflow/decompositions_test.py 98.96% <100.00%> (+0.08%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f669725...92c931a. Read the comment docs.



def cholesky(
tf: Any,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ya

tf: Any,
tensor: Tensor,
pivot_axis: int,
non_negative_diagonal: bool
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(it only makes sense for QR)

@@ -52,6 +52,13 @@ def test_qr(self):
q, r = decompositions.qr(np, random_matrix, 1, non_negative_diagonal)
self.assertAllClose(q.dot(r), random_matrix)

def test_cholesky(self):
#Assured positive-definite hermitian matrixs
random_matrix = [[1.0, 0], [0, 1.0]]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do np.random.rand(D, D).astype(dtype) to get your fav type. if it's imaginary you might need to initialize the real and imaginary parts separately.

put np.random_seed(10) (or any other fixed integer) before the rand so that the matrix is fixed.

L = phases[:, None] * L
center_dim = tf.shape(L)[1]
L = tf.reshape(L, tf.concat([left_dims, [center_dim]], axis=-1))
return L
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add a newline here (at the end of the file)

@LuiGiovanni
Copy link
Contributor Author

Hi there @mganahl @alewis In the end I implemented the test following your requested changes and tried these changes out, I'd like your opinion on them. The biggest difference being:

image

torch.from_numpy(random_ matrix) since it wasn't reading it like a tensor. Let me know what you think. Thank you!

#Assured positive-definite hermitian matrixs
random_matrix = np.random.rand(10, 10)
random_matrix = random_matrix @ random_matrix.T.conj()
L = decompositions.cholesky(tf, random_matrix, 1)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you are passing the tensorflow module here, but it should be numpy

left_dims = np.shape(tensor)[:pivot_axis]
right_dims = np.shape(tensor)[pivot_axis:]
tensor = np.reshape(tensor,
[np.reduce_prod(left_dims),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

np is the numpy module. numpy does not have reduce_prod, that's a tensorflow routine. This only passes the test because in decompositions_test.py you are erroneously passing the tensorflow module instead of the numpy module to numpy.decompositions.cholesky. pls fix this

right_dims = list(tensor.shape)[pivot_axis:]

tensor = torch.reshape(tensor, (np.prod(left_dims), np.prod(right_dims)))
L = np.linalg.cholesky(tensor)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you have to call the torch version of choleksy here, not numpy.

@mganahl
Copy link
Collaborator

mganahl commented Jan 18, 2021

Hi @LuiGiovanni, once the comments are fixed we can merge this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants