Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

parsing files into tensor networks #5

Open
aneksteind opened this issue May 15, 2020 · 5 comments
Open

parsing files into tensor networks #5

aneksteind opened this issue May 15, 2020 · 5 comments

Comments

@aneksteind
Copy link

I was wondering if there exists functionality in cotengra or quimb or opt_einsum to parse a file for the purposes of calculating contraction costs alone, without needing or caring about the elements contained within the tensors themselves.

For instance, networkx has a read_edgelist function and kahypar has a way to parse files in the hMetis format to build hypergraphs. One or both of these could be used to construct tensor networks with arbitrary elements where the aim is to optimize contraction costs.

If something like this doesn't exist, I'd be interested in working on it. Please advise.

@jcmgray
Copy link
Owner

jcmgray commented May 15, 2020

So opt_einsum has the contract_path function, which just returns information about the path. It has a shapes=True argument if you just want to specify shapes not arrays.

import opt_einsum as oe

eq, shapes  = oe.helpers.rand_equation(10, 3)
path, info = oe.contract_path(eq, *shapes, shapes=True)

This is the function that quimb/cotengra calls in the example lines like info = tn.contract(all, optimize=opt, get='path-info').

opt_einsum also has the contract_expression which is a reuseable function to call with different sets of arrays (with different elements). quimb indeed caches these so that once a path is found for a particular equation and set of shapes the path finding is not run again. In a recent commit I added caching these to disk.

If there is a specific filetype you want to work with, hopefully it is just a matter of converting it into the eq, shapes format?

@aneksteind
Copy link
Author

aneksteind commented May 15, 2020

I think some sort of file format that can be parsed into something like *shapes argument along with shared indices among tensors would be perfect; it doesn't need to be an intermediate data structure like qtn.TensorNetwork. It could be as simple as

((index_1_dim, index_1_name), ..., (index_N_dim, index_N_name))

for each tensor. That way there's no need to assume the expression in the event a partial contraction of the network is desired.

The hMetis format would still cover conventional graphs, so that might be a good candidate. That would translate well into this library's best-performing heuristic too I believe.

@jcmgray
Copy link
Owner

jcmgray commented May 15, 2020

Is your initial thought to easily load 'hmetis' files to find contraction orders for, or are you wanting to try hmetis itself as a hypergraph partitioner?

In the later case cotengra has HyperGraph.to_hmetis_file that might be useful. I haven't done any tests myself but my understanding is that kahypar is more modern and outperforms hmetis - at least in terms of partitioning. It would be interesting to check though!

In the first case, it would be very simple to write a function to do the opposite that translates the hmetis format to eq, shapes.

(By the way, opt_einsum also accepts input like:

contract_path(shapes[0], inds[0], shapes[1], inds[1], ..., shapes=True)

)

Or possibly you are suggesting a canonical file format for these libraries...? Along these lines it might be best to open an issue in opt_einsum simply suggesting load_from_hmetis and save_to_hmetis utility functions for translating to the 'oe' format.

@aneksteind
Copy link
Author

aneksteind commented May 15, 2020

Is your initial thought to easily load 'hmetis' files to find contraction orders for, or are you wanting to try hmetis itself as a hypergraph partitioner?

More the former, indirectly, to prepare input as you've outlined for contract_path. I think the suggestions you've made for load_from_hmetis and save_to_hmetis are exactly the kind of thing I'm talking about (although it doesn't need to be canonical, perhaps just one of many options), so I can open a similar issue there if you think that's the more appropriate location over cotengra or quimb. Feel free to close the issue if you think that's the case.

@jcmgray
Copy link
Owner

jcmgray commented May 15, 2020

I think opt_einsum is the place for this kind of general, low level functionality since it essentially defines the contraction format. contegra is more aimed at advanced optimizers, and quimb more at manipulation tensor networks.

On the other hand, if it is deemed not a good fit in opt_einsum then happy to have these utilities added to cotengra!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants