Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Future: Just in time rendering #45

Open
josiahseaman opened this issue Jan 10, 2018 · 1 comment
Open

Future: Just in time rendering #45

josiahseaman opened this issue Jan 10, 2018 · 1 comment

Comments

@josiahseaman
Copy link
Owner

This feature is something I'll put off for now, but I know it's possible.

Given a fasta file, you could skip the entire image write and zoom stack steps which constitute about 80% of the total compute time. This would mean you could open a very large fasta file and immediately start browsing interactively.

HDF5 - random access file format - read in file, build index of where in memory to look, only render what's necessary. Could intercept OpenSeaDragon file requests and render those layers just in time. Would require sparse sampling of points (whole fasta in memory?)

I think realistically, this feature is lower priority than many types of annotation support.

@hyanwong
Copy link

For what it's worth, we are trying to move away from HDF5 and have started to use zarr (see e.g. http://alimanfoo.github.io/2016/04/14/to-hdf5-and-beyond.html). The author of zarr works just across the room from me, and is a very good programmer, so I trust this library.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants