Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory issue #5

Open
SooLee opened this issue Jul 6, 2017 · 5 comments
Open

memory issue #5

SooLee opened this issue Jul 6, 2017 · 5 comments

Comments

@SooLee
Copy link
Member

SooLee commented Jul 6, 2017

We had an occasion where >15GB mem was required for 749.2 MB hic file. Could it be optimized a bit more?

@sameet
Copy link

sameet commented Sep 29, 2017

As a followup, what are general memory requirements for such a conversion?

@carlvitzthum
Copy link
Collaborator

@sameet I have released version 0.4.0, which takes big steps towards improving memory usage and overall runtime. In my tests, hic files up to 10GB consumed up to 2 GB of memory locally. I will leave this issue open for now, since I have further memory and speed optimizations in mind that I will implement in a future version.

@Phlya
Copy link

Phlya commented Apr 19, 2019

I'm just trying to convert the biggest files from Rao et al 2014. I am doing it on a cluster where I request 32Gb memory, but the job dies with the following memory error:

Traceback (most recent call last):
  File "/exports/igmm/eddie/wendy-lab/ilia/condaenvs/dotfinder/bin/hic2cool", line 11, in <module>
    load_entry_point('hic2cool==0.5.1', 'console_scripts', 'hic2cool')()
  File "/exports/igmm/eddie/wendy-lab/ilia/condaenvs/dotfinder/lib/python3.6/site-packages/hic2cool/__main__.py", line 80, in main
    hic2cool_convert(args.infile, args.outfile, args.resolution, args.warnings, args.silent)
  File "/exports/igmm/eddie/wendy-lab/ilia/condaenvs/dotfinder/lib/python3.6/site-packages/hic2cool/hic2cool_utils.py", line 870, in hic2cool_convert
    buf = mmap.mmap(req.fileno(), 0, access=mmap.ACCESS_READ)
OSError: [Errno 12] Cannot allocate memory

After requesting 64Gb, it seems to be running, and maxvmem it uses is 38Gb! That is almost the size of the hic file (~40Gb). This seems a bit extreme! Is this normal?

$ hic2cool --version
hic2cool 0.5.1

@carlvitzthum
Copy link
Collaborator

Thanks for reporting this! You may be the first to use hic2cool to convert a file of that size and your feedback is helpful. I will look into why such a memory requirement is so high when I get a chance.

Best,
Carl

@Phlya
Copy link

Phlya commented Apr 19, 2019

Maybe the format has changed, but that's the filesize of original Rao et al 2014 .hic files, I'm sure people have converted them to coolers... I just got them from here https://www.ncbi.nlm.nih.gov/geo/query/acc.cgi?acc=GSE63525

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants