You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Simulators usually generate reads with much higher error rate. The peak memory of the error corrector is sensitive to the error rate. This is an issue with fermikit.
Hi,
I have 50x simulated 1000g data with ART (http://www.niehs.nih.gov/research/resources/software/biostatistics/art/). I'm trying to run fermikit on this data and our cluster is killing the job when it passes 170G RAM. Do you have any suggestions for decreasing memory usage?
The only thing in the log is the following:
I'm using NA12877 vcf from GiaB, converted to fasta reference, and then simulating using ART with the following characteristics:
The text was updated successfully, but these errors were encountered: