-
Notifications
You must be signed in to change notification settings - Fork 103
[ERROR]Task Node(0-rawreads/build) failed with exit-code=256 #709
Comments
Something is wrong with your environment: Is something altering your PATH? See these two issues for reference: |
Hi, Thank you very much for your prompt response. I am getting some assistance setting up Falcon to run on a Slurm cluster. I will pass along the information to see if that can fix the dependency issue (for I also see a message about the FASTA files in my e-mail, but I am not currently seeing that response in this issue. For the Is there a way to specify that I am providing FASTQ files instead of FASTA files in the configuration file? Thank You, |
I commented first on the fasta file error, but then I saw the fasta2DB command not found which supercedes the invalid FASTA file message, so I deleted that original response. As long as the fasta is present & valid, it should be fine, the larger issue here is why the environment is unable to find the |
Hi Greg, That is OK, except I provided FASTQ files (so, the path should be correct, but the filetype should be different). Can I provide FASTQ files, or do I need to provide FASTA files (which I believe are also prepared in another folder, but they lack quality scores). Thank You, |
The FALCON part of the pipeline does actually require the input data to be in fasta format, so fastq will not work, so definitely you should fix that after you solve the |
Thank you very much - I will do that. After i fix the Thanks again! |
I am not sure if this should be a separate ticket, but the IT support tried to fix the problem by using a different version of Falcon. There are now multiple stderr files, so I am not sure what I should attach. However, this is the output that I see:
and this is the configuration file that I am using:
I am not sure if this is the most important part of the output, but is |
Great, looks like you solved the environment issue!
You need to check for the stderr file in this |
I will try to re-run the analysis with more memory. In the meantime, I don't think I see a "Killed" message, but this is what I can see (before I go and delete the files when I re-run with more memory): Contents of
Linked stderr file (from
Thank you! |
Looks like the version of falcon you have installed doesn't have the
|
Thank you very much. I have removed that parameter from the configuration file. I apologize that it is hard for me to tell what has gone wrong. There are 14 "jobs" directories, so I am again not sure what I should attach. However, here is the similar information as before: The main output:
and this is the configuration file that I am using:
Thanks again! |
falcon is definitely a little tricky to troubleshoot until you get the hang of it. In general you want to take a look at the *.stderr in the directory that shows [ERROR] in the master log file. In this case it's |
I am not sure if it was what I was supposed to do, but I also tested removing Assuming that I should still be looking for the stderr for the Please let me know if you need any additional information. |
Oh interesting, I'm not sure, but I suspect you are running an older version of falcon. at one point the falcon_sense_option had a bug which was later fixed where the command line arguments were specified with an underscore instead of a dash. |
Yes - that I correct: I first changed the format for the newest version, but the name to load in the module system (Falcon/2018.03.12) made me think that an earlier version of Falcon was being used. I also think that I saw that in the logs. I apologize that I didn't think about the formatting issue before. If I understand this tutorial correctly, then I don't think Falcon successfully finished running. I am not sure if I should upload the same stderr file, but here it is (just in case, from Also, here is the main output:
and here is the configuration file that I am using:
Please let me know if you need any additional information. Thanks again! |
|
I think I misread the "done" counts in the completed jobs section. I noticed some FASTA files, but I think they were smaller than expected. For example, therse the files in the
I think the largest sequence within the 4 FASTA files was 30,000 bp in Am I looking in the right place? If so, maybe you can help explain what that name means? For example, I think 111296 bp is somewhat close to the expected size, but that is not the length of the sequence. Also, these are BAC sequences, so they are actually circular. Since I see However, if you are saying this represents a complete assembly without error messages, then I will close this ticket. |
For the separate BAC assembly question, I see these discussions: However, I apologize, but I am still not quite sure how to tell Falcon to expect circular assemblies. Is the only option to auto-recognize the assembly type? I can post this as a separate question, if there is a way to do some separate troubleshooting for this. |
Falcon exited successfully if you have data written in the
There is no parameter to specify whether you expect a circular assembly or not, the algorithm just classifies what it detects one way or the other. |
You can post as a separate question if you're still interested. As stated in my other reply there is no way to BAC CLR data is tricky due to its high coverage nature and how the FALCON algorithm was designed. |
Hi Greg, Thank you very much for your help! In this situation, the earliest HGAP3 assembly was similar in size to what was expected. I more recently ran Canu. This produced an an assembly that was smaller than HGAP3 but larger than what I described for Falcon. When I ran Circlator on the Canu result and corrected reads, it said that the intermediate size assembly could be confirmed to be circular (Circlator didn't add back in the ~20,000 bp of extra sequence in the HGAP3 assembly). I will take look at the amplicon link that you described. I am trying to run Falcon via command line because this is RSII data that won't work with the version of SMRT Link that we have configured. I think we will also eventually send PacBio an e-mail regarding troubleshooting some BAC samples, but there is additional analysis and discussion that we would like to have first. Thanks yet again! Sincerely, |
Hi,
I am trying to use FALCON to run something similar to HGAP for RSII data (since the version of SMRT Link that I have access to is version 8.0, and not compatible with RS II subreads).
I see that there have been other postings with similar error messages:
#496
#549
#645
It looks like one recommendation was to check the stderror file within the
0-rawreads
folder.While not precisely in that location, I did see a link to a stderr file under
mypwatcher/jobs/P8d2bcec4643311
. That file has a relatively large number of lines, so I have attached the file with a modified extension to allow the upload (rather than showing all of the contents).stderr
I also made a modification to the file paths, for the public upload
In general, this the output that I am seeing:
And these are the configurations that I am using:
Can you please help me troubleshoot this problem?
Thank You,
Charles
The text was updated successfully, but these errors were encountered: