Skip to content
This repository has been archived by the owner on Mar 16, 2022. It is now read-only.

[ERROR]Task Node(0-rawreads/build) failed with exit-code=256 #709

Closed
cwarden45 opened this issue Sep 8, 2020 · 21 comments
Closed

[ERROR]Task Node(0-rawreads/build) failed with exit-code=256 #709

cwarden45 opened this issue Sep 8, 2020 · 21 comments

Comments

@cwarden45
Copy link

Hi,

I am trying to use FALCON to run something similar to HGAP for RSII data (since the version of SMRT Link that I have access to is version 8.0, and not compatible with RS II subreads).

I see that there have been other postings with similar error messages:

#496
#549
#645

It looks like one recommendation was to check the stderror file within the 0-rawreads folder.

While not precisely in that location, I did see a link to a stderr file under mypwatcher/jobs/P8d2bcec4643311. That file has a relatively large number of lines, so I have attached the file with a modified extension to allow the upload (rather than showing all of the contents).

stderr

I also made a modification to the file paths, for the public upload

In general, this the output that I am seeing:

falcon-kit 1.8.1 (pip thinks "falcon-kit 1.8.1")
pypeflow 2.3.0
[INFO]Setup logging from file "None".
[INFO]$ lfs setstripe -c 12 /path/to/BAC >
[INFO]Apparently '/path/to/BAC' is not in lustre filesystem, which is fine.
[INFO]fc_run started with configuration BAC_fa_subread-Apollo.cfg
[INFO]cfg=
{
  "General": {
    "LA4Falcon_preload": false,
    "avoid_text_file_busy": true,
    "bestn": 12,
    "dazcon": false,
    "falcon_sense_greedy": false,
    "falcon_sense_option": "--output-multi --output-dformat --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 16",
    "falcon_sense_skip_contained": false,
    "fc_ovlp_to_graph_option": " --min-len 10000",
    "genome_size": 0,
    "input_fofn": "102b15_FQ.fofn",
    "input_type": "raw",
    "length_cutoff": "10000",
    "length_cutoff_pr": "10000",
    "overlap_filtering_setting": "--max-diff 120 --max-cov 120 --min-cov 2 --n-core 16",
    "ovlp_DBdust_option": "",
    "ovlp_DBsplit_option": "-s200",
    "ovlp_HPCdaligner_option": "-v -l500",
    "ovlp_daligner_option": "-v -B4 -t32 -h60 -e.96 -l500 -s1000",
    "pa_DBdust_option": "",
    "pa_DBsplit_option": "-a -x500 -s200",
    "pa_HPCTANmask_option": "-l500",
    "pa_HPCdaligner_option": "-v",
    "pa_REPmask_code": "0,300/0,300/0,300",
    "pa_daligner_option": "-v -B4 -t16 -e.70 -l1000 -s1000",
    "pa_dazcon_option": "-j 4 -x -l 500",
    "pa_fasta_filter_option": "streamed-internal-median",
    "pa_subsample_coverage": 0,
    "pa_subsample_random_seed": 12345,
    "pa_subsample_strategy": "random",
    "seed_coverage": 20,
    "skip_checks": false,
    "target": "assembly"
  },
  "job.defaults": {
    "job_type": "local",
    "njobs": "16",
    "pwatcher_type": "fs_based",
    "use_tmpdir": false
  },
  "job.step.asm": {},
  "job.step.cns": {
    "njobs": "8"
  },
  "job.step.da": {},
  "job.step.dust": {},
  "job.step.la": {},
  "job.step.pda": {},
  "job.step.pla": {}
}
[INFO]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/opt/Falcon/2.2.4/lib/python3.8/site-packages/pwatcher/fs_based.py'>
[INFO]job_type='local', (default)job_defaults={'njobs': '16', 'job_type': 'local', 'pwatcher_type': 'fs_based', 'use_tmpdir': False}, use_tmpdir=False, squash=False, job_name_style=0
[INFO]Setting max_jobs to 16; was None
[INFO]Num unsatisfied: 2, graph: 2
[INFO]About to submit: Node(0-rawreads/build)
[INFO] starting job Job(jobid='P8d2bcec4643311', cmd='/bin/bash run.sh', rundir='/path/to/BAC/0-rawreads/build', options={'njobs': '16', 'job_type': 'local', 'pwatcher_type': 'fs_based', 'use_tmpdir': False, 'NPROC': 1, 'MB': 4000}) w/ job_type=LOCAL
[INFO]dir: '/path/to/BAC/mypwatcher/jobs/P8d2bcec4643311'
CALL:
 '/bin/bash /path/to/BAC/mypwatcher/wrappers/run-P8d2bcec4643311.bash 1>|stdout 2>|stderr & '
[INFO]pid=21253 pgid=21242 sub-pid=21287
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P8d2bcec4643311', cmd='/bin/bash run.sh', rundir='/path/to/BAC/0-rawreads/build', options={'njobs': '16', 'job_type': 'local', 'pwatcher_type': 'fs_based', 'use_tmpdir': False, 'NPROC': 1, 'MB': 4000}), lang_exe='/bin/bash'))
[INFO](slept for another 0.0s -- another 1 loop iterations)
[INFO](slept for another 0.30000000000000004s -- another 2 loop iterations)
[INFO](slept for another 1.2000000000000002s -- another 3 loop iterations)
[ERROR]Task Node(0-rawreads/build) failed with exit-code=256
[ERROR]Some tasks are recently_done but not satisfied: {Node(0-rawreads/build)}
[ERROR]ready: set()
	submitted: set()
[ERROR]Failed to kill job for heartbeat 'heartbeat-P8d2bcec4643311' (which might mean it was already gone): FileNotFoundError(2, 'No such file or directory')
Traceback (most recent call last):
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/pypeflow/simple_pwatcher_bridge.py", line 278, in refreshTargets
    self._refreshTargets(updateFreq, exitOnFailure)
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/pypeflow/simple_pwatcher_bridge.py", line 362, in _refreshTargets
    raise Exception(msg)
Exception: Some tasks are recently_done but not satisfied: {Node(0-rawreads/build)}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/pwatcher/fs_based.py", line 611, in delete_heartbeat
    bjob.kill(state, heartbeat)
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/pwatcher/fs_based.py", line 273, in kill
    with open(heartbeat_fn) as ifs:
FileNotFoundError: [Errno 2] No such file or directory: '/path/to/BAC/mypwatcher/heartbeats/heartbeat-P8d2bcec4643311'
Traceback (most recent call last):
  File "/opt/Falcon/2.2.4/bin/fc_run.py", line 33, in <module>
    sys.exit(load_entry_point('falcon-kit==1.8.1', 'console_scripts', 'fc_run.py')())
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/falcon_kit/mains/run1.py", line 706, in main
    main1(argv[0], args.config, args.logger)
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/falcon_kit/mains/run1.py", line 71, in main1
    run(wf, config,
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/falcon_kit/mains/run1.py", line 219, in run
    gen_parallel_tasks(
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/falcon_kit/pype.py", line 106, in gen_parallel_tasks
    wf.refreshTargets()
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/pypeflow/simple_pwatcher_bridge.py", line 278, in refreshTargets
    self._refreshTargets(updateFreq, exitOnFailure)
  File "/opt/Falcon/2.2.4/lib/python3.8/site-packages/pypeflow/simple_pwatcher_bridge.py", line 362, in _refreshTargets
    raise Exception(msg)
Exception: Some tasks are recently_done but not satisfied: {Node(0-rawreads/build)}

And these are the configurations that I am using:

[General]

input_fofn = 102b15_FQ.fofn

input_type = raw

length_cutoff = 10000

length_cutoff_pr = 10000

pa_daligner_option =  -v -B4 -t16 -e.70 -l1000 -s1000
ovlp_daligner_option = -v -B4 -t32 -h60 -e.96 -l500 -s1000

pa_DBsplit_option = -a -x500 -s200
ovlp_DBsplit_option = -s200

falcon_sense_option = --output-multi --output-dformat --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 16

overlap_filtering_setting = --max-diff 120 --max-cov 120 --min-cov 2 --n-core 16

[job.defaults]
njobs = 16

job_type = local

[job.step.cns]
njobs = 8

Can you please help me troubleshoot this problem?

Thank You,
Charles

@gconcepcion
Copy link

gconcepcion commented Sep 8, 2020

Something is wrong with your environment:
build_db.sh: line 14: fasta2DB: command not found

Is something altering your PATH?

See these two issues for reference:
PacificBiosciences/FALCON-integrate#131
#361

@cwarden45
Copy link
Author

cwarden45 commented Sep 9, 2020

Hi,

Thank you very much for your prompt response.

I am getting some assistance setting up Falcon to run on a Slurm cluster. I will pass along the information to see if that can fix the dependency issue (for fasta2DB).

I also see a message about the FASTA files in my e-mail, but I am not currently seeing that response in this issue.

For the Exception: Invalid FASTA file None error, I specified the relative path (and they were FASTQ, not FASTA files). I tested the full path, but I am still getting the same error message.

Is there a way to specify that I am providing FASTQ files instead of FASTA files in the configuration file?

Thank You,
Charles

@gconcepcion
Copy link

I commented first on the fasta file error, but then I saw the fasta2DB command not found which supercedes the invalid FASTA file message, so I deleted that original response. As long as the fasta is present & valid, it should be fine, the larger issue here is why the environment is unable to find the fasta2DB binary

@cwarden45
Copy link
Author

Hi Greg,

That is OK, except I provided FASTQ files (so, the path should be correct, but the filetype should be different).

Can I provide FASTQ files, or do I need to provide FASTA files (which I believe are also prepared in another folder, but they lack quality scores).

Thank You,
Charles

@gconcepcion
Copy link

The FALCON part of the pipeline does actually require the input data to be in fasta format, so fastq will not work, so definitely you should fix that after you solve the fastaDB environment issue

@cwarden45
Copy link
Author

Thank you very much - I will do that. After i fix the fastaDB issue, I will either close the ticket or report my progress.

Thanks again!

@cwarden45
Copy link
Author

I am not sure if this should be a separate ticket, but the IT support tried to fix the problem by using a different version of Falcon.

There are now multiple stderr files, so I am not sure what I should attach.

However, this is the output that I see:

[2043]$(u'lfs setstripe -c 12 //path/to/BAC')
sh: lfs: command not found
[2043]WARNING: Call u'lfs setstripe -c 12 //path/to/BAC' returned 32512.
[INFO]Setup logging from file "None".
[INFO]fc_run started with configuration 102b15_fa_subread-Apollo.cfg
[INFO] No target specified, assuming "assembly" as target 
/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/run_support.py:462: UserWarning: Unexpected keys in input config: set(['pa_daligner_option', 'ovlp_daligner_option'])
  warnings.warn("Unexpected keys in input config: %s" % repr(unused))
[WARNING]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pwatcher/fs_based.pyc'>
[INFO]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pwatcher/fs_based.pyc'>
[INFO]job_type='local', job_queue=u'', sge_option='-pe smp 4 -q bigmem', use_tmpdir=False, squash=False, job_name_style=0
[INFO]u'Serializing 52 records'
[INFO]u"  Opened for write: u'./config.json'"
[INFO]u'  Serialized to 0.0MB as json'
[INFO]u"Deserializing from u'//path/to/BAC/config.json'"
[INFO]u"  Opened for read: u'//path/to/BAC/config.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 52 records'
[INFO]gen_task(python -m falcon_kit.mains.build_rdb --input-fofn-fn={input.raw_reads_fofn} --config-fn={input.config} --run-jobs-fn={output.run_jobs} --length-cutoff-fn={output.length_cutoff} --job-done-fn={output.db_build_done}
touch {output.db_build_done}

	inputs={u'raw_reads_fofn': '102b15_FA.fofn', u'config': u'config.json'},
	outputs={u'raw_reads_db': u'0-rawreads/raw_reads.db', u'run_jobs': u'0-rawreads/run_jobs.sh', u'length_cutoff': u'0-rawreads/length_cutoff', u'db_build_done': u'0-rawreads/rdb_build_done'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --skip-checks={params.skip_checks} --pread-aln={params.pread_aln} --run-jobs-fn={input.run_jobs} --db-fn={input.db} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'0-rawreads/run_jobs.sh', u'db': u'0-rawreads/raw_reads.db'},
	outputs={u'split': u'0-rawreads/daligner-split/all-units-of-work.json', u'bash_template': u'0-rawreads/daligner-split/daligner_bash_template.sh'})
[INFO]Num unsatisfied: 2, graph: 2
[INFO]About to submit: Node(0-rawreads)
[INFO] starting job Job(jobid='Pba6687a04c7046', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pba6687a04c7046', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]recently_satisfied: set([Node(0-rawreads)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/daligner-split)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/daligner-split"
[INFO] starting job Job(jobid='P8d899ce890d2ee', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P8d899ce890d2ee', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/daligner-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'0-rawreads/daligner-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/daligner-runs/{dal0_id}/some-done-files.json'}
[WARNING] wildcards={u'dal0_id': u'j_0000'}
[WARNING]  k=results, v=u'0-rawreads/daligner-runs/{dal0_id}/some-done-files.json'
[WARNING] wildcards={u'dal0_id': u'j_0000'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/daligner-split/all-units-of-work.json'},
	outputs={u'one': u'//path/to/BAC/0-rawreads/daligner-chunks/j_0000/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'//path/to/BAC/0-rawreads/daligner-chunks/j_0000/some-units-of-work.json', u'bash_template': u'0-rawreads/daligner-split/daligner_bash_template.sh'},
	outputs={u'results': u'0-rawreads/daligner-runs/j_0000/some-done-files.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/daligner-gathered"
[INFO]u"  Opened for write: u'0-rawreads/daligner-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobj_0000': u'//path/to/BAC/0-rawreads/daligner-runs/j_0000/some-done-files.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobj_0000': u'//path/to/BAC/0-rawreads/daligner-runs/j_0000/some-done-files.json'},
	outputs={u'gathered': u'0-rawreads/daligner-gathered/gathered-done-files.json', u'result_fn_list': u'0-rawreads/daligner-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_gather_las_list --gathered-fn={input.gathered} --las-paths-fn={output.las_paths}

	inputs={u'gathered': u'0-rawreads/daligner-gathered/gathered-done-files.json'},
	outputs={u'las_paths': u'0-rawreads/daligner-intermediate-gathered-las/gathered-las.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --run-jobs-fn={input.run_jobs} --gathered-las-fn={input.gathered_las} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'0-rawreads/run_jobs.sh', u'gathered_las': u'0-rawreads/daligner-intermediate-gathered-las/gathered-las.json'},
	outputs={u'split': u'0-rawreads/las-merge-split/all-units-of-work.json', u'bash_template': u'0-rawreads/las-merge-split/las-merge-bash-template.sh'})
[INFO]Num unsatisfied: 5, graph: 7
[INFO]About to submit: Node(0-rawreads/daligner-chunks/j_0000)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/daligner-chunks/j_0000"
[INFO] starting job Job(jobid='P1ba7269bc88c5a', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P1ba7269bc88c5a', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-chunks/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(0-rawreads/daligner-runs/j_0000)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/daligner-runs/j_0000"
[INFO] starting job Job(jobid='P345e7d6b75e12a', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P345e7d6b75e12a', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]sleep 1.0s
[INFO]sleep 1.1s
[INFO]sleep 1.2s
[INFO]sleep 1.3s
[INFO]sleep 1.4s
[INFO]sleep 1.5s
[INFO]sleep 1.6s
[INFO]sleep 1.7s
[INFO]sleep 1.8s
[INFO]sleep 1.9s
[INFO]sleep 2.0s
[INFO]sleep 2.1s
[INFO]sleep 2.2s
[INFO]sleep 2.3s
[INFO]sleep 2.4s
[INFO]sleep 2.5s
[INFO]sleep 2.6s
[INFO]sleep 2.7s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-runs/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(0-rawreads/daligner-gathered)
[INFO] starting job Job(jobid='P65e280ba73ba44', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P65e280ba73ba44', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(0-rawreads/daligner-intermediate-gathered-las)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/daligner-intermediate-gathered-las"
[INFO] starting job Job(jobid='P4f5ed38f680cb5', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P4f5ed38f680cb5', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-intermediate-gathered-las)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/las-merge-split)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/las-merge-split"
[INFO] starting job Job(jobid='P1751a7fa2d519e', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 20 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P1751a7fa2d519e', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 20 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/las-merge-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'0-rawreads/las-merge-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/las-merge-runs/{mer0_id}/some-las-paths.json'}
[WARNING] wildcards={u'mer0_id': u'm_00001'}
[WARNING]  k=results, v=u'0-rawreads/las-merge-runs/{mer0_id}/some-las-paths.json'
[WARNING] wildcards={u'mer0_id': u'm_00001'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/las-merge-split/all-units-of-work.json'},
	outputs={u'one': u'//path/to/BAC/0-rawreads/las-merge-chunks/m_00001/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'//path/to/BAC/0-rawreads/las-merge-chunks/m_00001/some-units-of-work.json', u'bash_template': u'0-rawreads/las-merge-split/las-merge-bash-template.sh'},
	outputs={u'results': u'0-rawreads/las-merge-runs/m_00001/some-las-paths.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/las-merge-gathered"
[INFO]u"  Opened for write: u'0-rawreads/las-merge-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobm_00001': u'//path/to/BAC/0-rawreads/las-merge-runs/m_00001/some-las-paths.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobm_00001': u'//path/to/BAC/0-rawreads/las-merge-runs/m_00001/some-las-paths.json'},
	outputs={u'gathered': u'0-rawreads/las-merge-gathered/gathered.json', u'result_fn_list': u'0-rawreads/las-merge-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_gather --gathered-fn={input.gathered} --p-id2las-fn={output.p_id2las} --las-fn={output.las}

	inputs={u'gathered': u'0-rawreads/las-merge-gathered/gathered.json'},
	outputs={u'p_id2las': u'0-rawreads/las-gather/p_id2las.json', u'las': u'0-rawreads/las-gather/las_fofn.json'})
[INFO]gen_task(python -m falcon_kit.mains.consensus_split --wildcards={params.wildcards} --p-id2las-fn={input.p_id2las} --db-fn={input.raw_reads_db} --length-cutoff-fn={input.length_cutoff} --config-fn={input.config} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'raw_reads_db': u'0-rawreads/raw_reads.db', u'config': u'config.json', u'p_id2las': u'0-rawreads/las-gather/p_id2las.json', u'length_cutoff': u'0-rawreads/length_cutoff'},
	outputs={u'split': u'0-rawreads/cns-split/split.json', u'bash_template': u'0-rawreads/cns-split/consensus-bash-template.sh'})
[INFO]Num unsatisfied: 5, graph: 12
[INFO]About to submit: Node(0-rawreads/las-merge-chunks/m_00001)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/las-merge-chunks/m_00001"
[INFO] starting job Job(jobid='P45773280344a39', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P45773280344a39', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-chunks/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(0-rawreads/las-merge-runs/m_00001)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/las-merge-runs/m_00001"
[INFO] starting job Job(jobid='P6b67acd634425c', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 20 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P6b67acd634425c', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 20 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-runs/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(0-rawreads/las-merge-gathered)
[INFO] starting job Job(jobid='P7c2983404e21cc', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P7c2983404e21cc', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(0-rawreads/las-gather)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/las-gather"
[INFO] starting job Job(jobid='P7cb073bb352017', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P7cb073bb352017', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/las-gather)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/cns-split)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/cns-split"
[INFO] starting job Job(jobid='P437edc68c958f9', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/cns-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P437edc68c958f9', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/cns-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/cns-split/split.json'"
[INFO]u"  Opened for read: u'0-rawreads/cns-split/split.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/cns-runs/{cns0_id}/some-done-files.json'}
[WARNING] wildcards={u'cns0_id2': u'cns_00001', u'cns0_id': u'cns_00001'}
[WARNING]  k=results, v=u'0-rawreads/cns-runs/{cns0_id}/some-done-files.json'
[WARNING] wildcards={u'cns0_id2': u'cns_00001', u'cns0_id': u'cns_00001'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/cns-split/split.json'},
	outputs={u'one': u'//path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'//path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json', u'bash_template': u'0-rawreads/cns-split/consensus-bash-template.sh'},
	outputs={u'results': u'0-rawreads/cns-runs/cns_00001/some-done-files.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/cns-gather"
[INFO]u"  Opened for write: u'0-rawreads/cns-gather/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobcns_00001_cns_00001': u'//path/to/BAC/0-rawreads/cns-runs/cns_00001/some-done-files.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobcns_00001_cns_00001': u'//path/to/BAC/0-rawreads/cns-runs/cns_00001/some-done-files.json'},
	outputs={u'gathered': u'0-rawreads/cns-gather/gathered.json', u'result_fn_list': u'0-rawreads/cns-gather/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.consensus_gather_fasta_fofn --gathered-fn={input.gathered} --preads-fofn-fn={output.preads_fofn}

	inputs={u'gathered': u'0-rawreads/cns-gather/gathered.json'},
	outputs={u'preads_fofn': u'0-rawreads/preads/input_preads.fofn'})
[INFO]gen_task(python -m falcon_kit.mains.task_report_pre_assembly --config-fn={input.config} --length-cutoff-fn={input.length_cutoff} --raw-reads-db-fn={input.raw_reads_db} --preads-fofn-fn={input.preads_fofn} --pre-assembly-report-fn={output.pre_assembly_report}

	inputs={u'length_cutoff': u'0-rawreads/length_cutoff', u'preads_fofn': u'0-rawreads/preads/input_preads.fofn', u'config': u'config.json', u'raw_reads_db': u'0-rawreads/raw_reads.db'},
	outputs={u'pre_assembly_report': u'0-rawreads/report/pre_assembly_stats.json'})
[INFO]gen_task(python -m falcon_kit.mains.build_pdb --input-fofn-fn={input.preads_fofn} --config-fn={input.config} --run-jobs-fn={output.run_jobs} --job-done-fn={output.db_build_done}
# TODO: Verify that input.preads_db exists.
touch {output.db_build_done}

	inputs={u'config': u'config.json', u'preads_fofn': u'0-rawreads/preads/input_preads.fofn'},
	outputs={u'run_jobs': u'1-preads_ovl/run_jobs.sh', u'preads_db': u'1-preads_ovl/preads.db', u'db_build_done': u'1-preads_ovl/pdb_build_done'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --skip-checks={params.skip_checks} --pread-aln={params.pread_aln} --run-jobs-fn={input.run_jobs} --db-fn={input.db} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'1-preads_ovl/run_jobs.sh', u'db': u'1-preads_ovl/preads.db'},
	outputs={u'split': u'1-preads_ovl/daligner-split/all-units-of-work.json', u'bash_template': u'1-preads_ovl/daligner-split/daligner_bash_template.sh'})
[INFO]Num unsatisfied: 7, graph: 19
[INFO]About to submit: Node(0-rawreads/cns-chunks/cns_00001)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/cns-chunks/cns_00001"
[INFO] starting job Job(jobid='Pc3e580920b8168', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/cns-chunks/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pc3e580920b8168', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/cns-chunks/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-chunks/cns_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 6
[INFO]About to submit: Node(0-rawreads/cns-runs/cns_00001)
[INFO]mkdir -p "//path/to/BAC/0-rawreads/cns-runs/cns_00001"
[INFO] starting job Job(jobid='P6efd1a5f93dbfa', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/cns-runs/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P6efd1a5f93dbfa', cmd='/bin/bash run.sh', rundir=u'//path/to/BAC/0-rawreads/cns-runs/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]sleep 1.0s
[INFO]sleep 1.1s
[INFO]sleep 1.2s
[ERROR]Task Node(0-rawreads/cns-runs/cns_00001) failed with exit-code=256
[INFO]recently_satisfied: set([])
[INFO]Num satisfied in this iteration: 0
[INFO]Num still unsatisfied: 6
[ERROR]Some tasks are recently_done but not satisfied: set([Node(0-rawreads/cns-runs/cns_00001)])
[ERROR]ready: set([])
	submitted: set([])
Traceback (most recent call last):
  File "/opt/Falcon/2018.03.12/bin/fc_run.py", line 11, in <module>
    load_entry_point('falcon-kit==1.0+git.6bb3daa96931aece9bd3742bccc77ad257b7bb65', 'console_scripts', 'fc_run.py')()
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 644, in main
    main1(argv[0], args.config, args.logger)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 159, in main1
    input_fofn_plf=input_fofn_plf,
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 520, in run
    dist=Dist(NPROC=4, MB=4000, sge_option=sge_option),
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/pype.py", line 89, in gen_parallel_tasks
    wf.refreshTargets()
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/simple_pwatcher_bridge.py", line 284, in refreshTargets
    self._refreshTargets(updateFreq, exitOnFailure)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/simple_pwatcher_bridge.py", line 350, in _refreshTargets
    raise Exception(msg)
Exception: Some tasks are recently_done but not satisfied: set([Node(0-rawreads/cns-runs/cns_00001)])

and this is the configuration file that I am using:

[General]

job_type = local

input_fofn = 102b15_FA.fofn

input_type = raw

length_cutoff = 10000

length_cutoff_pr = 10000

sge_option_da = -pe smp 4 -q bigmem
sge_option_la = -pe smp 20 -q bigmem
sge_option_pda = -pe smp 6 -q bigmem
sge_option_pla = -pe smp 16 -q bigmem
sge_option_fc = -pe smp 24 -q bigmem
sge_option_cns = -pe smp 8 -q bigmem

pa_daligner_option =  -v -B4 -t16 -e.70 -l1000 -s1000
ovlp_daligner_option = -v -B4 -t32 -h60 -e.96 -l500 -s1000

pa_DBsplit_option = -a -x500 -s200
ovlp_DBsplit_option = -s200

falcon_sense_option = --output-multi --output-dformat --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 16

overlap_filtering_setting = --max-diff 120 --max-cov 120 --min-cov 2 --n-core 16

[job.defaults]
njobs = 16

job_type = local

[job.step.cns]
njobs = 8

I am not sure if this is the most important part of the output, but is lfs a specific PacBio dependency or something else?

@gconcepcion
Copy link

gconcepcion commented Sep 10, 2020

Great, looks like you solved the environment issue!

lfs is for lustre file systems and is not essential for the run, so not applicable in this particular case.

You need to check for the stderr file in this 0-rawreads/cns-runs/cns_00001 directory and see what it says.
I'm guessing (but not positive) that you will see Killed in there, indicating a memory problem when trying to compute the consensus.
EDIT: Except I just noticed this is a local job, not sent to a job scheduler so now i'm guessing some other error message...

@cwarden45
Copy link
Author

I will try to re-run the analysis with more memory.

In the meantime, I don't think I see a "Killed" message, but this is what I can see (before I go and delete the files when I re-run with more memory):

Contents of 0-rawreads/cns-runs/cns_000 folder:

pwatcher.dir 
run.sh
task.json
task.sh
template.sh
uow-00
user_script.sh

Linked stderr file (from mypwatcher/jobs/P6efd1a5f93dbfa):

+ python2.7 -m pwatcher.mains.fs_heartbeat --directory=/path/to/BAC/0-rawreads/cns-runs/cns_00001 --heartbeat-file=/path/to/BAC/mypwatcher/heartbeats/heartbeat-P6efd1a5f93dbfa --exit-file=/path/to/BAC/mypwatcher/exits/exit-P6efd1a5f93dbfa --rate=10.0 /bin/bash run.sh
Namespace(command=['/bin/bash', 'run.sh'], directory='/path/to/BAC/0-rawreads/cns-runs/cns_00001', exit_file='/path/to/BAC/mypwatcher/exits/exit-P6efd1a5f93dbfa', heartbeat_file='/path/to/BAC/mypwatcher/heartbeats/heartbeat-P6efd1a5f93dbfa', rate=10.0)

cwd:'/path/to/BAC/0-rawreads/cns-runs/cns_00001'
hostname=ppxhpcnode27.coh.org
heartbeat_fn='/path/to/BAC/mypwatcher/heartbeats/heartbeat-P6efd1a5f93dbfa'
exit_fn='/path/to/BAC/mypwatcher/exits/exit-P6efd1a5f93dbfa'
sleep_s=10.0
before setpgid: pid=2430 pgid=2033
 after setpgid: pid=2430 pgid=2430
In cwd: /path/to/BAC/0-rawreads/cns-runs/cns_00001, Blocking call: '/bin/bash run.sh'
export PATH=$PATH:/bin
+ export PATH=/opt/Falcon/2018.03.12/bin:/home/cwarden/perl5/bin:/net/isi-dcnl/ifs/user_data/Seq/jre1.8.0_121/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/srun.x11:/home/cwarden/sw/cmake-3.9.0-rc5/bin:/home/cwarden/bin:/bin
+ PATH=/opt/Falcon/2018.03.12/bin:/home/cwarden/perl5/bin:/net/isi-dcnl/ifs/user_data/Seq/jre1.8.0_121/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/srun.x11:/home/cwarden/sw/cmake-3.9.0-rc5/bin:/home/cwarden/bin:/bin
cd /path/to/BAC/0-rawreads/cns-runs/cns_00001
+ cd /path/to/BAC/0-rawreads/cns-runs/cns_00001
/bin/bash task.sh
+ /bin/bash task.sh
2020-09-10 09:40:34,453 - root - DEBUG - Running "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py /path/to/BAC/0-rawreads/cns-runs/cns_00001/task.json"
2020-09-10 09:40:34,455 - root - DEBUG - Checking existence of '/path/to/BAC/0-rawreads/cns-runs/cns_00001/task.json' with timeout=30
2020-09-10 09:40:34,456 - root - DEBUG - Loading JSON from '/path/to/BAC/0-rawreads/cns-runs/cns_00001/task.json'
2020-09-10 09:40:34,456 - root - DEBUG - {u'bash_template_fn': u'template.sh',
 u'inputs': {u'bash_template': u'/path/to/BAC/0-rawreads/cns-split/consensus-bash-template.sh',
             u'units_of_work': u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json'},
 u'outputs': {u'results': u'some-done-files.json'},
 u'parameters': {}}
2020-09-10 09:40:34,457 - root - WARNING - CD: '/path/to/BAC/0-rawreads/cns-runs/cns_00001' <- '/path/to/BAC/0-rawreads/cns-runs/cns_00001'
2020-09-10 09:40:34,457 - root - DEBUG - Checking existence of u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json' with timeout=30
2020-09-10 09:40:34,458 - root - DEBUG - Checking existence of u'/path/to/BAC/0-rawreads/cns-split/consensus-bash-template.sh' with timeout=30
2020-09-10 09:40:34,458 - root - DEBUG - Checking existence of u'template.sh' with timeout=30
2020-09-10 09:40:34,459 - root - WARNING - CD: '/path/to/BAC/0-rawreads/cns-runs/cns_00001' <- '/path/to/BAC/0-rawreads/cns-runs/cns_00001'
2020-09-10 09:40:34,463 - root - INFO - $('/bin/bash user_script.sh')
hostname
+ hostname
pwd
+ pwd
date
+ date
# Substitution will be similar to snakemake "shell".
python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn=/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json --bash-template-fn=/path/to/BAC/0-rawreads/cns-split/consensus-bash-template.sh --results-fn=some-done-files.json
+ python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn=/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json --bash-template-fn=/path/to/BAC/0-rawreads/cns-split/consensus-bash-template.sh --results-fn=some-done-files.json
INFO:root:u"Deserializing from '/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json'"
INFO:root:u"  Opened for read: '/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json'"
INFO:root:u'  Read 0.0MB as json'
INFO:root:u'  Deserialized 1 records'
WARNING:root:INPUT:{u'length_cutoff': u'/path/to/BAC/0-rawreads/length_cutoff', u'config': u'/path/to/BAC/config.json', u'db': u'/path/to/BAC/0-rawreads/raw_reads.db', u'las': u'/path/to/BAC/0-rawreads/cns-split/cns_00001/merged.cns_00001.las'}
WARNING:root:OUTPUT:{u'fasta': u'consensus.cns_00001.fasta'}
WARNING:root:PARAMS:{}
INFO:root:mkdir -p "uow-00"
WARNING:root:CD: u'uow-00' <- '/path/to/BAC/0-rawreads/cns-runs/cns_00001'
INFO:root:$('/bin/bash user_script.sh')
hostname
+ hostname
pwd
+ pwd
date
+ date
python -m falcon_kit.mains.consensus_task --las-fn=/path/to/BAC/0-rawreads/cns-split/cns_00001/merged.cns_00001.las --db-fn=/path/to/BAC/0-rawreads/raw_reads.db --length-cutoff-fn=/path/to/BAC/0-rawreads/length_cutoff --config-fn=/path/to/BAC/config.json --fasta-fn=consensus.cns_00001.fasta
+ python -m falcon_kit.mains.consensus_task --las-fn=/path/to/BAC/0-rawreads/cns-split/cns_00001/merged.cns_00001.las --db-fn=/path/to/BAC/0-rawreads/raw_reads.db --length-cutoff-fn=/path/to/BAC/0-rawreads/length_cutoff --config-fn=/path/to/BAC/config.json --fasta-fn=consensus.cns_00001.fasta
INFO:root:u"Deserializing from '/path/to/BAC/config.json'"
INFO:root:u"  Opened for read: '/path/to/BAC/config.json'"
INFO:root:u'  Read 0.0MB as json'
INFO:root:u'  Deserialized 52 records'
INFO:root:$(u'rm -f consensus.cns_00001.fasta')
INFO:root:$(u'bash -vex run_consensus.sh')
module () {  eval $($LMOD_CMD bash "$@") && eval $(${LMOD_SETTARG_CMD:-:} -s sh)
}
ml () {  eval $($LMOD_DIR/ml_cmd "$@")
}
#!/bin/bash
# -*- shell-script -*-
#
########################################################################
# Start Lmod BASHENV
########################################################################

# Remember the XTRACE and VERBOSE states so that they are restored
# at the end of this script.
[[ ${-/x} != $- ]] && XTRACE_STATE="-x"  || XTRACE_STATE="+x"
+ [[ hvB != hvxB ]]
+ XTRACE_STATE=-x
[[ ${-/v} != $- ]] && VERBOSE_STATE="-v" || VERBOSE_STATE="+v"
+ [[ hxB != hvxB ]]
+ VERBOSE_STATE=-v
set +xv     # Turn off tracing for now.
+ set +xv
unset XTRACE_STATE  VERBOSE_STATE
+ unset XTRACE_STATE VERBOSE_STATE

########################################################################
# End Lmod BASHENV
########################################################################
#
# Local Variables:
# mode: shell-script
# indent-tabs-mode: nil
# End:
#!/bin/bash
set -vex
+ set -vex

set -o pipefail
+ set -o pipefail
CUTOFF=10000
+ CUTOFF=10000
LA4Falcon -H$CUTOFF -of /path/to/BAC/0-rawreads/raw_reads.db /path/to/BAC/0-rawreads/cns-split/cns_00001/merged.cns_00001.las | python -m falcon_kit.mains.consensus --output-multi --output-dformat --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 16 >| consensus.cns_00001.fasta.tmp
+ LA4Falcon -H10000 -of /path/to/BAC/0-rawreads/raw_reads.db /path/to/BAC/0-rawreads/cns-split/cns_00001/merged.cns_00001.las
+ python -m falcon_kit.mains.consensus --output-multi --output-dformat --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 16

about to go into loop with novl = 586859 /path/to/BAC/0-rawreads/raw_reads.db
before Read_Overlap record j = 0 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 1 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 2 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 3 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 4 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 5 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 6 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 7 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 8 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
before Read_Overlap record j = 9 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:34 2020
usage: consensus.py [-h] [--n_core N_CORE] [--min_cov MIN_COV]
                    [--min_cov_aln MIN_COV_ALN] [--max_cov_aln MAX_COV_ALN]
                    [--min_len_aln MIN_LEN_ALN] [--min_n_read MIN_N_READ]
                    [--max_n_read MAX_N_READ] [--trim] [--output_full]
                    [--output_multi] [--min_idt MIN_IDT]
                    [--edge_tolerance EDGE_TOLERANCE] [--trim_size TRIM_SIZE]
consensus.py: error: unrecognized arguments: --output-multi --output-dformat --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 16

completed loop record j = 586859 out of 586859 at /path/to/BAC/0-rawreads/raw_reads.db Thu Sep 10 09:40:41 2020

WARNING:root:Call u'bash -vex run_consensus.sh' returned 512.
Traceback (most recent call last):
  File "/opt/Falcon/2018.03.12/lib/python2.7/runpy.py", line 174, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/opt/Falcon/2018.03.12/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/consensus_task.py", line 103, in <module>
    main()
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/consensus_task.py", line 99, in main
    run(**vars(args))
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/consensus_task.py", line 57, in run
    io.syscall('bash -vex {}'.format(script_fn))
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/io.py", line 29, in syscall
    raise Exception(msg)
Exception: Call u'bash -vex run_consensus.sh' returned 512.
WARNING:root:Call '/bin/bash user_script.sh' returned 256.
WARNING:root:CD: u'uow-00' -> '/path/to/BAC/0-rawreads/cns-runs/cns_00001'
Traceback (most recent call last):
  File "/opt/Falcon/2018.03.12/lib/python2.7/runpy.py", line 174, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/opt/Falcon/2018.03.12/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/generic_run_units_of_work.py", line 95, in <module>
    main()
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/generic_run_units_of_work.py", line 91, in main
    run(**vars(args))
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/generic_run_units_of_work.py", line 49, in run
    pypeflow.do_task.run_bash(script, inputs, outputs, params)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py", line 173, in run_bash
    util.system(cmd)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/io.py", line 29, in syscall
    raise Exception(msg)
Exception: Call '/bin/bash user_script.sh' returned 256.
2020-09-10 09:40:41,595 - root - WARNING - Call '/bin/bash user_script.sh' returned 256.
2020-09-10 09:40:41,595 - root - WARNING - CD: '/path/to/BAC/0-rawreads/cns-runs/cns_00001' -> '/path/to/BAC/0-rawreads/cns-runs/cns_00001'
2020-09-10 09:40:41,595 - root - WARNING - CD: '/path/to/BAC/0-rawreads/cns-runs/cns_00001' -> '/path/to/BAC/0-rawreads/cns-runs/cns_00001'
2020-09-10 09:40:41,596 - root - CRITICAL - Error in /opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py with args="{'json_fn': '/path/to/BAC/0-rawreads/cns-runs/cns_00001/task.json',\n 'timeout': 30,\n 'tmpdir': None}"
Traceback (most recent call last):
  File "/opt/Falcon/2018.03.12/lib/python2.7/runpy.py", line 174, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/opt/Falcon/2018.03.12/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py", line 250, in <module>
    main()
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py", line 242, in main
    run(**vars(parsed_args))
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py", line 236, in run
    run_cfg_in_tmpdir(cfg, tmpdir)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py", line 212, in run_cfg_in_tmpdir
    run_bash(bash_template, myinputs, myoutputs, parameters)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/do_task.py", line 173, in run_bash
    util.system(cmd)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/io.py", line 29, in syscall
    raise Exception(msg)
Exception: Call '/bin/bash user_script.sh' returned 256.

real	0m7.380s
user	0m6.485s
sys	0m0.441s
 returned: 256

Thank you!

@gconcepcion
Copy link

gconcepcion commented Sep 10, 2020

Looks like the version of falcon you have installed doesn't have the --output-dformat option for consensus.py. Try removing --output-dformat from the config and re-starting the workflow

usage: consensus.py [-h] [--n_core N_CORE] [--min_cov MIN_COV]
                    [--min_cov_aln MIN_COV_ALN] [--max_cov_aln MAX_COV_ALN]
                    [--min_len_aln MIN_LEN_ALN] [--min_n_read MIN_N_READ]
                    [--max_n_read MAX_N_READ] [--trim] [--output_full]
                    [--output_multi] [--min_idt MIN_IDT]
                    [--edge_tolerance EDGE_TOLERANCE] [--trim_size TRIM_SIZE]
consensus.py: error: unrecognized arguments: --output-multi --output-dformat --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 16

@cwarden45
Copy link
Author

Thank you very much. I have removed that parameter from the configuration file.

I apologize that it is hard for me to tell what has gone wrong. There are 14 "jobs" directories, so I am again not sure what I should attach. However, here is the similar information as before:

The main output:

[3550]$(u'lfs setstripe -c 12 /path/to/BAC')
sh: lfs: command not found
[3550]WARNING: Call u'lfs setstripe -c 12 /path/to/BAC' returned 32512.
[INFO]Setup logging from file "None".
[INFO]fc_run started with configuration 102b15_fa_subread-Apollo.cfg
[INFO] No target specified, assuming "assembly" as target 
/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/run_support.py:462: UserWarning: Unexpected keys in input config: set(['pa_daligner_option', 'ovlp_daligner_option'])
  warnings.warn("Unexpected keys in input config: %s" % repr(unused))
[WARNING]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pwatcher/fs_based.pyc'>
[INFO]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pwatcher/fs_based.pyc'>
[INFO]job_type='local', job_queue=u'', sge_option='-pe smp 4 -q bigmem', use_tmpdir=False, squash=False, job_name_style=0
[INFO]u'Serializing 52 records'
[INFO]u"  Opened for write: u'./config.json'"
[INFO]u'  Serialized to 0.0MB as json'
[INFO]u"Deserializing from u'/path/to/BAC/config.json'"
[INFO]u"  Opened for read: u'/path/to/BAC/config.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 52 records'
[INFO]gen_task(python -m falcon_kit.mains.build_rdb --input-fofn-fn={input.raw_reads_fofn} --config-fn={input.config} --run-jobs-fn={output.run_jobs} --length-cutoff-fn={output.length_cutoff} --job-done-fn={output.db_build_done}
touch {output.db_build_done}

	inputs={u'raw_reads_fofn': '102b15_FA.fofn', u'config': u'config.json'},
	outputs={u'raw_reads_db': u'0-rawreads/raw_reads.db', u'run_jobs': u'0-rawreads/run_jobs.sh', u'length_cutoff': u'0-rawreads/length_cutoff', u'db_build_done': u'0-rawreads/rdb_build_done'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --skip-checks={params.skip_checks} --pread-aln={params.pread_aln} --run-jobs-fn={input.run_jobs} --db-fn={input.db} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'0-rawreads/run_jobs.sh', u'db': u'0-rawreads/raw_reads.db'},
	outputs={u'split': u'0-rawreads/daligner-split/all-units-of-work.json', u'bash_template': u'0-rawreads/daligner-split/daligner_bash_template.sh'})
[INFO]Num unsatisfied: 2, graph: 2
[INFO]About to submit: Node(0-rawreads)
[INFO] starting job Job(jobid='Pba6687a04c7046', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pba6687a04c7046', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]recently_satisfied: set([Node(0-rawreads)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/daligner-split)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-split"
[INFO] starting job Job(jobid='P8d899ce890d2ee', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P8d899ce890d2ee', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/daligner-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'0-rawreads/daligner-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/daligner-runs/{dal0_id}/some-done-files.json'}
[WARNING] wildcards={u'dal0_id': u'j_0000'}
[WARNING]  k=results, v=u'0-rawreads/daligner-runs/{dal0_id}/some-done-files.json'
[WARNING] wildcards={u'dal0_id': u'j_0000'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/daligner-split/all-units-of-work.json'},
	outputs={u'one': u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000/some-units-of-work.json', u'bash_template': u'0-rawreads/daligner-split/daligner_bash_template.sh'},
	outputs={u'results': u'0-rawreads/daligner-runs/j_0000/some-done-files.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/daligner-gathered"
[INFO]u"  Opened for write: u'0-rawreads/daligner-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobj_0000': u'/path/to/BAC/0-rawreads/daligner-runs/j_0000/some-done-files.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobj_0000': u'/path/to/BAC/0-rawreads/daligner-runs/j_0000/some-done-files.json'},
	outputs={u'gathered': u'0-rawreads/daligner-gathered/gathered-done-files.json', u'result_fn_list': u'0-rawreads/daligner-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_gather_las_list --gathered-fn={input.gathered} --las-paths-fn={output.las_paths}

	inputs={u'gathered': u'0-rawreads/daligner-gathered/gathered-done-files.json'},
	outputs={u'las_paths': u'0-rawreads/daligner-intermediate-gathered-las/gathered-las.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --run-jobs-fn={input.run_jobs} --gathered-las-fn={input.gathered_las} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'0-rawreads/run_jobs.sh', u'gathered_las': u'0-rawreads/daligner-intermediate-gathered-las/gathered-las.json'},
	outputs={u'split': u'0-rawreads/las-merge-split/all-units-of-work.json', u'bash_template': u'0-rawreads/las-merge-split/las-merge-bash-template.sh'})
[INFO]Num unsatisfied: 5, graph: 7
[INFO]About to submit: Node(0-rawreads/daligner-chunks/j_0000)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-chunks/j_0000"
[INFO] starting job Job(jobid='P1ba7269bc88c5a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P1ba7269bc88c5a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-chunks/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(0-rawreads/daligner-runs/j_0000)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-runs/j_0000"
[INFO] starting job Job(jobid='P345e7d6b75e12a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P345e7d6b75e12a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]sleep 1.0s
[INFO]sleep 1.1s
[INFO]sleep 1.2s
[INFO]sleep 1.3s
[INFO]sleep 1.4s
[INFO]sleep 1.5s
[INFO]sleep 1.6s
[INFO]sleep 1.7s
[INFO]sleep 1.8s
[INFO]sleep 1.9s
[INFO]sleep 2.0s
[INFO]sleep 2.1s
[INFO]sleep 2.2s
[INFO]sleep 2.3s
[INFO]sleep 2.4s
[INFO]sleep 2.5s
[INFO]sleep 2.6s
[INFO]sleep 2.7s
[INFO]sleep 2.8s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-runs/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(0-rawreads/daligner-gathered)
[INFO] starting job Job(jobid='P65e280ba73ba44', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P65e280ba73ba44', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(0-rawreads/daligner-intermediate-gathered-las)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-intermediate-gathered-las"
[INFO] starting job Job(jobid='P4f5ed38f680cb5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P4f5ed38f680cb5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-intermediate-gathered-las)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/las-merge-split)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-merge-split"
[INFO] starting job Job(jobid='P1751a7fa2d519e', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P1751a7fa2d519e', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/las-merge-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'0-rawreads/las-merge-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/las-merge-runs/{mer0_id}/some-las-paths.json'}
[WARNING] wildcards={u'mer0_id': u'm_00001'}
[WARNING]  k=results, v=u'0-rawreads/las-merge-runs/{mer0_id}/some-las-paths.json'
[WARNING] wildcards={u'mer0_id': u'm_00001'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/las-merge-split/all-units-of-work.json'},
	outputs={u'one': u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001/some-units-of-work.json', u'bash_template': u'0-rawreads/las-merge-split/las-merge-bash-template.sh'},
	outputs={u'results': u'0-rawreads/las-merge-runs/m_00001/some-las-paths.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/las-merge-gathered"
[INFO]u"  Opened for write: u'0-rawreads/las-merge-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobm_00001': u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001/some-las-paths.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobm_00001': u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001/some-las-paths.json'},
	outputs={u'gathered': u'0-rawreads/las-merge-gathered/gathered.json', u'result_fn_list': u'0-rawreads/las-merge-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_gather --gathered-fn={input.gathered} --p-id2las-fn={output.p_id2las} --las-fn={output.las}

	inputs={u'gathered': u'0-rawreads/las-merge-gathered/gathered.json'},
	outputs={u'p_id2las': u'0-rawreads/las-gather/p_id2las.json', u'las': u'0-rawreads/las-gather/las_fofn.json'})
[INFO]gen_task(python -m falcon_kit.mains.consensus_split --wildcards={params.wildcards} --p-id2las-fn={input.p_id2las} --db-fn={input.raw_reads_db} --length-cutoff-fn={input.length_cutoff} --config-fn={input.config} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'raw_reads_db': u'0-rawreads/raw_reads.db', u'config': u'config.json', u'p_id2las': u'0-rawreads/las-gather/p_id2las.json', u'length_cutoff': u'0-rawreads/length_cutoff'},
	outputs={u'split': u'0-rawreads/cns-split/split.json', u'bash_template': u'0-rawreads/cns-split/consensus-bash-template.sh'})
[INFO]Num unsatisfied: 5, graph: 12
[INFO]About to submit: Node(0-rawreads/las-merge-chunks/m_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-merge-chunks/m_00001"
[INFO] starting job Job(jobid='P45773280344a39', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P45773280344a39', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-chunks/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(0-rawreads/las-merge-runs/m_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-merge-runs/m_00001"
[INFO] starting job Job(jobid='P6b67acd634425c', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P6b67acd634425c', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-runs/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(0-rawreads/las-merge-gathered)
[INFO] starting job Job(jobid='P7c2983404e21cc', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P7c2983404e21cc', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(0-rawreads/las-gather)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-gather"
[INFO] starting job Job(jobid='P7cb073bb352017', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P7cb073bb352017', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/las-gather)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/cns-split)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/cns-split"
[INFO] starting job Job(jobid='P437edc68c958f9', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P437edc68c958f9', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/cns-split/split.json'"
[INFO]u"  Opened for read: u'0-rawreads/cns-split/split.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/cns-runs/{cns0_id}/some-done-files.json'}
[WARNING] wildcards={u'cns0_id2': u'cns_00001', u'cns0_id': u'cns_00001'}
[WARNING]  k=results, v=u'0-rawreads/cns-runs/{cns0_id}/some-done-files.json'
[WARNING] wildcards={u'cns0_id2': u'cns_00001', u'cns0_id': u'cns_00001'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/cns-split/split.json'},
	outputs={u'one': u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json', u'bash_template': u'0-rawreads/cns-split/consensus-bash-template.sh'},
	outputs={u'results': u'0-rawreads/cns-runs/cns_00001/some-done-files.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/cns-gather"
[INFO]u"  Opened for write: u'0-rawreads/cns-gather/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobcns_00001_cns_00001': u'/path/to/BAC/0-rawreads/cns-runs/cns_00001/some-done-files.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobcns_00001_cns_00001': u'/path/to/BAC/0-rawreads/cns-runs/cns_00001/some-done-files.json'},
	outputs={u'gathered': u'0-rawreads/cns-gather/gathered.json', u'result_fn_list': u'0-rawreads/cns-gather/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.consensus_gather_fasta_fofn --gathered-fn={input.gathered} --preads-fofn-fn={output.preads_fofn}

	inputs={u'gathered': u'0-rawreads/cns-gather/gathered.json'},
	outputs={u'preads_fofn': u'0-rawreads/preads/input_preads.fofn'})
[INFO]gen_task(python -m falcon_kit.mains.task_report_pre_assembly --config-fn={input.config} --length-cutoff-fn={input.length_cutoff} --raw-reads-db-fn={input.raw_reads_db} --preads-fofn-fn={input.preads_fofn} --pre-assembly-report-fn={output.pre_assembly_report}

	inputs={u'length_cutoff': u'0-rawreads/length_cutoff', u'preads_fofn': u'0-rawreads/preads/input_preads.fofn', u'config': u'config.json', u'raw_reads_db': u'0-rawreads/raw_reads.db'},
	outputs={u'pre_assembly_report': u'0-rawreads/report/pre_assembly_stats.json'})
[INFO]gen_task(python -m falcon_kit.mains.build_pdb --input-fofn-fn={input.preads_fofn} --config-fn={input.config} --run-jobs-fn={output.run_jobs} --job-done-fn={output.db_build_done}
# TODO: Verify that input.preads_db exists.
touch {output.db_build_done}

	inputs={u'config': u'config.json', u'preads_fofn': u'0-rawreads/preads/input_preads.fofn'},
	outputs={u'run_jobs': u'1-preads_ovl/run_jobs.sh', u'preads_db': u'1-preads_ovl/preads.db', u'db_build_done': u'1-preads_ovl/pdb_build_done'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --skip-checks={params.skip_checks} --pread-aln={params.pread_aln} --run-jobs-fn={input.run_jobs} --db-fn={input.db} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'1-preads_ovl/run_jobs.sh', u'db': u'1-preads_ovl/preads.db'},
	outputs={u'split': u'1-preads_ovl/daligner-split/all-units-of-work.json', u'bash_template': u'1-preads_ovl/daligner-split/daligner_bash_template.sh'})
[INFO]Num unsatisfied: 7, graph: 19
[INFO]About to submit: Node(0-rawreads/cns-chunks/cns_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/cns-chunks/cns_00001"
[INFO] starting job Job(jobid='Pc3e580920b8168', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pc3e580920b8168', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-chunks/cns_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 6
[INFO]About to submit: Node(0-rawreads/cns-runs/cns_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/cns-runs/cns_00001"
[INFO] starting job Job(jobid='P6efd1a5f93dbfa', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-runs/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P6efd1a5f93dbfa', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-runs/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]sleep 1.0s
[INFO]sleep 1.1s
[INFO]sleep 1.2s
[ERROR]Task Node(0-rawreads/cns-runs/cns_00001) failed with exit-code=256
[INFO]recently_satisfied: set([])
[INFO]Num satisfied in this iteration: 0
[INFO]Num still unsatisfied: 6
[ERROR]Some tasks are recently_done but not satisfied: set([Node(0-rawreads/cns-runs/cns_00001)])
[ERROR]ready: set([])
	submitted: set([])
Traceback (most recent call last):
  File "/opt/Falcon/2018.03.12/bin/fc_run.py", line 11, in <module>
    load_entry_point('falcon-kit==1.0+git.6bb3daa96931aece9bd3742bccc77ad257b7bb65', 'console_scripts', 'fc_run.py')()
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 644, in main
    main1(argv[0], args.config, args.logger)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 159, in main1
    input_fofn_plf=input_fofn_plf,
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/mains/run1.py", line 520, in run
    dist=Dist(NPROC=4, MB=4000, sge_option=sge_option),
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/pype.py", line 89, in gen_parallel_tasks
    wf.refreshTargets()
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/simple_pwatcher_bridge.py", line 284, in refreshTargets
    self._refreshTargets(updateFreq, exitOnFailure)
  File "/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pypeflow/simple_pwatcher_bridge.py", line 350, in _refreshTargets
    raise Exception(msg)
Exception: Some tasks are recently_done but not satisfied: set([Node(0-rawreads/cns-runs/cns_00001)])

and this is the configuration file that I am using:

[General]

job_type = local

input_fofn = 102b15_FA.fofn

input_type = raw

length_cutoff = 10000

length_cutoff_pr = 10000

sge_option_da = -pe smp 4 -q bigmem
sge_option_la = -pe smp 8 -q bigmem
sge_option_pda = -pe smp 6 -q bigmem
sge_option_pla = -pe smp 8 -q bigmem
sge_option_fc = -pe smp 8 -q bigmem
sge_option_cns = -pe smp 8 -q bigmem

pa_daligner_option =  -v -B4 -t16 -e.70 -l1000 -s1000
ovlp_daligner_option = -v -B4 -t32 -h60 -e.96 -l500 -s1000

pa_DBsplit_option = -a -x500 -s200
ovlp_DBsplit_option = -s200

falcon_sense_option = --output-multi --min-idt 0.70 --min-cov 4 --max-n-read 200 --n-core 8

overlap_filtering_setting = --max-diff 120 --max-cov 120 --min-cov 2 --n-core 8

[job.defaults]
njobs = 8

job_type = local

[job.step.cns]
njobs = 8

Thanks again!

@gconcepcion
Copy link

falcon is definitely a little tricky to troubleshoot until you get the hang of it. In general you want to take a look at the *.stderr in the directory that shows [ERROR] in the master log file. In this case it's 0-rawreads/cns-runs/cns_00001 again. I'm assuming the changes didn't get picked up.
Try manually removing that directory, and restarting the workflow
rm -rf 0-rawreads/cns-runs/cns_00001

@cwarden45
Copy link
Author

I am not sure if it was what I was supposed to do, but I also tested removing --output-multi from the configuration file.

Assuming that I should still be looking for the stderr for the 0-rawreads/cns-runs/cns_00001 folder (linked to mypwatcher/jobs/P6efd1a5f93dbfa), I have attached that (with a different extension to allow the upload, as well as some file path modifications for the public deposit).

stderr

Please let me know if you need any additional information.

@gconcepcion
Copy link

Oh interesting, I'm not sure, but I suspect you are running an older version of falcon. at one point the falcon_sense_option had a bug which was later fixed where the command line arguments were specified with an underscore instead of a dash.
Like this: --output_multi --min_idt 0.70 --min_cov 4 --max_n_read 200 --n_core 8
Try removing the middle dash and replacing with an underscore like the above line

@cwarden45
Copy link
Author

Yes - that I correct: I first changed the format for the newest version, but the name to load in the module system (Falcon/2018.03.12) made me think that an earlier version of Falcon was being used. I also think that I saw that in the logs. I apologize that I didn't think about the formatting issue before.

If I understand this tutorial correctly, then I don't think Falcon successfully finished running.

I am not sure if I should upload the same stderr file, but here it is (just in case, from 0-rawreads/cns-runs/cns_00001 --> mypwatcher/jobs/Pba6687a04c7046):

stderr

Also, here is the main output:

[12937]$(u'lfs setstripe -c 12 /path/to/BAC')
sh: lfs: command not found
[12937]WARNING: Call u'lfs setstripe -c 12 /path/to/BAC' returned 32512.
[INFO]Setup logging from file "None".
[INFO]fc_run started with configuration 102b15_fa_subread-Apollo.cfg
[INFO] No target specified, assuming "assembly" as target 
/opt/Falcon/2018.03.12/lib/python2.7/site-packages/falcon_kit/run_support.py:462: UserWarning: Unexpected keys in input config: set(['pa_daligner_option', 'ovlp_daligner_option'])
  warnings.warn("Unexpected keys in input config: %s" % repr(unused))
[WARNING]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pwatcher/fs_based.pyc'>
[INFO]In simple_pwatcher_bridge, pwatcher_impl=<module 'pwatcher.fs_based' from '/opt/Falcon/2018.03.12/lib/python2.7/site-packages/pwatcher/fs_based.pyc'>
[INFO]job_type='local', job_queue=u'', sge_option='-pe smp 4 -q bigmem', use_tmpdir=False, squash=False, job_name_style=0
[INFO]u'Serializing 52 records'
[INFO]u"  Opened for write: u'./config.json'"
[INFO]u'  Serialized to 0.0MB as json'
[INFO]u"Deserializing from u'/path/to/BAC/config.json'"
[INFO]u"  Opened for read: u'/path/to/BAC/config.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 52 records'
[INFO]gen_task(python -m falcon_kit.mains.build_rdb --input-fofn-fn={input.raw_reads_fofn} --config-fn={input.config} --run-jobs-fn={output.run_jobs} --length-cutoff-fn={output.length_cutoff} --job-done-fn={output.db_build_done}
touch {output.db_build_done}

	inputs={u'raw_reads_fofn': '102b15_FA.fofn', u'config': u'config.json'},
	outputs={u'raw_reads_db': u'0-rawreads/raw_reads.db', u'run_jobs': u'0-rawreads/run_jobs.sh', u'length_cutoff': u'0-rawreads/length_cutoff', u'db_build_done': u'0-rawreads/rdb_build_done'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --skip-checks={params.skip_checks} --pread-aln={params.pread_aln} --run-jobs-fn={input.run_jobs} --db-fn={input.db} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'0-rawreads/run_jobs.sh', u'db': u'0-rawreads/raw_reads.db'},
	outputs={u'split': u'0-rawreads/daligner-split/all-units-of-work.json', u'bash_template': u'0-rawreads/daligner-split/daligner_bash_template.sh'})
[INFO]Num unsatisfied: 2, graph: 2
[INFO]About to submit: Node(0-rawreads)
[INFO] starting job Job(jobid='Pba6687a04c7046', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pba6687a04c7046', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]recently_satisfied: set([Node(0-rawreads)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/daligner-split)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-split"
[INFO] starting job Job(jobid='P8d899ce890d2ee', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P8d899ce890d2ee', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/daligner-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'0-rawreads/daligner-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/daligner-runs/{dal0_id}/some-done-files.json'}
[WARNING] wildcards={u'dal0_id': u'j_0000'}
[WARNING]  k=results, v=u'0-rawreads/daligner-runs/{dal0_id}/some-done-files.json'
[WARNING] wildcards={u'dal0_id': u'j_0000'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/daligner-split/all-units-of-work.json'},
	outputs={u'one': u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000/some-units-of-work.json', u'bash_template': u'0-rawreads/daligner-split/daligner_bash_template.sh'},
	outputs={u'results': u'0-rawreads/daligner-runs/j_0000/some-done-files.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/daligner-gathered"
[INFO]u"  Opened for write: u'0-rawreads/daligner-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobj_0000': u'/path/to/BAC/0-rawreads/daligner-runs/j_0000/some-done-files.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobj_0000': u'/path/to/BAC/0-rawreads/daligner-runs/j_0000/some-done-files.json'},
	outputs={u'gathered': u'0-rawreads/daligner-gathered/gathered-done-files.json', u'result_fn_list': u'0-rawreads/daligner-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_gather_las_list --gathered-fn={input.gathered} --las-paths-fn={output.las_paths}

	inputs={u'gathered': u'0-rawreads/daligner-gathered/gathered-done-files.json'},
	outputs={u'las_paths': u'0-rawreads/daligner-intermediate-gathered-las/gathered-las.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --run-jobs-fn={input.run_jobs} --gathered-las-fn={input.gathered_las} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'0-rawreads/run_jobs.sh', u'gathered_las': u'0-rawreads/daligner-intermediate-gathered-las/gathered-las.json'},
	outputs={u'split': u'0-rawreads/las-merge-split/all-units-of-work.json', u'bash_template': u'0-rawreads/las-merge-split/las-merge-bash-template.sh'})
[INFO]Num unsatisfied: 5, graph: 7
[INFO]About to submit: Node(0-rawreads/daligner-chunks/j_0000)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-chunks/j_0000"
[INFO] starting job Job(jobid='P1ba7269bc88c5a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P1ba7269bc88c5a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-chunks/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(0-rawreads/daligner-runs/j_0000)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-runs/j_0000"
[INFO] starting job Job(jobid='P345e7d6b75e12a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P345e7d6b75e12a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]sleep 1.0s
[INFO]sleep 1.1s
[INFO]sleep 1.2s
[INFO]sleep 1.3s
[INFO]sleep 1.4s
[INFO]sleep 1.5s
[INFO]sleep 1.6s
[INFO]sleep 1.7s
[INFO]sleep 1.8s
[INFO]sleep 1.9s
[INFO]sleep 2.0s
[INFO]sleep 2.1s
[INFO]sleep 2.2s
[INFO]sleep 2.3s
[INFO]sleep 2.4s
[INFO]sleep 2.5s
[INFO]sleep 2.6s
[INFO]sleep 2.7s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-runs/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(0-rawreads/daligner-gathered)
[INFO] starting job Job(jobid='P65e280ba73ba44', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P65e280ba73ba44', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(0-rawreads/daligner-intermediate-gathered-las)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/daligner-intermediate-gathered-las"
[INFO] starting job Job(jobid='P4f5ed38f680cb5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P4f5ed38f680cb5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/daligner-intermediate-gathered-las)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/las-merge-split)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-merge-split"
[INFO] starting job Job(jobid='P1751a7fa2d519e', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P1751a7fa2d519e', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/las-merge-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'0-rawreads/las-merge-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/las-merge-runs/{mer0_id}/some-las-paths.json'}
[WARNING] wildcards={u'mer0_id': u'm_00001'}
[WARNING]  k=results, v=u'0-rawreads/las-merge-runs/{mer0_id}/some-las-paths.json'
[WARNING] wildcards={u'mer0_id': u'm_00001'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/las-merge-split/all-units-of-work.json'},
	outputs={u'one': u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001/some-units-of-work.json', u'bash_template': u'0-rawreads/las-merge-split/las-merge-bash-template.sh'},
	outputs={u'results': u'0-rawreads/las-merge-runs/m_00001/some-las-paths.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/las-merge-gathered"
[INFO]u"  Opened for write: u'0-rawreads/las-merge-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobm_00001': u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001/some-las-paths.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobm_00001': u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001/some-las-paths.json'},
	outputs={u'gathered': u'0-rawreads/las-merge-gathered/gathered.json', u'result_fn_list': u'0-rawreads/las-merge-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_gather --gathered-fn={input.gathered} --p-id2las-fn={output.p_id2las} --las-fn={output.las}

	inputs={u'gathered': u'0-rawreads/las-merge-gathered/gathered.json'},
	outputs={u'p_id2las': u'0-rawreads/las-gather/p_id2las.json', u'las': u'0-rawreads/las-gather/las_fofn.json'})
[INFO]gen_task(python -m falcon_kit.mains.consensus_split --wildcards={params.wildcards} --p-id2las-fn={input.p_id2las} --db-fn={input.raw_reads_db} --length-cutoff-fn={input.length_cutoff} --config-fn={input.config} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'raw_reads_db': u'0-rawreads/raw_reads.db', u'config': u'config.json', u'p_id2las': u'0-rawreads/las-gather/p_id2las.json', u'length_cutoff': u'0-rawreads/length_cutoff'},
	outputs={u'split': u'0-rawreads/cns-split/split.json', u'bash_template': u'0-rawreads/cns-split/consensus-bash-template.sh'})
[INFO]Num unsatisfied: 5, graph: 12
[INFO]About to submit: Node(0-rawreads/las-merge-chunks/m_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-merge-chunks/m_00001"
[INFO] starting job Job(jobid='P45773280344a39', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P45773280344a39', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-chunks/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(0-rawreads/las-merge-runs/m_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-merge-runs/m_00001"
[INFO] starting job Job(jobid='P6b67acd634425c', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P6b67acd634425c', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-runs/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(0-rawreads/las-merge-gathered)
[INFO] starting job Job(jobid='P7c2983404e21cc', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P7c2983404e21cc', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/las-merge-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(0-rawreads/las-gather)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/las-gather"
[INFO] starting job Job(jobid='P7cb073bb352017', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P7cb073bb352017', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/las-gather)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(0-rawreads/cns-split)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/cns-split"
[INFO] starting job Job(jobid='P437edc68c958f9', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P437edc68c958f9', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'0-rawreads/cns-split/split.json'"
[INFO]u"  Opened for read: u'0-rawreads/cns-split/split.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'0-rawreads/cns-runs/{cns0_id}/some-done-files.json'}
[WARNING] wildcards={u'cns0_id2': u'cns_00001', u'cns0_id': u'cns_00001'}
[WARNING]  k=results, v=u'0-rawreads/cns-runs/{cns0_id}/some-done-files.json'
[WARNING] wildcards={u'cns0_id2': u'cns_00001', u'cns0_id': u'cns_00001'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'0-rawreads/cns-split/split.json'},
	outputs={u'one': u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001/some-units-of-work.json', u'bash_template': u'0-rawreads/cns-split/consensus-bash-template.sh'},
	outputs={u'results': u'0-rawreads/cns-runs/cns_00001/some-done-files.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "0-rawreads/cns-gather"
[INFO]u"  Opened for write: u'0-rawreads/cns-gather/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobcns_00001_cns_00001': u'/path/to/BAC/0-rawreads/cns-runs/cns_00001/some-done-files.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobcns_00001_cns_00001': u'/path/to/BAC/0-rawreads/cns-runs/cns_00001/some-done-files.json'},
	outputs={u'gathered': u'0-rawreads/cns-gather/gathered.json', u'result_fn_list': u'0-rawreads/cns-gather/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.consensus_gather_fasta_fofn --gathered-fn={input.gathered} --preads-fofn-fn={output.preads_fofn}

	inputs={u'gathered': u'0-rawreads/cns-gather/gathered.json'},
	outputs={u'preads_fofn': u'0-rawreads/preads/input_preads.fofn'})
[INFO]gen_task(python -m falcon_kit.mains.task_report_pre_assembly --config-fn={input.config} --length-cutoff-fn={input.length_cutoff} --raw-reads-db-fn={input.raw_reads_db} --preads-fofn-fn={input.preads_fofn} --pre-assembly-report-fn={output.pre_assembly_report}

	inputs={u'length_cutoff': u'0-rawreads/length_cutoff', u'preads_fofn': u'0-rawreads/preads/input_preads.fofn', u'config': u'config.json', u'raw_reads_db': u'0-rawreads/raw_reads.db'},
	outputs={u'pre_assembly_report': u'0-rawreads/report/pre_assembly_stats.json'})
[INFO]gen_task(python -m falcon_kit.mains.build_pdb --input-fofn-fn={input.preads_fofn} --config-fn={input.config} --run-jobs-fn={output.run_jobs} --job-done-fn={output.db_build_done}
# TODO: Verify that input.preads_db exists.
touch {output.db_build_done}

	inputs={u'config': u'config.json', u'preads_fofn': u'0-rawreads/preads/input_preads.fofn'},
	outputs={u'run_jobs': u'1-preads_ovl/run_jobs.sh', u'preads_db': u'1-preads_ovl/preads.db', u'db_build_done': u'1-preads_ovl/pdb_build_done'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --skip-checks={params.skip_checks} --pread-aln={params.pread_aln} --run-jobs-fn={input.run_jobs} --db-fn={input.db} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'1-preads_ovl/run_jobs.sh', u'db': u'1-preads_ovl/preads.db'},
	outputs={u'split': u'1-preads_ovl/daligner-split/all-units-of-work.json', u'bash_template': u'1-preads_ovl/daligner-split/daligner_bash_template.sh'})
[INFO]Num unsatisfied: 7, graph: 19
[INFO]About to submit: Node(0-rawreads/cns-chunks/cns_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/cns-chunks/cns_00001"
[INFO] starting job Job(jobid='Pc3e580920b8168', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pc3e580920b8168', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-chunks/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-chunks/cns_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 6
[INFO]About to submit: Node(0-rawreads/cns-runs/cns_00001)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/cns-runs/cns_00001"
[INFO] starting job Job(jobid='P6efd1a5f93dbfa', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-runs/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P6efd1a5f93dbfa', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-runs/cns_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]sleep 1.0s
[INFO]sleep 1.1s
[INFO]sleep 1.2s
[INFO]sleep 1.3s
[INFO]sleep 1.4s
[INFO]sleep 1.5s
[INFO]sleep 1.6s
[INFO]sleep 1.7s
[INFO]sleep 1.8s
[INFO]sleep 1.9s
[INFO]sleep 2.0s
[INFO]sleep 2.1s
[INFO]sleep 2.2s
[INFO]sleep 2.3s
[INFO]sleep 2.4s
[INFO]sleep 2.5s
[INFO]sleep 2.6s
[INFO]sleep 2.7s
[INFO]sleep 2.8s
[INFO]sleep 2.9s
[INFO]sleep 3.0s
[INFO]sleep 3.1s
[INFO]sleep 3.2s
[INFO]sleep 3.3s
[INFO]sleep 3.4s
[INFO]sleep 3.5s
[INFO]sleep 3.6s
[INFO]sleep 3.7s
[INFO]sleep 3.8s
[INFO]sleep 3.9s
[INFO]sleep 4.0s
[INFO]sleep 4.1s
[INFO]sleep 4.2s
[INFO]sleep 4.3s
[INFO]sleep 4.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-runs/cns_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 5
[INFO]About to submit: Node(0-rawreads/cns-gather)
[INFO] starting job Job(jobid='Pf4944973980bfc', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pf4944973980bfc', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/cns-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(0-rawreads/cns-gather)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(0-rawreads/preads)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/preads"
[INFO] starting job Job(jobid='P03898102ddf2f5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/preads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P03898102ddf2f5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/preads', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(0-rawreads/preads)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(0-rawreads/report)
[INFO]About to submit: Node(1-preads_ovl)
[INFO]mkdir -p "/path/to/BAC/0-rawreads/report"
[INFO] starting job Job(jobid='P447e6617b94fb1', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl', options={'job_queue': u'', 'sge_option': '-pe smp 6 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P447e6617b94fb1', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl', options={'job_queue': u'', 'sge_option': '-pe smp 6 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO] starting job Job(jobid='P72d1898067ce5a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/report', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P72d1898067ce5a', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/0-rawreads/report', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]recently_satisfied: set([Node(0-rawreads/report)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(1-preads_ovl)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(1-preads_ovl/daligner-split)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/daligner-split"
[INFO] starting job Job(jobid='Pd0d5659d3dc337', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pd0d5659d3dc337', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(1-preads_ovl/daligner-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'1-preads_ovl/daligner-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'1-preads_ovl/daligner-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'1-preads_ovl/daligner-runs/{dal1_id}/some-done-files.json'}
[WARNING] wildcards={u'dal1_id': u'j_0000'}
[WARNING]  k=results, v=u'1-preads_ovl/daligner-runs/{dal1_id}/some-done-files.json'
[WARNING] wildcards={u'dal1_id': u'j_0000'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'1-preads_ovl/daligner-split/all-units-of-work.json'},
	outputs={u'one': u'/path/to/BAC/1-preads_ovl/daligner-chunks/j_0000/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/1-preads_ovl/daligner-chunks/j_0000/some-units-of-work.json', u'bash_template': u'1-preads_ovl/daligner-split/daligner_bash_template.sh'},
	outputs={u'results': u'1-preads_ovl/daligner-runs/j_0000/some-done-files.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "1-preads_ovl/daligner-gathered"
[INFO]u"  Opened for write: u'1-preads_ovl/daligner-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobj_0000': u'/path/to/BAC/1-preads_ovl/daligner-runs/j_0000/some-done-files.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobj_0000': u'/path/to/BAC/1-preads_ovl/daligner-runs/j_0000/some-done-files.json'},
	outputs={u'gathered': u'1-preads_ovl/daligner-gathered/gathered-done-files.json', u'result_fn_list': u'1-preads_ovl/daligner-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.daligner_gather_las_list --gathered-fn={input.gathered} --las-paths-fn={output.las_paths}

	inputs={u'gathered': u'1-preads_ovl/daligner-gathered/gathered-done-files.json'},
	outputs={u'las_paths': u'1-preads_ovl/daligner-intermediate-gathered-las/gathered-las.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_split --wildcards={params.wildcards} --db-prefix={params.db_prefix} --run-jobs-fn={input.run_jobs} --gathered-las-fn={input.gathered_las} --split-fn={output.split} --bash-template-fn={output.bash_template}

	inputs={u'run_jobs': u'1-preads_ovl/run_jobs.sh', u'gathered_las': u'1-preads_ovl/daligner-intermediate-gathered-las/gathered-las.json'},
	outputs={u'split': u'1-preads_ovl/las-merge-split/all-units-of-work.json', u'bash_template': u'1-preads_ovl/las-merge-split/las-merge-bash-template.sh'})
[INFO]Num unsatisfied: 5, graph: 24
[INFO]About to submit: Node(1-preads_ovl/daligner-chunks/j_0000)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/daligner-chunks/j_0000"
[INFO] starting job Job(jobid='Pd4c79efef0bc75', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pd4c79efef0bc75', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-chunks/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(1-preads_ovl/daligner-chunks/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(1-preads_ovl/daligner-runs/j_0000)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/daligner-runs/j_0000"
[INFO] starting job Job(jobid='Pd73d05dffcf7cf', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 6 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pd73d05dffcf7cf', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-runs/j_0000', options={'job_queue': u'', 'sge_option': '-pe smp 6 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]recently_satisfied: set([Node(1-preads_ovl/daligner-runs/j_0000)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(1-preads_ovl/daligner-gathered)
[INFO] starting job Job(jobid='P1557e5912898fe', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P1557e5912898fe', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(1-preads_ovl/daligner-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(1-preads_ovl/daligner-intermediate-gathered-las)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/daligner-intermediate-gathered-las"
[INFO] starting job Job(jobid='P5891b0b7c1343c', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P5891b0b7c1343c', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/daligner-intermediate-gathered-las', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(1-preads_ovl/daligner-intermediate-gathered-las)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(1-preads_ovl/las-merge-split)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/las-merge-split"
[INFO] starting job Job(jobid='P56aaa57f7e2b67', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P56aaa57f7e2b67', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-split', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(1-preads_ovl/las-merge-split)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0
[INFO]u"Deserializing from u'1-preads_ovl/las-merge-split/all-units-of-work.json'"
[INFO]u"  Opened for read: u'1-preads_ovl/las-merge-split/all-units-of-work.json'"
[INFO]u'  Read 0.0MB as json'
[INFO]u'  Deserialized 1 records'
[WARNING]PARALLEL OUTPUTS:{u'results': u'1-preads_ovl/las-merge-runs/{mer1_id}/some-las-paths.json'}
[WARNING] wildcards={u'mer1_id': u'm_00001'}
[WARNING]  k=results, v=u'1-preads_ovl/las-merge-runs/{mer1_id}/some-las-paths.json'
[WARNING] wildcards={u'mer1_id': u'm_00001'}
[INFO]gen_task(python -m falcon_kit.mains.generic_scatter_one_uow --all-uow-list-fn={input.all} --one-uow-list-fn={output.one} --split-idx={params.split_idx}

	inputs={u'all': u'1-preads_ovl/las-merge-split/all-units-of-work.json'},
	outputs={u'one': u'/path/to/BAC/1-preads_ovl/las-merge-chunks/m_00001/some-units-of-work.json'})
[INFO]gen_task(python -m falcon_kit.mains.generic_run_units_of_work --units-of-work-fn={input.units_of_work} --bash-template-fn={input.bash_template} --results-fn={output.results}

	inputs={u'units_of_work': u'/path/to/BAC/1-preads_ovl/las-merge-chunks/m_00001/some-units-of-work.json', u'bash_template': u'1-preads_ovl/las-merge-split/las-merge-bash-template.sh'},
	outputs={u'results': u'1-preads_ovl/las-merge-runs/m_00001/some-las-paths.json'})
[INFO]u'Serializing 1 records'
[INFO]mkdir -p "1-preads_ovl/las-merge-gathered"
[INFO]u"  Opened for write: u'1-preads_ovl/las-merge-gathered/result-fn-list.json'"
[INFO]u'  Serialized to 0.0MB as json'
[WARNING]gather_inputs:{u'jobm_00001': u'/path/to/BAC/1-preads_ovl/las-merge-runs/m_00001/some-las-paths.json'}
[INFO]gen_task(
python -m falcon_kit.mains.generic_unsplit --result-fn-list-fn={output.result_fn_list} --gathered-fn={output.gathered}

	inputs={u'jobm_00001': u'/path/to/BAC/1-preads_ovl/las-merge-runs/m_00001/some-las-paths.json'},
	outputs={u'gathered': u'1-preads_ovl/las-merge-gathered/gathered.json', u'result_fn_list': u'1-preads_ovl/las-merge-gathered/result-fn-list.json'})
[INFO]gen_task(python -m falcon_kit.mains.las_merge_gather --gathered-fn={input.gathered} --p-id2las-fn={output.p_id2las} --las-fn={output.las}

	inputs={u'gathered': u'1-preads_ovl/las-merge-gathered/gathered.json'},
	outputs={u'p_id2las': u'1-preads_ovl/las-gather/p_id2las.json', u'las': u'1-preads_ovl/las-gather/las_fofn.json'})
[INFO]gen_task(# Given preads.db,
# write preads4falcon.fasta (implicitly) in CWD.
time DB2Falcon -U {input.preads_db}
[ -f {output.preads4falcon} ] || exit 1
touch {output.job_done}

	inputs={u'preads_db': u'1-preads_ovl/preads.db', u'p_id2las': u'1-preads_ovl/las-gather/p_id2las.json'},
	outputs={u'job_done': u'1-preads_ovl/db2falcon/db2falcon_done', u'preads4falcon': u'1-preads_ovl/db2falcon/preads4falcon.fasta'})
[INFO]gen_task(# Given, las_fofn.json,
# write preads.ovl:

# mobs uses binwrappers, so it does not see our "entry-points".
# So, after dropping "src/py_scripts/*.py", we can call these via python -m:

time python -m falcon_kit.mains.ovlp_filter --db {input.db_file} --las-fofn {input.las_fofn} {params.overlap_filtering_setting} --min_len {params.length_cutoff_pr} --out-fn preads.ovl

ln -sf {input.preads4falcon_fasta} ./preads4falcon.fasta

# Given preads.ovl,
# write sg_edges_list, c_path, utg_data, ctg_paths.
time python -m falcon_kit.mains.ovlp_to_graph {params.fc_ovlp_to_graph_option} --overlap-file preads.ovl >| fc_ovlp_to_graph.log

# Given sg_edges_list, utg_data, ctg_paths, preads4falcon.fasta,
# write p_ctg.fa and a_ctg_all.fa,
# plus a_ctg_base.fa, p_ctg_tiling_path, a_ctg_tiling_path, a_ctg_base_tiling_path:
time python -m falcon_kit.mains.graph_to_contig

# Given a_ctg_all.fa, write a_ctg.fa:
time python -m falcon_kit.mains.dedup_a_tigs

# Generate a GFA of all assembly graph edges. This GFA can contain
# edges and nodes which are not part of primary and associate contigs.
time python -m falcon_kit.mains.gen_gfa_v1 >| asm.gfa

# Generate a GFA of all assembly graph edges. This GFA can contain
# edges and nodes which are not part of primary and associate contigs.
time python -m falcon_kit.mains.gen_gfa_v1 --add-string-graph >| sg.gfa

#rm -f ./preads4falcon.fasta

touch {output.falcon_asm_done}

	inputs={u'config': u'config.json', u'db2falcon_done': u'1-preads_ovl/db2falcon/db2falcon_done', u'db_file': u'1-preads_ovl/preads.db', u'las_fofn': u'1-preads_ovl/las-gather/las_fofn.json', u'preads4falcon_fasta': u'1-preads_ovl/db2falcon/preads4falcon.fasta'},
	outputs={u'falcon_asm_done': u'2-asm-falcon/falcon_asm_done'})
[INFO]Num unsatisfied: 6, graph: 30
[INFO]About to submit: Node(1-preads_ovl/las-merge-chunks/m_00001)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/las-merge-chunks/m_00001"
[INFO] starting job Job(jobid='Pc5ef4cdcc33a75', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pc5ef4cdcc33a75', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-chunks/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(1-preads_ovl/las-merge-chunks/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 5
[INFO]About to submit: Node(1-preads_ovl/las-merge-runs/m_00001)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/las-merge-runs/m_00001"
[INFO] starting job Job(jobid='Pd979da0037e084', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pd979da0037e084', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-runs/m_00001', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(1-preads_ovl/las-merge-runs/m_00001)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 4
[INFO]About to submit: Node(1-preads_ovl/las-merge-gathered)
[INFO] starting job Job(jobid='Paf06ce1f000fad', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Paf06ce1f000fad', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-merge-gathered', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(1-preads_ovl/las-merge-gathered)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 3
[INFO]About to submit: Node(1-preads_ovl/las-gather)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/las-gather"
[INFO] starting job Job(jobid='P973d4f4f25aec5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='P973d4f4f25aec5', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/las-gather', options={'job_queue': u'', 'sge_option': '-pe smp 4 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]recently_satisfied: set([Node(1-preads_ovl/las-gather)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 2
[INFO]About to submit: Node(1-preads_ovl/db2falcon)
[INFO]mkdir -p "/path/to/BAC/1-preads_ovl/db2falcon"
[INFO] starting job Job(jobid='Pfb45749544fbbb', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/db2falcon', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pfb45749544fbbb', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/1-preads_ovl/db2falcon', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]recently_satisfied: set([Node(1-preads_ovl/db2falcon)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 1
[INFO]About to submit: Node(2-asm-falcon)
[INFO] starting job Job(jobid='Pbbf14b5768bb24', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/2-asm-falcon', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pbbf14b5768bb24', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/2-asm-falcon', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]recently_satisfied: set([Node(2-asm-falcon)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0

and here is the configuration file that I am using:

[General]

job_type = local

input_fofn = 102b15_FA.fofn

input_type = raw

length_cutoff = 10000

length_cutoff_pr = 10000

sge_option_da = -pe smp 4 -q bigmem
sge_option_la = -pe smp 8 -q bigmem
sge_option_pda = -pe smp 6 -q bigmem
sge_option_pla = -pe smp 8 -q bigmem
sge_option_fc = -pe smp 8 -q bigmem
sge_option_cns = -pe smp 8 -q bigmem

pa_daligner_option =  -v -B4 -t16 -e.70 -l1000 -s1000
ovlp_daligner_option = -v -B4 -t32 -h60 -e.96 -l500 -s1000

pa_DBsplit_option = -a -x500 -s200
ovlp_DBsplit_option = -s200

falcon_sense_option = --output_multi --min_idt 0.70 --min_cov 4 --max_n_read 200 --n_core 8

overlap_filtering_setting = --max_diff 120 --max_cov 120 --min_cov 2 --n_core 8

[job.defaults]
njobs = 8

job_type = local

[job.step.cns]
njobs = 8

Please let me know if you need any additional information.

Thanks again!

@gconcepcion
Copy link

[INFO]About to submit: Node(2-asm-falcon)
[INFO] starting job Job(jobid='Pbbf14b5768bb24', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/2-asm-falcon', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}) w/ job_type=LOCAL
[INFO]Submitted backgroundjob=MetaJobLocal(MetaJob(job=Job(jobid='Pbbf14b5768bb24', cmd='/bin/bash run.sh', rundir=u'/path/to/BAC/2-asm-falcon', options={'job_queue': u'', 'sge_option': '-pe smp 8 -q bigmem', 'job_type': 'local'}), lang_exe='/bin/bash'))
[INFO]sleep 0.1s
[INFO]sleep 0.2s
[INFO]sleep 0.3s
[INFO]sleep 0.4s
[INFO]sleep 0.5s
[INFO]sleep 0.6s
[INFO]sleep 0.7s
[INFO]sleep 0.8s
[INFO]sleep 0.9s
[INFO]recently_satisfied: set([Node(2-asm-falcon)])
[INFO]Num satisfied in this iteration: 1
[INFO]Num still unsatisfied: 0

2-asm-falcon is the last step in the pipeline, it looks to me like it successfully completed and exited. What makes you think it didn't finish?
Is there *.fa output in the 2-asm-falcon directory?

@cwarden45
Copy link
Author

cwarden45 commented Sep 11, 2020

I think I misread the "done" counts in the completed jobs section.

I noticed some FASTA files, but I think they were smaller than expected.

For example, therse the files in the 2-asm-falcon directory:

a_ctg_all.fa
a_ctg_base.fa
a_ctg_base_tiling_path
a_ctg.fa
a_ctg_tiling_path
asm.gfa
chimers_nodes
c_path
ctg_paths
falcon_asm_done
fc_ovlp_to_graph.log
p_ctg.fa
p_ctg_tiling_path
preads4falcon.fasta
preads.ovl
pwatcher.dir
run.sh
run.sh.done
sg_edges_list
sg.gfa
task.json
task.sh
template.sh
user_script.sh
utg_data

I think the largest sequence within the 4 FASTA files was 30,000 bp in p_ctg.fa, with a name of >000000F 000000216:B~000000216:B~000000222:B~000000360:B ctg_linear 16039 111296.

Am I looking in the right place? If so, maybe you can help explain what that name means? For example, I think 111296 bp is somewhat close to the expected size, but that is not the length of the sequence.

Also, these are BAC sequences, so they are actually circular. Since I see ctg_linear, does that mean there is a parameter to specify that you expected a circular assembly?

However, if you are saying this represents a complete assembly without error messages, then I will close this ticket.

@cwarden45
Copy link
Author

For the separate BAC assembly question, I see these discussions:

#585

#446

However, I apologize, but I am still not quite sure how to tell Falcon to expect circular assemblies. Is the only option to auto-recognize the assembly type?

I can post this as a separate question, if there is a way to do some separate troubleshooting for this.

@gconcepcion
Copy link

Falcon exited successfully if you have data written in the p_ctg.fa and a_ctg.fa files and falcon_asm_done sentinel file is present.

I think the largest sequence within the 4 FASTA files was 30,000 bp in p_ctg.fa, with a name of >000000F 000000216:B~000000216:B~000000222:B~000000360:B ctg_linear 16039 111296.

Am I looking in the right place? If so, maybe you can help explain what that name means? For example, I think 111296 bp is somewhat close to the expected size, but that is not the length of the sequence.

111296 is not bases, it has to do with the contig score and is irrelevant for what you're interested in.

Also, these are BAC sequences, so they are actually circular. Since I see ctg_linear, does that mean there is a parameter to specify that you expected a circular assembly?

There is no parameter to specify whether you expect a circular assembly or not, the algorithm just classifies what it detects one way or the other.

@gconcepcion
Copy link

For the separate BAC assembly question, I see these discussions:

#585

#446

However, I apologize, but I am still not quite sure how to tell Falcon to expect circular assemblies. Is the only option to auto-recognize the assembly type?

I can post this as a separate question, if there is a way to do some separate troubleshooting for this.

You can post as a separate question if you're still interested. As stated in my other reply there is no way to tell falcon what to expect. FALCON is no longer being developed in favor of our new assembler IPA and the latest HiFi datatype that it supports, so you're unlikely to see much support here beyond ensuring you can get the pipeline to execute.

BAC CLR data is tricky due to its high coverage nature and how the FALCON algorithm was designed.
See the FAQ for some suggestions:
https://github.com/PacificBiosciences/pb-assembly#why-does-falcon-have-trouble-assembling-my-amplicon-data

@cwarden45
Copy link
Author

cwarden45 commented Sep 11, 2020

Hi Greg,

Thank you very much for your help!

In this situation, the earliest HGAP3 assembly was similar in size to what was expected.

I more recently ran Canu. This produced an an assembly that was smaller than HGAP3 but larger than what I described for Falcon.

When I ran Circlator on the Canu result and corrected reads, it said that the intermediate size assembly could be confirmed to be circular (Circlator didn't add back in the ~20,000 bp of extra sequence in the HGAP3 assembly).

I will take look at the amplicon link that you described. I am trying to run Falcon via command line because this is RSII data that won't work with the version of SMRT Link that we have configured.

I think we will also eventually send PacBio an e-mail regarding troubleshooting some BAC samples, but there is additional analysis and discussion that we would like to have first.

Thanks yet again!

Sincerely,
Charles

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants