Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scheduler occasionally messes up in Snakemake 5.30.1 #771

Closed
bkohrn opened this issue Nov 25, 2020 · 19 comments
Closed

Scheduler occasionally messes up in Snakemake 5.30.1 #771

bkohrn opened this issue Nov 25, 2020 · 19 comments
Labels
bug Something isn't working

Comments

@bkohrn
Copy link

bkohrn commented Nov 25, 2020

Snakemake version
5.30.1 confirmed; does not occur in 5.25.0

Describe the bug
Occasionally, the scheduler decides to run a rule before all the input files are present. The rule then crashes, which crashes the pipeline. Restarting the pipeline sometimes fixes it, but I don't know if it will always fix it.

Logs
This is the snakemake log from the particular run. At this point, my log files don't contain anything; they're just there for when I decide to do something with them.

Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 2
Rules claiming more threads will be scaled down.
Job counts:
	count	jobs
	5	BLAST
	5	CountAmbig
	11	FinalFilter
	5	InsertSize
	7	MutsPerCycle
	6	PlotCoverage
	5	PlotInsertSize
	5	PostBlastProcessing1
	5	PostBlastProcessing2
	5	PreBlastFilter
	5	PreBlastProcessing1
	5	PreBlastProcessing2
	5	PreBlastProcessing3
	10	alignReads
	1	all
	10	clipAdapters
	6	compileReport
	10	endClip
	13	getFlagstats
	6	getOnTarget
	38	makeBai
	4	makeBufferedBed
	3	makeConsensus
	7	makeCountMuts
	7	makeDepth
	3	makeDirs
	10	makePreEndClip
	6	makeReport
	1	makeSummaryCSV
	1	makeSummaryDepth
	1	makeSummaryFamilySize
	1	makeSummaryInsertSize
	1	makeSummaryMutsByCycle
	10	makeTempBai
	7	make_final_VCF
	11	overlapClip
	5	postBlastRecovery
	7	summaizeDepth
	6	varDict
	7	varDict2VCF
	6	varDict_Ns
	272
Select jobs to execute...

[Wed Nov 25 13:13:49 2020]
rule overlapClip:
    input: testData/test3.sscs.clipped.bam, testData/test3.sscs.clipped.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test3.sscs.overlapClip.temp.bam, testData/test3.sscs.overlapClip.temp.bai, testData/Stats/data/test3.sscs.overlapClip.metrics.txt
    log: testData/logs/test3_overlapClip_sscs.log
    jobid: 239
    wildcards: runPath=testData, sample=test3, sampType=sscs


[Wed Nov 25 13:13:49 2020]
rule varDict2VCF:
    input: testData/test5.dcs.varDict.txt, testData/test5.dcs.varDict.Ns.txt, testData/Final/dcs/test5.dcs.final.bam, testData/Final/dcs/test5.dcs.final.bam.bai
    output: testData/test5.dcs.raw.vcf, testData/Final/dcs/test5.dcs.snps.vcf
    jobid: 150
    wildcards: runPath=testData, sample=test5, sampType=dcs

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test5.dcs.varDict.txt.
Removing temporary output file testData/test5.dcs.varDict.Ns.txt.
[Wed Nov 25 13:14:08 2020]
Finished job 150.
1 of 272 steps (0.37%) done
Select jobs to execute...

[Wed Nov 25 13:14:08 2020]
rule make_final_VCF:
    input: testData/test5.dcs.raw.vcf
    output: testData/Final/dcs/test5.dcs.vcf
    jobid: 149
    wildcards: runPath=testData, sampType=dcs, sample=test5

Removing temporary output file testData/test5.dcs.raw.vcf.
[Wed Nov 25 13:14:08 2020]
Finished job 149.
2 of 272 steps (0.74%) done
Select jobs to execute...

[Wed Nov 25 13:14:08 2020]
rule makeDirs:
    output: testData/.test2_dirsMade
    jobid: 7
    wildcards: runPath=testData, sample=test2

Touching output file testData/.test2_dirsMade.
[Wed Nov 25 13:14:08 2020]
Finished job 7.
3 of 272 steps (1%) done
Select jobs to execute...

[Wed Nov 25 13:14:09 2020]
rule makeConsensus:
    input: testData/testSeq1.fastq.gz, testData/testSeq2.fastq.gz, testData/.test2_dirsMade
    output: testData/Intermediate/ConsensusMakerOutputs/test2_read1_sscs.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test2_read2_sscs.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test2_read1_dcs.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test2_read2_dcs.fq.gz, testData/test2.temp.sort.bam, testData/Stats/data/test2.tagstats.txt, testData/Stats/plots/test2_family_size.png, testData/Stats/plots/test2_fam_size_relation.png, testData/Intermediate/ConsensusMakerOutputs/test2_aln_seq1.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test2_aln_seq2.fq.gz, testData/Stats/data/test2_cmStats.txt
    log: testData/logs/test2_makeConsensus.log
    jobid: 6
    wildcards: runPath=testData, sample=test2
    priority: 50

Removing temporary output file testData/test3.sscs.clipped.bai.
Removing temporary output file testData/test3.sscs.clipped.bam.
[Wed Nov 25 13:14:12 2020]
Finished job 239.
4 of 272 steps (1%) done
Select jobs to execute...

[Wed Nov 25 13:14:12 2020]
rule FinalFilter:
    input: testData/test3.sscs.overlapClip.temp.bam, testData/test3.sscs.overlapClip.temp.bai
    output: testData/Final/sscs/test3.sscs.final.bam
    log: testData/logs/test3_finalFilter_sscs.log
    jobid: 238
    wildcards: runPath=testData, sampType=sscs, sample=test3

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test3.sscs.overlapClip.temp.bam.
Removing temporary output file testData/test3.sscs.overlapClip.temp.bai.
[Wed Nov 25 13:14:27 2020]
Finished job 238.
5 of 272 steps (2%) done
Select jobs to execute...

[Wed Nov 25 13:14:27 2020]
rule makeCountMuts:
    input: testData/Final/dcs/test5.dcs.vcf, testData/Final/dcs/test5.dcs.final.bam, testData/Final/dcs/test5.dcs.final.bam.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa, /home/kohrnb/Duplex-Seq-Pipeline/test/testTarget/test.bed
    output: testData/Final/dcs/test5.dcs.countmuts.csv
    jobid: 148
    wildcards: runPath=testData, sampType=dcs, sample=test5

Removing temporary output file testData/.test2_dirsMade.
[Wed Nov 25 13:14:29 2020]
Finished job 6.
6 of 272 steps (2%) done
Select jobs to execute...

[Wed Nov 25 13:14:29 2020]
rule getFlagstats:
    input: testData/test2.temp.sort.bam
    output: testData/Stats/data/test2.temp.sort.flagstats.txt
    jobid: 5
    wildcards: runPath=testData, fileBase=test2.temp.sort

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
[Wed Nov 25 13:14:42 2020]
Finished job 148.
7 of 272 steps (3%) done
Removing temporary output file testData/test2.temp.sort.bam.
[Wed Nov 25 13:14:43 2020]
Finished job 5.
8 of 272 steps (3%) done
Select jobs to execute...

[Wed Nov 25 13:14:43 2020]
rule clipAdapters:
    input: testData/Intermediate/ConsensusMakerOutputs/test6_read1_sscs.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test6_read2_sscs.fq.gz
    output: testData/test6_read1_sscs.adaptClip.fq.gz, testData/test6_read2_sscs.adaptClip.fq.gz
    jobid: 37
    wildcards: runPath=testData, sample=test6, sampType=sscs


[Wed Nov 25 13:14:43 2020]
rule getOnTarget:
    input: testData/Intermediate/ConsensusMakerOutputs/test2_aln_seq1.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test2_aln_seq2.fq.gz, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa, /home/kohrnb/Duplex-Seq-Pipeline/test/testTarget/test.bed
    output: testData/test2_mem.aln.sort.bam, testData/test2_mem.aln.sort.bam.bai, testData/Stats/data/test2_onTargetCount.txt
    jobid: 218
    wildcards: runPath=testData, sample=test2

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test2_mem.aln.sort.bam.
Removing temporary output file testData/test2_mem.aln.sort.bam.bai.
[Wed Nov 25 13:14:58 2020]
Finished job 218.
9 of 272 steps (3%) done
[Wed Nov 25 13:14:58 2020]
Finished job 37.
10 of 272 steps (4%) done
Select jobs to execute...

[Wed Nov 25 13:14:58 2020]
rule makeBufferedBed:
    input: /home/kohrnb/Duplex-Seq-Pipeline/test/testTarget/test.bed, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test1.vardictBed.bed, testData/test1.ref.genome
    jobid: 86
    wildcards: runPath=testData, sample=test1


[Wed Nov 25 13:14:58 2020]
rule alignReads:
    input: testData/test6_read1_sscs.adaptClip.fq.gz, testData/test6_read2_sscs.adaptClip.fq.gz, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test6_mem.sscs.sort.bam, testData/test6_mem.sscs.sort.bam.bai, testData/test6.sscs.alignReads.samtoolsTemp
    log: testData/logs/test6_bwa_sscs.log
    jobid: 36
    wildcards: runPath=testData, sample=test6, sampType=sscs
    priority: 45

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test1.ref.genome.
[Wed Nov 25 13:15:12 2020]
Finished job 86.
11 of 272 steps (4%) done
Touching output file testData/test6.sscs.alignReads.samtoolsTemp.
Removing temporary output file testData/test6_read1_sscs.adaptClip.fq.gz.
Removing temporary output file testData/test6_read2_sscs.adaptClip.fq.gz.
Removing temporary output file testData/test6_mem.sscs.sort.bam.bai.
Removing temporary output file testData/test6.sscs.alignReads.samtoolsTemp.
[Wed Nov 25 13:15:12 2020]
Finished job 36.
12 of 272 steps (4%) done
Select jobs to execute...

[Wed Nov 25 13:15:12 2020]
rule getFlagstats:
    input: testData/test6_mem.sscs.sort.bam
    output: testData/Stats/data/test6_mem.sscs.sort.flagstats.txt
    jobid: 35
    wildcards: runPath=testData, fileBase=test6_mem.sscs.sort


[Wed Nov 25 13:15:12 2020]
rule makePreEndClip:
    input: testData/test6_mem.sscs.sort.bam
    output: testData/test6.sscs.prevar.temp.bam
    jobid: 63
    wildcards: runPath=testData, sample=test6, sampType=sscs

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
[Wed Nov 25 13:15:27 2020]
Finished job 63.
13 of 272 steps (5%) done
Removing temporary output file testData/test6_mem.sscs.sort.bam.
[Wed Nov 25 13:15:27 2020]
Finished job 35.
14 of 272 steps (5%) done
Select jobs to execute...

[Wed Nov 25 13:15:27 2020]
rule makeTempBai:
    input: testData/test6.sscs.prevar.temp.bam
    output: testData/test6.sscs.prevar.temp.bam.bai
    jobid: 64
    wildcards: runPath=testData, fileBase=test6.sscs.prevar


[Wed Nov 25 13:15:27 2020]
rule getOnTarget:
    input: testData/Intermediate/ConsensusMakerOutputs/test3_aln_seq1.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test3_aln_seq2.fq.gz, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa, /home/kohrnb/Duplex-Seq-Pipeline/test/testTarget/test.bed
    output: testData/test3_mem.aln.sort.bam, testData/test3_mem.aln.sort.bam.bai, testData/Stats/data/test3_onTargetCount.txt
    jobid: 237
    wildcards: runPath=testData, sample=test3

Select jobs to execute...
Failed to solve scheduling problem with ILP solver. Falling back to greedy solver.
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
[Wed Nov 25 13:15:41 2020]
Finished job 64.
15 of 272 steps (6%) done
Select jobs to execute...

[Wed Nov 25 13:15:41 2020]
rule endClip:
    input: testData/test6.sscs.prevar.temp.bam, testData/test6.sscs.prevar.temp.bam.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test6.sscs.clipped.bam, testData/test6.sscs.clipped.bai, testData/Stats/data/test6.sscs.endClip.metrics.txt
    log: testData/logs/test6_endClip_sscs.log
    jobid: 62
    wildcards: runPath=testData, sample=test6, sampType=sscs

Removing temporary output file testData/test3_mem.aln.sort.bam.
Removing temporary output file testData/test3_mem.aln.sort.bam.bai.
[Wed Nov 25 13:15:41 2020]
Finished job 237.
16 of 272 steps (6%) done
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Touching output file testData/Stats/data/test6.sscs.endClip.metrics.txt.
Removing temporary output file testData/test6.sscs.prevar.temp.bam.
Removing temporary output file testData/test6.sscs.prevar.temp.bam.bai.
[Wed Nov 25 13:15:56 2020]
Finished job 62.
17 of 272 steps (6%) done
Select jobs to execute...

[Wed Nov 25 13:15:56 2020]
rule overlapClip:
    input: testData/test6.sscs.clipped.bam, testData/test6.sscs.clipped.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test6.sscs.overlapClip.temp.bam, testData/test6.sscs.overlapClip.temp.bai, testData/Stats/data/test6.sscs.overlapClip.metrics.txt
    log: testData/logs/test6_overlapClip_sscs.log
    jobid: 61
    wildcards: runPath=testData, sample=test6, sampType=sscs


[Wed Nov 25 13:15:56 2020]
rule MutsPerCycle:
    input: /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa, testData/Final/dcs/test5.dcs.final.bam, testData/Final/dcs/test5.dcs.final.bam.bai, testData/Final/dcs/test5.dcs.vcf
    output: testData/Final/dcs/test5.dcs.mutated.bam, testData/Stats/plots/test5.dcs_BasePerPosInclNs.png, testData/Stats/plots/test5.dcs_BasePerPosWithoutNs.png, testData/Stats/data/test5.dcs_MutsPerCycle.dat.csv, testData/Stats/data/test5.dcs.mutsPerRead.txt, testData/Stats/plots/test5.dcs.mutsPerRead.png
    log: testData/logs/test5_stats_dcs.log
    jobid: 184
    wildcards: runPath=testData, sampType=dcs, sample=test5

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test6.sscs.clipped.bai.
Removing temporary output file testData/test6.sscs.clipped.bam.
[Wed Nov 25 13:16:16 2020]
Finished job 61.
18 of 272 steps (7%) done
Select jobs to execute...

[Wed Nov 25 13:16:17 2020]
rule FinalFilter:
    input: testData/test6.sscs.overlapClip.temp.bam, testData/test6.sscs.overlapClip.temp.bai
    output: testData/Final/sscs/test6.sscs.final.bam
    log: testData/logs/test6_finalFilter_sscs.log
    jobid: 60
    wildcards: runPath=testData, sampType=sscs, sample=test6

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
[Wed Nov 25 13:16:20 2020]
Finished job 184.
19 of 272 steps (7%) done
Select jobs to execute...

[Wed Nov 25 13:16:20 2020]
rule makeBufferedBed:
    input: /home/kohrnb/Duplex-Seq-Pipeline/test/testTarget/test.bed, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test2.vardictBed.bed, testData/test2.ref.genome
    jobid: 106
    wildcards: runPath=testData, sample=test2

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test6.sscs.overlapClip.temp.bam.
Removing temporary output file testData/test6.sscs.overlapClip.temp.bai.
[Wed Nov 25 13:16:31 2020]
Finished job 60.
20 of 272 steps (7%) done
Select jobs to execute...

[Wed Nov 25 13:16:31 2020]
rule clipAdapters:
    input: testData/Intermediate/ConsensusMakerOutputs/test5_read1_sscs.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test5_read2_sscs.fq.gz
    output: testData/test5_read1_sscs.adaptClip.fq.gz, testData/test5_read2_sscs.adaptClip.fq.gz
    jobid: 34
    wildcards: runPath=testData, sample=test5, sampType=sscs

Removing temporary output file testData/test2.ref.genome.
[Wed Nov 25 13:16:35 2020]
Finished job 106.
21 of 272 steps (8%) done
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
[Wed Nov 25 13:16:43 2020]
Finished job 34.
22 of 272 steps (8%) done
Select jobs to execute...

[Wed Nov 25 13:16:43 2020]
rule makeBufferedBed:
    input: /home/kohrnb/Duplex-Seq-Pipeline/test/testTarget/test.bed, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test3.vardictBed.bed, testData/test3.ref.genome
    jobid: 126
    wildcards: runPath=testData, sample=test3


[Wed Nov 25 13:16:43 2020]
rule alignReads:
    input: testData/test5_read1_sscs.adaptClip.fq.gz, testData/test5_read2_sscs.adaptClip.fq.gz, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test5_mem.sscs.sort.bam, testData/test5_mem.sscs.sort.bam.bai, testData/test5.sscs.alignReads.samtoolsTemp
    log: testData/logs/test5_bwa_sscs.log
    jobid: 33
    wildcards: runPath=testData, sample=test5, sampType=sscs
    priority: 45

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test3.ref.genome.
[Wed Nov 25 13:16:57 2020]
Finished job 126.
23 of 272 steps (8%) done
Touching output file testData/test5.sscs.alignReads.samtoolsTemp.
Removing temporary output file testData/test5_read1_sscs.adaptClip.fq.gz.
Removing temporary output file testData/test5_read2_sscs.adaptClip.fq.gz.
Removing temporary output file testData/test5_mem.sscs.sort.bam.bai.
Removing temporary output file testData/test5.sscs.alignReads.samtoolsTemp.
[Wed Nov 25 13:16:57 2020]
Finished job 33.
24 of 272 steps (9%) done
Select jobs to execute...

[Wed Nov 25 13:16:57 2020]
rule makePreEndClip:
    input: testData/test5_mem.sscs.sort.bam
    output: testData/test5.sscs.prevar.temp.bam
    jobid: 274
    wildcards: runPath=testData, sample=test5, sampType=sscs


[Wed Nov 25 13:16:57 2020]
rule getFlagstats:
    input: testData/test5_mem.sscs.sort.bam
    output: testData/Stats/data/test5_mem.sscs.sort.flagstats.txt
    jobid: 32
    wildcards: runPath=testData, fileBase=test5_mem.sscs.sort

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
[Wed Nov 25 13:17:11 2020]
Finished job 32.
25 of 272 steps (9%) done
Removing temporary output file testData/test5_mem.sscs.sort.bam.
[Wed Nov 25 13:17:11 2020]
Finished job 274.
26 of 272 steps (10%) done
Select jobs to execute...

[Wed Nov 25 13:17:12 2020]
rule clipAdapters:
    input: testData/Intermediate/ConsensusMakerOutputs/test2_read1_dcs.fq.gz, testData/Intermediate/ConsensusMakerOutputs/test2_read2_dcs.fq.gz
    output: testData/test2_read1_dcs.adaptClip.fq.gz, testData/test2_read2_dcs.adaptClip.fq.gz
    jobid: 43
    wildcards: runPath=testData, sample=test2, sampType=dcs


[Wed Nov 25 13:17:12 2020]
rule makeTempBai:
    input: testData/test5.sscs.prevar.temp.bam
    output: testData/test5.sscs.prevar.temp.bam.bai
    jobid: 275
    wildcards: runPath=testData, fileBase=test5.sscs.prevar

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
[Wed Nov 25 13:17:26 2020]
Finished job 275.
27 of 272 steps (10%) done
Select jobs to execute...

[Wed Nov 25 13:17:26 2020]
rule endClip:
    input: testData/test5.sscs.prevar.temp.bam, testData/test5.sscs.prevar.temp.bam.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test5.sscs.clipped.bam, testData/test5.sscs.clipped.bai, testData/Stats/data/test5.sscs.endClip.metrics.txt
    log: testData/logs/test5_endClip_sscs.log
    jobid: 273
    wildcards: runPath=testData, sample=test5, sampType=sscs

[Wed Nov 25 13:17:26 2020]
Finished job 43.
28 of 272 steps (10%) done
Select jobs to execute...

[Wed Nov 25 13:17:26 2020]
rule alignReads:
    input: testData/test2_read1_dcs.adaptClip.fq.gz, testData/test2_read2_dcs.adaptClip.fq.gz, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test2_mem.dcs.sort.bam, testData/test2_mem.dcs.sort.bam.bai, testData/test2.dcs.alignReads.samtoolsTemp
    log: testData/logs/test2_bwa_dcs.log
    jobid: 42
    wildcards: runPath=testData, sample=test2, sampType=dcs
    priority: 45

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Touching output file testData/test2.dcs.alignReads.samtoolsTemp.
Removing temporary output file testData/test2_read2_dcs.adaptClip.fq.gz.
Removing temporary output file testData/test2_read1_dcs.adaptClip.fq.gz.
Removing temporary output file testData/test2.dcs.alignReads.samtoolsTemp.
[Wed Nov 25 13:17:41 2020]
Finished job 42.
29 of 272 steps (11%) done
Select jobs to execute...

[Wed Nov 25 13:17:41 2020]
rule PreBlastFilter:
    input: testData/test2_mem.dcs.sort.bam, testData/test2_mem.dcs.sort.bam.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa, /home/kohrnb/Duplex-Seq-Pipeline/test/testBlastDb/testBlastDb.nal
    output: testData/test2_mem.dcs.nonSecSup.bam, testData/test2_mem.dcs.nonSecSup.bam.bai, testData/test2_mem.dcs.SecSup.bam
    log: testData/logs/test2_PreBlastFilter_dcs.log
    jobid: 100
    wildcards: runPath=testData, sample=test2

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Touching output file testData/Stats/data/test5.sscs.endClip.metrics.txt.
Removing temporary output file testData/test5.sscs.prevar.temp.bam.
Removing temporary output file testData/test5.sscs.prevar.temp.bam.bai.
[Wed Nov 25 13:17:45 2020]
Finished job 273.
30 of 272 steps (11%) done
Select jobs to execute...

[Wed Nov 25 13:17:46 2020]
rule overlapClip:
    input: testData/test5.sscs.clipped.bam, testData/test5.sscs.clipped.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test5.sscs.overlapClip.temp.bam, testData/test5.sscs.overlapClip.temp.bai, testData/Stats/data/test5.sscs.overlapClip.metrics.txt
    log: testData/logs/test5_overlapClip_sscs.log
    jobid: 272
    wildcards: runPath=testData, sample=test5, sampType=sscs

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test2_mem.dcs.SecSup.bam.
[Wed Nov 25 13:17:57 2020]
Finished job 100.
31 of 272 steps (11%) done
Select jobs to execute...

[Wed Nov 25 13:17:57 2020]
rule PreBlastProcessing1:
    input: testData/test2_mem.dcs.nonSecSup.bam, testData/test2_mem.dcs.nonSecSup.bam.bai, /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa
    output: testData/test2_dcs.vars.vcf, testData/test2_dcs.vars.vcf_depth.txt
    log: testData/logs/test2_preBlast1_dcs.log
    jobid: 102
    wildcards: runPath=testData, sample=test2
    priority: 43

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test5.sscs.clipped.bam.
Removing temporary output file testData/test5.sscs.clipped.bai.
[Wed Nov 25 13:18:08 2020]
Finished job 272.
32 of 272 steps (12%) done
Select jobs to execute...

[Wed Nov 25 13:18:09 2020]
rule FinalFilter:
    input: testData/test5.sscs.overlapClip.temp.bam, testData/test5.sscs.overlapClip.temp.bai
    output: testData/Final/sscs/test5.sscs.final.bam
    log: testData/logs/test5_finalFilter_sscs.log
    jobid: 271
    wildcards: runPath=testData, sampType=sscs, sample=test5

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test2_dcs.vars.vcf_depth.txt.
[Wed Nov 25 13:18:14 2020]
Finished job 102.
33 of 272 steps (12%) done
Select jobs to execute...

[Wed Nov 25 13:18:14 2020]
rule PreBlastProcessing2:
    input: testData/test2_dcs.vars.vcf
    output: testData/test2_dcs.Markedvars.vcf, testData/test2_dcs.snps.vcf
    log: testData/logs/test2_preBlast2_dcs.log
    jobid: 101
    wildcards: runPath=testData, sample=test2
    priority: 43

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test5.sscs.overlapClip.temp.bai.
Removing temporary output file testData/test5.sscs.overlapClip.temp.bam.
[Wed Nov 25 13:18:24 2020]
Finished job 271.
34 of 272 steps (12%) done
Select jobs to execute...

[Wed Nov 25 13:18:25 2020]
rule InsertSize:
    input: /home/kohrnb/Duplex-Seq-Pipeline/test/testRef/testRef.fa, testData/test2_mem.dcs.sort.bam, testData/test2_mem.dcs.sort.bam.bai
    output: testData/Stats/data/test2.dcs.iSize_Metrics.txt, testData/test2.dcs.iSize_Histogram.pdf
    log: testData/logs/test2_stats_dcs.log
    jobid: 227
    wildcards: runPath=testData, sample=test2, sampType=dcs

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test2_dcs.vars.vcf.
Removing temporary output file testData/test2_dcs.Markedvars.vcf.
[Wed Nov 25 13:18:30 2020]
Finished job 101.
35 of 272 steps (13%) done
Select jobs to execute...

[Wed Nov 25 13:18:30 2020]
rule PreBlastProcessing3:
    input: testData/test2_mem.dcs.nonSecSup.bam, testData/test2_mem.dcs.nonSecSup.bam.bai, testData/test2_dcs.snps.vcf
    output: testData/Intermediate/postBlast/test2_dcs.preBlast.unmutated.bam, testData/Intermediate/postBlast/test2_dcs.preBlast.mutated.bam, testData/Intermediate/postBlast/test2_dcs.snpFiltered_MutsPerCycle.dat.csv
    log: testData/logs/test2_preBlast3_dcs.log
    jobid: 99
    wildcards: runPath=testData, sample=test2

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Touching output file testData/Stats/data/test2.dcs.iSize_Metrics.txt.
Touching output file testData/test2.dcs.iSize_Histogram.pdf.
Removing temporary output file testData/test2_mem.dcs.sort.bam.bai.
Removing temporary output file testData/test2.dcs.iSize_Histogram.pdf.
[Wed Nov 25 13:18:43 2020]
Finished job 227.
36 of 272 steps (13%) done
Select jobs to execute...

[Wed Nov 25 13:18:43 2020]
rule getFlagstats:
    input: testData/test2_mem.dcs.sort.bam
    output: testData/Stats/data/test2_mem.dcs.sort.flagstats.txt
    jobid: 41
    wildcards: runPath=testData, fileBase=test2_mem.dcs.sort

Removing temporary output file testData/test2_mem.dcs.nonSecSup.bam.bai.
Removing temporary output file testData/test2_mem.dcs.nonSecSup.bam.
Removing temporary output file testData/test2_dcs.snps.vcf.
Removing temporary output file testData/Intermediate/postBlast/test2_dcs.snpFiltered_MutsPerCycle.dat.csv.
[Wed Nov 25 13:18:46 2020]
Finished job 99.
37 of 272 steps (14%) done
Select jobs to execute...

[Wed Nov 25 13:18:47 2020]
rule PostBlastProcessing1:
    input: testData/Intermediate/postBlast/test2_dcs.preBlast.mutated.bam, testData/Intermediate/postBlast/test2_dcs.blast.xml
    output: testData/test2_dcs.speciesLabeled.bam
    log: testData/logs/test2_postBlast1_dcs.log
    jobid: 98
    wildcards: runPath=testData, sample=test2
    priority: 41

Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Activating conda environment: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
Removing temporary output file testData/test2_mem.dcs.sort.bam.
[Wed Nov 25 13:18:57 2020]
Finished job 41.
38 of 272 steps (14%) done
[Wed Nov 25 13:19:01 2020]
Error in rule PostBlastProcessing1:
    jobid: 98
    output: testData/test2_dcs.speciesLabeled.bam
    log: testData/logs/test2_postBlast1_dcs.log (check log file(s) for error message)
    conda-env: /home/kohrnb/Duplex-Seq-Pipeline/.snakemake/7e527c78
    shell:
        
        set -x
        cd testData
        python3 /home/kohrnb/Duplex-Seq-Pipeline/scripts/blastFilter.py         Intermediate/postBlast/test2_dcs.preBlast.mutated.bam         Intermediate/postBlast/test2_dcs.blast.xml         test2_dcs
        cd ../
        
        (one of the commands exited with non-zero exit code; note that snakemake uses bash strict mode!)

Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /home/kohrnb/Duplex-Seq-Pipeline/test/.snakemake/log/2020-11-25T131348.426672.snakemake.log

This pipeline crashed due to the file "Intermediate/postBlast/test2_dcs.blast.xml" not being found by the python script.

Minimal example
See test cases for https://github.com/Kennedy-Lab-UW/Duplex-Seq-Pipeline/tree/v2.0.0_prerelease; I was testing out a new version of my pipeline when I came across this issue. The test cases work just fine with snakemake=5.25.0 (at least, the pipeline runs to completion; whether the data is correct is not an issue for here). Both times I tried running them with snakemake=5.30.1, they crashed (one of those times, I tried restarting the pipeline and it ran the rest of the way through).

Additional context
I'm testing on WSL1 Ubuntu, but a colleague of mine also observed this on WSL2 Ubuntu. It doesn't seem to be related to the python version (happened with both python=3.8 and python=3.7)

@bkohrn bkohrn added the bug Something isn't working label Nov 25, 2020
@nh13
Copy link
Contributor

nh13 commented Nov 29, 2020

FWIW I have also seen this behavior on version 5.29.0. This happens reproducibly after a cluster job fails (due to a missing output file).

@bkohrn
Copy link
Author

bkohrn commented Jun 21, 2021

Does anyone know if this issue was ever addressed? Because while it is less common in v5.25.0, I still see this issue occasionally.

Edit: Actually, not exactly the same error. I'll put together a new issue for this. Still want to know if anyone ever figured this out.

@bkohrn
Copy link
Author

bkohrn commented Jun 21, 2021

Just checked, and still seeing this issue (or something similar to it) in the most recent version of snakemake (6.4.1). I've also started seeing solver issues in v5.25.0, which is what I've been using to avoid this issue; the solver issue I'm seeing in 5.25.0 doesn't seem to be an issue in 6.4.1, but it puts me in a odd spot where I will probably have to tell my users that they can't use Mac OS X (the second solver issue doesn't seem to happen on Linux).

The second solver issue seems to be independent of this issue, and results in the following showing up periodically.

Traceback (most recent call last):
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/__init__.py", line 735, in snakemake
    keepincomplete=keep_incomplete,
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/workflow.py", line 972, in execute
    success = scheduler.schedule()
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/scheduler.py", line 406, in schedule
    else self.job_selector_ilp(needrun)
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/scheduler.py", line 625, in job_selector_ilp
    prob.solve()
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/pulp/pulp.py", line 1737, in solve
    status = solver.actualSolve(self, **kwargs)
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/pulp/apis/glpk_api.py", line 81, in actualSolve
    raise PulpSolverError("PuLP: Error while trying to execute "+self.path)
pulp.apis.core.PulpSolverError: PuLP: Error while trying to execute glpsol

As best I can tell, glpsol seems to be installed on the system in question (Mac OS X Sierra). Again, this issue only shows up in v5.25.0 (which I know isn't the most recent version, but avoids the scheduler issue that originated this thread).

@johanneskoester
Copy link
Contributor

Just checked, and still seeing this issue (or something similar to it) in the most recent version of snakemake (6.4.1). I've also started seeing solver issues in v5.25.0, which is what I've been using to avoid this issue; the solver issue I'm seeing in 5.25.0 doesn't seem to be an issue in 6.4.1, but it puts me in a odd spot where I will probably have to tell my users that they can't use Mac OS X (the second solver issue doesn't seem to happen on Linux).

The second solver issue seems to be independent of this issue, and results in the following showing up periodically.

Traceback (most recent call last):
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/__init__.py", line 735, in snakemake
    keepincomplete=keep_incomplete,
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/workflow.py", line 972, in execute
    success = scheduler.schedule()
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/scheduler.py", line 406, in schedule
    else self.job_selector_ilp(needrun)
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/snakemake/scheduler.py", line 625, in job_selector_ilp
    prob.solve()
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/pulp/pulp.py", line 1737, in solve
    status = solver.actualSolve(self, **kwargs)
  File "/Users/loeblabm11/miniconda3/lib/python3.7/site-packages/pulp/apis/glpk_api.py", line 81, in actualSolve
    raise PulpSolverError("PuLP: Error while trying to execute "+self.path)
pulp.apis.core.PulpSolverError: PuLP: Error while trying to execute glpsol

As best I can tell, glpsol seems to be installed on the system in question (Mac OS X Sierra). Again, this issue only shows up in v5.25.0 (which I know isn't the most recent version, but avoids the scheduler issue that originated this thread).

So this error here should be fixed in the latest versions. We don't have the resources to backport such fixes.

However, regarding your real issue here (the one reported at the top): can you reproduce that when setting --scheduler greedy?

@bkohrn
Copy link
Author

bkohrn commented Jun 22, 2021

Error 2 seems to have been fixed in the latest versions of Snakemake (my suspicion is that there was something wrong in the conda setup script for Mac OS X)

With regards to the first:
My basic run command is:

snakemake \
-s /home/kohrnb/bioinformatics/Duplex-Seq-Pipeline-v2.0.0/Snakefile \
--use-conda --keep-going -j 12 \
--conda-prefix /home/kohrnb/bioinformatics/Duplex-Seq-Pipeline-v2.0.0/.snakemake \
--config samples="${inConfig}"

Step 1: Make a test environment running the latest version. The error still happens (although since I now use --keep-going, it isn't quite as big an issue since the last steps it runs before it stops are the steps that generate the files whose absence caused the error). I get the following error from the python script the rule calls:

Traceback (most recent call last):
  File "/home/kohrnb/bioinformatics/Duplex-Seq-Pipeline-v2.0.0/scripts/blastFilter.py", line 174, in <module>
    main()
  File "/home/kohrnb/bioinformatics/Duplex-Seq-Pipeline-v2.0.0/scripts/blastFilter.py", line 136, in main
    myIterator = multiparser(o.inBam, o.inXML)
  File "/home/kohrnb/bioinformatics/Duplex-Seq-Pipeline-v2.0.0/scripts/blastFilter.py", line 10, in __init__
    self.xml = NCBIXML.parse(open(inXml))
FileNotFoundError: [Errno 2] No such file or directory: 'Intermediate/postBlast/test1_dcs.blast.xml'

Step 2: Re-run test with --scheduler greedy:
New run command:

snakemake \
-s /home/kohrnb/bioinformatics/Duplex-Seq-Pipeline-v2.0.0/Snakefile \
--use-conda --keep-going -j 12 --scheduler greedy \
--conda-prefix /home/kohrnb/bioinformatics/Duplex-Seq-Pipeline-v2.0.0/.snakemake \
--config samples="${inConfig}"

Result: Still fails. I get the following error (from the scheduler; doesn't end up in the .snakemake log file):

Traceback (most recent call last):
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/__init__.py", line 701, in snakemake
    success = workflow.execute(
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/workflow.py", line 1036, in execute
    success = scheduler.schedule()
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/scheduler.py", line 470, in schedule
    run = self.job_selector(needrun)
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/scheduler.py", line 800, in job_selector_greedy
    c = list(map(self.job_reward, jobs))  # job rewards
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/scheduler.py", line 883, in job_reward
    input_size = job.inputsize
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/jobs.py", line 382, in inputsize
    self._inputsize = sum(f.size for f in self.input)
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/jobs.py", line 382, in <genexpr>
    self._inputsize = sum(f.size for f in self.input)
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/io.py", line 242, in wrapper
    return func(self, *args, **kwargs)
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/io.py", line 257, in wrapper
    return func(self, *args, **kwargs)
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/io.py", line 560, in size
    return self.size_local
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/io.py", line 565, in size_local
    self.check_broken_symlink()
  File "/home/kohrnb/miniconda3/envs/snakemake_6_4_1/lib/python3.9/site-packages/snakemake/io.py", line 570, in check_broken_symlink
    if not self.exists_local and os.lstat(self.file):
FileNotFoundError: [Errno 2] No such file or directory: 'testData/Intermediate/postBlast/test4_dcs.blast.xml'

Just for good measure, the rule that keeps failing due to this is:

rule PostBlastProcessing1:
    params:
        basePath = sys.path[0]
    priority: 41
    input:
        inBam1="{runPath}/Intermediate/postBlast/{sample}_dcs.preBlast.mutated.bam",
        inXML = ancient("{runPath}/Intermediate/postBlast/{sample}_dcs.blast.xml"),
    output:
        tempBam3 = temp("{runPath}/{sample}_dcs.speciesLabeled.bam"),
    conda:
        "envs/DS_env_full.yaml"
    log:
        "{runPath}/logs/{sample}_dcs_postBlast1.log"
    shell:
        """
        set -e
        set -o pipefail
        set -x
        {{
        cd {wildcards.runPath}
        python3 {params.basePath}/scripts/blastFilter.py \
        Intermediate/postBlast/{wildcards.sample}_dcs.preBlast.mutated.bam \
        Intermediate/postBlast/{wildcards.sample}_dcs.blast.xml \
        {wildcards.sample}_dcs
        cd ../
        }} 2>&1 | tee -a {log}
        """

and the rule that would generate that file is:

rule BLAST:
    params:
        basePath = sys.path[0],
        db = get_blast_db
    priority: 42
    threads: config["maxCores"]
    input:
        inBam1="{runPath}/Intermediate/postBlast/{sample}_dcs.preBlast.mutated.bam",
        inBlastDbPath = get_blast_db_path
    output:
        outXML = "{runPath}/Intermediate/postBlast/{sample}_dcs.blast.xml",
    conda:
        "envs/DS_env_full.yaml"
    log:
        "{runPath}/logs/{sample}_dcs_blast.log"
    shell:
        """
        set -e
        set -o pipefail
        set -x
        {{
        cd {wildcards.runPath}

        samtools fasta Intermediate/postBlast/{wildcards.sample}_dcs.preBlast.mutated.bam | \
        blastn -task blastn \
        -db {params.db} \
        -outfmt 16 \
        -max_hsps 2 \
        -max_target_seqs 2 \
        -num_threads {threads} \
        | python3 {params.basePath}/scripts/blastMonitor.py \
        > Intermediate/postBlast/{wildcards.sample}_dcs.blast.xml

        cd ../
        }} 2>&1 | tee -a {log}
        """

@johanneskoester
Copy link
Contributor

And just to make sure, the rule BLAST is not executed for the respective sample before the failing rule?

@bkohrn
Copy link
Author

bkohrn commented Jun 23, 2021

I've attached the complete logs generated by Snakemake for both the ILP and greedy solvers; the ILP solver eventually gets around to running the rule BLAST; the greedy solver doesn't and stops shortly after the error. In addition, there was some output to the terminal from the greedy solver version that didn't make it to the log file for whatever reason, so I also copied the terminal output from that version. The error in the terminal output version happens at line 4455.

ILP solver:
Duplex-Seq-Pipeline-ILP-solver.log

greedy solver:
Duplex-Seq-Pipeline-greedy-solver.log
Duplex-Seq-Pipeline-greedy-solver-terminal.log

@johanneskoester
Copy link
Contributor

Do you have any checkpoints or dynamic() in that worklfow?

@johanneskoester
Copy link
Contributor

I'm asking because that must be a quite extreme corner case. I have never seen anything like that, in hundreds of real world applications.

@bkohrn
Copy link
Author

bkohrn commented Jun 24, 2021

I don't have any use of the checkpoint keyword at the moment, though I was considering trying to add them to see if that fixed anything. I'm running the test case set from https://github.com/Kennedy-Lab-UW/Duplex-Seq-Pipeline to produce this right now.

@johanneskoester
Copy link
Contributor

Ah, wait. I think I have spotted it! The input file is marked as ancient in PostBlastProcessing1.

@johanneskoester
Copy link
Contributor

Usually, that was intended for files that are not generated by other jobs. So the behavior you are seeing here is kind of undefined. May I ask what behavior you intend to achieve by marking the file as ancient?

@johanneskoester
Copy link
Contributor

Of course, Snakemake should either forbid that or behave predictive in some way. Just not yet sure in what way, hence I am asking.

@bkohrn
Copy link
Author

bkohrn commented Jun 28, 2021

I believe my logic when I defined it that way was that Snakemake was occasionally rerunning that step when I didn’t think it needed to. My intention was something like “If the file exists, treat it as if it does not need updating; if the file doesn’t exist, create it”. I’d sort of expect that, even if the file is marked as ancient, the pipeline would still check for existence and generate it from other rules if necessary.

@johanneskoester
Copy link
Contributor

Yes, I agree. This makes sense.

@johanneskoester
Copy link
Contributor

With a small example, I cannot reproduce the problem though. Could you please remove the ancient flag and see if the problem goes away?

@bkohrn
Copy link
Author

bkohrn commented Jun 29, 2021

It seems to (removed ancient, it ran without issue, restored ancient, and it failed again), and the issue I was having previously with rerunning the BLAST step inappropriately doesn't seem to be happening any more (probably due to other changes I made in the pipeline). Thanks for the help!

@scottrk
Copy link

scottrk commented Jul 5, 2021

Looks like others are running into this issue, too: #946

@pvandyken
Copy link
Contributor

Closing as the issue is stale and appears to be resolved. Please comment if this is a mistake.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants