Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fatal error: concurrent map writes #1605

Open
trim21 opened this issue Apr 19, 2024 · 5 comments · May be fixed by #1608
Open

fatal error: concurrent map writes #1605

trim21 opened this issue Apr 19, 2024 · 5 comments · May be fixed by #1608
Labels
type: bug Something not working as intended.

Comments

@trim21
Copy link
Contributor

trim21 commented Apr 19, 2024

  • Task version: v3.36.0
  • Operating system: windiws
  • Experiments enabled: no

I'm running job with --watch

fatal error: concurrent map writes

goroutine 10 [running]:
github.com/go-task/task/v3/internal/omap.(*OrderedMap[...]).Set(...)
        github.com/go-task/task/v3@v3.36.0/internal/omap/orderedmap.go:67
github.com/go-task/task/v3.(*Executor).GetTask(0xc00010f7c0, 0xc000071020)
        github.com/go-task/task/v3@v3.36.0/task.go:425 +0x2e8
github.com/go-task/task/v3.(*Executor).compiledTask(0xc00010f7c0, 0xc0001d3e08?, 0x1)
        github.com/go-task/task/v3@v3.36.0/variables.go:30 +0x39
github.com/go-task/task/v3.(*Executor).CompiledTask(...)
        github.com/go-task/task/v3@v3.36.0/variables.go:21
github.com/go-task/task/v3.(*Executor).registerWatchedFiles.func1(0xc000037280?)
        github.com/go-task/task/v3@v3.36.0/watch.go:127 +0x5e
github.com/go-task/task/v3.(*Executor).registerWatchedFiles(0xc00010f7c0, 0xc000037280, {0xc00006c298, 0x1, 0x0?})
        github.com/go-task/task/v3@v3.36.0/watch.go:176 +0xb3
github.com/go-task/task/v3.(*Executor).watchTasks.func3()
        github.com/go-task/task/v3@v3.36.0/watch.go:95 +0x72
created by github.com/go-task/task/v3.(*Executor).watchTasks in goroutine 1
        github.com/go-task/task/v3@v3.36.0/watch.go:92 +0x515

goroutine 1 [sleep]:
time.Sleep(0x12a05f200)
        runtime/time.go:195 +0x126
github.com/radovskyb/watcher.(*Watcher).Start(0xc000037280, 0x12a05f200)
        github.com/radovskyb/watcher@v1.0.7/watcher.go:608 +0x105
github.com/go-task/task/v3.(*Executor).watchTasks(0xc00010f7c0, {0xc00006c298, 0x1, 0x1})
        github.com/go-task/task/v3@v3.36.0/watch.go:102 +0x525
github.com/go-task/task/v3.(*Executor).Run(0xc00010f7c0, {0xef9150, 0x121a620}, {0xc00006c280, 0x1, 0x1})
        github.com/go-task/task/v3@v3.36.0/task.go:139 +0x39d
main.run()
        github.com/go-task/task/v3@v3.36.0/cmd/task/task.go:193 +0xadb
main.main()
        github.com/go-task/task/v3@v3.36.0/cmd/task/task.go:25 +0x1f

goroutine 6 [runnable]:
syscall.Environ()
        syscall/env_windows.go:95 +0x1ff
os.Environ(...)
        os/env.go:140
github.com/go-task/task/v3/internal/compiler.GetEnviron()
        github.com/go-task/task/v3@v3.36.0/internal/compiler/env.go:14 +0x45
github.com/go-task/task/v3/internal/compiler.(*Compiler).getVariables(0xc00007e1e0, 0xc0001c45a0, 0xc000071020, 0x0)
        github.com/go-task/task/v3@v3.36.0/internal/compiler/compiler.go:49 +0x45
github.com/go-task/task/v3/internal/compiler.(*Compiler).FastGetVariables(...)
        github.com/go-task/task/v3@v3.36.0/internal/compiler/compiler.go:45
github.com/go-task/task/v3.(*Executor).compiledTask(0xc00010f7c0, 0x0?, 0x0)
        github.com/go-task/task/v3@v3.36.0/variables.go:39 +0xbe
github.com/go-task/task/v3.(*Executor).FastCompiledTask(...)
        github.com/go-task/task/v3@v3.36.0/variables.go:26
github.com/go-task/task/v3.(*Executor).RunTask(0xc00010f7c0, {0xef9210, 0xc000100910}, 0xc000071020)
        github.com/go-task/task/v3@v3.36.0/task.go:163 +0x5a
github.com/go-task/task/v3.(*Executor).watchTasks.func1()
        github.com/go-task/task/v3@v3.36.0/watch.go:36 +0x2f
created by github.com/go-task/task/v3.(*Executor).watchTasks in goroutine 1
        github.com/go-task/task/v3@v3.36.0/watch.go:35 +0x205

goroutine 7 [syscall]:
os/signal.signal_recv()
        runtime/sigqueue.go:152 +0x29
os/signal.loop()
        os/signal/signal_unix.go:23 +0x13
created by os/signal.Notify.func1.1 in goroutine 1
        os/signal/signal.go:151 +0x1f

goroutine 8 [chan receive]:
github.com/go-task/task/v3.closeOnInterrupt.func1()
        github.com/go-task/task/v3@v3.36.0/watch.go:117 +0x25
created by github.com/go-task/task/v3.closeOnInterrupt in goroutine 1
        github.com/go-task/task/v3@v3.36.0/watch.go:116 +0xc9

goroutine 9 [select]:
github.com/go-task/task/v3.(*Executor).watchTasks.func2()
        github.com/go-task/task/v3@v3.36.0/watch.go:62 +0x10f
created by github.com/go-task/task/v3.(*Executor).watchTasks in goroutine 1
        github.com/go-task/task/v3@v3.36.0/watch.go:60 +0x479
@task-bot task-bot added the state: needs triage Waiting to be triaged by a maintainer. label Apr 19, 2024
@pd93
Copy link
Member

pd93 commented Apr 21, 2024

@trim21 This can likely be fixed by adding a mutex to the OrderedMap implementation. However, it would be really useful if there was a reproduceable test for this. Is this something that you see consistently? If so, are you able to provide a reproducable example?

@pd93 pd93 added type: bug Something not working as intended. and removed state: needs triage Waiting to be triaged by a maintainer. labels Apr 21, 2024
@trim21
Copy link
Contributor Author

trim21 commented Apr 21, 2024

@trim21 This can likely be fixed by adding a mutex to the OrderedMap implementation. However, it would be really useful if there was a reproduceable test for this. Is this something that you see consistently? If so, are you able to provide a reproducable example?

I'm using a taskfile with glob source and generated. But I'm not sure how to re-produce this.

@pd93 pd93 linked a pull request Apr 21, 2024 that will close this issue
@pd93
Copy link
Member

pd93 commented Apr 21, 2024

@trim21 No problem. Hopefully #1608 will fix your issue.

@vergenzt
Copy link
Contributor

I'm getting the same issue at the moment. It's been happening fairly often. It seems to happen less frequently when I limit the concurrency via -C1 (or even -C2 sometimes), or when I run my task for just a single target.

I'm now experiencing it on a single task which runs a complicated nested loop in Bash.

  • Task version: v3.37.0 (h1:9752Ej9q4RlG2vkbMKMX89EvrZCSyReBQ6N1hGniiCE=)
  • OS: Mac 14.3
  • No experiments enabled I think (that can only happen via a TASK_X_* env var, right?)

Stack traces:

$ task import SHORTCUT_PATH=Archive/Archive\ on\ Wayback\ Machine
task: [import:export-from-account] shortcuts run 'Export Shortcuts'
task: [import:extract] tmp=$(mktemp -d); cat '/Users/tim_vergenz/Library/Mobile Documents/iCloud~is~workflow~my~workflows/Documents'/'Archive/Archive on Wayback Machine'.shortcut | tee build/'Archive/Archive on Wayback Machine'.shortcut | fq -r -d bytes '.[((.[0x8:0xC] | explode | reverse | tobytes | tonumber) + 0x495c):] | print' | lzfse -decode | aa extract -d "$tmp" && mv "$tmp"/Shortcut.wflow build/'Archive/Archive on Wayback Machine'.wflow && cat build/'Archive/Archive on Wayback Machine'.wflow | plutil -convert json -o - - | jq --sort-keys '(.WFWorkflowTypes, .WFWorkflowOutputContentItemClasses) |= sort' > build/'Archive/Archive on Wayback Machine'-in.json
task: [import:default] jsonnetfmt build/'Archive/Archive on Wayback Machine'-in.json > src/'Archive/Archive on Wayback Machine'.jsonnet
task: [import:default] phase=1; seen=(); while find lib/ast-grep/rules/$phase-* -quit 2>/dev/null; do while ! printf '%s\n' "${seen[@]}" | grep -qF -e "${next_sha:=next}"; do seen+=("${next_sha:-}"); find lib/ast-grep/rules/$phase-* -exec sg scan --update-all --rule='{}' src/'Archive/Archive on Wayback Machine'.jsonnet \; && jsonnetfmt -i src/'Archive/Archive on Wayback Machine'.jsonnet; next_sha=$(sha1sum src/'Archive/Archive on Wayback Machine'.jsonnet | awk '{print $1}'); done; phase=$((phase+1)); done;
fatal error: concurrent map read and map write

goroutine 64 [running]:
mvdan.cc/sh/v3/interp.(*overlayEnviron).Get(0xc000398280, {0x14c308a, 0x3})
        mvdan.cc/sh/v3@v3.8.0/interp/vars.go:29 +0x87
mvdan.cc/sh/v3/interp.(*overlayEnviron).Get(0xc000398500, {0x14c308a, 0x3})
        mvdan.cc/sh/v3@v3.8.0/interp/vars.go:32 +0x110
mvdan.cc/sh/v3/interp.(*Runner).lookupVar(0xc000004900, {0x14c308a, 0x3})
        mvdan.cc/sh/v3@v3.8.0/interp/vars.go:159 +0x3ed
mvdan.cc/sh/v3/interp.expandEnv.Get({0x0?}, {0x14c308a?, 0x0?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:227 +0x4b
mvdan.cc/sh/v3/expand.prepareConfig(0x0?)
        mvdan.cc/sh/v3@v3.8.0/expand/expand.go:113 +0xae
mvdan.cc/sh/v3/expand.Fields(0x3?, {0xc000010030, 0x3, 0x3?})
        mvdan.cc/sh/v3@v3.8.0/expand/expand.go:447 +0x32
mvdan.cc/sh/v3/interp.(*Runner).fields(0xc000004900, {0xc000010030?, 0xc00039e5a0?, 0x13?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:196 +0x27
mvdan.cc/sh/v3/interp.(*Runner).cmd(0xc000004900, {0x157a790?, 0xc00039e5a0}, {0x1579c98?, 0xc0001e55f0?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:368 +0x24ab
mvdan.cc/sh/v3/interp.(*Runner).stmtSync(0xc000004900, {0x157a790, 0xc00039e5a0}, 0xc00029a1b8)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:314 +0x285
mvdan.cc/sh/v3/interp.(*Runner).stmt(0xc000004900, {0x157a790?, 0xc00039e5a0}, 0xc00029a1b8)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:295 +0x159
mvdan.cc/sh/v3/interp.(*Runner).cmd.func1()
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:454 +0x39
created by mvdan.cc/sh/v3/interp.(*Runner).cmd in goroutine 1
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:453 +0x476

goroutine 1 [runnable]:
mvdan.cc/sh/v3/expand.(*Config).paramExp(0xc0001e8140, 0xc000428af0)
        mvdan.cc/sh/v3@v3.8.0/expand/param.go:336 +0x1602
mvdan.cc/sh/v3/expand.(*Config).wordField(0xc0001e8140, {0xc00004aaf0?, 0x1, 0x0?}, 0x1)
        mvdan.cc/sh/v3@v3.8.0/expand/expand.go:544 +0x3b1
mvdan.cc/sh/v3/expand.(*Config).wordFields(0xc0001e8140, {0xc0002541f8?, 0x1, 0x1018bc8?})
        mvdan.cc/sh/v3@v3.8.0/expand/expand.go:670 +0x48b
mvdan.cc/sh/v3/expand.Fields(0x4?, {0xc000398560, 0x4, 0x4?})
        mvdan.cc/sh/v3@v3.8.0/expand/expand.go:457 +0x208
mvdan.cc/sh/v3/interp.(*Runner).fields(0xc000004600, {0xc000398560?, 0xc00039e5a0?, 0x0?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:196 +0x27
mvdan.cc/sh/v3/interp.(*Runner).cmd(0xc000004600, {0x157a790?, 0xc00039e5a0}, {0x1579c98?, 0xc0001e5640?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:368 +0x24ab
mvdan.cc/sh/v3/interp.(*Runner).stmtSync(0xc000004600, {0x157a790, 0xc00039e5a0}, 0xc00029a210)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:314 +0x285
mvdan.cc/sh/v3/interp.(*Runner).stmt(0xc000004600, {0x157a790?, 0xc00039e5a0}, 0xc00029a210)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:295 +0x159
mvdan.cc/sh/v3/interp.(*Runner).cmd(0xc000004600, {0x157a790?, 0xc00039e5a0}, {0x1579d88?, 0xc000398120?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:458 +0x49f
mvdan.cc/sh/v3/interp.(*Runner).stmtSync(0xc000004600, {0x157a790, 0xc00039e5a0}, 0xc00029a268)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:314 +0x285
mvdan.cc/sh/v3/interp.(*Runner).stmt(0xc000004600, {0x157a790?, 0xc00039e5a0}, 0xc00029a268)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:295 +0x159
mvdan.cc/sh/v3/interp.(*Runner).stmts(...)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:795
mvdan.cc/sh/v3/interp.(*Runner).cmd(0xc000004600, {0x157a790?, 0xc00039e5a0}, {0x1579d28?, 0xc00016c180?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:485 +0x205f
mvdan.cc/sh/v3/interp.(*Runner).stmtSync(0xc000004600, {0x157a790, 0xc00039e5a0}, 0xc00029a160)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:314 +0x285
mvdan.cc/sh/v3/interp.(*Runner).stmt(0xc000004600, {0x157a790?, 0xc00039e5a0}, 0xc00029a160)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:295 +0x159
mvdan.cc/sh/v3/interp.(*Runner).loopStmtsBroken(0xc000004600, {0x157a790, 0xc00039e5a0}, {0xc0002b09c0, 0x2, 0x0?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:895 +0xbb
mvdan.cc/sh/v3/interp.(*Runner).cmd(0xc000004600, {0x157a790?, 0xc00039e5a0}, {0x1579d28?, 0xc00016c100?})
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:490 +0x201f
mvdan.cc/sh/v3/interp.(*Runner).stmtSync(0xc000004600, {0x157a790, 0xc00039e5a0}, 0xc00029a0b0)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:314 +0x285
mvdan.cc/sh/v3/interp.(*Runner).stmt(0xc000004600, {0x157a790?, 0xc00039e5a0}, 0xc00029a0b0)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:295 +0x159
mvdan.cc/sh/v3/interp.(*Runner).stmts(...)
        mvdan.cc/sh/v3@v3.8.0/interp/runner.go:795
mvdan.cc/sh/v3/interp.(*Runner).Run(0xc000004600, {0x157a790, 0xc00039e5a0}, {0x15785a8?, 0xc00004a9c0})
        mvdan.cc/sh/v3@v3.8.0/interp/api.go:749 +0x2d6
github.com/go-task/task/v3/internal/execext.RunCommand({0x157a790, 0xc00039e5a0}, 0xc0001f7650)
        github.com/go-task/task/v3@v3.37.0/internal/execext/exec.go:90 +0x8ab
github.com/go-task/task/v3.(*Executor).runCommand(0xc0001e9680, {0x157a790, 0xc00039e5a0}, 0xc00017e000, 0xc00007c6c0?, 0x10?)
        github.com/go-task/task/v3@v3.37.0/task.go:357 +0x89a
github.com/go-task/task/v3.(*Executor).RunTask.func1({0x157a790, 0xc00039e5a0})
        github.com/go-task/task/v3@v3.37.0/task.go:251 +0x7bd
github.com/go-task/task/v3.(*Executor).startExecution(0xc0001e9680, {0x157a790, 0xc00039e5a0}, 0xe?, 0xc0001f79d8)
        github.com/go-task/task/v3@v3.37.0/task.go:387 +0x382
github.com/go-task/task/v3.(*Executor).RunTask(0xc0001e9680, {0x157a790, 0xc00039e5a0}, 0xc00007c6c0)
        github.com/go-task/task/v3@v3.37.0/task.go:196 +0x34d
github.com/go-task/task/v3.(*Executor).Run(0xc0001e9680, {0x157a6d0, 0x186c500}, {0xc000070220, 0x1, 0x1})
        github.com/go-task/task/v3@v3.37.0/task.go:129 +0x359
main.run()
        github.com/go-task/task/v3@v3.37.0/cmd/task/task.go:194 +0xb5d
main.main()
        github.com/go-task/task/v3@v3.37.0/cmd/task/task.go:24 +0x1f

goroutine 8 [syscall]:
os/signal.signal_recv()
        runtime/sigqueue.go:149 +0x25
os/signal.loop()
        os/signal/signal_unix.go:23 +0x13
created by os/signal.Notify.func1.1 in goroutine 1
        os/signal/signal.go:151 +0x1f

goroutine 33 [chan receive]:
github.com/go-task/task/v3.(*Executor).InterceptInterruptSignals.func1()
        github.com/go-task/task/v3@v3.37.0/signals.go:20 +0x65
created by github.com/go-task/task/v3.(*Executor).InterceptInterruptSignals in goroutine 1
        github.com/go-task/task/v3@v3.37.0/signals.go:18 +0xc9

goroutine 65 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 14
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 61 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 15
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 59 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 97
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 85 [chan receive]:
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:115 +0x2f
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1 in goroutine 1
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:114 +0x34b

goroutine 56 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 113
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 60 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 81
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 57 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 38
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 58 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 39
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 54 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 42
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 55 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 53
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 98 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 52
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 84 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 41
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 22 [sleep]:
time.Sleep(0x37e11d600)
        runtime/time.go:195 +0x125
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:126 +0x25
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1 in goroutine 83
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:125 +0x7f

goroutine 86 [chan receive]:
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:115 +0x2f
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1 in goroutine 1
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:114 +0x34b

goroutine 63 [chan receive]:
github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1.1()
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:115 +0x2f
created by github.com/go-task/task/v3/internal/execext.execHandler.DefaultExecHandler.func1 in goroutine 1
        mvdan.cc/sh/v3@v3.8.0/interp/handler.go:114 +0x34b

@vergenzt
Copy link
Contributor

And of course right after I post #1605 (comment) the individual task succeeded on simply rerunning. 🙂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug Something not working as intended.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants