New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to run tasks under Windows #4081
Comments
FWIW I worked around this by using the eventlet pool implementation ("-P eventlet" command line option). |
@drewdogg's solution should be mentioned in the tutorial. |
I have to confirm: This bug appears on
when running command and the following workaround works:
|
it's enough to define FORKED_BY_MULTIPROCESSING=1 environment variable for the worker instance. |
@auvipy Work for me, thanks. |
@auvipy it really solve the problem : ) 👍 |
maybe this should be mentioned in docs? @wonderfulsuccess care to send a pull request? |
Thanks So Much |
Thank it is worked! |
@auvipy if this is only one line of code to fix, then why not just fix it within celery, instead of using the docs to recommend the users implement a workaround? Why is a completely platform-breaking bug with such a simple fix still a problem after nearly 2 years? |
where do want celery put this could? I believe this is well suited for windows specific instruction. if you want to fix it in code level, come with an appropriate PR. |
You are awesome, thanks a ton! |
@auvipy I have been search an answer to this problem, I've spent a lot of time trying fix this, thank you so much |
Will this not disable concurrency? since I am planning to use celery only for concurrency as a replacement for threads, should I then go for this solution? |
you should. but in practice I will suggest to move to any unix-like system |
Celery 4.x starts (with fixes #4078) but all tasks crash
Steps to reproduce
Use First Steps tutorial (http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html)
celery -A tasks worker --loglevel=info
add.delay(2,2)
Expected behavior
Task is executed and a result of 4 is produced
Actual behavior
Celery crashes.
"C:\Program Files\Python36\Scripts\celery.exe" -A perse.celery worker -l info
-------------- celery@PETRUS v4.0.2 (latentcall)
---- **** -----
--- * *** * -- Windows-10-10.0.14393-SP0 2017-06-08 15:31:22
-- * - **** ---
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. perse.tasks.celery_add
[2017-06-08 15:31:22,685: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2017-06-08 15:31:22,703: INFO/MainProcess] mingle: searching for neighbors
[2017-06-08 15:31:23,202: INFO/SpawnPoolWorker-5] child process 5124 calling self.run()
[2017-06-08 15:31:23,207: INFO/SpawnPoolWorker-4] child process 10848 calling self.run()
[2017-06-08 15:31:23,208: INFO/SpawnPoolWorker-10] child process 5296 calling self.run()
[2017-06-08 15:31:23,214: INFO/SpawnPoolWorker-1] child process 5752 calling self.run()
[2017-06-08 15:31:23,218: INFO/SpawnPoolWorker-3] child process 11868 calling self.run()
[2017-06-08 15:31:23,226: INFO/SpawnPoolWorker-11] child process 9544 calling self.run()
[2017-06-08 15:31:23,227: INFO/SpawnPoolWorker-6] child process 16332 calling self.run()
[2017-06-08 15:31:23,229: INFO/SpawnPoolWorker-8] child process 3384 calling self.run()
[2017-06-08 15:31:23,234: INFO/SpawnPoolWorker-12] child process 8020 calling self.run()
[2017-06-08 15:31:23,241: INFO/SpawnPoolWorker-9] child process 15612 calling self.run()
[2017-06-08 15:31:23,243: INFO/SpawnPoolWorker-7] child process 9896 calling self.run()
[2017-06-08 15:31:23,245: INFO/SpawnPoolWorker-2] child process 260 calling self.run()
[2017-06-08 15:31:23,730: INFO/MainProcess] mingle: all alone
[2017-06-08 15:31:23,747: INFO/MainProcess] celery@PETRUS ready.
[2017-06-08 15:31:49,412: INFO/MainProcess] Received task: perse.tasks.celery_add[524d788e-e024-493d-9ed9-4b009315fea3]
[2017-06-08 15:31:49,416: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "c:\program files\python36\lib\site-packages\billiard\pool.py", line 359, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\program files\python36\lib\site-packages\celery\app\trace.py", line 518, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
Fix
See pull request #4078
The text was updated successfully, but these errors were encountered: