Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Polars import (and other libraries) crashes Jupyterlab Kernel under the MLRun Docker Environment #4512

Open
2 tasks done
gilad-rubin opened this issue Oct 31, 2023 · 2 comments

Comments

@gilad-rubin
Copy link

MLRun Version checks

  • I have checked that this issue has not already been reported.

  • I have confirmed this bug exists on the latest version of MLRun CE.

Reproducible Example

pip install polars
import polars

Issue Description

Jupyterlab kernel crashes upon importing polars

Expected Behavior

importing polars :)

Installation OS

Mac

Installation Method

Docker

Python Version

'3.9.13'

MLRun Version

'1.5.0'

Additional Information

This happens with the package 'daft' as well "pip install -U getdaft" import daft

@gilad-rubin gilad-rubin changed the title [Bug]: [Bug]: Polars import (and other libraries) crashes Jupyterlab Kernel under the MLRun Docker Environment Oct 31, 2023
@liranbg
Copy link
Member

liranbg commented Oct 31, 2023

Hi @gilad-rubin
Thank you for raising this issue.
May you attach logs from Jupyter pod or Jupyter kernel? Also, did you use docker or Kubernetes? What mlrun ce version? 🙏

@gilad-rubin
Copy link
Author

I put the information about my environment in the issue. I used Docker on my M1 mac. MLRun 1.5.0. This error was present also on version 1.4.0.

Here are the logs from the jupyter kernel:
giladrubin-jupyter-1 | [I 2023-11-02 06:42:27.454 ServerApp] Kernel restarted: a7bada15-6c81-44dc-9ab2-672656cbfcfc
giladrubin-jupyter-1 | [IPKernelApp] ERROR | No such comm target registered: jupyter.widget.control
giladrubin-jupyter-1 | [IPKernelApp] WARNING | No such comm: 5e7542aa-a021-4824-9b0a-b7919a25490f
giladrubin-jupyter-1 | [I 2023-11-02 06:42:36.446 ServerApp] AsyncIOLoopKernelRestarter: restarting kernel (1/5), keep random ports
giladrubin-jupyter-1 | [W 2023-11-02 06:42:36.447 ServerApp] kernel a7bada15-6c81-44dc-9ab2-672656cbfcfc restarted
giladrubin-jupyter-1 | [W 2023-11-02 06:42:36.450 ServerApp] kernel a7bada15-6c81-44dc-9ab2-672656cbfcfc restarted
giladrubin-jupyter-1 | Exception in callback <TaskWakeupMethWrapper object at 0x7ffff9d3a4f0>(<Future finis...t=b'\x88\x80'>)
giladrubin-jupyter-1 | handle: <Handle <TaskWakeupMethWrapper object at 0x7ffff9d3a4f0>(<Future finis...t=b'\x88\x80'>)>
giladrubin-jupyter-1 | Traceback (most recent call last):
giladrubin-jupyter-1 | File "/opt/conda/lib/python3.9/asyncio/events.py", line 80, in _run
giladrubin-jupyter-1 | self._context.run(self._callback, *self._args)
giladrubin-jupyter-1 | RuntimeError: Cannot enter into task <Task pending name='Task-593' coro=<RequestHandler._execute() running at /opt/conda/lib/python3.9/site-packages/tornado/web.py:1786> wait_for=<Future finished result=b'\x88\x80'> cb=[_HandlerDelegate.execute..() at /opt/conda/lib/python3.9/site-packages/tornado/web.py:2434]> while another task <Task pending name='Task-377' coro=<PeriodicCallback._run() running at /opt/conda/lib/python3.9/site-packages/tornado/ioloop.py:919> cb=[IOLoop.add_future..() at /opt/conda/lib/python3.9/site-packages/tornado/ioloop.py:692]> is being executed.
giladrubin-jupyter-1 | [W 2023-11-02 06:42:36.499 ServerApp] Replacing stale connection: a7bada15-6c81-44dc-9ab2-672656cbfcfc:54940a3d-730c-45bb-9b6c-6b832af27859
giladrubin-jupyter-1 | [W 2023-11-02 06:42:37.246 ServerApp] zmq message arrived on closed channel
giladrubin-jupyter-1 | [W 2023-11-02 06:42:37.251 ServerApp] zmq message arrived on closed channel

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants