Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MultiProcess problem #110

Open
muzhi1991 opened this issue Dec 11, 2023 · 5 comments
Open

MultiProcess problem #110

muzhi1991 opened this issue Dec 11, 2023 · 5 comments
Labels
enhancement New feature or request

Comments

@muzhi1991
Copy link

muzhi1991 commented Dec 11, 2023

When I check code about Chat Streaming, This variable seems to be shared by multiple processes, I also find this variable is assigned by another process. But is_end is local variable. There may be some problems with this code, or is it designed like this?

while chat_thread.is_alive() or len(stream_handler.for_display) > 0:
# print(memory_pool, err_pool, "out")
if stream_handler.is_end:
# The ending of the streaming is marked by the is_end variable from AgentStreamingStdOutCallbackHandler in agent_streaming.py
break
if len(stream_handler.for_display) == 0:
# first time display list is empty

I am also confused that if this var changes to sharable, logic seems to be broken. Author's design should control this stream stop when chat_thread(this is a python process) not alive. When I use MacOS(m1, version 14.1.2 ) for test, this is_alive may not work, This may another problem.

Copy link

This issue is stale because it has been open 3 days with no activity. Remove stale label or comment or this will be closed in 4 days.

@github-actions github-actions bot added the Stale label Dec 15, 2023
@Timothyxxx Timothyxxx added enhancement New feature or request and removed Stale labels Dec 15, 2023
@Timothyxxx
Copy link
Contributor

Hi, I believe we designed so. Could you point out what problems it could lead to?

@muzhi1991
Copy link
Author

muzhi1991 commented Dec 19, 2023

On my macOS, if tool selected enable, llm chain will not raise exception,(trace it raise exception because this mac python problems),so i switch to spawn mode at begin of main.py (as code in __name__ == "__main__"). After the modification, the request is normal.
However, another problem occurs, subprocess will never end so chat_thread.is_alive() is always true.
Finally I found that When I use CODE_EXECUTION_MODE == 'docker' , subprocess will always trigger this start_kernel_publisher thread, and this thread will not stop. If I disable this mode, The child process exits normally.

@isperfee
Copy link

image
in this method multiprocess.Process will copy anther "executor" parameter for runing ,those address or id(executor) != id(executor)

@isperfee
Copy link

On mac os set multiprocess.set_start_method("fork", True)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants