-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to close a stream using chat_engine.stream_chat #13503
Comments
Hey @cxycxm! Nice to see you back 😊. Diving into your stream closing conundrum right now, expect an update in a flash! |
I actually don't think there is a way besides either exhausting the stream, or never continuing the stream |
Can the client connection of this chat_engine be actively disconnected, similar to httpclient.close() |
Maybe? I think it really depends on the llm being used too. You'd have to do some digging into api docs there |
To actively disconnect or manage the lifecycle of a stream in For a more direct approach to managing stream connections, especially for starting and stopping streams, the Events such as In summary, while there's no explicit
|
Maybe we can add a close method to class StreamingAgentChatResponse:
...
def close(self) -> None:
if self.chat_stream is not None:
# Here we assume that `self.chat_stream` is a generator,
# so this will trigger a GeneratorExit in the generator function.
self.chat_stream.close() Then we also need to handle GeneratorExit in the relevant llm's def _stream_chat(
self, messages: Sequence[ChatMessage], **kwargs: Any
) -> ChatResponseGen:
...
def gen() -> ChatResponseGen:
...
stream = client.chat.completions.create(
messages=message_dicts,
stream=True,
**self._get_model_kwargs(**kwargs),
)
for response in stream:
try:
...
yield ChatResponse(
message=ChatMessage(
role=role,
content=content,
additional_kwargs=additional_kwargs,
),
delta=content_delta,
raw=response,
additional_kwargs=self._get_response_token_counts(response),
)
except GeneratorExit:
# Interrupt the stream by closing the connection.
# see https://github.com/openai/openai-python/issues/969#issuecomment-1857158754
stream.response.close()
# Then exit gracefully.
return
return gen() |
This would need to be implemented for every streaming llm 😅 |
But, makes sense |
Yes, both the sync and async versions need to be modified. Perhaps we can leverage some decorators, although they may not be helpful in all cases. |
thanks |
Question Validation
Question
How to close a stream using chat_engine.stream_chat
The text was updated successfully, but these errors were encountered: