-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Whether chat_engine can call stream_complete()? #13527
Comments
Hey there, @cxycxm! 🚀 Good to see you back with another intriguing question. Just diving into the depths to fetch your answer. Hang tight! |
the method you are looking for is
|
To directly address your question, response = chat_engine.stream_chat("Hello!")
for token in response.response_gen:
print(token, end="", flush=True) This approach allows you to process and print each token received from the chat engine in real-time. If you have any more questions or need further clarification, feel free to ask!
|
but i need stream_complete(),my purpose is to complete the conversation |
That's not how it works with chat engines sadly. Completion only makes sense if you have a single string, not a an alternating chat history. You can prompt it with a chat history and take the llms response as a completion of the previous message if you want |
Question Validation
Question
Whether chat_engine can call stream_complete()?
The text was updated successfully, but these errors were encountered: