Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BrokenResponseError: Can not build a coherent char history after a broken streaming response (See the previous Exception fro details). To inspect the last response object, use chat.last.To remove the last request/response Content objects from the chat call last_send, last_received = chat.rewind() and continue without it #307

Open
reloginn opened this issue Apr 30, 2024 · 3 comments
Assignees
Labels
component:python sdk Issue/PR related to Python SDK status:triaged Issue/PR triaged to the corresponding sub-team type:bug Something isn't working

Comments

@reloginn
Copy link

Traceback (most recent call last):
File "", line 1, in
File "/home/asahi/Проекты/Python/gemini/gemini/init.py", line 5, in main
run()
File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 81, in run
process_directory(SRC, DST)
File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 73, in process_directory
translate_file(src_path=src_path, dst_path=dst_path)
File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 48, in translate_file
translated = translate(src_path)
^^^^^^^^^^^^^^^^^^^
File "/home/asahi/Проекты/Python/gemini/gemini/main.py", line 36, in translate
second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/asahi/.cache/pypoetry/virtualenvs/gemini-UM_oiH3g-py3.12/lib/python3.12/site-packages/google/generativeai/generative_models.py", line 467, in send_message
history = self.history[:]
^^^^^^^^^^^^
File "/home/asahi/.cache/pypoetry/virtualenvs/gemini-UM_oiH3g-py3.12/lib/python3.12/site-packages/google/generativeai/generative_models.py", line 686, in history
raise generation_types.BrokenResponseError(
google.generativeai.types.generation_types.BrokenResponseError: Can not build a coherent char history after a broken streaming response (See the previous Exception fro details). To inspect the last response object, use chat.last.To remove the last request/response Content objects from the chat call last_send, last_received = chat.rewind() and continue without it.

code:

# --snip--
chat = model.start_chat()
PART_LEN = len(content) // 2
first_part = content[:PART_LEN]
second_part = content[PART_LEN:]
first_response = chat.send_message(content=MESSAGE_FOR_FIRST_PART + first_part, stream=True)
for chunk in first_response:
        chunks.append(chunk)
second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)
for chunk in second_response:
        chunks.append(chunk)
@singhniraj08 singhniraj08 added type:bug Something isn't working status:triaged Issue/PR triaged to the corresponding sub-team component:python sdk Issue/PR related to Python SDK labels May 2, 2024
@MarkDaoust
Copy link
Collaborator

Typo fix: #313

It's trying to tell you that the first stream broke, and it's not sure what to do.
Is the first part not raising an error?

@reloginn
Copy link
Author

reloginn commented May 3, 2024

The first part doesn't give any errors. The second part gives an error, previously everything worked, with identical code (I automate translation, and I was able to translate about 15 .md files), then just started to pop up this error and that's it, nothing helps.

@reloginn
Copy link
Author

reloginn commented May 3, 2024

def translate(path: str) -> str:
    file = io.FileIO(file=path)
    content = str(file.read())
    model = genai.GenerativeModel(model_name=MODEL, generation_config=GENERATION_CONFIG)
    chunks = []
    chat = model.start_chat()
    PART_LEN = len(content) // 2
    first_part = content[:PART_LEN]
    second_part = content[PART_LEN:]
    first_response = chat.send_message(content=MESSAGE_FOR_FIRST_PART + first_part, stream=True)
    for chunk in first_response:
        chunks.append(chunk)
    second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)
    for chunk in second_response:
        chunks.append(chunk)
    result = "".join(part.text for chunk in chunks for candidate in chunk.candidates for part in candidate.content.parts)
    return result

The error occurs here:

second_response = chat.send_message(content=MESSAGE_FOR_SECOND_PART + second_part, stream=True)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component:python sdk Issue/PR related to Python SDK status:triaged Issue/PR triaged to the corresponding sub-team type:bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants