Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Seems like we didn't handle error chunk here? #25

Open
shezhangzhang opened this issue Mar 6, 2023 · 3 comments
Open

Seems like we didn't handle error chunk here? #25

shezhangzhang opened this issue Mar 6, 2023 · 3 comments

Comments

@shezhangzhang
Copy link

parser.feed(decoder.decode(chunk));

If the chunk is an error data ({"error":{}}), the onParse() function will NOT be invoke. Need error handler here.

@shezhangzhang
Copy link
Author

shezhangzhang commented Mar 7, 2023

@Nutlope Hello, sorry to bother you. I'm not sure what "fragmented into multiple chunks" means exactly because when I tested the fetch API locally, I didn't encounter any situations where the chunks were truncated. Why would it happen like this after deploying to Vercel's edge function (without used these code)?

// stream response (SSE) from OpenAI may be fragmented into multiple chunks
// this ensures we properly read chunks and invoke an event for each SSE event stream
const parser = createParser(onParse);
// https://web.dev/streams/#asynchronous-iteration
for await (const chunk of res.body as any) {
  parser.feed(decoder.decode(chunk));
}

localhost:
image

deploy on vercel's edge function:
image

@smaeda-ks
Copy link
Contributor

Hi @shezhangzhang,

If the chunk is an error data ({"error":{}}), the onParse() function will NOT be invoke. Need error handler here.

Are you saying that, there may be a case that the completion API returns a JSON that indicates some sort of internal error in the middle of the SSE stream? Do you have any links to the documentation which mentions that?
But in any case, as long as the JSON is sent over SSE (server-sent events) and complies with the SSE data format, the callback function does get invoked. So it's more of a lack of error handling inside the callback function, if I'm not mistaken.

Or, if you're talking about general API errors where the API request as a whole fails with non-successful status code (e.g., rate limit, 500 error, etc), then yes, this example is also not checking the response status code. But that can be easily added, otherwise.

For the second question, that's exactly why we're using the eventsource-parser library, as the code comment suggests. The reason why fragmented chunks are not observed on your local machine is that this largely depends on the size of the data and the network path between each party (source and destination). The latter is more important. The data may get fragmented when they go through multiple hops on the internet (e.g., MTUs). Different locations get different paths. This isn't a bug or any sort, but more of a difference in how packets are delivered.

@shezhangzhang
Copy link
Author

@smaeda-ks Thank you very much for clarifying this! I have checked the eventsource-parser library, and it will handle every chunk, check if it is complete, and ensure to invoke the onParse function with the correct chunk. Yes, this is a great solution!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants