Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gpt-4-vision-preview does not work as expected. #573

Open
1 task done
iterprise opened this issue Dec 15, 2023 · 7 comments
Open
1 task done

gpt-4-vision-preview does not work as expected. #573

iterprise opened this issue Dec 15, 2023 · 7 comments
Labels
openai api Related to underlying OpenAI API

Comments

@iterprise
Copy link

Confirm this is a Node library issue and not an underlying OpenAI API issue

  • This is an issue with the Node library

Describe the bug

The response from ChatGPT unexpectedly cuts off if using stream. The response via API does not match the request through chat; through the API, I only receive the beginning of the response which unexpectedly cuts off. I think this is related to the bug below."
#499

To Reproduce

openai.beta.chat.completions.stream with image_url
I use the following image.
1

From API I got only
The instructions are asking for a modification of the SQL CREATE TABLE statement for
From chat I got much more.

Code snippets

const testVision = async () => {
    const stream = await openai.beta.chat.completions.stream({
        model: 'gpt-4-vision-preview',
        messages: [
            {
                role: 'user',
                content: [{
                    type: 'image_url',
                    image_url: convertImageToDataURLSync('1.png'),
                }],
            }
        ],
        stream: true,
    });
    stream.on('content', (delta, snapshot) => {
        process.stdout.write(delta)
    });
    stream.finalChatCompletion().then( () => {
        process.stdout.write('\n')
    } );
}

OS

Linux

Node version

Node v18.16.0

Library version

openai 4.22.0

@iterprise iterprise added the bug Something isn't working label Dec 15, 2023
@iterprise
Copy link
Author

Without stream it doesn't work as well.

@rattrayalex
Copy link
Collaborator

Hmm, it may be that your program is exiting because you're not waiting for the stream to complete. Try this:

const testVision = async () => {
    const stream = await openai.beta.chat.completions.stream({
        model: 'gpt-4-vision-preview',
        messages: [
            {
                role: 'user',
                content: [{
                    type: 'image_url',
                    image_url: convertImageToDataURLSync('1.png'),
                }],
            }
        ],
        stream: true,
    });
    stream.on('content', (delta, snapshot) => {
        process.stdout.write(delta)
    });
    
    await stream.finalChatCompletion();
    console.log();
}

Does that help?

@rattrayalex rattrayalex removed the bug Something isn't working label Dec 16, 2023
@iterprise
Copy link
Author

iterprise commented Dec 16, 2023

I tried this one with the same result.

const testVision = async () => {
    const stream = await openai.chat.completions.create({
        model: 'gpt-4-vision-preview',
        messages: [
            {
                role: 'user',
                content: [{
                    type: 'image_url',
                    image_url: convertImageToDataURLSync('1.png'),
                }],
            }
        ],
    });
    console.log('Not a stream', stream.choices[0].message.content);
}

as well i tested you version of my code, the same result.

@rattrayalex
Copy link
Collaborator

rattrayalex commented Dec 16, 2023 via email

@iterprise
Copy link
Author

@rattrayalex
Why do you remove the bag label? Do I have some problem in my code?

@rattrayalex
Copy link
Collaborator

rattrayalex commented Dec 19, 2023

It's probably a problem with the underlying API, not a bug in the Node SDK.

@rattrayalex rattrayalex added the openai api Related to underlying OpenAI API label Dec 19, 2023
@athyuttamre
Copy link
Collaborator

athyuttamre commented Dec 19, 2023

Hi @iterprise, thanks for the report. We can confirm this is an issue in the API and are working on addressing it. As a workaround, you can manually set the value of max_tokens to a higher value (it is currently defaulting to 16 when it should not, which is why the responses are getting cut off).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
openai api Related to underlying OpenAI API
Projects
None yet
Development

No branches or pull requests

3 participants