Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Model from OpenAI gpt-3.5-turbo-instruct does not work #9109

Open
martyna-mindsdb opened this issue Apr 22, 2024 · 1 comment
Open
Assignees
Labels
bug Something isn't working

Comments

@martyna-mindsdb
Copy link
Collaborator

Short description of current behavior

The gpt-3.5-turbo-instruct model is listed in chat models here. But when querying it, it comes back with an error, as below.

CREATE MODEL text_classify
PREDICT response
USING
  engine = 'openai_engine',
  max_tokens = 128,
  temperature = 0.5,
  model_name = 'gpt-3.5-turbo-instruct',
  prompt_template = '{{payload}}';

DESCRIBE text_classify;

SELECT response FROM text_classify WHERE payload = 'my prompt';

The error message says [openai_engine/text_classify]: Exception: Error status 404 raised by OpenAI API: This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?.

Meanwhile, other models from this list work fine:

image

Video or screenshots

No response

Expected behavior

No response

How to reproduce the error

No response

Anything else?

No response

@martyna-mindsdb martyna-mindsdb added the bug Something isn't working label Apr 22, 2024
@martyna-mindsdb
Copy link
Collaborator Author

@paxcema
Please reassign if required.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants