Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regression in support of customized "role" in OpenAI compatible API (v.0.4.2) #4755

Closed
simon-mo opened this issue May 10, 2024 Discussed in #4745 · 4 comments · Fixed by #4758
Closed

Regression in support of customized "role" in OpenAI compatible API (v.0.4.2) #4755

simon-mo opened this issue May 10, 2024 Discussed in #4745 · 4 comments · Fixed by #4758
Labels
good first issue Good for newcomers

Comments

@simon-mo
Copy link
Collaborator

Discussed in #4745

Originally posted by tanliboy May 10, 2024
Hi vLLM team,

We have been using vLLM for serving models, and it went really well. We have been using the OpenAI compatible API along with our customized "role" for different entities. However, when we upgraded the version to v0.4.2 recently, we realized that the customized "role" is not supported and the role is only limited to "system", "user", and "assistant".

I understand that it is tightly aligned with OpenAI's chat completion role definition; however, it limits the customization of different roles along with fine-tuning. Moreover, we also saw a trend (including the recent Llama3 chat template) to support different roles for multi-agent interactions.

Can you upgrade to bring back the previous support of customized roles in OpenAI chat completion APIs?

Thanks,
Li

@simon-mo simon-mo added the good first issue Good for newcomers label May 10, 2024
@DarkLight1337
Copy link
Contributor

DarkLight1337 commented May 11, 2024

This is likely the result of #4355, which made ChatCompletionRequest.messages more strict to avoid unrecognized attributes. Would this interface be sufficient for defining custom roles?

class CustomChatCompletionContentPartParam(TypedDict, total=False):
    __pydantic_config__ = ConfigDict(extra="allow")  # type: ignore

    type: Required[str]
    """The type of the content part."""


ChatCompletionContentPartParam = Union[
    openai.types.chat.ChatCompletionContentPartParam,
    CustomChatCompletionContentPartParam]


class CustomChatCompletionMessageParam(TypedDict, total=False):
    """Enables custom roles in the Chat Completion API."""
    role: Required[str]
    """The role of the message's author."""

    content: Union[str, List[ChatCompletionContentPartParam]]
    """The contents of the message."""

    name: str
    """An optional name for the participant.

    Provides the model information to differentiate between participants of the
    same role.
    """


ChatCompletionMessageParam = Union[
    openai.types.chat.ChatCompletionMessageParam,
    CustomChatCompletionMessageParam]


class ChatCompletionRequest(OpenAIBaseModel):
    messages: List[ChatCompletionMessageParam]
    ... # The rest is the same as OpenAI API

@Tostino
Copy link
Contributor

Tostino commented May 12, 2024

Thank you for the PR @DarkLight1337. Was wondering why my data pipeline stopped working when I just upgraded vLLM.

@tanliboy
Copy link

Thank you, @simon-mo, @DarkLight1337 and @Tostino !
Is the PR ready to merge and incorporate into the upcoming release?

@simon-mo
Copy link
Collaborator Author

Merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants