Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: system prompt for playground page #591

Merged
merged 3 commits into from Mar 20, 2024

Conversation

feloy
Copy link
Contributor

@feloy feloy commented Mar 20, 2024

What does this PR do?

  • Adds the setting block (the layout as the Recipe details in the recipe page)
  • Adds the system prompt

Screenshot / video of UI

playground-system-prompt.mp4

What issues does this PR fix or reference?

Part of #525
Fixes #543

How to test this PR?

Set instructions in the system prompt textarea, and check that it is used for the following prompts (NOTE: for the video above, it happened frequently that the model didn't take in consideration the system prompt for the second and following prompts)

@feloy feloy requested a review from a team as a code owner March 20, 2024 07:56
@feloy feloy marked this pull request as draft March 20, 2024 07:56
@feloy feloy force-pushed the feat-525/playground-parameters branch from 79e3803 to 686756c Compare March 20, 2024 08:04
Signed-off-by: Philippe Martin <phmartin@redhat.com>
@feloy feloy force-pushed the feat-525/playground-parameters branch from 686756c to ecd1e9d Compare March 20, 2024 08:27
Signed-off-by: Philippe Martin <phmartin@redhat.com>
@@ -87,7 +87,7 @@ export class PlaygroundV2Manager extends Publisher<PlaygroundV2[]> implements Di
* @param userInput the user input
* @param options the model configuration
*/
async submit(playgroundId: string, userInput: string, options?: ModelOptions): Promise<void> {
async submit(playgroundId: string, userInput: string, systemPrompt: string, options?: ModelOptions): Promise<void> {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The systemPrompt is not something we can/should change once the conversation is started ?

The conversation history should not be changed, as it would lead to incoherence if we are able to edit it while we already have existing messages.

Looking at how works ChatGPT or else, it is always defined at the begining of the conversation. In the same way that the model id is defined at the conversation creation.

The systemprompt should be provided in the creation request and added to the conversation, or it should have its own endpoint to change it, and once the conversation is started you cannot edit it. (disabling text area)

We could imagine

setConversationSystemPrompt(conversationId: string, content: string): void {
    const conversation = this.conversations.get(conversationId);
    //ensure exists
    if(conversation .length > 1) throw new Error('cannot change on started conversation');
    // set system prompt value
}
```

Copy link
Contributor Author

@feloy feloy Mar 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to me that it is not what @slemeur defined in the spec (where system prompt is one of the settings):

How does it work? 
- The settings are applied to the NEXT call to the model to be made and will be sent in the payload made to the model.
- That means that, each time we do a call to the model, we read the state of the settings, get the values and append that in the API call.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO The quoted statement does not reflect how a system prompt in LLM conversation works @slemeur

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The goal is to let the user being able to iterate with the different "settings" on how the prompt is getting configured.

  • Say the user is trying with one set of settings.
  • Get the answer and want to see if adjusting the settings will make the answer any better.
    Hence the need to apply the settings only to the next prompt sent to the model - so the user can keep tuning;

Copy link
Contributor

@axel7083 axel7083 Mar 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there is a misunderstanding: changing a settings should only apply to the next messages, the system prompt is something that the model is receiving at the start of the conversation. It is not a settings

If you change the system prompt, you are changing the first message, (system prompt is simply a message, with a role system.)

In chatGPT when you change a message in a conversation, all the messages bellow are deleted

chagtp.mp4

You cannot expect in an existing covnersation to be able to change the system prompt, as the system prompt is the first message of the conversation. Otherwise you should delete all the messages already sent.

I am not saying that a chat bot is deterministic, but at least coherent, if you have an existing conversation, and you change the system prompt, then you re-sent a message, the existing conversation is based an a message that is not here anymore. It does not make sense.

image

What I am explaining is how all chat gpt like works, here is the explanation from ChatGPT website

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this conversation, I have set the system prompt as 'reply in french' at the beginning, then changed to 'reply in italian' before sending the second prompt. The model replies as expected.

reply-french-italian

@feloy feloy marked this pull request as ready for review March 20, 2024 09:27
@jeffmaury
Copy link
Contributor

I can;t make it work I have timeout errors even without the system prompt

@feloy
Copy link
Contributor Author

feloy commented Mar 20, 2024

I can;t make it work I have timeout errors even without the system prompt

OK, I'll look at this timeout problem

Copy link
Contributor

@jeffmaury jeffmaury left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@feloy feloy force-pushed the feat-525/playground-parameters branch from 81e08f2 to 2d6e1a5 Compare March 20, 2024 13:35
@jeffmaury
Copy link
Contributor

I can;t make it work I have timeout errors even without the system prompt

OK, I'll look at this timeout problem

I think you can forget about it as it seems that was related to the health of my Podman machine. Restarting it made it work again

Signed-off-by: Philippe Martin <phmartin@redhat.com>
@feloy feloy force-pushed the feat-525/playground-parameters branch from 3098934 to 1c41030 Compare March 20, 2024 13:41
@feloy
Copy link
Contributor Author

feloy commented Mar 20, 2024

I can;t make it work I have timeout errors even without the system prompt

OK, I'll look at this timeout problem

I think you can forget about it as it seems that was related to the health of my Podman machine. Restarting it made it work again

I also get a timeout sometimes (and @axel7083 also). I looked at the durations of the different calls, and the completions.create gets some time (generally 2s on my system, but up to 45s during my tests).

The latest commit on this PR should fix the problem

@feloy feloy merged commit 05dbb39 into containers:main Mar 20, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add "System Prompt" into the configuration options in the playground
4 participants