Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Several questions.Whats the differences between the "tprompt" and "prompt" in config.yaml? #219

Open
sungh66 opened this issue Aug 2, 2023 · 1 comment

Comments

@sungh66
Copy link

sungh66 commented Aug 2, 2023

I want to add some custom operations, such as file read and write local modules, is this possible? Beacuse i think AutoGPT is too flexible.
Also, how does the code reflect the process of matching the steps planned by llm with the introduction of the model in huggingface?
Whats the diffenrences between the "tprompt" and "prompt" in config.yaml?

@sungh66 sungh66 changed the title Several questions.Whats the diffenrences between the "tprompt" and "prompt" in config.yaml? Several questions.Whats the differences between the "tprompt" and "prompt" in config.yaml? Aug 2, 2023
@spyd3rweb
Copy link

@sungh66

Whats the diffenrences between the "tprompt" and "prompt" in config.yaml?

From the config.yaml, the "tprompt", in combination with the few shot examples "demos_or_presteps", is used as part of the system message/prompt to "guide an AI system’s behavior and improve system performance" [1]. The "prompt" is used to wrap the user's chat as part of the user message/prompt and provides additional instruction/requests related to the stage.

References:
[1] https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/system-message

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants