You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a framework that generates prompts on the fly (https://github.com/microsoft/genaiscript) which means that the prompt files built on the fly and sent to the LLM (This works great with the custom javascript provider model and we were able to integrate nicely with promptfoo)
This is problematic for the prompt source because it always ends up being the javascript source instead of the generated prompt, which disables a number of assertions that rely on the input.
Would there be a way to also provide the opportunity for the api provider to return the prompt it actually as part of the response? With a couple pointers, I would be happy to produce a PR.
The text was updated successfully, but these errors were encountered:
We have a framework that generates prompts on the fly (https://github.com/microsoft/genaiscript) which means that the prompt files built on the fly and sent to the LLM (This works great with the custom javascript provider model and we were able to integrate nicely with promptfoo)
This is problematic for the prompt source because it always ends up being the javascript source instead of the generated prompt, which disables a number of assertions that rely on the input.
Would there be a way to also provide the opportunity for the api provider to return the prompt it actually as part of the response? With a couple pointers, I would be happy to produce a PR.
The text was updated successfully, but these errors were encountered: