Langchain Prompt Format #198
Comments
Hi @0xDigest! Thank you for bringing this issue to our attention. Upon verification, it has been confirmed that the We are currently investigating the behavior of specifying the Additionally, does langchain really send multi- |
Thanks for the quick reply- Honestly, I've just started with langchain, I haven't seen any instances of multiple values in the prompt list, yet. I did go ahead and patch parse_options to add list to the dtypes tuple and changed the returned value to return the first value if it's a list So far, everything is working well. In case I'm not being clear:
|
We've added a temporary workaround in v0.18.1: currently only the first prompt in the list will be used, and Complete support for multi-prompt requests will be added in the upcoming versions. |
Thanks for the quick turnaround. I've updated and can confirm this works as expected. |
Example responses for some multi-prompt requests: multi-prompts.txt |
I have been working to integrate langchain with basaran and I am encountering an issue that I believe has to do with the prompt format. It seems that when langchain is posting to basaran, the prompt is a list and not a string. For example:
returns
It sees the prompt as a single token and doesn't return anything. I am able to replicate the issue by changing the example to use a list. The model seems to take the single (empty?) token and generate text. For instance:
returns
Would you consider this a langchain issue if the openAI API supports the call-- or am I missing something in my basaran setup?
The text was updated successfully, but these errors were encountered: