You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yes, log probability can indeed be retrieved through llm.achat by setting logprobs=True and specifying top_logprobs=20 as shown in the example code provided. This configuration enables the retrieval of log probabilities for the top 20 tokens. The response.logprobs will contain the log probabilities after making a request.
Details
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
Question Validation
Question
I wonder whether log prob can be retrieved through llm.achat (see https://platform.openai.com/docs/api-reference/chat#chat-create-logprobs) for log prob in OpenAI API. If so, is there an example code?
The text was updated successfully, but these errors were encountered: