Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

feat(model): return logprobs for prompt (also for max_tokens=0) #84

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

nicpopovic
Copy link

Hi,

Cool project!

OpenAIs API allows you get logprobs for a prompt without generating text (max_tokens=0, logprobs=1).
So far your API returns zero as logprobs for input tokens.

This PR is a quick implementation of this feature (not particularly elegant, but maybe you'll find it useful).

Cheers,
Nicholas

@peakji
Copy link
Member

peakji commented Mar 23, 2023

Thanks, @nicpopovic ! Please give us some time to test and review the changes. 😉

@nicpopovic
Copy link
Author

Woops, looks like I didn't test with encoder_decoder models, sorry about that...

Tests are passing for me now.

@peakji peakji requested a review from fardeon March 23, 2023 15:46
@peakji
Copy link
Member

peakji commented Mar 28, 2023

@nicpopovic Sorry for the late reply. We carefully reviewed your proposed changes and fully understand the purpose.

However, merging this pull request may encounter some difficulties: the logic of echo=True has been moved from __call__ to generate. Some user feedback we have collected suggests that they want to add stop sequences, presence_penalty, and frequency_penalty, just like in the OpenAI API. Clearly, these features should only take effect on completion tokens, not prompt tokens. Therefore, we hope to make the generate method more independent so that we can add new features to further enhance compatibility with the OpenAI API.

We are planning to refactor StreamModel or generate. Can we keep this pull request open for now and figure out a way to merge it later?

@peakji peakji added the enhancement New feature or request label Mar 28, 2023
@peakji peakji changed the title feat(model): Return logprobs for prompt (also for max_tokens=0) feat(model): return logprobs for prompt (also for max_tokens=0) Apr 19, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants