Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

calculation of the perplexity score #5464

Open
deven367 opened this issue Mar 18, 2024 · 0 comments
Open

calculation of the perplexity score #5464

deven367 opened this issue Mar 18, 2024 · 0 comments

Comments

@deven367
Copy link

deven367 commented Mar 18, 2024

❓ Questions and Help

Before asking:

  1. search the issues. → couldn't find an answer
  2. search the docs. → couldn't find an answer

What is your question?

Why is the perplexity score calculated as (2**avg_nll_loss) instead of the regular exp(avg_nll_loss)

"perplexity": 2**avg_nll_loss,

Was this a deliberate choice made by the fairseq team? or is there some other reason behind it?

cc @b-dickson @zorant

What's your environment?

  • fairseq Version (e.g., 1.0 or main): 0.12.2
  • PyTorch Version (e.g., 1.0)
  • OS (e.g., Linux):
  • How you installed fairseq (pip, source):
  • Build command you used (if compiling from source):
  • Python version:
  • CUDA/cuDNN version:
  • GPU models and configuration:
  • Any other relevant information:
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant