Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: unable to generate key got llm inference end point #9203

Open
dusvyat opened this issue May 13, 2024 · 0 comments
Open

[Bug]: unable to generate key got llm inference end point #9203

dusvyat opened this issue May 13, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@dusvyat
Copy link
Contributor

dusvyat commented May 13, 2024

Short description of current behavior

api key gen is hanging

Video or screenshots

https://www.loom.com/share/dec3ff3896b1456c969e66c92db325fc

Expected behavior

No response

How to reproduce the error

No response

Anything else?

No response

@dusvyat dusvyat added the bug Something isn't working label May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant