Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of context answers #453

Open
DravidVaishnav opened this issue Mar 15, 2024 · 0 comments
Open

Out of context answers #453

DravidVaishnav opened this issue Mar 15, 2024 · 0 comments

Comments

@DravidVaishnav
Copy link

Screenshot from 2024-03-15 19-03-23
tried using llama (6.9 gb) model with cli, as well as code, but it is generating out of context answers, and often even giving questions as output, is there any way to solve this ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant