You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I download the codellama-7B, continue config.json config as this:
{"title": "LocalServer",
"provider": "openai",
"model": "codellama-7b-Instruct",
"apiBase": "http://localhost:8000/v1/"}
Then I run the llamacpp_mock_api.py , codeLlama can run rightly in my computer , get the post json from continue, generate LLM content correctly, but when I return the json ,the continue can't reecognize the format and show empty, How do you know the json format of Continue, I see the code add "onesix" to the front of json, I can't find json format definition in continue' docs, Is it possible that the Continue plugin updated the format? The current Json generating code is: "onesix" + jsonify({"choices": [{"delta": {"role": "assistant", "content": response}}]}).get_data(as_text=True)
How I can generate a right json that Continue can show?
The text was updated successfully, but these errors were encountered:
I download the codellama-7B, continue config.json config as this:
{"title": "LocalServer",
"provider": "openai",
"model": "codellama-7b-Instruct",
"apiBase": "http://localhost:8000/v1/"}
Then I run the llamacpp_mock_api.py , codeLlama can run rightly in my computer , get the post json from continue, generate LLM content correctly, but when I return the json ,the continue can't reecognize the format and show empty, How do you know the json format of Continue, I see the code add "onesix" to the front of json, I can't find json format definition in continue' docs, Is it possible that the Continue plugin updated the format? The current Json generating code is:
"onesix" + jsonify({"choices": [{"delta": {"role": "assistant", "content": response}}]}).get_data(as_text=True)
How I can generate a right json that Continue can show?
The text was updated successfully, but these errors were encountered: