Skip to content

Commit

Permalink
Merge pull request #11 from haseeb-heaven/bug/local-model-env-check
Browse files Browse the repository at this point in the history
Update offline model configuration and skip client initialization for…
  • Loading branch information
haseeb-heaven committed Mar 8, 2024
2 parents b7d42af + f2bdaae commit 01dfccb
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Expand Up @@ -108,8 +108,8 @@ This Interpreter supports offline models via **LM Studio** so to download it fro
- Download any model from **LM Studio** like _Phi-2,Code-Llama,Mistral_.
- Then in the app go to **Local Server** option and select the model.
- Start the server and copy the **URL**.
- Open config file `configs/offline-model.config` and paste the **URL** in the `api_base` field.
- Now you can use the model with the interpreter set the model name to `offline-model` and run the interpreter.</br>
- Open config file `configs/local-model.config` and paste the **URL** in the `api_base` field.
- Now you can use the model with the interpreter set the model name to `local-model` and run the interpreter.</br>

4. Run the interpreter with Python:</br>
### Running with Python.
Expand Down
5 changes: 5 additions & 0 deletions libs/interpreter_lib.py
Expand Up @@ -103,6 +103,11 @@ def initialize_client(self):
self.INTERPRETER_MODEL = str(self.config_values.get('HF_MODEL', self.INTERPRETER_MODEL))
hf_model_name = self.INTERPRETER_MODEL.strip().split("/")[-1]

# skip init client for local models.(Bug#10 https://github.com/haseeb-heaven/code-interpreter/issues/10)
if 'local' in self.INTERPRETER_MODEL:
self.logger.info(f"Skipping client initialization for local model.")
return

self.logger.info(f"Using model {hf_model_name}")

model_api_keys = {
Expand Down

0 comments on commit 01dfccb

Please sign in to comment.