New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding the 'Computer' destroyed open intererpreter which was the best product i used #1255
Comments
Instructions for Setting Up a Model File on OLLAMA: You can address some of the current system issues by creating and using a custom model file. Here’s a simple example of how to set up a model file using OLLAMA. This script should be run in your terminal: # Pull your chosen base model from OLLAMA
ollama pull <base_model_name>
# Create a Modelfile specifying the base model and setting up system attributes
echo "FROM <base_model_name>" > Modelfile
echo "SYSTEM You are a friendly assistant. You embody the characteristics of a helpful, knowledgeable assistant with a strong emphasis on user interaction and problem-solving." >> Modelfile
# Create the new model using the Modelfile
ollama create -f Modelfile <your_model_name>
# Push the new model to the repository
ollama push <your_model_name> Customization Instructions:
This approach helps bypass issues related to system message length and dependency errors by allowing you to configure a simplified, efficient setup that's tailored to your needs. Running these commands in the terminal will create a more manageable environment, potentially reducing operational costs and enhancing performance stability. Additional Tips:
By following these steps, you can create a more streamlined and effective model setup that addresses specific bugs and improves overall system responsiveness. |
package-lock.json |
I see now the working formart is :parameter: its good to know, and the size if the prompt looks totally rational. I wish the base came with an extra example ontop of the fast,empty ,OS and way too long ones it comes with. |
Honestly ollama is the only provider which im able to use with CLI call using -ab ip:11434 -o -m=ollama/model |
When called from python for example activated by crewAI it work way better than the way i use it from CLI, maybe the solution will be instead of the aliases i have for everymode in CLI to have py code for each although if i have py code for each then instead of me distributing it we should merge it to the base model as initially suggested |
I wish the code was strict as the rules )) |
Absolutely, I resonate with your points on CLI usability across different models. I primarily use oh-my-zsh and spaceship for OLLAMA models, which simplifies things, but I’ve also noticed inconsistencies when trying to apply similar CLI formats with COHERE or Hugging Face. It often feels like instead of leveraging the interpreter as an extension of my capabilities, I'm stuck configuring endlessly. I completely agree that having a uniform syntax for model interaction could vastly improve usability. It would reduce the learning curve and make tool integration more seamless across different platforms. Regarding config files, they sometimes help but often add another layer of complexity. Simplifying CLI interaction to make it as intuitive as using Python scripts directly might be a better approach for consistency and efficiency. Perhaps advocating for integrating these scripts into the base model, as you suggested, could be a step towards standardizing model interactions. I wish the CLI interactions were as strict and standardized as coding best practices, making our work much more straightforward!
|
Describe the bug
Reproduce
Expected behavior
Screenshots
No response
Open Interpreter version
0.2.0 and above
Python version
3.11
Operating System name and version
Wsl2 on win10
Additional context
I can gladly help, i dug very deep into all the core files to try fix these issues.
Thank you.
@6rz6
The text was updated successfully, but these errors were encountered: