Skip to content

OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub

License

inno-waluza/OllamaChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Chatbot

Ollama Chatbot is a conversational agent powered by AI that allows users to interact with an AI assistant through either a graphical user interface (GUI) or a console interface.

Features

  • Graphical User Interface (GUI): Provides a user-friendly interface for interacting with the AI assistant.
  • Console Interface: Allows interaction with the AI assistant through the command line interface.

AI Models

The AI models used in this chatbot are provided by Mistral AI or Ollama AI. The available models are:

  • Llama2: A pre-trained AI model for conversation.
  • Llama2-Uncensored: A variant of the Llama2 model without content filtering.

These models can be installed locally on a machine capable of running AI models.

Requirements

  • Python 3.x
  • Required Python libraries:
    • Kivy
    • KivyMD
    • Requests
    • Pyttsx3

Installation

  1. Clone the repository:

    git clone https://github.com/innowaluza/ollama-chatbot.git
  2. Install the required Python libraries:

    pip install -r requirements.txt

Usage

Graphical User Interface (GUI)

  1. Run the Ollamachat_GUI.py script:

    python Ollamachat_GUI.py
  2. Type your message in the input field and press "Send" to receive responses from the AI assistant.

Console Interface

  1. Run the Ollamachat_console.py script:

    python Ollamachat_console.py
  2. Type your message in the console and press Enter to send it to the AI assistant. Type "exit" to end the conversation.

Contributors

License

This project is licensed under the MIT License.

About

OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages