Skip to content

julep-ai/julep

Repository files navigation

julep

Start your project with conversation history, support for any LLM, agentic workflows, integrations & more.


Explore the docs »

Report Bug · Request Feature · Join Our Discord · X · LinkedIn

NPM Version   PyPI - Version   Docker Image Version   GitHub License


Why Julep?

We've built a lot of AI apps and understand how difficult it is to evaluate hundreds of tools, techniques, and models, and then make them work well together.

The Problem

Even for simple apps you have to:

  • pick the right language model for your use case
  • pick the right framework
  • pick the right embedding model
  • choose the vector store and RAG pipeline
  • build integrations
  • tweak all of the parameters (temp, penalty, max tokens, similarity thresholds, chunk size, and so on)
  • write and iterate on prompts for them to work

The Solution: Julep eases the burden and time taken to get up and running with any AI app.

  • Statefulness By Design: Build AI apps without needing to write code to embed, save and retrieve conversation history. Deals with context windows by using CozoDB; a transactional, relational-graph-vector database.
  • Use and switch between any LLMs anytime: Switch and use different LLMs, providers and models, self-hosted or otherwise by changing only one line of code
  • Automatic Function Calling: No need to handle function calling manually. Julep deals with calling the function, parsing the response, retrying in case of failures and passing the response into the context.
  • Production-ready: Julep comes ready to be deployed to production using Docker Compose. Support for k8s coming soon!
  • 90+ tools built-in: Connect your AI app to 150+ third-party applications using Composio natively.
  • *GitHub Actions-like workflows for task: Define agentic workflows to be executed asynchronously without worrying about timeouts.

(*) Features coming soon!

Quickstart

Option 1: Install and run Julep locally

  • Download the docker-compose.yml file along with the .env file for configuration to run the Julep platform locally
# Add the docker compose to your project dir
wget https://raw.githubusercontent.com/julep-ai/julep/dev/deploy/docker-compose.yml
# Add the .env file to your project dir
wget https://raw.githubusercontent.com/julep-ai/julep/dev/deploy/.env.example -O .env
# Pull the latest images
docker compose pull
# Start the services (in detached mode)
docker compose up -d
  • The API would now be available at: http://0.0.0.0:8080

  • Next, add your OpenAI API Key to the .env

  • Set your environment variables

export JULEP_API_KEY=myauthkey
export JULEP_API_URL=http://0.0.0.0:8080

Option 2: Use the Julep Cloud

export JULEP_API_KEY=your_julep_api_key
export JULEP_API_URL=https://api-alpha.julep.ai

Installation

pip install julep

Setting up the client

from julep import Client
from pprint import pprint
import textwrap
import os

base_url = os.environ.get("JULEP_API_URL")
api_key = os.environ.get("JULEP_API_KEY")

client = Client(api_key=api_key, base_url=base_url)

Create an agent

Agent is the object to which LLM settings like model, temperature along with tools are scoped to.

agent = client.agents.create(
    name="Jessica"
    model="gpt-4",
    tools=[]    # Tools defined here
)

Create a user

User is the object which represents the user of the application.

Memories are formed and saved for each user and many users can talk to one agent.

user = client.users.create(
    name="Anon",
    about="Average nerdy techbro/girl spending 8 hours a day on a laptop,
)

Create a session

A "user" and an "agent" communicate in a "session". System prompt goes here. Conversation history and summary are stored in a "session" which saves the conversation history.

The session paradigm allows for; many users to interact with one agent and allow separation of conversation history and memories.

situation_prompt = """You are Jessica. You're a stuck up Cali teenager. 
You basically complain about everything. You live in Bel-Air, Los Angeles and drag yourself to Curtis High School when you must.
"""
session = client.sessions.create(
    user_id=user.id, agent_id=agent.id, situation=situation_prompt
)

Start a stateful conversation

session.chat controls the communication between the "agent" and the "user".

It has two important arguments;

  • recall: Retrieves the previous conversations and memories.
  • remember: Saves the current conversation turn into the memory store.

To keep the session stateful, both need to be True

user_msg = "hey. what do u think of starbucks"
response = client.sessions.chat(
    session_id=session.id,
    messages=[
        {
            "role": "user",
            "content": user_msg,
            "name": "Anon",
        }
    ],
    recall=True,
    remember=True,
)

print("\n".join(textwrap.wrap(response.response[0][0].content, width=100)))

API and SDKs

To use the API directly or to take a look at request & response formats, authentication, available endpoints and more, please refer to the API Documentation

You can also use the Postman Collection for reference.

Python SDK

To install the Python SDK, run:

pip install julep

For more information on using the Python SDK, please refer to the Python SDK documentation.

TypeScript SDK

To install the TypeScript SDK using npm, run:

npm install @julep/sdk

For more information on using the TypeScript SDK, please refer to the TypeScript SDK documentation.


Examples

You can test different examples of using Julep to make apps in the example app docs.

  1. Simple Conversational Bot
  2. Discord Bot with Long-Term Memory
  3. AI Dungeon Master
  4. Community Feedback Agent

Deployment

Check out the self-hosting guide to host the platform yourself.

If you want to deploy Julep to production, let's hop on a call!

We'll help you customise the platform and help you get set up with:

  • Multi-tenancy
  • Reverse proxy along with authentication and authorisation
  • Self-hosted LLMs
  • & more

Contributing

We welcome contributions from the community to help improve and expand the Julep AI platform. See CONTRIBUTING.md


License

Julep AI is released under the Apache 2.0 License. By using, contributing to, or distributing the Julep AI platform, you agree to the terms and conditions of this license.


Contact and Support

If you have any questions, need assistance, or want to get in touch with the Julep AI team, please use the following channels:

  • Discord: Join our community forum to discuss ideas, ask questions, and get help from other Julep AI users and the development team.
  • GitHub Issues: For technical issues, bug reports, and feature requests, please open an issue on the Julep AI GitHub repository.
  • Email Support: If you need direct assistance from our support team, send an email to diwank@julep.ai, and we'll get back to you as soon as possible.
  • Follow for updates on X & LinkedIn
  • Hop on a call: We wanna know what you're building and how we can tweak and tune Julep to help you build your next AI app.