Skip to content

Developed using Python-Flask, Next.js, Pinecone, Typescript and Langchain

Notifications You must be signed in to change notification settings

surabhiwaingankar/GilmoreGirls_Augenblick

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GilmoreGirls_Augenblick

LLM Powered Conversational Customer Care Assistant - SahAIta

Welcome to our conversational customer care assistant web application powered by large language models (LLMs) with sentiment analysis capabilities. This application provides a seamless interface for customer service interactions, aiming to offer accurate product information, address customer queries, and handle complaints efficiently. The chatbot component is integrated with sentiment analysis to ensure positive and professional customer interactions.

Use Case

Our application is tailored for providing details about laptops. The conversational assistant assists users with various tasks such as providing product features, pricing details, availability status, troubleshooting guidance, and addressing complaints.

Features

  • Real-time chat interface for customers to interact with the conversational assistant.
  • Sentiment analysis to gauge and respond to customer emotions effectively.
  • Frictionless user experience with accurate and timely information.
  • Dashboard for human cases vs agent cases
  • Retrieval augmented generation (RAG) capabilities for querying knowledge base of products and services.

Technologies Used

  • Next.js: Next.js is a React framework for building server-side rendered and statically generated applications. It provides an efficient and flexible development experience.
  • TypeScript: TypeScript adds static typing to JavaScript, enhancing code quality and developer productivity.
  • Python and Flask: Flask is a lightweight web framework for Python, allowing rapid development of web applications. We use Python for backend functionalities such as sentiment analysis and integrating with retrieval augmented generation (RAG) capabilities.
  • Langchain: Langchain was used so that interaction will LLM's becomes very smooth.
  • Pinecone vector DB: Pinecone is used to store vector data from PDF's

Getting Started

Prerequisites

  • Node.js (>=14.x)
  • npm (>=7.x)
  • Python (>=3.6)

Installation

  1. Clone the repository:
git clone https://github.com/your-username/llm-customer-care-assistant.git

Built By

  • Surabhi Waingankar 😎
  • Tanisha Kanal 🥳

About

Developed using Python-Flask, Next.js, Pinecone, Typescript and Langchain

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published