Skip to content
#

promptengineering

Here are 19 public repositories matching this topic...

LLMOps with Prompt Flow is a "LLMOps template and guidance" to help you build LLM-infused apps using Prompt Flow. It offers a range of features including Centralized Code Hosting, Lifecycle Management, Variant and Hyperparameter Experimentation, A/B Deployment, reporting for all runs and experiments and so on.

  • Updated May 9, 2024
  • Python

Prompt engineering is the process of designing and refining input queries to gen AI models, like OpenAI's GPT variants, for achieving desired output. It involves optimizing the phrasing, context, and structure of prompts to improve the AI's understanding while maintaining high-quality & creative results that cater to specific app requirements.

  • Updated May 20, 2023
  • Jupyter Notebook

The AI Chat Bot project integrates OpenAI's LangChain Agent with RAG technology, offering a user-friendly interface via Streamlit for seamless communication. It serves diverse functions such as customer service and information retrieval, remaining at the forefront of conversational AI through continuous refinement.

  • Updated Apr 30, 2024
  • Python

Improve this page

Add a description, image, and links to the promptengineering topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the promptengineering topic, visit your repo's landing page and select "manage topics."

Learn more