You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Can we get a cookbook with Groq and LanceDb? Would I still need to have a local embedding? If so, then can I get an example of how this Streamlit app can be hosted using a proper Dockerfile that downloads all packages, including ollama to download the nomic-text embedding model
Goal: I want to be able to upload a PDF file using Streamlit, and have a summarizer assistant with a specific prompt to summarize the document. This I need in combination with a RAG chatbot input at the bottom (below the summarized output in markdown in the main Streamlit window). All this without my customer needing to run a pgvector docker container. In other words, using a local DB such as LanceDB.
The text was updated successfully, but these errors were encountered:
Can we get a cookbook with Groq and LanceDb? Would I still need to have a local embedding? If so, then can I get an example of how this Streamlit app can be hosted using a proper Dockerfile that downloads all packages, including ollama to download the nomic-text embedding model
Goal: I want to be able to upload a PDF file using Streamlit, and have a summarizer assistant with a specific prompt to summarize the document. This I need in combination with a RAG chatbot input at the bottom (below the summarized output in markdown in the main Streamlit window). All this without my customer needing to run a pgvector docker container. In other words, using a local DB such as LanceDB.
The text was updated successfully, but these errors were encountered: