Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API HTTP code: 500, "error":"failed to generate embedding with langchain #4509

Open
buaa39055211 opened this issue May 18, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@buaa39055211
Copy link

What is the issue?

after version 0.1.32 of Ollama,there always have a bug with the api of embedding
the embedding model I used is "smartcreation/bge-large-zh-v1.5",and dztech/bge-large-zh:v1.5 pulled from ollama

`
from langchain_community.embeddings import OllamaEmbeddings
from langchain_community.document_loaders import (
CSVLoader,
UnstructuredWordDocumentLoader,
)
from langchain_community.vectorstores import Qdrant
from qdrant_client import QdrantClient
base_url="http://127.0.0.1:11434")
embeddings = OllamaEmbeddings(model="dztech/bge-large-zh:v1.5",base_url=base_url)
LOADER_MAPPING = {
".csv": (CSVLoader, {}),
# ".docx": (Docx2txtLoader, {}),
".doc": (UnstructuredWordDocumentLoader, {"mode": "elements"}),
".docx": (UnstructuredWordDocumentLoader, {}),}
def split(uploaded_file_name):
# Create embeddings
print("Creating new vectorstore")
texts = process_documents(uploaded_file_name)
print(f"Creating embeddings. May take some minutes...")
db = Qdrant.from_documents(texts, embedding=embeddings,url='localhost:7541', collection_name=uploaded_file_name)
print(uploaded_file_name)
query = "insert"
docs = db.similarity_search(query)
print(docs[0].page_content)

`
File "/Users/mac/anaconda3/envs/ag2/lib/python3.11/site-packages/langchain_community/vectorstores/qdrant.py", line 2037, in _embed_texts
embeddings = self.embeddings.embed_documents(list(texts))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/mac/anaconda3/envs/ag2/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 204, in embed_documents
embeddings = self._embed(instruction_pairs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/mac/anaconda3/envs/ag2/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 192, in _embed
return [self.process_emb_response(prompt) for prompt in iter]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/mac/anaconda3/envs/ag2/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 192, in
return [self.process_emb_response(prompt) for prompt in iter]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/mac/anaconda3/envs/ag2/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 166, in _process_emb_response
raise ValueError(
ValueError: Error raised by inference API HTTP code: 500, {"error":"failed to generate embedding"}

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.33-0.1.38

@buaa39055211 buaa39055211 added the bug Something isn't working label May 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant