Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it not possible to use in AZURE? #25

Open
sonnydfa opened this issue Mar 29, 2024 · 1 comment
Open

Is it not possible to use in AZURE? #25

sonnydfa opened this issue Mar 29, 2024 · 1 comment

Comments

@sonnydfa
Copy link

I tried with the following configuration, but it doesn't work.

os.environ["OPENAI_API_TYPE"] = "azure"
os.environ["OPENAI_API_BASE"] = "https://proxy url"
os.environ["OPENAI_API_KEY"] = "key"
os.environ["OPENAI_API_VERSION"] = "version"

Traceback (most recent call last):
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions
yield
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_transports\default.py", line 233, in handle_request
resp = self._pool.handle_request(req)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpcore_sync\connection_pool.py", line 216, in handle_request
raise exc from None
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpcore_sync\connection_pool.py", line 196, in handle_request
response = connection.handle_request(
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpcore_sync\connection.py", line 99, in handle_request
raise exc
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpcore_sync\connection.py", line 76, in handle_request
stream = self._connect(request)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpcore_sync\connection.py", line 154, in _connect
stream = stream.start_tls(**kwargs)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpcore_backends\sync.py", line 168, in start_tls
raise exc
File "C:\Users\10021782\AppData\Local\Programs\Python\Python39\lib\contextlib.py", line 135, in exit
self.gen.throw(type, value, traceback)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1123)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 858, in _request
response = self._client.send(request, auth=self.custom_auth, stream=stream)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_client.py", line 914, in send
response = self._send_handling_auth(
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_client.py", line 942, in _send_handling_auth
response = self._send_handling_redirects(
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects
response = self._send_single_request(request)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_client.py", line 1015, in _send_single_request
response = transport.handle_request(request)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_transports\default.py", line 233, in handle_request
resp = self._pool.handle_request(req)
File "C:\Users\10021782\AppData\Local\Programs\Python\Python39\lib\contextlib.py", line 135, in exit
self.gen.throw(type, value, traceback)
File "c:\GPT\raptor\raptor-39\lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1123)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "c:\GPT\raptor\raptor-39\lib\site-packages\tenacity_init_.py", line 382, in call
result = fn(*args, **kwargs)
File "c:\GPT\raptor\raptor-39\raptor\EmbeddingModels.py", line 26, in create_embedding
self.client.embeddings.create(input=[text], model=self.model)
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai\resources\embeddings.py", line 105, in create
return self._post(
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 1055, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 834, in request
return self._request(
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 890, in _request
return self._retry_request(
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 925, in _retry_request
return self._request(
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 890, in _request
return self._retry_request(
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 925, in _retry_request
return self._request(
File "c:\GPT\raptor\raptor-39\lib\site-packages\openai_base_client.py", line 897, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "c:\GPT\raptor\raptor-39\rag.py", line 28, in
RA.add_documents(text)
File "c:\GPT\raptor\raptor-39\raptor\RetrievalAugmentation.py", line 220, in add_documents
self.tree = self.tree_builder.build_from_text(text=docs, use_multithreading=False)
File "c:\GPT\raptor\raptor-39\raptor\tree_builder.py", line 280, in build_from_text
, node = self.create_node(index, text)
File "c:\GPT\raptor\raptor-39\raptor\tree_builder.py", line 175, in create_node
embeddings = {
File "c:\GPT\raptor\raptor-39\raptor\tree_builder.py", line 176, in
model_name: model.create_embedding(text)
File "c:\GPT\raptor\raptor-39\lib\site-packages\tenacity_init
.py", line 289, in wrapped_f
return self(f, *args, **kw)
File "c:\GPT\raptor\raptor-39\lib\site-packages\tenacity_init
.py", line 379, in call
do = self.iter(retry_state=retry_state)
File "c:\GPT\raptor\raptor-39\lib\site-packages\tenacity_init_.py", line 326, in iter
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x1c7ff8743d0 state=finished raised APIConnectionError>]

@parthsarthi03
Copy link
Owner

Hi, I'm not very familiar with the Azure setup, but it should be easy to set up RAPTOR with any inference setup. You can define your own inference code to use Azure by defining your custom SummarizationModel, QAModel, and/or Embedding model. We have a tutorial for how to extend to custon models here and you can do something similar for Azure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants