Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for LLM observalibility #62

Open
srinisubramanian opened this issue Oct 13, 2023 · 3 comments
Open

Support for LLM observalibility #62

srinisubramanian opened this issue Oct 13, 2023 · 3 comments

Comments

@srinisubramanian
Copy link

Any ideas for providing LLM observalibility options from frameworks like LangChain?

A simple approach (given OpenTelemetry integration) could be to use openllmetry. More details on a SDK for itnegration is here: https://github.com/traceloop/openllmetry

@MikeShi42
Copy link
Contributor

Definitely familiar with openllmetry - met with Nir a couple of times!

We should be supported already, though I'll want to test it out myself and submit a PR for their docs and ours if it works successfully. In theory you should be able to set them up and provide the following configs (assuming localhost, the base url would be https://in-otel.hyperdx.io for cloud):

TRACELOOP_BASE_URL=http://localhost:4318
TRACELOOP_HEADERS="authorization=<YOUR_HYPERDX_API_KEY_HERE>"

Any chance you already have openllmetry setup with HyperDX and want to give that a spin?

@srinisubramanian
Copy link
Author

@MikeShi42 I havent tred this ,, but since you say it should work I will try and let you know for sure,

@MikeShi42
Copy link
Contributor

@srinisubramanian Traceloop has published an integration page about us, we'll hopefully do the same on our end soon!

https://www.traceloop.com/docs/openllmetry/integrations/hyperdx

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants