Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: base opentelemetry integration #363

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

PacoDu
Copy link
Contributor

@PacoDu PacoDu commented Dec 14, 2020

This PR adds a base mercurius span to the fastifyGraphql decorator and the cached attribute.

image

Copy link
Collaborator

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would generically prefer if we provided hooks (similar to fastify, or even simpler) for APMs to plug into. This is totally ok but it's non-scalable as there are a few different implementations for this.

@PacoDu
Copy link
Contributor Author

PacoDu commented Dec 14, 2020

I would generically prefer if we provided hooks (similar to fastify, or even simpler) for APMs to plug into. This is totally ok but it's non-scalable as there are a few different implementations for this.

As stated in #346 (comment) this is just a cleanup of the previous PR with the tracer declaration and the base mercurius span. Unfortunately, I don't have time to push this implementation further for now.

But, I'm not sure to understand what you mean by providing hooks, as opentelemetry is already providing an API for the end-user to extend the tracing and how it is plugged. This implementation is based on opentelemetry recommendations: https://open-telemetry.github.io/opentelemetry-js/#library-authors

@mcollina
Copy link
Collaborator

But, I'm not sure to understand what you mean by providing hooks, as opentelemetry is already providing an API for the end-user to extend the tracing and how it is plugged. This implementation is based on opentelemetry recommendations: https://open-telemetry.github.io/opentelemetry-js/#library-authors

Some APMs do not implement OpenTelemetry but they have similar concepts/libraries. They'd need to put logic in the same places. Instead of calling opentracing directly, we could do:

app.graphql.setTracing({
  ... // all the methods for those points
})

@Sytten
Copy link

Sytten commented Feb 4, 2021

I also support hooks, I am trying to build an SDK for graphmetrics.io and we are not openmetrics compliant for now (we will consider it when the spec stabilizes a bit).
I think we can get quite far with the new 6.12.0 hooks, but we are missing a hook on the field level to measure their resolution time (not applicable for gateway/federation obviously).

@mcollina
Copy link
Collaborator

mcollina commented Feb 5, 2021

@Sytten this is probably not the right issue to talk about. Maybe you can open a new one? Or just send a PR.

@airhorns
Copy link
Contributor

airhorns commented Feb 10, 2021

FWIW I think the aspirations of opentelemetry is to avoid having different APM products all have their own interfaces, and instead give library authors one generally accepted API to integrate against. We'd build integrations against @opentelemetry/api, which defines one interface for tracing, and then Datadog, Honeycomb, Lightstep, and others all support the otel ecosystem with exporters or network endpoints for it to talk to. They are all collaborating on the client libraries explicitly so that we don't all have to try to build support for each individually or none at all. The idea would be if you want to export telemetry from your system to whatever new APM product comes along, you write a new opentelemetry exporter instead of a new integration for each piece of your system. I don't know if it will work and it is early days yet, but at least in the kubernetes space, a lot of communities have standardized on opentelemetry being the way to get telemetry data out of their system, and users use whatever exporter they want (or the opentelemetry collector, which is like a network attached exporter that can redirect to yet more places).

@@ -28,6 +28,7 @@
"@graphql-tools/merge": "^6.2.4",
"@graphql-tools/schema": "^6.2.4",
"@graphql-tools/utils": "^6.2.4",
"@opentelemetry/api": "^0.13.0",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@opentelemetry/api version 0.16.0 changes the API significantly, if we do want to get this in we should try to build against the most up to date tracer API and active-context-getting functions they have available (the getSpan, setSpan stuff)

@@ -397,6 +404,7 @@ const plugin = fp(async function (app, opts) {
// this query errored
const err = new MER_ERR_GQL_VALIDATION()
err.errors = cachedError.validationErrors
span.end()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Usually on erros error info is added to span as status code

const { SpanStatusCode } = require('@opentelemetry/api')

span.setStatus({
  code: SpanStatusCode.ERROR,
  message: error.message,
});

As well as OK status on success.

span.setStatus({
  code: SpanStatusCode.OK,
});

Also to avoid multiple span.end() I prefer following pattern:

const result = tracer.startActiveSpan(
  `<span_name>`,
  { attributes: <spanAttributes> },
  async (span) => {
    try {
      // do something
      span.setStatus({ code: SpanStatusCode.OK });
      // return if needed
    } catch (error) {
      span.setStatus({
        code: SpanStatusCode.ERROR,
        message: error.message,
      });
      throw error;
    } finally {
      span.end();
    }
  },
);
return result;

But it might be little bit less performant. And this one is available in most recent opentelemetry API, not sure about version from package.json.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants