Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Branched of #784
Hi @solnic,
Here's the draft with a few suggestions from using the OTel tracing with Sentry, namely:
From testing this I wanted the traces to be closer to the actual OTel traces rather than normalized to Sentry events. This also seems to be in line with how the other Sentry SDK's consumes OTel spans.
The big piece here is how I switched over to let OTel doing most of the work with minimal adjustments after. It doesn't tie itself to a specific instrumentation library, instead I just look at the semantic conventions attributes to see if we can infer additional context, e.g. is it a HTTP or DB type span. It defaults to just returning the name as
op
if we don't have any other info.So if we want to set the source for oban tasks to
task
we can figure it out from the attributes by picking up onmessaging.system
and maybeoban.job.worker
, same forview
source were we may want to check if there's aphoenix.plug
set. Right now I just set all sources tocustom
.I've been pushing a few upstream fixes for otel erlang to make the attributes more rich which will help us here: https://github.com/open-telemetry/opentelemetry-erlang-contrib/pulls
Lmk what you think.