You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
## Problem
To continue driving performance improvements in the SDK and avoid major
regressions, we need to track performance over time. First step in that
is figuring out how to send data over to datadog with ddtrace so we can
begin to have a historical overview of how perf is changing over time.
Thanks to @ssmith-pc for pointing me in the right direction on how to
get started with this stuff.
## Solution
- Add ddtrace dev dependency
- Adjust CI configurations to set env variables expected by ddtrace
Probably later I will need to:
- Develop some benchmark tests specifically exercising the areas where
performance is most critical.
- Setup some dashboards on the datadog side where we can easily find and
interpret the data
Not sure if this link will be visible to others, but here's where I see
data flowing into datadog
https://app.datadoghq.com/ci/test-runs?saved-view-id=3068550
## Type of Change
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing
functionality to not work as expected)
- [ ] This change requires a documentation update
- [x] Infrastructure change (CI configs, etc)
- [ ] Non-code change (docs, etc)
- [ ] None of the above: (explain here)
0 commit comments