fix(core): Align Vercel embedding spans with semantic conventions#19795
fix(core): Align Vercel embedding spans with semantic conventions#19795nicohrubec wants to merge 7 commits intodevelopfrom
Conversation
size-limit report 📦
|
node-overhead report 🧳Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.
|
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
| }; | ||
|
|
||
| await createRunner().expect({ transaction: expectedTransaction }).start().completed(); | ||
| }); |
There was a problem hiding this comment.
Test doesn't assert absence of PII when disabled
Low Severity
The test "creates embedding related spans with sendDefaultPii: false" uses expect.objectContaining but never asserts that GEN_AI_EMBEDDINGS_INPUT_ATTRIBUTE is absent from span data. The companion sendDefaultPii: true test explicitly checks for the presence of this attribute, implying the false case intends to verify it's not included. Without an explicit assertion of absence (e.g. checking the attribute is undefined), a regression that leaks PII when sendDefaultPii is false would go undetected.
Triggered by project rule: PR Review Guidelines for Cursor Bot
There was a problem hiding this comment.
this is valid but we don't do this anywhere right now, I opened a follow-up issue to fix this across all our integrations: #19801


Embedding calls were incorrectly emitting an
invoke_agentspan alongside the actual embeddings span. Now we emit a singlegen_ai.embeddingsspan per embed call. Additionally, this fixes some attributes to align with our semantic convention.Fixes the embeddings tests for Vercel in the testing framework
Closes #19793