Skip to content

Adding Langsmith trace processor introduces huge latency to chat #529

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
kmariunas opened this issue Apr 16, 2025 · 1 comment
Open

Adding Langsmith trace processor introduces huge latency to chat #529

kmariunas opened this issue Apr 16, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@kmariunas
Copy link

kmariunas commented Apr 16, 2025

Describe the bug

Hey, when we add a langsmith trace processor, our latency goes through the roof. We have a suspicion, that tracing is not done asynchronously. Is there a way to make it async?

This is before and after we removed langsmith (but kept openai) tracing:
Image

Debug information

  • Agents SDK version: (e.g. v0.0.6)
  • Python version (e.g. Python 3.12.8)

Repro steps

from agents import (
    add_trace_processor,
)
from langsmith.wrappers import OpenAIAgentsTracingProcessor
add_trace_processor(OpenAIAgentsTracingProcessor())

Expected behavior

Tracing does not intrduce any latency to the system

@kmariunas kmariunas added the bug Something isn't working label Apr 16, 2025
@rm-openai
Copy link
Collaborator

@baskaryan, you added this integration - any ideas?

Tracing isn't sync. It gathers events on the same thread/task, but the expectation is that exports occur in the background.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants