Skip to content

Cannot get the last tool_call_output event in stream_events when MaxTurnsExceeded #526

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
MartinDai opened this issue Apr 16, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@MartinDai
Copy link

Please read this first

  • Have you read the docs?Agents SDK docs
  • Have you searched for related issues? Others may have faced similar issues.

Describe the bug

Here is my code

with trace("my-agents", group_id=context_manager.conversation_id):
    context_manager.input_items.append(EasyInputMessageParam(content=message, role="user"))
    result = Runner.run_streamed(
        context_manager.current_agent,
        context_manager.input_items,
        context=context_manager.context,
        max_turns=1,
        run_config=RunConfig(model_provider=MODEL_PROVIDER)
    )

    try:
        async for event in result.stream_events():
            if event.type == "raw_response_event":
                if isinstance(event.data, ResponseTextDeltaEvent):
                    final_chunk = create_chunk(conversation_id=context_manager.conversation_id,
                                               content=event.data.delta,
                                               role="assistant", model=MODEL_NAME)
                    yield f"data: {json.dumps(final_chunk)}\n\n"
                else:
                    continue
            elif event.type == "agent_updated_stream_event":
                logger.info(f"Handed off to {event.new_agent.name}")
            elif event.type == "run_item_stream_event":
                if event.item.type == "tool_call_item":
                    content = f"tool call: {event.item.raw_item.name} arguments: {event.item.raw_item.arguments}"
                    logger.info(content)
                    block = create_block(conversation_id=context_manager.conversation_id,
                                         content=content,
                                         role="agent",
                                         model=MODEL_NAME)
                    yield f"data: {json.dumps(block)}\n\n"
                elif event.item.type == "tool_call_output_item":
                    content = f"tool call output: {event.item.output}"
                    logger.info(content)
                    block = create_block(conversation_id=context_manager.conversation_id,
                                         content=content,
                                         role="tool",
                                         model=MODEL_NAME)
                    yield f"data: {json.dumps(block)}\n\n"
                elif event.item.type == "message_output_item":
                    logger.info(f"AI: {ItemHelpers.text_message_output(event.item)}")
                else:
                    continue
            else:
                continue
    except MaxTurnsExceeded:
        logger.warning(f"Max turns exceeded in conversation {context_manager.conversation_id}")
        error_chunk = create_chunk(
            conversation_id=context_manager.conversation_id,
            content="The session has exceeded the limit. The session has been reset. Please start again.",
            role="assistant",
            model=MODEL_NAME
        )
        yield f"data: {json.dumps(error_chunk)}\n\n"
    except Exception as e:
        logger.error(f"Unexpected error: {str(e)}")
        error_chunk = create_chunk(
            conversation_id=context_manager.conversation_id,
            content="An unknown error occurred during processing. Please try again later.",
            role="assistant",
            model=MODEL_NAME
        )
        yield f"data: {json.dumps(error_chunk)}\n\n"

    context_manager.input_items = result.to_input_list()
    context_manager.current_agent = result.last_agent
    conversation.save_conversation(context_manager)

When i start a conversation and the llm's response with a tool call, the code cannot into event.item.type == "tool_call_output_item": branch, but actually the tool call was invoked, i can find tool call result in result.to_input_list()

Debug information

  • Agents SDK version: 0.0.10
  • Python version 3.12.0

Repro steps

Use the code I provided above

Expected behavior

put tool_call_output event to stream_events before throw MaxTurnsExceeded

@MartinDai MartinDai added the bug Something isn't working label Apr 16, 2025
@MartinDai MartinDai changed the title Cannot get last tool_call_output event in stream_events when MaxTurnsExceeded Cannot get the last tool_call_output event in stream_events when MaxTurnsExceeded Apr 16, 2025
@rm-openai
Copy link
Collaborator

Hmm yeah the bug here is within stream_events. Specifically, the new items for tool calls etc are enqueued. But before they are yielded, we check the current_turn and raise an error.

Instead, we should probably push events like turn_updated onto the event queue.

I'll try to get to this soon! PR welcome as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants