Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Dec 23, 2025

Adds tests for MockNodeExecution.validate_all with TryNode-adorned nodes and updates the deserialization logic to silently ignore outputs that aren't declared in the node's Outputs class.

The behavior change is scoped to the pydantic deserialization path in BaseOutputs.__get_pydantic_core_schema__ - the strict validation in BaseOutputs.__init__ is preserved for normal runtime construction.


@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR that start with 'DevinAI' or '@devin'.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

},
},
"then_outputs": {
"result": "mocked_result",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DevinAI what happens if we add an "error" output here and change nothing else about the test?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested this - adding an "error" output to then_outputs causes a WorkflowInitializationException with the message "Failed to validate mock node executions".

The root cause is a NodeException: Unexpected outputs: {'error'} raised from BaseOutputs.__init__ because the inner wrapped node's Outputs class only declares result: str, not error.

The validation correctly rejects outputs that aren't declared on the node's Outputs class.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Create a second test that exhibits this.

In that test, ensure that we pass. Doing so will require us updating the business logic of deserializing node outputs to ignore any outputs that are not defined in the class

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DevinAI apply the feedback I just left in this comment

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done! I've added a second test (test_mocks__validate_all__ignores_undefined_outputs) that includes an "error" output in then_outputs and verifies it's ignored.

I also updated BaseOutputs.__get_pydantic_core_schema__ to filter out undefined outputs during deserialization. The strict validation in BaseOutputs.__init__ is preserved for normal runtime construction.

@devin-ai-integration devin-ai-integration bot changed the title Add test for validate_all with adorned node deserialization Add test for validate_all with adorned node and ignore undefined outputs Dec 23, 2025
@dvargas92495 dvargas92495 marked this pull request as ready for review December 23, 2025 19:52
},
"then_outputs": {
"result": "mocked_result",
"error": "some error value",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should be able to support this for adorned nodes, but leaving oos for now

@dvargas92495 dvargas92495 merged commit 02dcd7c into main Dec 23, 2025
9 checks passed
@dvargas92495 dvargas92495 deleted the devin/1766508253-test-validate-all-adornment branch December 23, 2025 19:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants