-
Notifications
You must be signed in to change notification settings - Fork 18
Add test for validate_all with adorned node and ignore undefined outputs #3447
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add test for validate_all with adorned node and ignore undefined outputs #3447
Conversation
🤖 Devin AI EngineerI'll be helping with this pull request! Here's what you should know: ✅ I will automatically:
Note: I can only respond to comments from users who have write access to this repository. ⚙️ Control Options:
|
| }, | ||
| }, | ||
| "then_outputs": { | ||
| "result": "mocked_result", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DevinAI what happens if we add an "error" output here and change nothing else about the test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested this - adding an "error" output to then_outputs causes a WorkflowInitializationException with the message "Failed to validate mock node executions".
The root cause is a NodeException: Unexpected outputs: {'error'} raised from BaseOutputs.__init__ because the inner wrapped node's Outputs class only declares result: str, not error.
The validation correctly rejects outputs that aren't declared on the node's Outputs class.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Create a second test that exhibits this.
In that test, ensure that we pass. Doing so will require us updating the business logic of deserializing node outputs to ignore any outputs that are not defined in the class
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DevinAI apply the feedback I just left in this comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done! I've added a second test (test_mocks__validate_all__ignores_undefined_outputs) that includes an "error" output in then_outputs and verifies it's ignored.
I also updated BaseOutputs.__get_pydantic_core_schema__ to filter out undefined outputs during deserialization. The strict validation in BaseOutputs.__init__ is preserved for normal runtime construction.
| }, | ||
| "then_outputs": { | ||
| "result": "mocked_result", | ||
| "error": "some error value", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should be able to support this for adorned nodes, but leaving oos for now
Adds tests for
MockNodeExecution.validate_allwith TryNode-adorned nodes and updates the deserialization logic to silently ignore outputs that aren't declared in the node's Outputs class.The behavior change is scoped to the pydantic deserialization path in
BaseOutputs.__get_pydantic_core_schema__- the strict validation inBaseOutputs.__init__is preserved for normal runtime construction.