-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make the run dialogue snappier by tuning batching interval #9824
Conversation
The batching interval of 2 seconds is legacy from the time when Ert was a mixture of Python and C, and a lot of threading issues attached. The underlying message structure and message processing infrastructure now handles a lot more messages, and the GUI can thus appear more responsive to the incoming messages from compute nodes.
CodSpeed Performance ReportMerging #9824 will not alter performanceComparing Summary
|
@@ -62,7 +62,7 @@ def __init__(self, ensemble: Ensemble, config: EvaluatorServerConfig): | |||
list[tuple[EVENT_HANDLER, Event]] | |||
] = asyncio.Queue() | |||
self._max_batch_size: int = 500 | |||
self._batching_interval: float = 2.0 | |||
self._batching_interval: float = 0.5 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought about this, but the issue arises when having max running more than 200. Have you tried big poly with this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will try with bigpoly, but which issue or negative impact would you look for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Performance wise. That is whether it makes gui more or less responsive, since this gives more work to ee.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When you flood the evaluator with messages (as in bigpoly), the parameter _max_batch_size
will determine the behaviour (as far as I read the code), not the batching interval.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🚀
The batching interval of 2 seconds is legacy from the time when Ert was a mixture of Python and C, and a lot of threading issues attached. The underlying message structure and message processing infrastructure now handles a lot more messages, and the GUI can thus appear more responsive to the incoming messages from compute nodes.
Issue
Resolves #9491
Approach
Short description of the approach
(Screenshot of new behavior in GUI if applicable)
git rebase -i main --exec 'pytest tests/ert/unit_tests -n auto --hypothesis-profile=fast -m "not integration_test"'
)When applicable