Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make the run dialogue snappier by tuning batching interval #9824

Merged
merged 1 commit into from
Jan 23, 2025

Conversation

berland
Copy link
Contributor

@berland berland commented Jan 21, 2025

The batching interval of 2 seconds is legacy from the time when Ert was a mixture of Python and C, and a lot of threading issues attached. The underlying message structure and message processing infrastructure now handles a lot more messages, and the GUI can thus appear more responsive to the incoming messages from compute nodes.

Issue
Resolves #9491

Approach
Short description of the approach

(Screenshot of new behavior in GUI if applicable)

  • PR title captures the intent of the changes, and is fitting for release notes.
  • Added appropriate release note label
  • Commit history is consistent and clean, in line with the contribution guidelines.
  • Make sure unit tests pass locally after every commit (git rebase -i main --exec 'pytest tests/ert/unit_tests -n auto --hypothesis-profile=fast -m "not integration_test"')

When applicable

  • When there are user facing changes: Updated documentation
  • New behavior or changes to existing untested code: Ensured that unit tests are added (See Ground Rules).
  • Large PR: Prepare changes in small commits for more convenient review
  • Bug fix: Add regression test for the bug
  • Bug fix: Create Backport PR to latest release

The batching interval of 2 seconds is legacy from the time when Ert was
a mixture of Python and C, and a lot of threading issues attached. The
underlying message structure and message processing infrastructure now
handles a lot more messages, and the GUI can thus appear more responsive
to the incoming messages from compute nodes.
@berland berland requested a review from xjules January 21, 2025 11:47
@berland berland added release-notes:user-impact Automatically categorise as breaking for people using CLI/GUI release-notes:improvement Automatically categorise as improvement in release notes and removed release-notes:user-impact Automatically categorise as breaking for people using CLI/GUI labels Jan 21, 2025
@berland berland self-assigned this Jan 21, 2025
Copy link

codspeed-hq bot commented Jan 21, 2025

CodSpeed Performance Report

Merging #9824 will not alter performance

Comparing berland:snappy_run_dialog (d77fafb) with main (6a978df)

Summary

✅ 24 untouched benchmarks

@@ -62,7 +62,7 @@ def __init__(self, ensemble: Ensemble, config: EvaluatorServerConfig):
list[tuple[EVENT_HANDLER, Event]]
] = asyncio.Queue()
self._max_batch_size: int = 500
self._batching_interval: float = 2.0
self._batching_interval: float = 0.5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought about this, but the issue arises when having max running more than 200. Have you tried big poly with this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will try with bigpoly, but which issue or negative impact would you look for?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Performance wise. That is whether it makes gui more or less responsive, since this gives more work to ee.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When you flood the evaluator with messages (as in bigpoly), the parameter _max_batch_size will determine the behaviour (as far as I read the code), not the batching interval.

Copy link
Contributor

@xjules xjules left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚀

@berland berland merged commit 58fd934 into equinor:main Jan 23, 2025
33 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release-notes:improvement Automatically categorise as improvement in release notes
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

Tune batching parameters in ensemble evaluator
3 participants