feat: add vllm-metrics-tui optional dependency and tmux integration#2194
feat: add vllm-metrics-tui optional dependency and tmux integration#2194
Conversation
- Add `tui` optional dependency group for vllm-metrics-tui - Update tmux.sh to auto-discover inference server URLs from logs and launch the metrics TUI in a dedicated window Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
| if [[ "$url" == *"0.0.0.0"* ]]; then | ||
| port=$(echo "$url" | grep -oP ':\d+$') | ||
| # Try to find the real hostname from orchestrator logs | ||
| host=$(grep -oP "ltc-[a-z0-9-]+${port}" "${LOG_DIR}/orchestrator.log" 2>/dev/null | head -1) |
There was a problem hiding this comment.
Hostname resolution returns same host for all nodes
Medium Severity
In multi-node inference where all servers bind to 0.0.0.0 on the same default port (8000), the grep … | head -1 on each iteration of the for node_log loop always returns the same first matching hostname from the orchestrator log. This means every node resolves to the same URL, so METRICS_URLS contains duplicates of a single server rather than distinct URLs for each inference node. The grep has no per-node correlation — it needs to either track already-matched hosts or use node-specific identifiers.


tuioptional dependency group for vllm-metrics-tuihttps://github.com/samsja/vllm-metrics-tui
Note
Low Risk
Low risk: changes are limited to packaging metadata and a best-effort tmux helper that only runs when
vllm-metrics-tuiis installed and matching logs are present.Overview
Adds a new optional dependency extra,
prime-rl[tui], sourcingvllm-metrics-tuifrom Git to enable a metrics terminal UI.Updates
scripts/tmux.shto optionally create a Metrics tmux window that auto-discovers vLLM inference server URLs by parsinglogs/inference/node_*.log(and resolving0.0.0.0hosts viaorchestrator.log) and then launchesvllm-metrics-tuiwith the discovered endpoints.Written by Cursor Bugbot for commit 49d4aa5. This will update automatically on new commits. Configure here.