[Bug Fix] Use global rank instead of local rank in AsyncSGLangRollout broadcast when TP > 1 #1449
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Checklist Before Starting
What does this PR do?
[Bug Fix] Fix the broadcast issue in AsyncSGLangRollout by using the global rank instead of the local rank as the rank index when TP > 1
High-Level Design
Currently, the Slang async rollout engine failed due to the following issue when TP > 1:
When TP = 2 and DP = 4, the device mesh for 8 GPUs is structured as follows:
In this setup, the Slang rollout is collected at the first local rank within each tensor parallel (TP) group, meaning ranks [0, 2, 4, 6] in the global (world) rank space.
Currently, Slang checks whether the current rank index matches the source (src) index in broadcast_pyobj. However, when using the local rank within a TP group, the only possible values are [0, 1] (since TP = 2). This mismatch causes rank == src to fail on some global ranks that hold actual data. As a result, an empty list is returned to VerL, which leads to an unpacking failure on the VerL side.
Specific Changes
Use global rank as rank index to ensure the rank with actual data is being broadcast to processor group.
API
Usage Example
Test
Test a multi turn training and the rollout can be generated successfully with the fix
Additional Info.
Checklist Before Submitting
[BREAKING]
to the PR title if it breaks any API.