Skip to content

[rollout] chore: bump up trtllm image version to 1.3.0rc10#5841

Open
Superjomn wants to merge 3 commits intoverl-project:mainfrom
Superjomn:upgrade-megatron-0-16
Open

[rollout] chore: bump up trtllm image version to 1.3.0rc10#5841
Superjomn wants to merge 3 commits intoverl-project:mainfrom
Superjomn:upgrade-megatron-0-16

Conversation

@Superjomn
Copy link
Copy Markdown
Collaborator

@Superjomn Superjomn commented Apr 1, 2026

What does this PR do?

This PR bump up the trtllm docker image to v1.3.0rc10.

Checklist Before Starting

  • Search for similar PRs. Paste at least one query link here: ...
  • Format the PR title as [{modules}] {type}: {description} (This will be checked by the CI)
    • {modules} include fsdp, megatron, veomni, sglang, vllm, rollout, trainer, ci, training_utils, recipe, hardware, deployment, ray, worker, single_controller, misc, perf, model, algo, env, tool, ckpt, doc, data, cfg, reward, fully_async, one_step_off
    • If this PR involves multiple modules, separate them with , like [megatron, fsdp, doc]
    • {type} is in feat, fix, refactor, chore, test
    • If this PR breaks any API (CLI arguments, config, function signature, etc.), add [BREAKING] to the beginning of the title.
    • Example: [BREAKING][fsdp, megatron] feat: dynamic batching

Test

For changes that can not be tested by CI (e.g., algorithm implementation, new model support), validate by experiment(s) and show results like training curve plots, evaluation results, etc.

API and Usage Example

Demonstrate how the API changes if any, and provide usage example(s) if possible.

# Add code snippet or script demonstrating how to use this

Design & Code Changes

Demonstrate the high-level design if this PR is complex, and list the specific changes.

Checklist Before Submitting

Important

Please check all the following items before requesting a review, otherwise the reviewer might deprioritize this PR for review.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the TRT-LLM rollout implementation and Docker environment, including upgrading Megatron-LM to v0.16.0 and transitioning to SleepConfig for server management. It also removes the single-node restriction for TRT-LLM replicas. Review feedback points out that using a branch name for the DeepEP dependency in the Dockerfile compromises build reproducibility and identifies a potential IndexError in the placement group indexing logic that requires a bounds check.

@wuxibin89
Copy link
Copy Markdown
Collaborator

Should we also bump ci image?

Copy link
Copy Markdown
Collaborator

@hchings hchings left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you cherry-pick this commit as well? 4d36c21

Other than that, LGTM.

@Superjomn Superjomn changed the title [rollout,trtllm] bump up trtllm version to 1.3.0rc10 [rollout] bump up trtllm version to 1.3.0rc10 Apr 3, 2026
@Superjomn Superjomn changed the title [rollout] bump up trtllm version to 1.3.0rc10 [rollout, trtllm] bump up trtllm version to 1.3.0rc10 Apr 3, 2026
@Superjomn Superjomn changed the title [rollout, trtllm] bump up trtllm version to 1.3.0rc10 [rollout] chore: bump up trtllm version to 1.3.0rc10 Apr 3, 2026
@Superjomn Superjomn requested a review from hchings April 3, 2026 04:30
Copy link
Copy Markdown
Collaborator

@hchings hchings left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please address the cherry-pick comment. Other parts LGTM.

@Superjomn Superjomn force-pushed the upgrade-megatron-0-16 branch from 440c5f3 to 029b394 Compare April 3, 2026 12:59
"model_extra",
"executor_extra",
"model",
"model_weights",
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Superjomn Is this section for backward compatibility (for older trtllm version before we have the fine-grained labels)? If yes, then it should be exactly the same as the old tags at https://github.com/verl-project/verl/pull/5841/changes#diff-4d19b99d5dc8054a16c391ce00301671727c4c3549ecb6d904d33c2aa1f552beL263 (aka without model_weights and draft_model_weights). Otherwise I think it'll error out.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, let me update it.

@Superjomn Superjomn changed the title [rollout] chore: bump up trtllm version to 1.3.0rc10 [rollout] chore: bump up trtllm image version to 1.3.0rc10 Apr 7, 2026
@Superjomn Superjomn force-pushed the upgrade-megatron-0-16 branch 2 times, most recently from bc9205b to cc8943f Compare April 8, 2026 10:54
Superjomn and others added 3 commits April 8, 2026 21:52
….0rc10

- Upgrade Megatron-LM to core_v0.16.0, switch DeepEP branch from v1.2.1
  to hybrid-ep (removing now-unnecessary patch), add CCCL CPATH for build compat
- Bump TRT-LLM base image from 1.3.0rc4 to 1.3.0rc10
- Pin trl==0.27.0 to fix AutoModelForCausalLMWithValueHead import

Co-authored-by: Claude Sonnet 4.6 <[email protected]>
- Fix ExecutorMemoryType import path change in 1.3.0rc10
- Remove model_weights/draft_model_weights from fallback _WEIGHTS_TAGS
- Defer ServerAdapter import to avoid FlashInfer crash on CPU orchestrator
- Resolve CUTLASS SM 9.0 PTX failures on L20 (SM 8.9) GPUs via sm89 target
- Disable PDL, fix MoE backend case, guard FlashInfer import
- Update trtllm_worker.rst docs for new API

Co-authored-by: Claude Sonnet 4.6 <[email protected]>
- Bump CI image tag to 1.3.0rc10
- Restore TORCH_CUDA_ARCH_LIST as runtime fallback for Ray workers

Co-authored-by: Claude Sonnet 4.6 <[email protected]>
@Superjomn Superjomn force-pushed the upgrade-megatron-0-16 branch from 3b286a0 to 687adb0 Compare April 8, 2026 13:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants