Skip to content

Conversation

@junya-takayama
Copy link
Collaborator

@junya-takayama junya-takayama commented Sep 10, 2025

Upgrade vllm version.
I have confirmed that VLLM-related test cases pass in our environment.

@butsugiri
Copy link
Collaborator

FYI: v0.10.2 is out https://github.com/vllm-project/vllm/releases/tag/v0.10.2

]
chunk_batch_outputs: list[RequestOutput] = self.llm.generate(
prompt_token_ids=chunk_batch_input_ids,
prompts=[TokensPrompt(prompt_token_ids=prompt_token_ids) for prompt_token_ids in chunk_batch_input_ids],
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

prompt_token_ids in LLM.generate was removed.
vllm-project/vllm#18800

@junya-takayama junya-takayama changed the title [WIP] upgrade: vllm==0.10.1.1 upgrade: vllm==0.10.2 Sep 22, 2025
@junya-takayama junya-takayama marked this pull request as ready for review September 22, 2025 06:56
@junya-takayama junya-takayama requested a review from a team September 22, 2025 06:56
Copy link
Contributor

@ryokan0123 ryokan0123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@junya-takayama junya-takayama merged commit 34847d9 into main Sep 22, 2025
8 checks passed
@junya-takayama junya-takayama deleted the vllm_0.10.1.1 branch September 22, 2025 07:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants