Skip to content

Is it possible to do continuous batching with an openai ChatCompletion compatible interface? #1605

Closed Answered by simon-mo
msugiyama57 asked this question in Q&A
Discussion options

You must be logged in to vote

The OpenAI server automatically batches concurrent requests already, just try it with concurrent requests using any OpenAI compatible clients!

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by msugiyama57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants