Is it possible to do continuous batching with an openai ChatCompletion compatible interface? #1605
-
The following code shows an example of doing Continuous batching, can it be done in oepnai ChatCompletion format? |
Beta Was this translation helpful? Give feedback.
Answered by
simon-mo
Nov 9, 2023
Replies: 1 comment
-
The OpenAI server automatically batches concurrent requests already, just try it with concurrent requests using any OpenAI compatible clients! |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
msugiyama57
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The OpenAI server automatically batches concurrent requests already, just try it with concurrent requests using any OpenAI compatible clients!