-
In some cases, I want to send a fixed string (e.g., when user exceeds some rate limit for chat, from the backend i would like to send I tried to figure out if there is an easy to construct export async function POST({ locals, request, params }) {
const user = await locals.optionalUser();
if(isRateLimited(user)) {
// FIXME: what to return here?
}
const { messages } = await request.json() as {
messages: { role: "user" | "assistant", content: string }[]
};
const response = await openai.createChatCompletion({
model: 'gpt-3.5-turbo',
stream: true,
messages: [
{
role: "system",
content: "<my-prompt>.",
},
...messages,
]
});
const stream = OpenAIStream(response);
return new StreamingTextResponse(stream);
} |
Beta Was this translation helpful? Give feedback.
Answered by
spy16
Jul 1, 2023
Replies: 1 comment
-
I figured out I can do this: const getStream = (text: string) => {
const encoder = new TextEncoder();
const encodedChunk = encoder.encode(text);
return new ReadableStream({
start(controller) {
controller.enqueue(encodedChunk);
controller.close();
}
});
} Let me know if there are better way to send errors. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
spy16
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I figured out I can do this:
Let me know if there are better way to send errors.