Please support stream with [POST] method, sending a whole PDF file with 10K+ words to LLM API #3258
Unanswered
incomingflyingbrick
asked this question in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I am working a LLM model project, I need to use the server-side-event to get some data from the LLM. But I need to use POST method, because GET requet can't send more than 2000 charaters, so I have to put my data in the POST body. For example, I need to a whole PDF file with more than 10K works to the LLM. I love the httpx library, could you guys support POST streaming with sync and async versions please. Thank you
Beta Was this translation helpful? Give feedback.
All reactions