You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is anyone proxying Titan embeddings for LibreChat? It worked fine for us for months, but possibly with the update to LiteLLM v1.80.5-stable it broke. (Well, it is broken for us today for sure, but there is a likely time correlation with us going to this version of LiteLLM.)
LibreChat embedding requests to LiteLLM now generate this error from Bedrock back to LiteLLM for us:
Malformed input request: expected type: String, found: JSONArray, please reformat your input and try again.
This is our LiteLLM configuration for this embedding model:
Note that simple curl tests to the proxy work fine:
curl -X POST https://our.litellm.hostname/v1/embeddings \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_LITELLM_API_KEY" \
-d '{
"model": "amazon.titan-embed-text-v2:0",
"input": "The quick brown fox jumps over the lazy dog"
}'
For completeness, here are relevant settings from our LibreChat (v0.7.9) which formally worked:
I poked around the LiteLLM issues on GitHub, as we seem to be associating the breaking change with an update to LiteLLM, but I didn't find any likely candidate.
There are kind of a number of pieces in the chain, and I am working on digging deeper, but if anyone is experiencing this, please do share!
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Is anyone proxying Titan embeddings for LibreChat? It worked fine for us for months, but possibly with the update to LiteLLM
v1.80.5-stableit broke. (Well, it is broken for us today for sure, but there is a likely time correlation with us going to this version of LiteLLM.)LibreChat embedding requests to LiteLLM now generate this error from Bedrock back to LiteLLM for us:
Malformed input request: expected type: String, found: JSONArray, please reformat your input and try again.This is our LiteLLM configuration for this embedding model:
Note that simple curl tests to the proxy work fine:
For completeness, here are relevant settings from our LibreChat (v0.7.9) which formally worked:
I poked around the LiteLLM issues on GitHub, as we seem to be associating the breaking change with an update to LiteLLM, but I didn't find any likely candidate.
There are kind of a number of pieces in the chain, and I am working on digging deeper, but if anyone is experiencing this, please do share!
Beta Was this translation helpful? Give feedback.
All reactions