Can we use other models, say LLama or Gemini ? #436
Unanswered
SwastikGorai
asked this question in
Q&A
Replies: 3 comments 2 replies
-
i used gemma2 locally where i installed it with ollama and also used mixtral from groq api |
Beta Was this translation helpful? Give feedback.
2 replies
-
Have a look at #657 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Try litelm, it worked in my case ! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can we use models hosted in cloud, or gemini api? Or can we use Ollama if we are hosting the model locally?
Beta Was this translation helpful? Give feedback.
All reactions