This repository has been archived by the owner on Sep 12, 2024. It is now read-only.
Which backend? #82
Answered
by
elijah629
DavidBDiligence
asked this question in
Q&A
-
Hi, I'm trying to get started and the instructions call for the installation of one of three inference backends (llama.cpp, llm, or rwkv.cpp) with no guidance about how to choose one. Would you please provide a quick-start recommendation for a beginner? |
Beta Was this translation helpful? Give feedback.
Answered by
elijah629
Jun 20, 2023
Replies: 1 comment
-
Personally I would recommend llama.cpp, it works well with many different types of models |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
DavidBDiligence
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Personally I would recommend llama.cpp, it works well with many different types of models