Using local models #203
chenxizhang
started this conversation in
Use cases
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This is thanks to Ollama, whose work has made it so easy for us to run models locally, even if you don't have a machine equipped with a GPU. Please visit https://ollama.ai to learn more, and Windows users can get a quick experience through https://ollama.com/blog/windows-preview. There are currently hundreds of open-source models available, and you can even build your own (by modifying existing models).
Here is an example.
Beta Was this translation helpful? Give feedback.
All reactions