Replies: 5 comments
-
ollama 的 后端api 默认就是http://localhost:11434。 需要先在ollama里面安装模型、开启服务后端口才会打开才可以使用。 |
Beta Was this translation helpful? Give feedback.
-
Ollama 简直就是个侮辱别人智商的软件,做得非常懒惰,还一直占着内存。 |
Beta Was this translation helpful? Give feedback.
-
api key要填 |
Beta Was this translation helpful? Give feedback.
-
谢谢,连上了,而且不能开代理。 |
Beta Was this translation helpful? Give feedback.
-
api key 填入 ollama 测试可以,谢谢 |
Beta Was this translation helpful? Give feedback.
-
本地的ollama连接不上,估计是llm连接模版,要求填入ollama的api,而本地安装的ollama没有api

Beta Was this translation helpful? Give feedback.
All reactions