-
Notifications
You must be signed in to change notification settings - Fork 476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Usage] How to use ollama with local server #445
Comments
Could you try a newer version of LLMs supported by ollama, for example, llama3:8b to see whether it still has the problem? |
ok. I will have a try as your suggestion |
When I use llama3:8b, the results is same |
I see, that could be the connection issue to ollama models in the kernel, let me take a look. |
ok, Thank you for your help |
The GPU environment at my side looks good. The CPU environment may not support LLMs exceeding 7B as it could cause timeout due to long-context. Try a smaller model, e.g., Qwen-2.5 series (1.5B and 3B) to see how it works. |
Thank you, let me have a try. |
Sorry, I encounter another issue: when I execute the command as ur suggestion:
the results have errors as follws: |
For the issue above, I add breakpoint, I find that when the porgram execute the" tools = llm_syscall.query.tools" in the function address_syscall which locate aios/llm_core/cores/local/ollama.py file, it return null. Why it return null and how can i fix the issue? |
let me try to reproduce this issue and see how it happens |
Check out to v0.2.1 to see whether it can work. If that works, could you submit a PR to change the pointer from 0.2.0.beta to the latest version? Thx |
hello, cpuld u please tell me how to fix the issue(below code in AIOS/aios/llm_core/cores/local/ollama.py
|
The language tutor agent does not import extra tools, so the query.tool param is null. Does this block the overall execution of this agent? |
Checked other resources
Your current environment
ubuntu 20.0.4
AIOS: v0.2.0.beta
How would you like to use aios
hi,
I setup the environment follow the readme, and I plan to use the backend ollama.
I run the ollama as the AIOS-LSFS/scripts/run_agent.sh,
kernel outputs are as follows:
How can i fix the issues? Thank you
The text was updated successfully, but these errors were encountered: