-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run LLM with Ollama inside Daytona workspace #2
Comments
/bounty $100 |
💎 $100 bounty • DaytonaSteps to solve:
If no one is assigned to the issue, feel free to tackle it, without confirmation from us, after registering your attempt. In the event that multiple PRs are made from different people, we will generally accept those with the cleanest code. Please respect others by working on PRs that you are allowed to submit attempts to. e.g. If you reached the limit of active attempts, please wait for the ability to do so before submitting a new PR. If you can not submit an attempt, you will not receive your payout. Thank you for contributing to daytonaio/content! Add a bounty • Share on socials
|
/attempt #2
|
/attempt #2 Options |
/attempt #2
|
💡 @Kiran1689 submitted a pull request that claims the bounty. You can visit your bounty board to reward. |
🎉🎈 @Kiran1689 has been awarded $100! 🎈🎊 |
Content Type
Guide
Article Description
This should be How to article on running Ollama within Daytona.
Base the article on the linked reference repository, potentially make it work with some (propose which) AI coding VS Code extension that consumes Ollama API.
Target Audience
individual dev
References/Resources
https://github.com/pamelafox/ollama-python-playground/
Examples
No response
Special Instructions
Ensure that devcontainer is optimal and runs well in Daytona and local VS Code without Daytona.
The text was updated successfully, but these errors were encountered: