|
| 1 | +--- |
| 2 | +sidebar_position: 4 |
| 3 | +--- |
| 4 | + |
| 5 | +# Ollama |
| 6 | + |
| 7 | +[Ollama](https://ollama.ai/) is a platform for running open-source large language models locally. It allows you to run various models on your own hardware, providing privacy and control over your AI interactions. |
| 8 | + |
| 9 | +## Setup |
| 10 | + |
| 11 | +To use Ollama with MyCoder: |
| 12 | + |
| 13 | +1. Install Ollama from [ollama.ai](https://ollama.ai/) |
| 14 | +2. Start the Ollama service |
| 15 | +3. Pull a model that supports tool calling |
| 16 | +4. Configure MyCoder to use Ollama |
| 17 | + |
| 18 | +### Installing Ollama |
| 19 | + |
| 20 | +Follow the installation instructions on the [Ollama website](https://ollama.ai/) for your operating system. |
| 21 | + |
| 22 | +For macOS: |
| 23 | +```bash |
| 24 | +curl -fsSL https://ollama.ai/install.sh | sh |
| 25 | +``` |
| 26 | + |
| 27 | +For Linux: |
| 28 | +```bash |
| 29 | +curl -fsSL https://ollama.ai/install.sh | sh |
| 30 | +``` |
| 31 | + |
| 32 | +For Windows, download the installer from the Ollama website. |
| 33 | + |
| 34 | +### Pulling a Model |
| 35 | + |
| 36 | +After installing Ollama, you need to pull a model that supports tool calling. **Important: Most Ollama models do not support tool calling**, which is required for MyCoder. |
| 37 | + |
| 38 | +A recommended model that supports tool calling is: |
| 39 | + |
| 40 | +```bash |
| 41 | +ollama pull medragondot/Sky-T1-32B-Preview:latest |
| 42 | +``` |
| 43 | + |
| 44 | +### Environment Variables |
| 45 | + |
| 46 | +You can set the Ollama base URL as an environment variable (defaults to http://localhost:11434 if not set): |
| 47 | + |
| 48 | +```bash |
| 49 | +export OLLAMA_BASE_URL=http://localhost:11434 |
| 50 | +``` |
| 51 | + |
| 52 | +### Configuration |
| 53 | + |
| 54 | +Configure MyCoder to use Ollama in your `mycoder.config.js` file: |
| 55 | + |
| 56 | +```javascript |
| 57 | +export default { |
| 58 | + // Provider selection |
| 59 | + provider: 'ollama', |
| 60 | + model: 'medragondot/Sky-T1-32B-Preview:latest', |
| 61 | + |
| 62 | + // Optional: Custom base URL (defaults to http://localhost:11434) |
| 63 | + // ollamaBaseUrl: 'http://localhost:11434', |
| 64 | + |
| 65 | + // Other MyCoder settings |
| 66 | + maxTokens: 4096, |
| 67 | + temperature: 0.7, |
| 68 | + // ... |
| 69 | +}; |
| 70 | +``` |
| 71 | + |
| 72 | +## Tool Calling Support |
| 73 | + |
| 74 | +**Important**: For MyCoder to function properly, the Ollama model must support tool calling (function calling). Most open-source models available through Ollama **do not** support this feature yet. |
| 75 | + |
| 76 | +Confirmed models with tool calling support: |
| 77 | + |
| 78 | +- `medragondot/Sky-T1-32B-Preview:latest` - Recommended for MyCoder |
| 79 | + |
| 80 | +If using other models, verify their tool calling capabilities before attempting to use them with MyCoder. |
| 81 | + |
| 82 | +## Hardware Requirements |
| 83 | + |
| 84 | +Running large language models locally requires significant hardware resources: |
| 85 | + |
| 86 | +- Minimum 16GB RAM (32GB+ recommended) |
| 87 | +- GPU with at least 8GB VRAM for optimal performance |
| 88 | +- SSD storage for model files (models can be 5-20GB each) |
| 89 | + |
| 90 | +## Best Practices |
| 91 | + |
| 92 | +- Start with smaller models if you have limited hardware |
| 93 | +- Ensure your model supports tool calling before using with MyCoder |
| 94 | +- Run on a machine with a dedicated GPU for better performance |
| 95 | +- Consider using a cloud provider's API for resource-intensive tasks if local hardware is insufficient |
| 96 | + |
| 97 | +## Troubleshooting |
| 98 | + |
| 99 | +If you encounter issues with Ollama: |
| 100 | + |
| 101 | +- Verify the Ollama service is running (`ollama serve`) |
| 102 | +- Check that you've pulled the correct model |
| 103 | +- Ensure the model supports tool calling |
| 104 | +- Verify your hardware meets the minimum requirements |
| 105 | +- Check Ollama logs for specific error messages |
| 106 | + |
| 107 | +For more information, visit the [Ollama Documentation](https://github.com/ollama/ollama/tree/main/docs). |
0 commit comments