Skip to content

Commit

Permalink
docs: ollama: added the advaced reverse proxy configuration
Browse files Browse the repository at this point in the history
  • Loading branch information
enricoros committed Nov 10, 2023
1 parent e5f498c commit 648ab3e
Showing 1 changed file with 36 additions and 4 deletions.
40 changes: 36 additions & 4 deletions docs/config-ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,17 +23,49 @@ For detailed instructions on setting up the Ollama API server, please refer to t
### Visual Guide

* After adding the `Ollama` model vendor, entering the IP address of an Ollama server, and refreshing models:
![config-local-ollama-1-models.png](pixels/config-ollama-1-models.png)
<img src="pixels/config-ollama-1-models.png" alt="config-local-ollama-1-models.png" style="max-width: 320px;">

* The `Ollama` admin panel, with the `Pull` button highlighted, after pulling the "Yi" model:
![config-local-ollama-2-admin-pull.png](pixels/config-ollama-2-admin-pull.png)
<img src="pixels/config-ollama-2-admin-pull.png" alt="config-local-ollama-2-admin-pull.png" style="max-width: 320px;">

* You can now switch model/persona dynamically and text/voice chat with the models:
![config-local-ollama-3-chat.png](pixels/config-ollama-3-chat.png)
<img src="pixels/config-ollama-3-chat.png" alt="config-local-ollama-3-chat.png" style="max-width: 320px;">

### Advanced Configuration
### Advanced: Model parameters

For users who wish to delve deeper into advanced settings, `big-AGI` offers additional configuration options, such
as the model temperature, maximum tokens, etc.

### Advanced: Ollama under a reverse proxy

You can elegantly expose your Ollama server to the internet (and thus make it easier to use from your server-side
big-AGI deployments) by exposing it on an http/https URL, such as: `https://yourdomain.com/ollama`

On Ubuntu Servers, you will need to install `nginx` and configure it to proxy requests to Ollama.

```bash
sudo apt update
sudo apt install nginx
sudo apt install certbot python3-certbot-nginx
sudo certbot --nginx -d yourdomain.com
```

Then, edit the nginx configuration file `/etc/nginx/sites-enabled/default` and add the following block:

```nginx
location /ollama/ {
proxy_pass http://localhost:11434;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
# Further proxy settings...
}
```

Reach out to our community if you need help with this.

### Community and Support

Join our community to share your experiences, get help, and discuss best practices:
Expand Down

1 comment on commit 648ab3e

@vercel
Copy link

@vercel vercel bot commented on 648ab3e Nov 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Successfully deployed to the following URLs:

big-agi – ./

big-agi-git-main-enricoros.vercel.app
get.big-agi.com
big-agi-enricoros.vercel.app

Please sign in to comment.