-
Couldn't load subscription status.
- Fork 23
Connector Implementation & Inheritance Ollama #456
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
tae0y
merged 25 commits into
aliencube:main
from
donghyeon639:feat/269-ollama-connector-implementation-clean
Oct 25, 2025
Merged
Changes from 4 commits
Commits
Show all changes
25 commits
Select commit
Hold shift + click to select a range
9598325
Connector Implementation & Inheritance
donghyeon639 a1f9683
Update OllamaConnectorTests.cs with latest improvements
donghyeon639 8add602
fix OllamaconnetcorTests.cs
donghyeon639 22eafbd
fix local container ollama
donghyeon639 ae1f2a5
fix: ollama bicep, ollama.md
donghyeon639 0d42437
conflicts fix and README.MD ollama
donghyeon639 8bb60fa
fix resuorces.bicep, mainparemeter.json
donghyeon639 61b73b0
comfilct fix
donghyeon639 791da59
fix conflict and Ollama Tests,ollama.md
donghyeon639 f7c1d84
ollama test fix
donghyeon639 eebb2cf
fix ollama.md Ollamatests
donghyeon639 a3a109a
conflict fix
donghyeon639 e78460d
confilct fix
donghyeon639 52417c2
fix ollama test
donghyeon639 226b5fc
fix ollama bicep
donghyeon639 68302e9
fix root README.md
donghyeon639 a03cb5c
fix ollamatest , ollama.md
donghyeon639 c1abb53
Merge latest changes from upstream/main and fix OllamaConnectorTests
donghyeon639 bf43b02
fix ollamatest, ollama.md
donghyeon639 45ff64b
ollama test languagemodel unit test
donghyeon639 3c4bdd5
ollama IntegrationTest fix
donghyeon639 bd231cd
Update ollama.md
tae0y adee6cf
Update OllamaConnectorTests.cs
tae0y ac958a0
Merge branch 'main' into feat/269-ollama-connector-implementation-clean
tae0y c76f1d3
Update OllamaConnectorTests.cs
tae0y File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Some comments aren't visible on the classic Files Changed page.
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,171 @@ | ||
| # OpenChat Playground with Ollama | ||
|
|
||
| This page describes how to run OpenChat Playground (OCP) with Ollama integration. | ||
|
|
||
| ## Get the repository root | ||
|
|
||
| 1. Get the repository root. | ||
|
|
||
| ```bash | ||
| # bash/zsh | ||
| REPOSITORY_ROOT=$(git rev-parse --show-toplevel) | ||
| ``` | ||
|
|
||
| ```powershell | ||
| # PowerShell | ||
| $REPOSITORY_ROOT = git rev-parse --show-toplevel | ||
| ``` | ||
|
|
||
| ## Run on local machine | ||
|
|
||
| 1. Make sure you are at the repository root. | ||
|
|
||
| ```bash | ||
| cd $REPOSITORY_ROOT | ||
| ``` | ||
|
|
||
| 1. Make sure Ollama is installed and running on your local machine. If not, install Ollama from [ollama.com](https://ollama.com/) and start the service. | ||
|
|
||
| ```bash | ||
| # Start Ollama service | ||
| ollama serve | ||
| ``` | ||
|
|
||
| 1. Pull the model you want to use. Replace `{{MODEL_NAME}}` with your desired model. | ||
|
|
||
| ```bash | ||
| # Example: Pull llama3.2 model | ||
| ollama pull llama3.2 | ||
|
|
||
| # Or pull other models | ||
| ollama pull mistral | ||
| ollama pull phi3 | ||
| ollama pull qwen | ||
| ``` | ||
|
|
||
| 1. Run the app. | ||
|
|
||
| ```bash | ||
| dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- --connector-type Ollama --model llama3.2 | ||
| ``` | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| 1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts. | ||
|
|
||
| ## Run in local container | ||
|
|
||
| This approach runs OpenChat Playground in a container while connecting to Ollama running on the host machine. | ||
|
|
||
| 1. Make sure you are at the repository root. | ||
|
|
||
| ```bash | ||
| cd $REPOSITORY_ROOT | ||
| ``` | ||
|
|
||
| 1. Configure Ollama to accept connections from containers. | ||
|
|
||
| ```powershell | ||
| # PowerShell (Windows) | ||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| $env:OLLAMA_HOST = "0.0.0.0:11434" | ||
|
|
||
| # Start Ollama service | ||
| ollama serve | ||
| ``` | ||
|
|
||
| ```bash | ||
| # bash/zsh (Linux/macOS) | ||
| export OLLAMA_HOST=0.0.0.0:11434 | ||
| ollama serve | ||
| ``` | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| 1. Pull the model you want to use. | ||
|
|
||
| ```bash | ||
| # Pull llama3.2 model (recommended) | ||
| ollama pull llama3.2 | ||
|
|
||
| # Verify Ollama is accessible | ||
| curl http://localhost:11434/api/version | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| ``` | ||
|
|
||
| 1. Build a container. | ||
|
|
||
| ```bash | ||
| docker build -f Dockerfile -t openchat-playground:latest . | ||
| ``` | ||
|
|
||
| 1. Run the app. | ||
|
|
||
| ```bash | ||
| # Using command-line arguments | ||
| docker run -i --rm -p 8080:8080 \ | ||
| openchat-playground:latest \ | ||
| --connector-type Ollama \ | ||
| --base-url http://host.docker.internal:11434 \ | ||
| --model llama3.2 | ||
| ``` | ||
|
|
||
| ```bash | ||
| # Alternative: Using environment variables | ||
| docker run -i --rm -p 8080:8080 \ | ||
| -e ConnectorType=Ollama \ | ||
| -e Ollama__BaseUrl=http://host.docker.internal:11434 \ | ||
| -e Ollama__Model=llama3.2 \ | ||
| openchat-playground:latest | ||
| ``` | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| > **NOTE**: Use `host.docker.internal:11434` to connect to Ollama running on the host machine from inside the container. Make sure `OLLAMA_HOST=0.0.0.0:11434` is set on the host. | ||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| 1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts. | ||
|
|
||
| ## Run on Azure | ||
|
|
||
| 1. Make sure you are at the repository root. | ||
|
|
||
| ```bash | ||
| cd $REPOSITORY_ROOT | ||
| ``` | ||
|
|
||
| 1. Login to Azure. | ||
|
|
||
| ```bash | ||
| # Login to Azure Dev CLI | ||
| azd auth login | ||
| ``` | ||
|
|
||
| 1. Check login status. | ||
|
|
||
| ```bash | ||
| # Azure Dev CLI | ||
| azd auth login --check-status | ||
| ``` | ||
|
|
||
| 1. Initialize `azd` template. | ||
|
|
||
| ```bash | ||
| azd init | ||
| ``` | ||
|
|
||
| > **NOTE**: You will be asked to provide environment name for provisioning. | ||
|
|
||
| 1. Set Ollama configuration to azd environment variables. | ||
| ```bash | ||
| # Set connector type to Ollama | ||
| azd env set CONNECTOR_TYPE "Ollama" | ||
|
|
||
| # Optionally, set a specific model (default is llama3.2) | ||
| azd env set OLLAMA_MODEL "llama3.2" | ||
| ``` | ||
|
|
||
| 1. Run the following commands in order to provision and deploy the app. | ||
|
|
||
| ```bash | ||
| azd up | ||
| ``` | ||
|
|
||
| > **NOTE**: You will be asked to provide Azure subscription and location for deployment. | ||
|
|
||
| 1. Clean up all the resources. | ||
|
|
||
| ```bash | ||
| azd down --force --purge | ||
| ``` | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,53 @@ | ||
| using Microsoft.Extensions.AI; | ||
| using OllamaSharp; | ||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| using OpenChat.PlaygroundApp.Abstractions; | ||
| using OpenChat.PlaygroundApp.Configurations; | ||
|
|
||
| namespace OpenChat.PlaygroundApp.Connectors; | ||
|
|
||
| /// <summary> | ||
| /// This represents the connector entity for Ollama. | ||
| /// </summary> | ||
| public class OllamaConnector(AppSettings settings) : LanguageModelConnector(settings.Ollama) | ||
| { | ||
| /// <inheritdoc/> | ||
| public override bool EnsureLanguageModelSettingsValid() | ||
| { | ||
| var settings = this.Settings as OllamaSettings; | ||
| if (settings is null) | ||
| { | ||
| throw new InvalidOperationException("Missing configuration: Ollama."); | ||
| } | ||
|
|
||
| if (string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true) | ||
| { | ||
| throw new InvalidOperationException("Missing configuration: Ollama:BaseUrl."); | ||
| } | ||
|
|
||
| if (string.IsNullOrWhiteSpace(settings.Model!.Trim()) == true) | ||
| { | ||
| throw new InvalidOperationException("Missing configuration: Ollama:Model."); | ||
| } | ||
|
|
||
tae0y marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| return true; | ||
| } | ||
|
|
||
| /// <inheritdoc/> | ||
| public override async Task<IChatClient> GetChatClientAsync() | ||
| { | ||
| var settings = this.Settings as OllamaSettings; | ||
| var baseUrl = settings!.BaseUrl!; | ||
| var model = settings!.Model!; | ||
|
|
||
| var config = new OllamaApiClient.Configuration | ||
| { | ||
| Uri = new Uri(baseUrl), | ||
| Model = model, | ||
| }; | ||
|
|
||
| var chatClient = new OllamaApiClient(config); | ||
|
|
||
tae0y marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| return await Task.FromResult(chatClient).ConfigureAwait(false); | ||
| } | ||
| } | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.