Skip to content

Fix native Ollama summarizer config by adapting execution to pi-ai's OpenAI-compatible path#88

Draft
sene1337 wants to merge 2 commits intoMartian-Engineering:mainfrom
sene1337:fix/native-ollama-adapter
Draft

Fix native Ollama summarizer config by adapting execution to pi-ai's OpenAI-compatible path#88
sene1337 wants to merge 2 commits intoMartian-Engineering:mainfrom
sene1337:fix/native-ollama-adapter

Conversation

@sene1337
Copy link

Origin: Created 2026-03-15 by Sene after the native-Ollama adapter patch passed the full lossless-claw test suite locally. Parent processes: Project Planning, Project Containment SOP, and the project tracker at /Users/senemaro/.openclaw/workspace/docs/projects/lossless-claw-ollama-pr/project-plan.md.

PR Draft — lossless-claw native Ollama summarizer adapter

Proposed title

Fix native Ollama summarizer config by adapting execution to pi-ai's OpenAI-compatible path

Maintainer-facing summary

This patch fixes the current gap where lossless-claw can resolve OpenClaw-native Ollama config during summarizer setup, but the bundled @mariozechner/pi-ai execution path cannot execute api: "ollama" directly.

Instead of widening execution support or changing behavior for every provider, this patch keeps the blast radius narrow:

  • if summarizer runtime provider config resolves to native api: "ollama"
  • adapt execution to pi-ai's supported openai-completions lane
  • normalize the provider baseUrl to <ollama-base>/v1
  • use runtime/config apiKey when present, otherwise fall back to ollama-local
  • preserve existing behavior for non-Ollama providers

Why this approach

  • matches how bundled pi-ai already talks to Ollama in OpenAI-compatible mode
  • avoids pretending native api: "ollama" execution exists in the current bundled provider registry
  • keeps the change local to the LCM summarizer path that currently fails
  • adds focused tests for both base URL normalization and config-backed apiKey fallback

Files changed

  • src/summarize.ts
  • test/summarize.test.ts

Validation performed

npm ci
npx vitest run test/summarize.test.ts
npx vitest run test/index-complete-provider-config.test.ts
npm test

Result:

  • test/summarize.test.ts: passed
  • test/index-complete-provider-config.test.ts: passed
  • full suite: 20 passed, 249 tests passed

Notes / scope boundaries

  • This patch intentionally targets the summarizer path only.
  • It does not claim that every possible future lossless-claw execution path automatically supports native OpenClaw Ollama config without similar adaptation.
  • It preserves the configured provider id (ollama) while adapting execution inputs for pi-ai compatibility.

Changelog

Version Date Change
1.0 2026-03-15 Created PR draft after adapter patch + full test-suite validation.

@sene1337 sene1337 marked this pull request as draft March 15, 2026 19:37
@darbsllim
Copy link

we couldn't get this working ... it solved the LCM errors, but introduced new errors that caused telegram and TUI instability.

Giving up on it for now.

@sene1337
Copy link
Author

Follow-up from live testing on an OpenClaw gateway with lossless-claw active as the context engine:

  • Before any local patching, the failing path reported No API key for provider: ollama.
  • After setting OLLAMA_API_KEY=ollama-local and restarting, the failure changed to 401 Incorrect API key provided: ollama-local, and the diagnostic envelope showed request_provider=ollama, request_model=qwen2.5:14b, request_api=openai-responses.
  • After deploying the adapter in this PR, that old auth error disappeared, but LCM still failed in production with No API provider registered for api: ollama.

So in live use this patch changed the failure mode, but did not fully cover the production path.

My current read is that some summarizer calls are still reaching deps.complete() without the compat rewrite applied consistently, then falling back to inferring native api=ollama, while the runtime path in play still does not have a native ollama provider registered.

Operationally, because this was happening inside the live gateway/Lossless Context path, it caused user-visible instability during testing, so I rolled back the local deploy and paused further Ollama + lossless-claw work for now.

Hosted provider lanes remain stable; the breakage we hit was specific to the Ollama path.

If helpful, I can also post a narrower follow-up pointing to the exact runtime layer that seems to need the deterministic fix (index.ts / final completion shim) rather than only src/summarize.ts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants