Fix native Ollama summarizer config by adapting execution to pi-ai's OpenAI-compatible path#88
Conversation
|
we couldn't get this working ... it solved the LCM errors, but introduced new errors that caused telegram and TUI instability. Giving up on it for now. |
|
Follow-up from live testing on an OpenClaw gateway with
So in live use this patch changed the failure mode, but did not fully cover the production path. My current read is that some summarizer calls are still reaching Operationally, because this was happening inside the live gateway/Lossless Context path, it caused user-visible instability during testing, so I rolled back the local deploy and paused further Ollama + lossless-claw work for now. Hosted provider lanes remain stable; the breakage we hit was specific to the Ollama path. If helpful, I can also post a narrower follow-up pointing to the exact runtime layer that seems to need the deterministic fix ( |
PR Draft — lossless-claw native Ollama summarizer adapter
Proposed title
Fix native Ollama summarizer config by adapting execution to pi-ai's OpenAI-compatible path
Maintainer-facing summary
This patch fixes the current gap where lossless-claw can resolve OpenClaw-native Ollama config during summarizer setup, but the bundled
@mariozechner/pi-aiexecution path cannot executeapi: "ollama"directly.Instead of widening execution support or changing behavior for every provider, this patch keeps the blast radius narrow:
api: "ollama"openai-completionslanebaseUrlto<ollama-base>/v1apiKeywhen present, otherwise fall back toollama-localWhy this approach
api: "ollama"execution exists in the current bundled provider registryFiles changed
src/summarize.tstest/summarize.test.tsValidation performed
Result:
test/summarize.test.ts: passedtest/index-complete-provider-config.test.ts: passed20 passed,249 tests passedNotes / scope boundaries
ollama) while adapting execution inputs for pi-ai compatibility.Changelog