diff --git a/README.md b/README.md index 5c4b5cadf..d28c4a870 100644 --- a/README.md +++ b/README.md @@ -220,7 +220,7 @@ During onboarding, NemoClaw validates the selected provider and model before it Credentials stay on the host in `~/.nemoclaw/credentials.json`. The sandbox only sees the routed `inference.local` endpoint, not your raw provider key. -Local Ollama is supported in the standard onboarding flow. Local vLLM remains experimental, and local host-routed inference on macOS still depends on OpenShell host-routing support in addition to the local service itself being reachable on the host. +Local Ollama is supported in the standard onboarding flow. On Linux Docker hosts, Ollama must listen on `0.0.0.0:11434` so sandboxes can reach `http://host.openshell.internal:11434`; a loopback-only `127.0.0.1:11434` bind will fail validation. Local vLLM remains experimental, and local host-routed inference on macOS still depends on OpenShell host-routing support in addition to the local service itself being reachable on the host. --- diff --git a/docs/reference/inference-profiles.md b/docs/reference/inference-profiles.md index f1c1a4f49..1b6b32d9d 100644 --- a/docs/reference/inference-profiles.md +++ b/docs/reference/inference-profiles.md @@ -76,6 +76,16 @@ Ollama gets additional onboarding help: - it warms the model - it validates the model before continuing +On Linux hosts that run NemoClaw with Docker, the sandbox reaches Ollama through `http://host.openshell.internal:11434`, not the host shell's `localhost` socket. +If Ollama is already running, make sure it listens on `0.0.0.0:11434` instead of `127.0.0.1:11434`. +For example: + +```console +$ OLLAMA_HOST=0.0.0.0:11434 ollama serve +``` + +If Ollama only binds loopback, NemoClaw can detect it on the host but the sandbox-side validation step fails because containers cannot reach it. + ## Experimental Local Providers The following local providers require `NEMOCLAW_EXPERIMENTAL=1`: