Skip to content

Commit

Permalink
don't check ollama connection on linux
Browse files Browse the repository at this point in the history
Signed-off-by: Paul S. Schweigert <[email protected]>

Configuring docker on linux to connect to ollama requires making edits
in the bee-api container's /etc/hosts file once all pods are up and
running. As a result, for linux users the `configure_ollama` command
will always fail.

This PR skips the ollama connection check for linux users
  • Loading branch information
psschwei committed Jan 17, 2025
1 parent 0f3abc6 commit b2f76f2
Showing 1 changed file with 21 additions and 13 deletions.
34 changes: 21 additions & 13 deletions bee-stack.sh
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,8 @@ check_docker() {
local runtime compose_version major minor req_major req_minor
req_major=$(cut -d'.' -f1 <<< ${REQUIRED_COMPOSE_VERSION})
req_minor=$(cut -d'.' -f2 <<< ${REQUIRED_COMPOSE_VERSION})


local existing_runtimes=()
for runtime in docker podman; do
command -v "$runtime" &>/dev/null && existing_runtimes+=("$runtime")
Expand All @@ -80,18 +80,18 @@ check_docker() {
printf "\n${MISSING_COMPOSE_MSG}"
exit 1
fi

local runtimes_with_compose=()
for runtime in "${existing_runtimes[@]}"; do
"$runtime" compose version --short &>/dev/null && runtimes_with_compose+=("$runtime")
done

if [ ${#runtimes_with_compose[@]} -eq 0 ]; then
print_error "Compose extension is not installed for any of the existing runtimes: ${existing_runtimes[*]}"
printf "\n${MISSING_COMPOSE_MSG}"
exit 2
fi

local compose_versions=()
local compose_version_ok=0
for runtime in "${runtimes_with_compose[@]}"; do
Expand Down Expand Up @@ -158,14 +158,22 @@ configure_watsonx() {
configure_ollama() {
write_backend ollama
write_env OLLAMA_URL "http://host.docker.internal:11434"
print_header "Checking Ollama connection"
if ! ${RUNTIME} run --rm -it curlimages/curl "$OLLAMA_URL"; then
print_error "Ollama is not running or accessible from containers."
printf " Make sure you configured OLLAMA_HOST=0.0.0.0\n"
printf " see https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server\n"
printf " or run ollama from command line ${BLUE}OLLAMA_HOST=0.0.0.0 ollama serve${NC}\n"
printf " Do not forget to pull the required LLMs ${BLUE}ollama pull llama3.1${NC}\n"
exit 2
# configuring docker internal host on linux needs to be done after pods deploy
# so this check will always fail for linux users
if [[ "$OSTYPE" != "linux-gnu" ]]; then
print_header "Checking Ollama connection"
if ! ${RUNTIME} run --rm -it curlimages/curl "$OLLAMA_URL"; then
print_error "Ollama is not running or accessible from containers."
printf " Make sure you configured OLLAMA_HOST=0.0.0.0\n"
printf " see https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server\n"
printf " or run ollama from command line ${BLUE}OLLAMA_HOST=0.0.0.0 ollama serve${NC}\n"
printf " Do not forget to pull the required LLMs ${BLUE}ollama pull llama3.1${NC}\n"
exit 2
fi
else
print_header "Configure Ollama on Linux after launch"
printf " https://github.com/i-am-bee/bee-stack/blob/main/docs/troubleshooting.md#connecting-to-ollama-on-linux\n"
printf " Do not forget to pull the required LLMs ${BLUE}ollama pull llama3.1${NC}\n"
fi
}

Expand Down

0 comments on commit b2f76f2

Please sign in to comment.