Skip to content

Commit 1068bc3

Browse files
committed
docker: fix ai dockerfile
Instead of trying to package ollama ourselves, just use the upstream docker image. I've tested this works and uses the local GPU on my machine.
1 parent dc9ac0a commit 1068bc3

File tree

1 file changed

+4
-7
lines changed

1 file changed

+4
-7
lines changed

resources/docker/Dockerfile.ai

+4-7
Original file line numberDiff line numberDiff line change
@@ -25,13 +25,11 @@ RUN make TAGS="timetzdata" redpanda-connect-ai
2525

2626
RUN touch /tmp/keep
2727

28-
FROM ubuntu AS download
29-
30-
RUN apt update && apt install -y curl
31-
RUN curl -fsSL https://ollama.com/install.sh | sh
32-
3328
# Pack
34-
FROM nvidia/cuda:12.6.0-runtime-ubuntu24.04 AS package
29+
FROM ollama/ollama AS package
30+
31+
# Override the HOST from the ollama dockerfile
32+
ENV OLLAMA_HOST=127.0.0.1
3533

3634
LABEL maintainer="Tyler Rockwood <[email protected]>"
3735
LABEL org.opencontainers.image.source="https://github.com/redpanda-data/connect"
@@ -45,7 +43,6 @@ COPY ./config/docker.yaml /connect.yaml
4543

4644
USER connect
4745

48-
COPY --chown=connect:connect --from=download /usr/local/bin/ollama /usr/local/
4946
COPY --chown=connect:connect --from=build /tmp/keep /home/connect/.ollama/keep
5047

5148
EXPOSE 4195

0 commit comments

Comments
 (0)