From ff1b3c344c7890f56e583947044c62a1e08b28bd Mon Sep 17 00:00:00 2001 From: ori346 Date: Sun, 8 Jun 2025 10:08:29 +0300 Subject: [PATCH] docs: update README with separate instructions for macOS and Fedora users --- docs/local_setup_guide.md | 15 ++++++++++++--- 1 file changed, 12 insertions(+), 3 deletions(-) diff --git a/docs/local_setup_guide.md b/docs/local_setup_guide.md index 89fbd3a..fa24c52 100644 --- a/docs/local_setup_guide.md +++ b/docs/local_setup_guide.md @@ -63,10 +63,10 @@ Create a local directory for persistent data: mkdir -p ~/.llama ``` -Run the container: - +### Running the Container +For macOS users: ```bash -podman run -it \ +podman run --rm -it \ -p $LLAMA_STACK_PORT:$LLAMA_STACK_PORT \ -v ~/.llama:/root/.llama \ --env INFERENCE_MODEL=$INFERENCE_MODEL \ @@ -84,6 +84,15 @@ podman run --privileged --network llama-net -it \ llamastack/distribution-ollama \ --port $LLAMA_STACK_PORT ``` +For Fedora users: +```bash +podman run --rm --privileged --network host -it \ + -p $LLAMA_STACK_PORT:$LLAMA_STACK_PORT \ + llamastack/distribution-ollama \ + --env INFERENCE_MODEL=$INFERENCE_MODEL \ + --env OLLAMA_URL=http://localhost:11434 \ + --port $LLAMA_STACK_PORT +``` Verify the container is running: