diff --git a/docs/local_setup_guide.md b/docs/local_setup_guide.md index 89fbd3a..fa24c52 100644 --- a/docs/local_setup_guide.md +++ b/docs/local_setup_guide.md @@ -63,10 +63,10 @@ Create a local directory for persistent data: mkdir -p ~/.llama ``` -Run the container: - +### Running the Container +For macOS users: ```bash -podman run -it \ +podman run --rm -it \ -p $LLAMA_STACK_PORT:$LLAMA_STACK_PORT \ -v ~/.llama:/root/.llama \ --env INFERENCE_MODEL=$INFERENCE_MODEL \ @@ -84,6 +84,15 @@ podman run --privileged --network llama-net -it \ llamastack/distribution-ollama \ --port $LLAMA_STACK_PORT ``` +For Fedora users: +```bash +podman run --rm --privileged --network host -it \ + -p $LLAMA_STACK_PORT:$LLAMA_STACK_PORT \ + llamastack/distribution-ollama \ + --env INFERENCE_MODEL=$INFERENCE_MODEL \ + --env OLLAMA_URL=http://localhost:11434 \ + --port $LLAMA_STACK_PORT +``` Verify the container is running: