Inspired by nulzo - Ollama WebUI
BS-LLM-WebUI is a web application with a frontend built using React and Vite, and a backend powered by Django with a Poetry-managed virtual environment.
- Node.js: Make sure you have Node.js installed to manage frontend dependencies.
- Python: Ensure that Python 3.11+ is installed for running the Django backend.
- Poetry: Make sure you have Poetry installed to manage Python dependencies.
- Ollama: Ensure you have Ollama installed on your machine, or a machine on your local network.
- Check to ensure each folder (frontend, backend, nginx) have their .env files created. Look at the sample.env files for examples.
VITE_APP_BACKEND_API_URL
: The url the frontend will use to access the backend.- If locally,
http://127.0.0.1:8000/api/v1/
- If with docker compose,
/api/v1/
- If locally,
ALLOWED_HOSTS
: The allowed hosts lists for django.OLLAMA_ENDPOINT
: The endpoint to access Ollama running locally.- If you are using docker on the same system that is running Ollama, you may need to use
http://host.docker.internal:11434
. Otherwise, it can be the IP address of the device hosting Ollama on your local network.
- If you are using docker on the same system that is running Ollama, you may need to use
OPENAI_API_KEY
: API Key to interact with OpenAI apis.ANTHROPIC_API_KEY
: API Key to interact with Anthropic apis.GEMINI_API_KEY
: API Key to interact with Google AI apis.AZURE_OPENAI_ENDPOINT
: Endpoint for Azure Open AI resource.AZURE_OPENAI_API_KEY
: API Key for Azure Open AI resource.AZURE_OPENAI_VERSION
: API Version of Azure Open AI resource.
Azure Open AI is not fully supported yet
DJANGO_SUPERUSER_USERNAME
: Username of the Django super user.DJANGO_SUPERUSER_PASSWORD
: Password of the Django super user.DJANGO_SUPERUSER_EMAIL
: Email of the Django super user.OLLAMA_ENDPOINT
: The endpoint to access Ollama running locally.- If you are using docker on the same system that is running Ollama, you may need to use
http://host.docker.internal:11434
. Otherwise, it can be the IP address of the device hosting Ollama on your local network.
- If you are using docker on the same system that is running Ollama, you may need to use
VITE_APP_BACKEND_API_URL
: The url the frontend will use to access the backend.- If locally,
http://127.0.0.1:8000/api/v1/
- If with docker compose,
/api/v1/
- If locally,
docker compose up --build -d
Then the app will be accessible on http://127.0.0.1:8008 (or the ip of the device running the app on your local network)
cd frontend
npm install
make run
make tailw
This will start the Vite development server, and you should be able to access the frontend at http://localhost:5173
by default.
cd backend
poetry install
poetry run python manage.py migrate
poetry run python manage.py createsuperuser
poetry run python manage.py populate_models
poetry run python manage.py runserver
This will start the Django development server, which will be accessible at http://localhost:8000
by default.
To run the application, start both the frontend and backend servers as described above:
- Start the Vite development server in the frontend directory.
- Start the Django development server in the backend directory.
The frontend React application will communicate with the Django backend via API calls.
You can then access the site at http://localhost:5173 (or the ip of the device on the local network)
This project is licensed under the MIT License. See the LICENSE file for details.