Skip to content

dockersamples/chatbot-model-runner

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A simple chatbot web application built in Go, Python and Node.js that connects to a local LLM service (llama.cpp) to provide AI-powered responses.

Environment Variables

The application uses the following environment variables defined in the .env file:

  • LLM_BASE_URL: The base URL of the LLM API
  • LLM_MODEL_NAME: The model name to use

To change these settings, simply edit the .env file in the root directory of the project.

Quick Start

  1. Clone the repository:

    git clone https://github.com/dockersamples/chatbot-model-runner
    cd chatbot-model-runner
  2. Start the application using Docker Compose:

    docker compose up
  3. Open your browser and visit the following links:

    http://localhost:8080 for the GenAI Application in Go

    http://localhost:8081 for the GenAI Application in Python

    http://localhost:8082 for the GenAI Application in Node

    http://localhost:8083 for the GenAI Application in Rust

If you're using a different LLM server configuration, you may need to modify the.env file.

About

Very simple GenAI application to try the Docker Model Runner

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • HTML 35.5%
  • JavaScript 19.9%
  • CSS 14.7%
  • Go 11.2%
  • Rust 8.1%
  • Python 6.8%
  • Other 3.8%