Skip to content

enso-labs/llm-server

Repository files navigation

🚨 DEPRECATION NOTICE

⚠️ This repository is no longer maintained.
As of June 10, 2025, this project has been officially deprecated and archived. It will no longer receive updates, security patches, or support from the maintainers.

πŸ‘‰ Recommended Action:

We recommend transitioning to Ensō Orchestra, which serves as the actively maintained and improved successor to this project.


πŸ€– Prompt Engineers AI - LLM Server

Full LLM REST API with prompts, LLMs, Vector Databases, and Agents

πŸ“– Table of Contents

πŸ› οΈ Setup Services

### Setup Docker Services
docker-compose up --build

πŸ› οΈ Setup Server

Before running the server make sure to take a look at cp .example.env .env see Environment Variables.

### Change into Backend directory
cd backend

### Setup Virtual Env
python3 -m venv .venv

### Activate Virtual Env
source .venv/bin/activate

### Install Runtime & Dev Dependencies
pip install -r requirements.txt -r requirements-dev.txt -c constaints.txt

### Install Runtime Dependencies
pip install -r requirements.txt -c constaints.txt

### Migrate Database Schema
alembic upgrade head

### Seed Database Users
python3 -m src.seeds.users 3

### Run Application on local machine
bash scripts/dev.sh

πŸ› οΈ Setup Client

### Change into Backend directory
cd frontend

### Install node_modules
npm install

### Start Development Server
npm run dev

Environment Variables

Variable Name Example Description
APP_ENV 'development' Environment where the application is running
APP_VERSION 0.0.1 Version of the application
APP_SECRET this-is-top-secret Secret key for the application
APP_WORKERS 1 Number of application workers
APP_ADMIN_EMAIL [email protected] Admin email for the application
APP_ADMIN_PASS test1234 Admin password for the application
TEST_USER_ID 0000000000000000000000000 Test user ID
DATABASE_URL mysql+aiomysql://admin:password@localhost:3306/llm_server URL for the database
PINECONE_API_KEY API key for Pinecone services
PINECONE_ENV us-east1-gcp Pinecone environment configuration
PINECONE_INDEX default Default Pinecone index used
REDIS_URL redis://localhost:6379 URL for the Redis service
OPENAI_API_KEY sk-abc123... Default LLM OpenAI key
GROQ_API_KEY API key for accessing GROQ services
ANTHROPIC_API_KEY API key for accessing Anthropic services
OLLAMA_BASE_URL http://localhost:11434 Base URL for the Ollama service
SEARX_SEARCH_HOST_URL http://localhost:8080 URL for the Searx search service
MINIO_HOST localhost:9000 URL to the Object storage
BUCKET my-documents Name of Minio or S3 bucket
S3_REGION us-east-1 Region where the S3 bucket exists
ACCESS_KEY_ID AKIAIOSFODNN7EXAMPLE IAM User Access Key ID (Optional)
ACCESS_SECRET_KEY wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY Secret IAM Key (Optional)

πŸš€ Roadmap

Here are the upcoming features I'm ([email protected]) is excited to bring to Prompt Engineers AI - LLM Server (More to come):

  • πŸ€– Foundation Model Providers Supported (OpenAI, Anthropic, Ollama, Groq, Google... coming soon.)
  • πŸ“Έ Multi-Modal Models Generation
  • πŸ“‘ Retrieval Augmented Generation (RAG)
  • πŸ›  UI-Based Tool Configuration
  • πŸ–₯ Code Interpreter
  • πŸŒ‘ Dark Mode
  • 🎨 Configure Custom Theme and Logos
  • πŸ€– Assistant Creation Capability

Create an issue and lets start a discussion if you'd like to see a feature added to the roadmap.

🀝 How to Contribute

We welcome contributions from the community, from beginners to seasoned developers. Here's how you can contribute:

  1. Fork the repository: Click on the 'Fork' button at the top right corner of the repository page on GitHub.

  2. Clone the forked repository to your local machine: git clone <forked_repo_link>.

  3. Navigate to the project folder: cd llm-server.

  4. Create a new branch for your changes: git checkout -b <branch_name>.

  5. Make your changes in the new branch.

  6. Commit your changes: git commit -am 'Add some feature'.

  7. Push to the branch: git push origin <branch_name>.

  8. Open a Pull Request: Go back to your forked repository on GitHub and click on 'Compare & pull request' to create a new pull request.

Please ensure that your code passes all the tests and if possible, add tests for new features. Always write a clear and concise commit message and pull request description.

πŸ’‘ Issues

Feel free to submit issues and enhancement requests. We're always looking for feedback and suggestions.

πŸ€“ Maintainers

πŸ“œ License

This project is open-source, under the MIT License. Feel free to use, modify, and distribute the code as you please.

Happy Prompting! πŸŽ‰πŸŽ‰

About

πŸ€– Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published