This project is a chatbot built using the Llama 2 model via the mlc_chat library. It fetches messages from a server, generates responses, and posts the responses back to the server.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
- Python 3.6 or higher
requestslibrarypsutillibrarydiffliblibrarymlc_chatlibrary
- Clone the repository
- Install the required libraries using pip:
pip install -r requirements.txtTo run the chatbot, execute the sheepGPT.py script:
python sheepGPT.pyThe chatbot will start fetching messages from the server, generate responses, and post the responses back to the server.
- Your Name
This project is licensed under the MIT License - see the LICENSE.md file for details
- The
mlc_chatlibrary for providing the chat module - The Llama 2 model