This repository contains the necessary files and instructions to run Falcon LLM 7b with LangChain and interact with a chat user interface using Chainlit. Follow the steps below to set up and run the chat UI.
- Python 3.10 or higher
- Operating System: macOS or Linux
-
Fork this repository or create a code space in GitHub.
-
Install the required Python packages by running the following command in your terminal:
pip install -r requirements.txt -
Create a
.envfile in the project directory. You can use theexample.envfile as a reference. Add your Hugging Face API token to the.envfile in the following format:HUGGINGFACEHUB_API_TOKEN=your_huggingface_token -
Run the following command in your terminal to start the chat UI:
chainlit run app.py -wThis will launch the chat UI, allowing you to interact with the Falcon LLM model using LangChain.
Note: Ensure that you have provided a valid Hugging Face API token in the .env file, as mentioned in step 3. Without a valid token, the chat UI will not function properly.
If you encounter any issues or have questions, please reach out to me on Twitter
Enjoy using Falcon LLM with LangChain!