Skip to content

Commit b5c503a

Browse files
committed
add config LLamaSharp tutorial in docs
1 parent 1d5a0a3 commit b5c503a

11 files changed

+100
-1
lines changed

.gitignore

+2-1
Original file line numberDiff line numberDiff line change
@@ -292,4 +292,5 @@ XMLs
292292
logs
293293
wwwroot
294294
appsettings.Production.json
295-
*.csproj.user
295+
*.csproj.user
296+
env/

docs/index.rst

+9
Original file line numberDiff line numberDiff line change
@@ -96,6 +96,15 @@ The main documentation for the site is organized into the following sections:
9696
llm/few-shot-learning
9797
llm/provider
9898

99+
.. _llamasharp:
100+
101+
.. toctree::
102+
:maxdepth: 2
103+
:caption: Use Local LLM Models
104+
105+
llama-sharp/config-llamasharp
106+
llama-sharp/use-llamasharp-in-ui
107+
99108
.. _architecture-docs:
100109

101110
.. toctree::
Loading
Loading
446 KB
Loading
Loading
Loading
451 KB
Loading
Loading

docs/llama-sharp/config-llamasharp.md

+60
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,60 @@
1+
# Config LLamaSharp
2+
3+
BotSharp contains LLamaSharp plugin that allows you to run local llm models. To use the LLamaSharp, you need to config the BotSharp project with few steps.
4+
5+
## Install LLamaSharp Backend
6+
7+
Before use LLamaSharp plugin, you need to install one of the LLamaSharp backend services that suits your environment.
8+
9+
- [`LLamaSharp.Backend.Cpu`](https://www.nuget.org/packages/LLamaSharp.Backend.Cpu): Pure CPU for Windows & Linux. Metal for Mac.
10+
- [`LLamaSharp.Backend.Cuda11`](https://www.nuget.org/packages/LLamaSharp.Backend.Cuda11): CUDA 11 for Windows and Linux
11+
- [`LLamaSharp.Backend.Cuda12`](https://www.nuget.org/packages/LLamaSharp.Backend.Cuda12): CUDA 12 for Windows and Linux
12+
13+
**Please install the same version of LLamaSharp Backend with the LLamaSharp in BotSharp.Plugin.LLamaSharp.csproj.**
14+
15+
![Check LLamaSharp Version](assets/check-llamasharp-version.png)
16+
17+
```shell
18+
# move to the LLamaSharp Plugin Project
19+
$ cd src/Plugins/BotSharp.Plugin.LLamaSharp
20+
# Install the LLamaSharp Backend
21+
$ dotnet add package LLamaSharp.Backend.Cpu --version 0.9.1
22+
```
23+
24+
## Download and Config Local LLM Models
25+
26+
LLamaSharp supports many LLM Models like LLaMA and Alpaca. Download the `gguf` format models and save them in your machine.
27+
28+
We will use a [Llama 2](https://huggingface.co/TheBloke/llama-2-7B-Guanaco-QLoRA-GGUF) model in this tutorial.
29+
30+
After downloading the model, open the `src/WebStarter/appsettings.json` file to config the LLamaSharp models. Set the `LlmProviders` and `LlamaSharp` fields to correct settings as your computer. For example:
31+
32+
```json
33+
{
34+
...,
35+
"LlmProviders": [
36+
...,
37+
{
38+
"Provider": "llama-sharp",
39+
"Models": [
40+
{
41+
"Name": "llama-2-7b.Q2_K.gguf",
42+
"Type": "chat"
43+
}
44+
]
45+
},
46+
...
47+
],
48+
...,
49+
"LlamaSharp": {
50+
"Interactive": true,
51+
"ModelDir": "/Users/wenwei/Desktop/LLM",
52+
"DefaultModel": "llama-2-7b.Q2_K.gguf",
53+
"MaxContextLength": 1024,
54+
"NumberOfGpuLayer": 20
55+
},
56+
...
57+
}
58+
```
59+
60+
For more details about LLamaSharp, visit [LLamaSharp - GitHub](https://github.com/SciSharp/LLamaSharp).
+29
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
# Use LLamaSharp in BotSharp
2+
3+
Start the BotSharp backend and frontend services, and follow this tutorial.
4+
5+
## Install LLamaSharp Plugin in UI.
6+
7+
Go to the Plugin page and install LLamaSharp Plugin.
8+
9+
![Install LlamaSharp Plugin](assets/install-llamasharp-plugin.png)
10+
11+
## Config LLamaSharp as LLM Providers for Agents
12+
13+
Edit or create an agent in Agents page, and config the agent.
14+
15+
![Edit Agent](assets/edit-agent.png)
16+
17+
In the edit page, config the provider as llama-sharp.
18+
19+
![Choose LLamaSharp as Provider](assets/choose-llamasharp-as-provider.png)
20+
21+
Then test the agent.
22+
23+
![Click Test Agent Button](assets/click-test-button.png)
24+
25+
![Test Agent Example](assets/converstaion-examples.png)
26+
27+
If run successfully, you will see log like this in BotSharp service's console.
28+
29+
![Console Output](assets/console-output-in-botsharp.png)

0 commit comments

Comments
 (0)