You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Run local LLMs on iGPU, APU and CPU (AMD , Intel, and Qualcomm (Coming Soon)).
4
-
Easiest way to launch OpenAI API Compatible Server on Windows, Linux and MacOS
3
+
Run local LLMs on iGPU, APU and CPU (AMD , Intel, and Qualcomm (Coming Soon)). Easiest way to launch OpenAI API Compatible Server on Windows, Linux and MacOS
5
4
6
5
| Support matrix | Supported now | Under Development | On the roadmap |
3. Example code to connect to the api server can be found in `scripts/python`. **Note:** To find out more of the supported arguments. `ellm_server --help`.
97
116
98
-

117
+
### Launch Chatbot Web UI
99
118
100
-
## Launch Model Management UI
101
-
It is an interface that allows you to download and deploy OpenAI API compatible server.
102
-
You can find out the disk space required to download the model in the UI.
119
+
1. `ellm_chatbot --port 7788 --host localhost --server_port <ellm_server_port> --server_host localhost`. **Note:** To find out more of the supported arguments. `ellm_chatbot --help`.
103
120
104
-
1.`ellm_modelui --port 6678`
121
+

105
122
106
-

123
+
### Launch Model Management UI
107
124
125
+
It is an interface that allows you to download and deploy OpenAI API compatible server. You can find out the disk space required to download the model in the UI.
126
+
127
+
1. `ellm_modelui --port 6678`. **Note:** To find out more of the supported arguments. `ellm_modelui --help`.
128
+
129
+

108
130
109
131
## Compile OpenAI-API Compatible Server into Windows Executable
3. Compile Windows Executable: `pyinstaller .\ellm_api_server.spec`.
113
136
4. You can find the executable in the `dist\ellm_api_server`.
114
137
115
138
## Acknowledgements
116
139
117
-
- Excellent open-source projects: [vLLM](https://github.com/vllm-project/vllm.git), [onnxruntime-genai](https://github.com/microsoft/onnxruntime-genai.git) and many others.
140
+
- Excellent open-source projects: [vLLM](https://github.com/vllm-project/vllm.git), [onnxruntime-genai](https://github.com/microsoft/onnxruntime-genai.git), [Ipex-LLM](https://github.com/intel-analytics/ipex-llm/tree/main) and many others.
0 commit comments