You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Add comprehensive CLI chat interface section with features and examples
- Include quick start guide and interactive commands overview
- Add example chat session demonstrating the interface
- Update documentation links to include CLI chat documentation
- Position CLI chat as primary interactive method alongside Python client
LocalLab includes a powerful terminal-based chat interface that lets you interact with your AI models directly from the command line. Perfect for quick conversations, testing, and interactive AI sessions.
158
+
159
+
### Quick Start
160
+
161
+
```bash
162
+
# Connect to local server
163
+
locallab chat
164
+
165
+
# Connect to remote server
166
+
locallab chat --url https://your-ngrok-url.app
167
+
168
+
# Use different generation modes
169
+
locallab chat --generate chat # Conversational mode with context
-**🚀 Multiple Generation Modes**: Stream, Simple, Chat, and Batch processing
177
+
-**💬 Rich Terminal UI**: Enhanced markdown rendering and syntax highlighting
178
+
-**🔄 Real-time Streaming**: Live response streaming with Server-Sent Events
179
+
-**📚 Conversation Management**: History tracking, persistence, and context retention
180
+
-**🛠️ Error Handling**: Automatic reconnection and graceful error recovery
181
+
-**⚡ Batch Processing**: Process multiple prompts efficiently
182
+
183
+
### Interactive Commands
184
+
185
+
```bash
186
+
/help - Show available commands
187
+
/history - Display conversation history
188
+
/batch - Enter batch processing mode
189
+
/save - Save conversation to file
190
+
/clear - Clear the screen
191
+
/exit - Exit gracefully
192
+
```
193
+
194
+
### Example Session
195
+
196
+
```bash
197
+
$ locallab chat
198
+
🚀 LocalLab Chat Interface
199
+
✅ Connected to: http://localhost:8000
200
+
📊 Server: LocalLab v0.9.0 | Model: qwen-0.5b
201
+
202
+
You: Hello! Can you help me with Python?
203
+
204
+
AI: Hello! I'd be happy to help you with Python programming.
205
+
What specific topic would you like to explore?
206
+
207
+
You: Show me how to create a class
208
+
209
+
AI: Here's how to create a simple class in Python:
210
+
211
+
```python
212
+
class Person:
213
+
def __init__(self, name, age):
214
+
self.name = name
215
+
self.age = age
216
+
217
+
def introduce(self):
218
+
return f"Hi, I'm {self.name} and I'm {self.age} years old."
219
+
220
+
# Usage
221
+
person = Person("Alice", 25)
222
+
print(person.introduce())
223
+
```
224
+
225
+
You: /exit
226
+
👋 Goodbye!
227
+
```
228
+
229
+
> 📖 **Learn More**: See the [CLI Chat Documentation](./docs/cli/chat.md) for complete usage guide and examples.
230
+
155
231
## 💡 Client Connection & Usage
156
232
157
-
After starting your LocalLab server (either locally or on Google Colab), you'll need to connect to it using the LocalLab client package. This is how your code interacts with the AI models running on the server.
233
+
After starting your LocalLab server (either locally or on Google Colab), you can connect to it in two ways:
234
+
235
+
1.**CLI Chat Interface** (above) - For interactive terminal conversations
236
+
2.**Python Client Package** (below) - For programmatic access in your code
158
237
159
238
### Synchronous Client Usage (Easier for Beginners)
0 commit comments