v0.1.0
Release
Llama Stack launched a stable release (v0.1.0). We have updated the Kotlin client SDK to make it compatible for supporting local and remote inference on Android apps, which enables developers to build RAG applications and Agents using tools and safety shields, image reasoning, monitor those agents with telemetry, and evaluate the agent with scoring functions.
Key Features of this release
-
API Support for v0.1.0 Llama Stack Server
-
Remote Inference
- Agentic model inference
- Agentic tool calling
- Image reasoning with Vision Models
-
Local Inference
- Local inference with streaming capabilities
-
Sample llama stack applications
- Android
-
Bugfixes
Stay tuned on future releases and updates!
Contributors
In alphabetical order: @ashwinb, @cmodi-meta, @dineshyv, @Riandy, @WuhanMonkey, @yanxi0830.