You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+40-38Lines changed: 40 additions & 38 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,19 +4,24 @@
4
4
5
5
llama-stack-client-swift brings the inference and agents APIs of [Llama Stack](https://github.com/meta-llama/llama-stack) to iOS.
6
6
7
+
**Update: January 27, 2025** The llama-stack-client-swift SDK version has been updated to 0.1.0, working with Llama Stack 0.1.0 ([release note](https://github.com/meta-llama/llama-stack/releases/tag/v0.1.0)).
8
+
7
9
## Features
8
10
9
11
-**Inference & Agents:** Leverage remote Llama Stack distributions for inference, code execution, and safety.
10
12
-**Custom Tool Calling:** Provide Swift tools that Llama agents can understand and use.
11
13
12
-
## Quick Demo
13
-
See [here](https://github.com/meta-llama/llama-stack-apps/tree/ios_demo/examples/ios_quick_demo/iOSQuickDemo) for a complete iOS demo ([video](https://drive.google.com/file/d/1HnME3VmsYlyeFgsIOMlxZy5c8S2xP4r4/view?usp=sharing)) using a remote Llama Stack server for inferencing.
14
+
## iOS Demos
15
+
See [here](https://github.com/meta-llama/llama-stack-apps/tree/main/examples/ios_quick_demo) for a quick iOS demo ([video](https://drive.google.com/file/d/1HnME3VmsYlyeFgsIOMlxZy5c8S2xP4r4/view?usp=sharing)) using a remote Llama Stack server for inferencing.
16
+
17
+
For a more advanced demo using the Llama Stack Agent API and custom tool calling feature, see the [iOS Calendar Assistant demo](https://github.com/meta-llama/llama-stack-apps/tree/main/examples/ios_calendar_assistant).
2. Add this repo URL at the top right: `https://github.com/meta-llama/llama-stack-client-swift`.
24
+
2. Add this repo URL at the top right: `https://github.com/meta-llama/llama-stack-client-swift` and 0.1.0 in the Dependency Rule, then click Add Package.
20
25
21
26
3. Select and add `llama-stack-client-swift` to your app target.
22
27
@@ -27,68 +32,65 @@ See [here](https://github.com/meta-llama/llama-stack-apps/tree/ios_demo/examples
llama stack build --template together --image-type conda
46
+
PYPI_VERSION=0.1.0 llama stack build --template together --image-type conda
41
47
export TOGETHER_API_KEY="<your_together_api_key>"
42
48
llama stack run together
43
49
```
44
50
45
51
The default port is 5000 for `llama stack run` and you can specify a different port by adding `--port <your_port>` to the end of `llama stack run fireworks|together`.
46
52
47
-
6. Replace the `RemoteInference` url below with the your host IP and port:
53
+
6. Replace the `RemoteInference` url string below with the host IP and port of the remote Llama Stack distro in Step 5:
48
54
49
55
```swift
50
56
importLlamaStackClient
51
57
52
58
let inference =RemoteInference(url: URL(string: "http://127.0.0.1:5000")!)
59
+
```
60
+
Below is an example code snippet to use the Llama Stack inference API. See the iOS Demos above for complete code.
Llama Stack `Types.swift` file is generated from the Llama Stack [API spec](https://github.com/meta-llama/llama-stack/blob/main/docs/resources/llama-stack-spec.yaml) in the main [Llama Stack repo](https://github.com/meta-llama/llama-stack).
87
-
88
-
```
89
-
scripts/generate_swift_types.sh
90
-
```
91
-
92
-
By default, this script will download the latest API spec from the main branch of the Llama Stack repo. You can set `LLAMA_STACK_DIR` to a local Llama Stack repo to use a local copy of the API spec instead.
93
-
94
-
This will update the `openapi.yaml` file in the Llama Stack Swift SDK source folder `Sources/LlamaStackClient`.
0 commit comments