Skip to content

Commit 09b0feb

Browse files
committed
Update readme
1 parent 053b8e3 commit 09b0feb

File tree

2 files changed

+16
-20
lines changed

2 files changed

+16
-20
lines changed

README.md

Lines changed: 12 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,9 @@ Features:
88
- Remote Inferencing: Perform inferencing tasks remotely with Llama models hosted on a remote connection (or serverless localhost).
99
- Simple Integration: With easy-to-use APIs, a developer can quickly integrate Llama Stack in their Android app. The difference with local vs remote inferencing is also minimal.
1010

11-
Latest Release Notes: [v0.2.2](https://github.com/meta-llama/llama-stack-client-kotlin/releases/tag/v0.2.2)
11+
Latest Release Notes: [v0.2.14](https://github.com/meta-llama/llama-stack-client-kotlin/releases/tag/v0.2.14)
1212

13-
Note: The current recommended version is 0.2.2 Llama Stack server with 0.2.2 Kotlin client SDK.
13+
Note: The current recommended version is 0.2.14 Llama Stack server with 0.2.14 Kotlin client SDK.
1414

1515
*Tagged releases are stable versions of the project. While we strive to maintain a stable main branch, it's not guaranteed to be free of bugs or issues.*
1616

@@ -26,7 +26,7 @@ The key files in the app are `ExampleLlamaStackLocalInference.kt`, `ExampleLlama
2626
Add the following dependency in your `build.gradle.kts` file:
2727
```
2828
dependencies {
29-
implementation("com.llama.llamastack:llama-stack-client-kotlin:0.2.2")
29+
implementation("com.llama.llamastack:llama-stack-client-kotlin:0.2.14")
3030
}
3131
```
3232
This will download jar files in your gradle cache in a directory like `~/.gradle/caches/modules-2/files-2.1/com.llama.llamastack/`
@@ -62,9 +62,9 @@ Breaking down the demo app, this section will show the core pieces that are used
6262
### Setup Remote Inferencing
6363
Start a Llama Stack server on localhost. Here is an example of how you can do this using the firework.ai distribution:
6464
```
65-
conda create -n stack-fireworks python=3.10
65+
conda create -n stack-fireworks python=3.12
6666
conda activate stack-fireworks
67-
pip install --no-cache llama-stack==0.2.2
67+
pip install --no-cache llama-stack==0.2.14
6868
llama stack build --template fireworks --image-type conda
6969
export FIREWORKS_API_KEY=<SOME_KEY>
7070
llama stack run fireworks --port 5050
@@ -103,7 +103,7 @@ client = LlamaStackClientLocalClient
103103
client = LlamaStackClientOkHttpClient
104104
.builder()
105105
.baseUrl(remoteURL)
106-
.headers(mapOf("x-llamastack-client-version" to listOf("0.1.4.1")))
106+
.headers(mapOf("x-llamastack-client-version" to listOf("0.2.14")))
107107
.build()
108108
```
109109
</td>
@@ -125,9 +125,7 @@ val agentConfig =
125125
.model("meta-llama/Llama-3.1-8B-Instruct")
126126
.samplingParams(
127127
SamplingParams.builder()
128-
.strategy(
129-
SamplingParams.Strategy.ofGreedySampling()
130-
)
128+
.strategyGreedy()
131129
.build()
132130
)
133131
.toolChoice(AgentConfig.ToolChoice.AUTO)
@@ -190,19 +188,19 @@ agentTurnCreateResponseStream.use {
190188
val agentResponsePayload = it.responseStreamChunk()?.event()?.payload()
191189
if (agentResponsePayload != null) {
192190
when {
193-
agentResponsePayload.isAgentTurnResponseTurnStart() -> {
191+
agentResponsePayload.isStart() -> {
194192
// Handle Turn Start Payload
195193
}
196-
agentResponsePayload.isAgentTurnResponseStepStart() -> {
194+
agentResponsePayload.isStepStart() -> {
197195
// Handle Step Start Payload
198196
}
199-
agentResponsePayload.isAgentTurnResponseStepProgress() -> {
197+
agentResponsePayload.isStepProgress() -> {
200198
// Handle Step Progress Payload
201199
}
202-
agentResponsePayload.isAgentTurnResponseStepComplete() -> {
200+
agentResponsePayload.isStepComplete() -> {
203201
// Handle Step Complete Payload
204202
}
205-
agentResponsePayload.isAgentTurnResponseTurnComplete() -> {
203+
agentResponsePayload.isComplete() -> {
206204
// Handle Turn Complete Payload
207205
}
208206
}

examples/android_app/README.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,15 @@
11
# Llama Stack Android Demo App
22

3-
[![Maven Central Version](https://img.shields.io/badge/maven%20central-v0.2.2-8A2BE2)](https://central.sonatype.com/artifact/com.llama.llamastack/llama-stack-client-kotlin/0.2.2)
3+
[![Maven Central Version](https://img.shields.io/badge/maven%20central-v0.2.14-8A2BE2)](https://central.sonatype.com/artifact/com.llama.llamastack/llama-stack-client-kotlin/0.2.14)
44

55
We’re excited to share this Android demo app using both remote and local Llama Stack features! The primary goal of this app is to showcase how to easily build Android apps with Llama models using Llama Stack SDKs in a chat app setup.
66

77
This app serves as a valuable resource to inspire your creativity and provide foundational code that you can customize and adapt for your particular use case.
88

99
Please dive in and start exploring our demo app today! We look forward to any feedback and are excited to see your innovative ideas to build agentic apps with Llama models. The current demo app is built using both Java and Kotlin. The majority of the activities are built with Java but the interfacing with Llama Stack APIs are in Kotlin.
1010

11-
**Latest Update (04/14/2025)**:
12-
- Updated the demo app to be compatible with Llama Stack Kotlin SDK [v0.2.2](https://github.com/meta-llama/llama-stack-client-kotlin/releases/tag/v0.2.2) and Llama Stack version [v0.2.2](https://github.com/meta-llama/llama-stack/releases/tag/v0.2.2).
13-
- Implemented a remote RAG use-case
14-
- Implemented the newly created local RAG support by the SDK with a use-case.
11+
**Latest Update (07/21/2025)**:
12+
- Updated the demo app to be compatible with Llama Stack Kotlin SDK [v0.2.14](https://github.com/meta-llama/llama-stack-client-kotlin/releases/tag/v0.2.14) and Llama Stack version [v0.2.14](https://github.com/meta-llama/llama-stack/releases/tag/v0.2.14).
1513

1614
## Key Concepts
1715
From this demo app, you will learn many key concepts of building a GenAI Andrioid app with Llama Stack libraries:
@@ -70,7 +68,7 @@ For local, here is the list of models we support currently and growing:
7068
Include the latest Llama Stack Kotlin SDK in your `build.gradle.kts`. The demo app automatically includes this.
7169

7270
```
73-
implementation("com.llama.llamastack:llama-stack-client-kotlin:0.2.2")
71+
implementation("com.llama.llamastack:llama-stack-client-kotlin:0.2.14")
7472
```
7573

7674
# App UI/UX

0 commit comments

Comments
 (0)