Skip to content
This repository was archived by the owner on Aug 14, 2025. It is now read-only.

Commit 28ea07e

Browse files
release: 0.3.0
1 parent 3d5f7a3 commit 28ea07e

File tree

3 files changed

+65
-2
lines changed

3 files changed

+65
-2
lines changed

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.2.14"
2+
".": "0.3.0"
33
}

CHANGELOG.md

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,68 @@
11
# Changelog
22

3+
## 0.3.0 (2025-07-25)
4+
5+
Full Changelog: [v0.2.13...v0.3.0](https://github.com/llamastack/llama-stack-client-python/compare/v0.2.13...v0.3.0)
6+
7+
### Features
8+
9+
* **api:** update via SDK Studio ([f568f65](https://github.com/llamastack/llama-stack-client-python/commit/f568f6508002eff7eae4a6b0a1cc13aba6fab98e))
10+
* **client:** add follow_redirects request option ([689725a](https://github.com/llamastack/llama-stack-client-python/commit/689725a1a563e3504fadcd23250d77b681f81334))
11+
* **client:** add support for aiohttp ([dd047f4](https://github.com/llamastack/llama-stack-client-python/commit/dd047f405953918ebe782736e5bf246c87128628))
12+
* make custom code changes ([#3](https://github.com/llamastack/llama-stack-client-python/issues/3)) ([83fa371](https://github.com/llamastack/llama-stack-client-python/commit/83fa37124133aab73bf2bbbdcd39338b9a192475))
13+
14+
15+
### Bug Fixes
16+
17+
* **ci:** correct conditional ([48024ee](https://github.com/llamastack/llama-stack-client-python/commit/48024eebbce61d662410af525eda73627fcf0082))
18+
* **ci:** release-doctor — report correct token name ([ef5375f](https://github.com/llamastack/llama-stack-client-python/commit/ef5375f88c980cd2ba46682bce1217663f9560bd))
19+
* **ci:** update pyproject.toml to use uv and remove broken CI ([#5](https://github.com/llamastack/llama-stack-client-python/issues/5)) ([7bc925c](https://github.com/llamastack/llama-stack-client-python/commit/7bc925c00401799d8f3345a4873f1b0028cb45ea))
20+
* **client:** correctly parse binary response | stream ([55795dc](https://github.com/llamastack/llama-stack-client-python/commit/55795dc5c63e17661f0c293fe00bf147f57e3046))
21+
* helptext for 'inspect version' and 'providers inspect' ([#8](https://github.com/llamastack/llama-stack-client-python/issues/8)) ([d79345e](https://github.com/llamastack/llama-stack-client-python/commit/d79345e42d6a3f3b828396b1ac00e2ecf196c0eb))
22+
* model register missing model-type and not accepting metadata ([#11](https://github.com/llamastack/llama-stack-client-python/issues/11)) ([f3f4515](https://github.com/llamastack/llama-stack-client-python/commit/f3f45155864379f227824d00f6febb1b46ed4839))
23+
* **package:** support direct resource imports ([b212c67](https://github.com/llamastack/llama-stack-client-python/commit/b212c6722c1125ced8d04572541fa6730251443b))
24+
* **tests:** fix: tests which call HTTP endpoints directly with the example parameters ([90ef24b](https://github.com/llamastack/llama-stack-client-python/commit/90ef24bf7d49e25fcc8df715ca254e7b74dc24d5))
25+
* update agent event logger ([#10](https://github.com/llamastack/llama-stack-client-python/issues/10)) ([0a10b70](https://github.com/llamastack/llama-stack-client-python/commit/0a10b70f91f28f533710433ae860789f2cb0f70f))
26+
27+
28+
### Chores
29+
30+
* change publish docs url ([42705cc](https://github.com/llamastack/llama-stack-client-python/commit/42705ccf1c4be9adaf7b4dd57a78671dfd37bf4b))
31+
* **ci:** change upload type ([d649417](https://github.com/llamastack/llama-stack-client-python/commit/d649417cfda0052f1611ab5d62135b5d7e19cfc5))
32+
* **ci:** enable for pull requests ([1511a4c](https://github.com/llamastack/llama-stack-client-python/commit/1511a4cadbd794a4a77c51823f1916c34d362c9a))
33+
* **ci:** fix installation instructions ([3cc4f6f](https://github.com/llamastack/llama-stack-client-python/commit/3cc4f6fbde864b3db974be1fd1710e884a203d1c))
34+
* **ci:** only run for pushes and fork pull requests ([ad42de8](https://github.com/llamastack/llama-stack-client-python/commit/ad42de8b4f72f70a5e8615173508d21eff25646d))
35+
* **ci:** upload sdks to package manager ([7284da9](https://github.com/llamastack/llama-stack-client-python/commit/7284da9a9f7ec7219ef528435bb6854f332aa6e8))
36+
* **docs:** grammar improvements ([8993bfc](https://github.com/llamastack/llama-stack-client-python/commit/8993bfc88d326d6f710a80adbd5cc01e9c3d4f54))
37+
* **docs:** remove reference to rye shell ([8a4fd68](https://github.com/llamastack/llama-stack-client-python/commit/8a4fd689560d9aa6fca7d99213e7d34554b7e2b2))
38+
* **docs:** remove unnecessary param examples ([32056b7](https://github.com/llamastack/llama-stack-client-python/commit/32056b73446c96467301fb63e7cd30a4843d6454))
39+
* **internal:** avoid errors for isinstance checks on proxies ([afb14f3](https://github.com/llamastack/llama-stack-client-python/commit/afb14f316224989114a24978f2cc440367c1e322))
40+
* **internal:** codegen related update ([41646f3](https://github.com/llamastack/llama-stack-client-python/commit/41646f395b5c50fb029f588299fb08ec7a2b9d66))
41+
* **internal:** codegen related update ([c50a78e](https://github.com/llamastack/llama-stack-client-python/commit/c50a78e8085b1dd332e9d0d21a1c7c2d993ae962))
42+
* **internal:** update conftest.py ([621de41](https://github.com/llamastack/llama-stack-client-python/commit/621de41ad84b0473b283ef83e7ec5deb0124e7e6))
43+
* **internal:** version bump ([a4db919](https://github.com/llamastack/llama-stack-client-python/commit/a4db9197db3c26ef1ef5686f9f4e7c8ca8508c8f))
44+
* **internal:** version bump ([cbf1129](https://github.com/llamastack/llama-stack-client-python/commit/cbf1129a31a39d17861bb4f4187bf80fa18827d6))
45+
* **internal:** version bump ([dba539b](https://github.com/llamastack/llama-stack-client-python/commit/dba539b4c6f8260ea9d47fcdf3d4d9d123de78b4))
46+
* **internal:** version bump ([02b79a5](https://github.com/llamastack/llama-stack-client-python/commit/02b79a5c91cd40dc397c527e7895b64a65ab0117))
47+
* **readme:** update badges ([9d5b683](https://github.com/llamastack/llama-stack-client-python/commit/9d5b683d93de62574dff6dd2c30c4aa5981e3d2c))
48+
* **tests:** add tests for httpx client instantiation & proxies ([b0c5e33](https://github.com/llamastack/llama-stack-client-python/commit/b0c5e332de5875b6f97b08e1d6828d0883ff26fe))
49+
* **tests:** run tests in parallel ([efe33aa](https://github.com/llamastack/llama-stack-client-python/commit/efe33aa152677f8e1009674ac9016991516176c0))
50+
* **tests:** skip some failing tests on the latest python versions ([d50097f](https://github.com/llamastack/llama-stack-client-python/commit/d50097f7b7582fa40c068c463e2970e3c81e40d6))
51+
* update SDK settings ([424c330](https://github.com/llamastack/llama-stack-client-python/commit/424c33013cf29e6fef16e571fe67533b90b3f9cf))
52+
* update version ([10ef53e](https://github.com/llamastack/llama-stack-client-python/commit/10ef53e74dbdd72a8dd829957820e61522fbe6ad))
53+
54+
55+
### Documentation
56+
57+
* **client:** fix httpx.Timeout documentation reference ([41b4f1a](https://github.com/llamastack/llama-stack-client-python/commit/41b4f1a7b2d0f5606255fbd5217ad01d8f063dd8))
58+
59+
60+
### Build System
61+
62+
* Bump version to 0.2.14 ([745a94e](https://github.com/llamastack/llama-stack-client-python/commit/745a94e1d2875c8e7b4fac5b1676b890aebf4915))
63+
* Bump version to 0.2.15 ([8700dc6](https://github.com/llamastack/llama-stack-client-python/commit/8700dc6ed9411d436422ee94af2702f10a96b49e))
64+
* Bump version to 0.2.15 ([4692024](https://github.com/llamastack/llama-stack-client-python/commit/46920241be5f8b921bbba367e65a7afa3aefd612))
65+
366
## 0.1.0-alpha.4 (2025-06-27)
467

568
Full Changelog: [v0.1.0-alpha.3...v0.1.0-alpha.4](https://github.com/llamastack/llama-stack-client-python/compare/v0.1.0-alpha.3...v0.1.0-alpha.4)

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "llama_stack_client"
3-
version = "0.2.15"
3+
version = "0.3.0"
44
description = "The official Python library for the llama-stack-client API"
55
dynamic = ["readme"]
66
license = "Apache-2.0"

0 commit comments

Comments
 (0)