Releases: meta-llama/llama-api-python
Releases · meta-llama/llama-api-python
v0.5.0
v0.4.0
0.4.0 (2025-09-16)
Full Changelog: v0.3.0...v0.4.0
Features
- improve future compat with pydantic v3 (648fe7b)
- types: replace List[str] with SequenceNotStr in params (565a26d)
Bug Fixes
- avoid newer type syntax (b9bfeb3)
Chores
v0.3.0
0.3.0 (2025-08-26)
Full Changelog: v0.2.0...v0.3.0
Features
- custom patch to handle exception during stream chunk (7549f0b)
Chores
v0.2.0
0.2.0 (2025-08-07)
Full Changelog: v0.1.2...v0.2.0
Features
Bug Fixes
- api: remove chat completion request model (94c4e9f)
- client: don't send Content-Type header on GET requests (efec88a)
- parsing: correctly handle nested discriminated unions (b627686)
- parsing: ignore empty metadata (d6ee851)
- parsing: parse extra field types (f03ca22)
Chores
- add examples (abfa065)
- internal: bump pinned h11 dep (d40e1b1)
- internal: fix ruff target version (c900ebc)
- package: mark python 3.13 as supported (ef5bc36)
- project: add settings file for vscode (e310380)
- readme: fix version rendering on pypi (786f9fb)
- sync repo (7e697f6)
- update SDK settings (de22c0e)
Documentation
v0.1.2
v0.1.1
What's Changed
- feat: v0.1.1 api update by @yanxi0830 in #32
Full Changelog: https://github.com/meta-llama/llama-api-python/commits/v0.1.1