Releases: UiPath/uipath-llm-client-python
Releases · UiPath/uipath-llm-client-python
UiPath LangChain Client [langchain-v1.8.1]
[1.8.1] - 2026-04-09
Changed
- Renamed
awsoptional dependency tobedrockfor clarity
UiPath LLM Client [v1.8.0]
[1.8.0] - 2026-04-08
Added
UiPathLiteLLM— provider-agnostic LLM client powered by LiteLLMcompletion/acompletionfor chat completions across all providersembedding/aembeddingfor embeddings- Automatic model discovery from the UiPath backend — detects vendor, api_flavor, and model family
- Optional
vendor_typeandapi_flavoroverrides (same pattern as LangChain factory) - Supports OpenAI (chat-completions + responses API), Gemini, Bedrock (invoke + converse), and Vertex AI Claude
- All HTTP routed through UiPath httpx transport (auth, retry, headers) — no direct calls to Google/AWS/OpenAI
- Explicit completion parameters with full IDE autocomplete
litellmas an optional dependency (uv add uipath-llm-client[litellm])_strict_response_validationparameter to all Anthropic client classes
Changed
- Updated dependency versions:
uipath-platform>=0.1.21,anthropic>=0.91.0,litellm>=1.83.4
UiPath LangChain Client [langchain-v1.8.0]
[1.8.0] - 2026-04-08
Added
UiPathChatLiteLLM— LangChain chat model powered by LiteLLM, supporting all UiPath gateway providerslangchain-litellmas an optional dependency for LiteLLM integration
Changed
- Updated dependency versions:
anthropic[bedrock,vertex]>=0.91.0 - Version bump to match core package 1.8.0
UiPath LangChain Client [langchain-v1.7.1]
[1.7.1] - 2026-04-04
Added
custom_classparameter inget_chat_model()andget_embedding_model()factory functions to allow instantiating a user-provided class instead of the auto-detected one
UiPath LLM Client [v1.7.0]
[1.7.0] - 2026-04-03
Added
UiPathNormalizedClient— provider-agnostic LLM client with no optional dependenciesclient.completions.create/acreate/stream/astreamfor chat completionsclient.embeddings.create/acreatefor embeddings- Structured output via
response_format(Pydantic, TypedDict, dict, json_object) - Tool calling with dicts, Pydantic models, or callables
- Streaming with SSE parsing
- Full vendor parameter coverage: OpenAI (reasoning, logprobs, logit_bias), Anthropic (thinking, top_k), Google (thinking_level/budget, safety_settings, cached_content)
- Typed response models:
ChatCompletion,ChatCompletionChunk,EmbeddingResponse - Accepts both dict and Pydantic model messages
UiPath LLM Client [v1.6.0]
[1.6.0] - 2026-04-03
Fixed
- Set
api_flavortoNonefor ANTHROPIC and AZURE vendor types - Add ANTHROPIC/AZURE cases to validator and remove unused
original_messageparameter - Fix VertexAI
default_headersconsistency and demo import path - Fix LLMGateway singleton cache key to include
base_url
UiPath LangChain Client [langchain-v1.6.0]
[1.6.0] - 2026-04-03
Fixed
- Version bump to match core package changes
UiPath LLM Client [v1.5.10]
[1.5.10] - 2026-03-26
Changed
- Removed
X-UiPath-LLMGateway-AllowFull4xxResponsefrom default request headers to avoid PII leakage in logs
UiPath LLM Client [v1.5.9]
[1.5.9] - 2026-03-26
Fix
- Use
availableOperationCodesfield (instead ofoperationCodes) when validating BYOM operation codes
UiPath LLM Client [v1.5.8]
[1.5.8] - 2026-03-26
Fix
- Pass
base_urltoOpenAIandAsyncOpenAIconstructors inUiPathOpenAIandUiPathAsyncOpenAIto ensure the correct endpoint is forwarded to the underlying SDK clients