Skip to content

Releases: UiPath/uipath-llm-client-python

UiPath LangChain Client [langchain-v1.8.1]

09 Apr 08:51

Choose a tag to compare

[1.8.1] - 2026-04-09

Changed

  • Renamed aws optional dependency to bedrock for clarity

UiPath LLM Client [v1.8.0]

08 Apr 14:22
833bf6d

Choose a tag to compare

[1.8.0] - 2026-04-08

Added

  • UiPathLiteLLM — provider-agnostic LLM client powered by LiteLLM
    • completion / acompletion for chat completions across all providers
    • embedding / aembedding for embeddings
    • Automatic model discovery from the UiPath backend — detects vendor, api_flavor, and model family
    • Optional vendor_type and api_flavor overrides (same pattern as LangChain factory)
    • Supports OpenAI (chat-completions + responses API), Gemini, Bedrock (invoke + converse), and Vertex AI Claude
    • All HTTP routed through UiPath httpx transport (auth, retry, headers) — no direct calls to Google/AWS/OpenAI
    • Explicit completion parameters with full IDE autocomplete
  • litellm as an optional dependency (uv add uipath-llm-client[litellm])
  • _strict_response_validation parameter to all Anthropic client classes

Changed

  • Updated dependency versions: uipath-platform>=0.1.21, anthropic>=0.91.0, litellm>=1.83.4

UiPath LangChain Client [langchain-v1.8.0]

08 Apr 14:23
833bf6d

Choose a tag to compare

[1.8.0] - 2026-04-08

Added

  • UiPathChatLiteLLM — LangChain chat model powered by LiteLLM, supporting all UiPath gateway providers
  • langchain-litellm as an optional dependency for LiteLLM integration

Changed

  • Updated dependency versions: anthropic[bedrock,vertex]>=0.91.0
  • Version bump to match core package 1.8.0

UiPath LangChain Client [langchain-v1.7.1]

04 Apr 00:06
926227a

Choose a tag to compare

[1.7.1] - 2026-04-04

Added

  • custom_class parameter in get_chat_model() and get_embedding_model() factory functions to allow instantiating a user-provided class instead of the auto-detected one

UiPath LLM Client [v1.7.0]

03 Apr 23:43
a5115b0

Choose a tag to compare

[1.7.0] - 2026-04-03

Added

  • UiPathNormalizedClient — provider-agnostic LLM client with no optional dependencies
    • client.completions.create/acreate/stream/astream for chat completions
    • client.embeddings.create/acreate for embeddings
    • Structured output via response_format (Pydantic, TypedDict, dict, json_object)
    • Tool calling with dicts, Pydantic models, or callables
    • Streaming with SSE parsing
    • Full vendor parameter coverage: OpenAI (reasoning, logprobs, logit_bias), Anthropic (thinking, top_k), Google (thinking_level/budget, safety_settings, cached_content)
    • Typed response models: ChatCompletion, ChatCompletionChunk, EmbeddingResponse
    • Accepts both dict and Pydantic model messages

UiPath LLM Client [v1.6.0]

03 Apr 13:31
b72c997

Choose a tag to compare

[1.6.0] - 2026-04-03

Fixed

  • Set api_flavor to None for ANTHROPIC and AZURE vendor types
  • Add ANTHROPIC/AZURE cases to validator and remove unused original_message parameter
  • Fix VertexAI default_headers consistency and demo import path
  • Fix LLMGateway singleton cache key to include base_url

UiPath LangChain Client [langchain-v1.6.0]

03 Apr 13:32
b72c997

Choose a tag to compare

[1.6.0] - 2026-04-03

Fixed

  • Version bump to match core package changes

UiPath LLM Client [v1.5.10]

26 Mar 17:01
297e4ae

Choose a tag to compare

[1.5.10] - 2026-03-26

Changed

  • Removed X-UiPath-LLMGateway-AllowFull4xxResponse from default request headers to avoid PII leakage in logs

UiPath LLM Client [v1.5.9]

26 Mar 16:34
4dcdccf

Choose a tag to compare

[1.5.9] - 2026-03-26

Fix

  • Use availableOperationCodes field (instead of operationCodes) when validating BYOM operation codes

UiPath LLM Client [v1.5.8]

26 Mar 14:20
2934fe4

Choose a tag to compare

[1.5.8] - 2026-03-26

Fix

  • Pass base_url to OpenAI and AsyncOpenAI constructors in UiPathOpenAI and UiPathAsyncOpenAI to ensure the correct endpoint is forwarded to the underlying SDK clients