Releases: jamesrochabrun/PolyAI
PolyAI v2.3.1
What's Changed
- Fix build errors, set minimum target to macOS 13, specify the dependencies' versions by @laike9m in #5
New Contributors
Full Changelog: v2.3.0...v2.3.1
PolyAI v2.3.0
Currently, the package supports OpenAI, Azure, Anthropic, Gemini, Groq, DeepSeek, and OpenRouter. Additionally, you can use Ollama to run local models like Llama 3 or Mistral through OpenAI-compatible endpoints.
// OpenAI
let openAIConfiguration: LLMConfiguration = .openAI(.api(key: "your_openai_api_key_here")
// Gemini
let geminiConfiguration: LLMConfiguration = .openAI(.gemini(apiKey: "your_gemini_api_key_here"))
// Groq
let groqConfiguration: LLMConfiguration = .openAI(.groq(apiKey: "your_groq_api_key_here"))
// Ollama
let ollamaConfiguration: LLMConfiguration = .openAI(.ollama(url: "http://localhost:11434"))
// OpenRouter
let openRouterConfiguration: LLMConfiguration = .openAI(.openRouter(apiKey: "your_open-router_api_key_here"))
// DeepSeek
let deepSeekConfiguration: LLMConfiguration = .openAI(.deepSeek(apiKey: "your_deepseek_api_key_here"))
// Anthropic
let anthropicConfiguration: LLMConfiguration = .anthropic(apiKey: "your_anthropic_api_key_here")
let configurations = [openAIConfiguration, anthropicConfiguration, geminiConfiguration, ollamaConfiguration]
PolyAI v2.2.0
Now consumers can use multiple services in same application, this was not supported as there was a bug that was overriding the services URL's
Full Changelog: v2.0.0...v2.2.0
PolyAI v2.1.1
Fixing Chat history for Gemini.
Full Changelog: v2.1.0...v2.1.1
PolyAI v2.0.0
Removing Gemini library dependency.
Full Changelog: v2.1.1...v2.0.0
v2.1.0 PolyAI
Full Changelog: 2.0...v2.1.0
Support for Local Models with Ollama
This release adds support for SwiftOpenAI v3.3 https://github.com/jamesrochabrun/SwiftOpenAI/releases/tag/v3.3
Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally.
Ollama
To interact with local models using Ollama OpenAI compatibility endpoints, use the following configuration setup.
1 - Download Ollama if yo don't have it installed already.
2 - Download the model you need, e.g for llama3
type in terminal:
ollama pull llama3
Once you have the model installed locally you are ready to use PolyAI!
let ollamaConfiguration: LLMConfiguration = .ollama(url: "http://localhost:11434")
More information can be found here.
Gemini support
Adding Gemini support.

public enum LLMConfiguration {
case openAI(OpenAI)
public enum OpenAI {
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to OpenAI.
/// - organizationID: Optional organization ID for OpenAI usage.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case api(key: String, organizationID: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - configuration: The AzureOpenAIConfiguration.
/// - urlSessionConfiguration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case azure(configuration: AzureOpenAIConfiguration, urlSessionConfiguration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - aiproxyPartialKey: The partial key provided in the 'API Keys' section of the AIProxy dashboard.
/// - aiproxyDeviceCheckBypass: The bypass token that is provided in the 'API Keys' section of the AIProxy dashboard.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case aiProxy(aiproxyPartialKey: String, aiproxyDeviceCheckBypass: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
}
/// Configuration for accessing Anthropic's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to Anthropic.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
case anthropic(apiKey: String, configuration: URLSessionConfiguration = .default)
/// Configuration for accessing Gemini's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to Gemini.
case gemini(apiKey: String) /// ✨ New
}
Gemini Configuration:
let geminiConfiguration: LLMConfiguration = .gemini(apiKey: "your_gemini_api_key_here")
let service = PolyAIServiceFactory.serviceWith([geminiConfiguration])
Message Stream With Gemini
let prompt = "How are you today?"
let parameters: LLMParameter = .gemini(model: "gemini-pro", messages: [.init(role: .user, content: prompt)], maxTokens: 2000)
let stream = try await service.streamMessage(parameters)
OpenAI support for Azure and AIProxy
Adding LLMConfiguration
definition to allow usage of Azure and AIPRoxy within PolyAI.
public enum LLMConfiguration {
case openAI(OpenAI)
public enum OpenAI {
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to OpenAI.
/// - organizationID: Optional organization ID for OpenAI usage.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case api(key: String, organizationID: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - configuration: The AzureOpenAIConfiguration.
/// - urlSessionConfiguration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case azure(configuration: AzureOpenAIConfiguration, urlSessionConfiguration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
/// Configuration for accessing OpenAI's API.
/// - Parameters:
/// - aiproxyPartialKey: The partial key provided in the 'API Keys' section of the AIProxy dashboard.
/// - aiproxyDeviceCheckBypass: The bypass token that is provided in the 'API Keys' section of the AIProxy dashboard.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
/// - decoder: The JSON decoder used for decoding responses. Defaults to a new instance of `JSONDecoder`.
case aiProxy(aiproxyPartialKey: String, aiproxyDeviceCheckBypass: String? = nil, configuration: URLSessionConfiguration = .default, decoder: JSONDecoder = .init())
}
/// Configuration for accessing Anthropic's API.
/// - Parameters:
/// - apiKey: The API key for authenticating requests to Anthropic.
/// - configuration: The URLSession configuration to use for network requests. Defaults to `.default`.
case anthropic(apiKey: String, configuration: URLSessionConfiguration = .default)
}
Usage
Azure:
let azureConfiguration: LLMConfiguration = .openAI(.azure(configuration: .init(resourceName: "YOUR_RESOURCE_NAME", openAIAPIKey: .apiKey("YOUR_API_KEY"), apiVersion: "THE_API_VERSION")))
AIProxy
let aiProxyConfiguration: LLMConfiguration = .openAI(.aiProxy(aiproxyPartialKey: "hardcode_partial_key_here", aiproxyDeviceCheckBypass: "hardcode_device_check_bypass_here"))
PolyAI 🚀
- Support for Message with OpenAI and Anthropic API's
- Stream Message.
- README