You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[](https://buymeacoffee.com/jamesrochabrun)
10
11
11
-
An open-source Swift package that simplifies LLM message completions, inspired by [liteLLM](https://litellm.ai/)and adapted for Swift developers, following Swift conventions.
12
+
An open-source Swift package that simplifies LLM message completions, designed for multi-model applications. It supports multiple providers while adhering to OpenAI-compatible APIs and Anthropic APIs, enabling Swift developers to integrate different AI models seamlessly.
12
13
13
14
## Description
14
15
15
-
Call different LLM APIs using the OpenAI format; currently supporting [OpenAI](https://github.com/jamesrochabrun/SwiftOpenAI), [Anthropic](https://github.com/jamesrochabrun/SwiftAnthropic), [Gemini](https://github.com/google-gemini/generative-ai-swift).
16
+
### OpenAI Compatibility
16
17
17
-
Also call any local model using [Ollama OpenAI compatibility endopints](https://ollama.com/blog/openai-compatibility). You can use models like [llama3](https://ollama.com/library/llama3) or [mistral](https://ollama.com/library/mistral).
18
+
Easily call various LLM APIs using the OpenAI format, with built-in support for multiple models and providers through the [SwiftOpenAI](https://github.com/jamesrochabrun/SwiftOpenAI) package.
19
+
20
+
Supported Providers:
21
+
22
+
- OpenAI
23
+
- Azure
24
+
- Groq
25
+
- DeepSeek
26
+
- Google Gemini
27
+
- OpenRouter
28
+
- Ollama
29
+
-[llama3](https://ollama.com/library/llama3)
30
+
-[mistral](https://ollama.com/library/mistral)
31
+
32
+
Note: When using OpenAI-compatible configurations, you can identify them by the .openAI enum prefix in the configuration structure.
Additionally, Anthropic API is supported through the [SwiftAnthropic](https://github.com/jamesrochabrun/SwiftAnthropic)
18
43
19
44
## Table of Contents
20
45
@@ -23,6 +48,9 @@ Also call any local model using [Ollama OpenAI compatibility endopints](https://
23
48
-[Message](#message)
24
49
-[Collaboration](#collaboration)
25
50
-[OpenAI Azure](#openAI-azure)
51
+
-[Groq](#groq)
52
+
-[OpenRouter](#open-router)
53
+
-[DeepSeek](#deepseek)
26
54
-[OpenAI AIProxy](#openai-aiproxy)
27
55
-[Ollama](#ollama)
28
56
@@ -63,13 +91,31 @@ import PolyAI
63
91
```
64
92
65
93
Then, define the LLM configurations.
66
-
Currently, OpenAI, Anthropic and Gemini are supported, you can also use Ollama or any provider that provides local models with OpenAI endpoint compatibilities to use local models such llama3 or Mistral.
94
+
95
+
Currently, the package supports OpenAI, Azure, Anthropic, Gemini, Groq, DeepSeek, and OpenRouter. Additionally, you can use Ollama to run local models like Llama 3 or Mistral through OpenAI-compatible endpoints.
67
96
68
97
```swift
69
-
let openAIConfiguration: LLMConfiguration = .openAI(.api(key: "your_openai_api_key_here"))
98
+
99
+
// OpenAI
100
+
let openAIConfiguration: LLMConfiguration = .openAI(.api(key: "your_openai_api_key_here")
101
+
102
+
// Gemini
103
+
let geminiConfiguration: LLMConfiguration = .openAI(.gemini(apiKey: "your_gemini_api_key_here"))
104
+
105
+
// Groq
106
+
let groqConfiguration: LLMConfiguration = .openAI(.groq(apiKey: "your_groq_api_key_here"))
107
+
108
+
// Ollama
109
+
let ollamaConfiguration: LLMConfiguration = .openAI(.ollama(url: "http://localhost:11434"))
110
+
111
+
// OpenRouter
112
+
let openRouterConfiguration: LLMConfiguration = .openAI(.openRouter(apiKey: "your_open-router_api_key_here"))
113
+
114
+
// DeepSeek
115
+
let deepSeekConfiguration: LLMConfiguration = .openAI(.deepSeek(apiKey: "your_deepseek_api_key_here"))
116
+
117
+
// Anthropic
70
118
let anthropicConfiguration: LLMConfiguration = .anthropic(apiKey: "your_anthropic_api_key_here")
71
-
let geminiConfiguration: LLMConfiguration = .gemini(apiKey: "your_gemini_api_key_here")
72
-
let ollamaConfiguration: LLMConfiguration = .ollama(url: "http://localhost:11434")
73
119
74
120
let configurations = [openAIConfiguration, anthropicConfiguration, geminiConfiguration, ollamaConfiguration]
75
121
```
@@ -80,18 +126,17 @@ With the configurations set, initialize the service:
80
126
let service = PolyAIServiceFactory.serviceWith(configurations)
81
127
```
82
128
83
-
Now, you have access to OpenAI, Anthropic, Gemini, llama3, Mistral models in a single package. 🚀
129
+
Now, you have access to all the models offered by these providersin a single package. 🚀
0 commit comments