Skip to content

Commit a11e77c

Browse files
Merge pull request #2 from jamesrochabrun/jroch-adding-more-models-support
Updating Read.me
2 parents deb12e2 + 2325ded commit a11e77c

File tree

1 file changed

+84
-10
lines changed

1 file changed

+84
-10
lines changed

README.md

Lines changed: 84 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -7,14 +7,39 @@
77
[![swiftui-version](https://img.shields.io/badge/swiftui-brightgreen)](https://developer.apple.com/documentation/swiftui)
88
[![xcode-version](https://img.shields.io/badge/xcode-15%20-brightgreen)](https://developer.apple.com/xcode/)
99
[![swift-package-manager](https://img.shields.io/badge/package%20manager-compatible-brightgreen.svg?logo=data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPHN2ZyB3aWR0aD0iNjJweCIgaGVpZ2h0PSI0OXB4IiB2aWV3Qm94PSIwIDAgNjIgNDkiIHZlcnNpb249IjEuMSIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIiB4bWxuczp4bGluaz0iaHR0cDovL3d3dy53My5vcmcvMTk5OS94bGluayI+CiAgICA8IS0tIEdlbmVyYXRvcjogU2tldGNoIDYzLjEgKDkyNDUyKSAtIGh0dHBzOi8vc2tldGNoLmNvbSAtLT4KICAgIDx0aXRsZT5Hcm91cDwvdGl0bGU+CiAgICA8ZGVzYz5DcmVhdGVkIHdpdGggU2tldGNoLjwvZGVzYz4KICAgIDxnIGlkPSJQYWdlLTEiIHN0cm9rZT0ibm9uZSIgc3Ryb2tlLXdpZHRoPSIxIiBmaWxsPSJub25lIiBmaWxsLXJ1bGU9ImV2ZW5vZGQiPgogICAgICAgIDxnIGlkPSJHcm91cCIgZmlsbC1ydWxlPSJub256ZXJvIj4KICAgICAgICAgICAgPHBvbHlnb24gaWQ9IlBhdGgiIGZpbGw9IiNEQkI1NTEiIHBvaW50cz0iNTEuMzEwMzQ0OCAwIDEwLjY4OTY1NTIgMCAwIDEzLjUxNzI0MTQgMCA0OSA2MiA0OSA2MiAxMy41MTcyNDE0Ij48L3BvbHlnb24+CiAgICAgICAgICAgIDxwb2x5Z29uIGlkPSJQYXRoIiBmaWxsPSIjRjdFM0FGIiBwb2ludHM9IjI3IDI1IDMxIDI1IDM1IDI1IDM3IDI1IDM3IDE0IDI1IDE0IDI1IDI1Ij48L3BvbHlnb24+CiAgICAgICAgICAgIDxwb2x5Z29uIGlkPSJQYXRoIiBmaWxsPSIjRUZDNzVFIiBwb2ludHM9IjEwLjY4OTY1NTIgMCAwIDE0IDYyIDE0IDUxLjMxMDM0NDggMCI+PC9wb2x5Z29uPgogICAgICAgICAgICA8cG9seWdvbiBpZD0iUmVjdGFuZ2xlIiBmaWxsPSIjRjdFM0FGIiBwb2ludHM9IjI3IDAgMzUgMCAzNyAxNCAyNSAxNCI+PC9wb2x5Z29uPgogICAgICAgIDwvZz4KICAgIDwvZz4KPC9zdmc+)](https://github.com/apple/swift-package-manager)
10+
[![Buy me a coffee](https://img.shields.io/badge/Buy%20me%20a%20coffee-048754?logo=buymeacoffee)](https://buymeacoffee.com/jamesrochabrun)
1011

11-
An open-source Swift package that simplifies LLM message completions, inspired by [liteLLM](https://litellm.ai/) and adapted for Swift developers, following Swift conventions.
12+
An open-source Swift package that simplifies LLM message completions, designed for multi-model applications. It supports multiple providers while adhering to OpenAI-compatible APIs and Anthropic APIs, enabling Swift developers to integrate different AI models seamlessly.
1213

1314
## Description
1415

15-
Call different LLM APIs using the OpenAI format; currently supporting [OpenAI](https://github.com/jamesrochabrun/SwiftOpenAI), [Anthropic](https://github.com/jamesrochabrun/SwiftAnthropic), [Gemini](https://github.com/google-gemini/generative-ai-swift).
16+
### OpenAI Compatibility
1617

17-
Also call any local model using [Ollama OpenAI compatibility endopints](https://ollama.com/blog/openai-compatibility). You can use models like [llama3](https://ollama.com/library/llama3) or [mistral](https://ollama.com/library/mistral).
18+
Easily call various LLM APIs using the OpenAI format, with built-in support for multiple models and providers through the [SwiftOpenAI](https://github.com/jamesrochabrun/SwiftOpenAI) package.
19+
20+
Supported Providers:
21+
22+
- OpenAI
23+
- Azure
24+
- Groq
25+
- DeepSeek
26+
- Google Gemini
27+
- OpenRouter
28+
- Ollama
29+
- [llama3](https://ollama.com/library/llama3)
30+
- [mistral](https://ollama.com/library/mistral)
31+
32+
Note: When using OpenAI-compatible configurations, you can identify them by the .openAI enum prefix in the configuration structure.
33+
34+
Example:
35+
36+
```swift
37+
.openAI(.gemini(apiKey: "your_gemini_api_key_here"))
38+
```
39+
40+
### Anthropic
41+
42+
Additionally, Anthropic API is supported through the [SwiftAnthropic](https://github.com/jamesrochabrun/SwiftAnthropic)
1843

1944
## Table of Contents
2045

@@ -23,6 +48,9 @@ Also call any local model using [Ollama OpenAI compatibility endopints](https://
2348
- [Message](#message)
2449
- [Collaboration](#collaboration)
2550
- [OpenAI Azure](#openAI-azure)
51+
- [Groq](#groq)
52+
- [OpenRouter](#open-router)
53+
- [DeepSeek](#deepseek)
2654
- [OpenAI AIProxy](#openai-aiproxy)
2755
- [Ollama](#ollama)
2856

@@ -63,13 +91,31 @@ import PolyAI
6391
```
6492

6593
Then, define the LLM configurations.
66-
Currently, OpenAI, Anthropic and Gemini are supported, you can also use Ollama or any provider that provides local models with OpenAI endpoint compatibilities to use local models such llama3 or Mistral.
94+
95+
Currently, the package supports OpenAI, Azure, Anthropic, Gemini, Groq, DeepSeek, and OpenRouter. Additionally, you can use Ollama to run local models like Llama 3 or Mistral through OpenAI-compatible endpoints.
6796

6897
```swift
69-
let openAIConfiguration: LLMConfiguration = .openAI(.api(key: "your_openai_api_key_here"))
98+
99+
// OpenAI
100+
let openAIConfiguration: LLMConfiguration = .openAI(.api(key: "your_openai_api_key_here")
101+
102+
// Gemini
103+
let geminiConfiguration: LLMConfiguration = .openAI(.gemini(apiKey: "your_gemini_api_key_here"))
104+
105+
// Groq
106+
let groqConfiguration: LLMConfiguration = .openAI(.groq(apiKey: "your_groq_api_key_here"))
107+
108+
// Ollama
109+
let ollamaConfiguration: LLMConfiguration = .openAI(.ollama(url: "http://localhost:11434"))
110+
111+
// OpenRouter
112+
let openRouterConfiguration: LLMConfiguration = .openAI(.openRouter(apiKey: "your_open-router_api_key_here"))
113+
114+
// DeepSeek
115+
let deepSeekConfiguration: LLMConfiguration = .openAI(.deepSeek(apiKey: "your_deepseek_api_key_here"))
116+
117+
// Anthropic
70118
let anthropicConfiguration: LLMConfiguration = .anthropic(apiKey: "your_anthropic_api_key_here")
71-
let geminiConfiguration: LLMConfiguration = .gemini(apiKey: "your_gemini_api_key_here")
72-
let ollamaConfiguration: LLMConfiguration = .ollama(url: "http://localhost:11434")
73119

74120
let configurations = [openAIConfiguration, anthropicConfiguration, geminiConfiguration, ollamaConfiguration]
75121
```
@@ -80,18 +126,17 @@ With the configurations set, initialize the service:
80126
let service = PolyAIServiceFactory.serviceWith(configurations)
81127
```
82128

83-
Now, you have access to OpenAI, Anthropic, Gemini, llama3, Mistral models in a single package. 🚀
129+
Now, you have access to all the models offered by these providers in a single package. 🚀
84130

85131
## Message
86132

87133
To send a message using OpenAI:
88134

89135
```swift
90136
let prompt = "How are you today?"
91-
let parameters: LLMParameter = .openAI(model: .gpt4turbo, messages: [.init(role: .user, content: prompt)])
137+
let parameters: LLMParameter = .openAI(model: .o1Preview, messages: [.init(role: .user, content: prompt)])
92138
let stream = try await service.streamMessage(parameters)
93139
```
94-
95140
To interact with Anthropic instead, all you need to do is change just one line of code! 🔥
96141

97142
```swift
@@ -116,6 +161,8 @@ let parameters: LLMParameter = .ollama(model: "llama3", messages: [.init(role: .
116161
let stream = try await service.streamMessage(parameters)
117162
```
118163
164+
As demonstrated, simply switch the LLMParameter to the desired provider.
165+
119166
## OpenAI Azure
120167
121168
To access the OpenAI API via Azure, you can use the following configuration setup.
@@ -126,6 +173,33 @@ let azureConfiguration: LLMConfiguration = .openAI(.azure(configuration: .init(r
126173
127174
More information can be found [here](https://github.com/jamesrochabrun/SwiftOpenAI?tab=readme-ov-file#azure-openai).
128175
176+
## Groq
177+
178+
To access Groq, use the following configuration setup.
179+
180+
```swift
181+
let groqConfiguration: LLMConfiguration = .openAI(.groq(apiKey: "your_groq_api_key_here"))
182+
```
183+
More information can be found [here](https://github.com/jamesrochabrun/SwiftOpenAI?tab=readme-ov-file#groq).
184+
185+
## OpenRouter
186+
187+
To access OpenRouter, use the following configuration setup.
188+
189+
```swift
190+
let openRouterConfiguration: LLMConfiguration = .openAI(.openRouter(apiKey: "your_open-router_api_key_here"))
191+
```
192+
More information can be found [here](https://github.com/jamesrochabrun/SwiftOpenAI?tab=readme-ov-file#openrouter).
193+
194+
## DeepSeek
195+
196+
To access DeepSeek, use the following configuration setup.
197+
198+
```swift
199+
let deepSeekConfiguration: LLMConfiguration = .openAI(.deepSeek(apiKey: "your_deepseek_api_key_here"))
200+
```
201+
More information can be found [here](https://github.com/jamesrochabrun/SwiftOpenAI?tab=readme-ov-file#deepseek).
202+
129203
## OpenAI AIProxy
130204
131205
To access the OpenAI API via AIProxy, use the following configuration setup.

0 commit comments

Comments
 (0)