Skip to content

Commit 08648d7

Browse files
updating readme
1 parent 0fffcbf commit 08648d7

File tree

1 file changed

+40
-11
lines changed

1 file changed

+40
-11
lines changed

README.md

Lines changed: 40 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# PolyAI
2-
<img width="1202" alt="dragon" src="https://github.com/jamesrochabrun/PolyAI/assets/5378604/68caf665-6ac4-4623-b15a-93f3a4072dc1">
2+
<img width="1202" alt="dragon" src="https://github.com/jamesrochabrun/PolyAI/assets/5378604/2d8c47f7-eec0-4d15-9d53-55ff21b6775e">
33

44
![iOS 15+](https://img.shields.io/badge/iOS-15%2B-blue.svg)
55
[![MIT license](https://img.shields.io/badge/License-MIT-blue.svg)](https://lbesson.mit-license.org/)
@@ -8,22 +8,23 @@
88
[![xcode-version](https://img.shields.io/badge/xcode-15%20-brightgreen)](https://developer.apple.com/xcode/)
99
[![swift-package-manager](https://img.shields.io/badge/package%20manager-compatible-brightgreen.svg?logo=data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiPz4KPHN2ZyB3aWR0aD0iNjJweCIgaGVpZ2h0PSI0OXB4IiB2aWV3Qm94PSIwIDAgNjIgNDkiIHZlcnNpb249IjEuMSIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIiB4bWxuczp4bGluaz0iaHR0cDovL3d3dy53My5vcmcvMTk5OS94bGluayI+CiAgICA8IS0tIEdlbmVyYXRvcjogU2tldGNoIDYzLjEgKDkyNDUyKSAtIGh0dHBzOi8vc2tldGNoLmNvbSAtLT4KICAgIDx0aXRsZT5Hcm91cDwvdGl0bGU+CiAgICA8ZGVzYz5DcmVhdGVkIHdpdGggU2tldGNoLjwvZGVzYz4KICAgIDxnIGlkPSJQYWdlLTEiIHN0cm9rZT0ibm9uZSIgc3Ryb2tlLXdpZHRoPSIxIiBmaWxsPSJub25lIiBmaWxsLXJ1bGU9ImV2ZW5vZGQiPgogICAgICAgIDxnIGlkPSJHcm91cCIgZmlsbC1ydWxlPSJub256ZXJvIj4KICAgICAgICAgICAgPHBvbHlnb24gaWQ9IlBhdGgiIGZpbGw9IiNEQkI1NTEiIHBvaW50cz0iNTEuMzEwMzQ0OCAwIDEwLjY4OTY1NTIgMCAwIDEzLjUxNzI0MTQgMCA0OSA2MiA0OSA2MiAxMy41MTcyNDE0Ij48L3BvbHlnb24+CiAgICAgICAgICAgIDxwb2x5Z29uIGlkPSJQYXRoIiBmaWxsPSIjRjdFM0FGIiBwb2ludHM9IjI3IDI1IDMxIDI1IDM1IDI1IDM3IDI1IDM3IDE0IDI1IDE0IDI1IDI1Ij48L3BvbHlnb24+CiAgICAgICAgICAgIDxwb2x5Z29uIGlkPSJQYXRoIiBmaWxsPSIjRUZDNzVFIiBwb2ludHM9IjEwLjY4OTY1NTIgMCAwIDE0IDYyIDE0IDUxLjMxMDM0NDggMCI+PC9wb2x5Z29uPgogICAgICAgICAgICA8cG9seWdvbiBpZD0iUmVjdGFuZ2xlIiBmaWxsPSIjRjdFM0FGIiBwb2ludHM9IjI3IDAgMzUgMCAzNyAxNCAyNSAxNCI+PC9wb2x5Z29uPgogICAgICAgIDwvZz4KICAgIDwvZz4KPC9zdmc+)](https://github.com/apple/swift-package-manager)
1010

11-
1211
An open-source Swift package that simplifies LLM message completions, inspired by [liteLLM](https://litellm.ai/) and adapted for Swift developers, following Swift conventions.
1312

13+
## Description
14+
15+
Call different LLM APIs using the OpenAI format; currently supporting [OpenAI](https://github.com/jamesrochabrun/SwiftOpenAI), [Anthropic](https://github.com/jamesrochabrun/SwiftAnthropic), [Gemini](https://github.com/google-gemini/generative-ai-swift).
16+
17+
Also call any local model using [Ollama OpenAI compatibility endopints](https://ollama.com/blog/openai-compatibility). You can use models like [llama3](https://ollama.com/library/llama3) or [mistral](https://ollama.com/library/mistral).
18+
1419
## Table of Contents
1520

16-
- [Description](#description)
1721
- [Installation](#installation)
1822
- [Usage](#usage)
1923
- [Message](#message)
2024
- [Collaboration](#collaboration)
2125
- [OpenAI Azure](#openAI-azure)
2226
- [OpenAI AIProxy](#openai-aiproxy)
23-
24-
## Description
25-
26-
Call different LLM APIs using the OpenAI format; currently supporting [OpenAI](https://github.com/jamesrochabrun/SwiftOpenAI), [Anthropic](https://github.com/jamesrochabrun/SwiftAnthropic), and [Gemini](https://github.com/google-gemini/generative-ai-swift).
27+
- [Ollama](#ollama)
2728

2829
## Installation
2930

@@ -54,13 +55,16 @@ First, import the PolyAI package:
5455
import PolyAI
5556
```
5657

57-
Then, define the LLM configurations. Currently, OpenAI, Anthropic and Gemini are supported:
58+
Then, define the LLM configurations.
59+
Currently, OpenAI, Anthropic and Gemini are supported, you can also use Ollama or any provider that provides local models with OpenAI endpoint compatibilities to use local models such llama3 or Mistral.
5860

5961
```swift
6062
let openAIConfiguration: LLMConfiguration = .openAI(.api(key: "your_openai_api_key_here"))
6163
let anthropicConfiguration: LLMConfiguration = .anthropic(apiKey: "your_anthropic_api_key_here")
6264
let geminiConfiguration: LLMConfiguration = .gemini(apiKey: "your_gemini_api_key_here")
63-
let configurations = [openAIConfiguration, anthropicConfiguration, geminiConfiguration]
65+
let ollamaConfiguration: LLMConfiguration = .ollama(url: "http://localhost:11434")
66+
67+
let configurations = [openAIConfiguration, anthropicConfiguration, geminiConfiguration, ollamaConfiguration]
6468
```
6569

6670
With the configurations set, initialize the service:
@@ -69,7 +73,7 @@ With the configurations set, initialize the service:
6973
let service = PolyAIServiceFactory.serviceWith(configurations)
7074
```
7175

72-
Now, you have access to OpenAI, Anthropic and Gemini APIs in a single package. 🚀
76+
Now, you have access to OpenAI, Anthropic, Gemini, llama3, Mistral models in a single package. 🚀
7377

7478
## Message
7579

@@ -93,7 +97,15 @@ To interact with Gemini instead, all you need to do (again) is change just one l
9397

9498
```swift
9599
let prompt = "How are you today?"
96-
let parameters: LLMParameter = .gemini(model: ""gemini-1.5-pro-latest"", messages: [.init(role: .user, content: prompt)], maxTokens: 2000)
100+
let parameters: LLMParameter = .gemini(model: ""gemini-1.5-pro-latest", messages: [.init(role: .user, content: prompt)], maxTokens: 2000)
101+
let stream = try await service.streamMessage(parameters)
102+
```
103+
104+
To interact with local models using Ollama, all you need to do(again) is change just one line of code! 🔥
105+
106+
```swift
107+
let prompt = "How are you today?"
108+
let parameters: LLMParameter = .ollama(model: "llama3", messages: [.init(role: .user, content: prompt)], maxTokens: 2000)
97109
let stream = try await service.streamMessage(parameters)
98110
```
99111
@@ -117,6 +129,23 @@ let aiProxyConfiguration: LLMConfiguration = .openAI(.aiProxy(aiproxyPartialKey:
117129
118130
More information can be found [here](https://github.com/jamesrochabrun/SwiftOpenAI?tab=readme-ov-file#aiproxy).
119131
132+
### Ollama
133+
134+
To interact with local models using [Ollama OpenAI compatibility endpoints](https://ollama.com/blog/openai-compatibility), use the following configuration setup.
135+
136+
1 - Download [Ollama](https://ollama.com/) if yo don't have it installed already.
137+
2 - Download the model you need, e.g for `llama3` type in terminal:
138+
```
139+
ollama pull llama3
140+
```
141+
142+
Once you have the model installed locally you are ready to use PolyAI!
143+
144+
```swift
145+
let ollamaConfiguration: LLMConfiguration = .ollama(url: "http://localhost:11434")
146+
```
147+
More information can be found [here](https://github.com/jamesrochabrun/SwiftOpenAI?tab=readme-ov-file#ollama).
148+
120149
## Collaboration
121150

122151
Open a PR for any proposed change pointing it to `main` branch.

0 commit comments

Comments
 (0)