Skip to content

Commit 4f8b7c2

Browse files
committed
added custom model instructions
1 parent 3a483fb commit 4f8b7c2

File tree

1 file changed

+6
-1
lines changed

1 file changed

+6
-1
lines changed

README.md

+6-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
Open-source AI-powered search engine. (Perplexity Clone)
44

5-
Run your local LLM (**llama3**, **gemma**, **mistral**, **phi3**) or use cloud models (**Groq/Llama3**, **OpenAI/gpt4-o**)
5+
Run local LLMs (**llama3**, **gemma**, **mistral**, **phi3**), custom LLMs through **LiteLLM**, or use cloud models (**Groq/Llama3**, **OpenAI/gpt4-o**)
66

77
Demo answering questions with phi3 on my M1 Macbook Pro:
88

@@ -27,6 +27,7 @@ Please feel free to contact me on [Twitter](https://twitter.com/rashadphz) or [c
2727
- [x] Docker deployment setup
2828
- [x] Add support for [searxng](https://github.com/searxng/searxng). Eliminates the need for external dependencies.
2929
- [x] Create a pre-built Docker Image
30+
- [x] Add support for custom LLMs through LiteLLM
3031
- [ ] Chat History
3132
- [ ] Chat with local files
3233

@@ -46,6 +47,7 @@ Please feel free to contact me on [Twitter](https://twitter.com/rashadphz) or [c
4647
- Search with multiple search providers (Tavily, Searxng, Serper, Bing)
4748
- Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
4849
- Answer questions with local models (llama3, mistral, gemma, phi3)
50+
- Answer questions with any custom LLMs through [LiteLLM](https://litellm.vercel.app/docs/providers)
4951

5052
## 🏃🏿‍♂️ Getting Started Locally
5153

@@ -144,6 +146,9 @@ SEARCH_PROVIDER=bing
144146
# Cloud Models
145147
OPENAI_API_KEY=...
146148
GROQ_API_KEY=...
149+
150+
# See https://litellm.vercel.app/docs/providers for the full list of supported models
151+
CUSTOM_MODEL=...
147152
```
148153

149154
### 3. Run Containers

0 commit comments

Comments
 (0)