You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|**OPENAI_API_KEY**| API key for OpenAI models | Optional |
204
+
|**CLAUDE_API_KEY**| API key for Anthropic models | Optional |
205
+
|**GROQ_API_KEY**| API key for speech-to-text features | Optional |
206
+
|**CUSTOM_LLM_ENABLED**| Set to `true` to enable a custom LLM. Supports OpenAI-compatible endpoints and AWS Bedrock. | Optional |
207
+
|**CUSTOM_LLM_PROVIDER**| Backend provider for the custom model. Accepted values: `openai` (default), `bedrock`. | Optional |
208
+
|**CUSTOM_LLM_MODEL_KEY**| Identifier key for the custom model (e.g. a model ID or name). | Optional |
209
+
|**CUSTOM_LLM_BASE_URL**| Base URL of the custom model's OpenAI-compatible endpoint. Required when `CUSTOM_LLM_PROVIDER=openai`. | Optional |
210
+
|**CUSTOM_LLM_API_KEY**| API key for authenticating with the custom endpoint. Required for `openai` provider; used as the AWS access key ID when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
211
+
|**CUSTOM_LLM_AWS_REGION**| AWS region for the Bedrock model (e.g. `us-east-1`). Required when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
212
+
|**CUSTOM_LLM_NAME**| Display name for the custom model shown in the UI. Defaults to `Custom LLM`. | Optional |
213
+
|**CUSTOM_LLM_MAX_TOKENS**| Maximum token limit for the custom model. Defaults to `128000`. | Optional |
Copy file name to clipboardExpand all lines: docs/self-hosting/govern/plane-ai.md
+16-9Lines changed: 16 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,10 +43,15 @@ You can provide API keys for both OpenAI and Anthropic, making all models availa
43
43
44
44
#### Custom models (self-hosted or third-party)
45
45
46
-
Plane AI works with any model exposed through an OpenAI-compatible API, including models served by Ollama, Groq, Cerebras, and similar runtimes. You can configure one custom model alongside your public provider keys.
46
+
Plane AI supports custom models through two backends:
47
47
48
-
:::warning
49
-
For reliable performance across all Plane AI features, use a custom model with at least 100 billion parameters. Larger models produce better results.
48
+
-**OpenAI-compatible endpoint** — any model exposed via an OpenAI-compatible API, including models served by Ollama, Groq, Cerebras, and similar runtimes.
49
+
-**AWS Bedrock** — models accessed directly through Amazon Bedrock using your AWS credentials.
50
+
51
+
One custom model can be configured alongside your public provider keys.
52
+
53
+
::: warning
54
+
The custom model should have at least 100 billion parameters for all Plane AI features to work reliably. Larger, more capable models yield better results.
0 commit comments