feat: add multi-provider support (OpenRouter, fal.ai, OpenAI direct)#50
Open
prak-gup wants to merge 1 commit into
Open
feat: add multi-provider support (OpenRouter, fal.ai, OpenAI direct)#50prak-gup wants to merge 1 commit into
prak-gup wants to merge 1 commit into
Conversation
Adds a provider abstraction layer that lets users choose their backend via environment variables, without requiring a Vercel AI Gateway key. New file: src/lib/provider.ts - Auto-detects active backend from env vars (priority order): AI_GATEWAY_API_KEY → Vercel (default, unchanged behaviour) OPENROUTER_API_KEY → OpenRouter (300+ text models) OPENROUTER_API_KEY + FAL_KEY → OpenRouter for text, fal.ai for image/video FAL_KEY → fal.ai image + video only OPENAI_API_KEY → direct OpenAI text + DALL-E image AI_PROVIDER → explicit override Updated: src/lib/models.ts - fetchGatewayModels() now accepts a backend param - OpenRouter model list fetched from openrouter.ai/api/v1/models - fal.ai models: static catalogue (Flux Pro/Dev/Schnell, Kling, Wan, HunyuanVideo, Runway Gen3) — fal.ai has no unified model list endpoint - OpenAI direct: static catalogue (GPT-4o, o3, o4-mini, DALL-E 2/3) - resolveModels() accepts backend param to select correct defaults Updated: commands/text.ts, image.ts, video.ts, models.ts - All commands create provider via createProvider() and pass provider.text/image/video(modelId) to the AI SDK generate functions - video.ts exits with clear message if chosen backend has no video support - image.ts language-image path (GPT-4o with image output) remains Vercel-gateway-only, guarded by backend check New deps: @openrouter/ai-sdk-provider, @ai-sdk/openai, @ai-sdk/fal Note: Vercel AI Gateway (AI_GATEWAY_API_KEY) remains the default and is the only backend with end-to-end test coverage. The new provider paths type-check and wire to verified package APIs, but have not been validated against live endpoints with real keys.
|
@prak-gup is attempting to deploy a commit to the Vercel Labs Team on Vercel. A member of the Team first needs to authorize it. |
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
| switch (m.type) { | ||
| case "language": | ||
| if (backend === "openrouter") { | ||
| const modality = m.architecture?.modality ?? ""; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What this adds
A provider abstraction layer that lets
ai-cliwork without a Vercel AI Gateway key, by supporting OpenRouter, fal.ai, and direct OpenAI as backends.Motivation
Getting a Vercel AI Gateway key requires a Vercel account. Many users already have OpenRouter or OpenAI API keys. This change lets them use
ai-cliimmediately with what they have.How it works
Auto-detects the active backend from environment variables. Vercel AI Gateway remains the default — existing behaviour is unchanged.
AI_GATEWAY_API_KEYOPENROUTER_API_KEYOPENROUTER_API_KEY+FAL_KEYFAL_KEYOPENAI_API_KEYAI_PROVIDER=<backend>Changes
New file:
src/lib/provider.tsdetectBackend()— reads env vars, returns active backendcreateProvider()— returns{ text(), image(), video() }factory functions pointing to the right AI SDK providersrc/lib/models.tsfetchGatewayModels(backend)— fetches models from the right sourceopenrouter.ai/api/v1/modelsresolveModels(modality, model, known, backend)— selects correct defaults per backendsrc/commands/text.ts,image.ts,video.ts,models.tscreateProvider()and useprovider.text/image/video(modelId)video.tsexits with a clear message if backend has no video supportimage.tslanguage-image path (GPT-4o with image output) is guarded to Vercel gateway onlyNew dependencies
@openrouter/ai-sdk-providerv2.9.0@ai-sdk/openaiv3.0.58@ai-sdk/falv2.0.33Testing status
tsc --noEmit)--helppages render correctlyfal.image,fal.video,openrouter()all confirmed callable with correct signatures)Happy to add end-to-end tests or adjust the approach based on feedback. If the project prefers to keep the tool Vercel-specific, I understand — just thought this was worth proposing given how often it comes up for first-time users.