Skip to content

feat: add multi-provider support (OpenRouter, fal.ai, OpenAI direct)#50

Open
prak-gup wants to merge 1 commit into
vercel-labs:mainfrom
prak-gup:feat/multi-provider-support
Open

feat: add multi-provider support (OpenRouter, fal.ai, OpenAI direct)#50
prak-gup wants to merge 1 commit into
vercel-labs:mainfrom
prak-gup:feat/multi-provider-support

Conversation

@prak-gup
Copy link
Copy Markdown

@prak-gup prak-gup commented May 4, 2026

What this adds

A provider abstraction layer that lets ai-cli work without a Vercel AI Gateway key, by supporting OpenRouter, fal.ai, and direct OpenAI as backends.

Motivation

Getting a Vercel AI Gateway key requires a Vercel account. Many users already have OpenRouter or OpenAI API keys. This change lets them use ai-cli immediately with what they have.

How it works

Auto-detects the active backend from environment variables. Vercel AI Gateway remains the default — existing behaviour is unchanged.

Env vars set Text Image Video
AI_GATEWAY_API_KEY ✓ Vercel (100+ models)
OPENROUTER_API_KEY ✓ OpenRouter (300+ models)
OPENROUTER_API_KEY + FAL_KEY ✓ OpenRouter ✓ fal.ai (Flux, SDXL) ✓ fal.ai (Kling, Wan, Runway)
FAL_KEY ✓ fal.ai ✓ fal.ai
OPENAI_API_KEY ✓ GPT models ✓ DALL-E
AI_PROVIDER=<backend> explicit override

Changes

New file: src/lib/provider.ts

  • detectBackend() — reads env vars, returns active backend
  • createProvider() — returns { text(), image(), video() } factory functions pointing to the right AI SDK provider

src/lib/models.ts

  • fetchGatewayModels(backend) — fetches models from the right source
  • OpenRouter: live from openrouter.ai/api/v1/models
  • fal.ai: static catalogue (Flux, Kling, Wan, HunyuanVideo, Runway Gen3) — fal.ai has no public model list endpoint
  • OpenAI: static catalogue (GPT-4o, o3, o4-mini, DALL-E 2/3, GPT Image 1)
  • resolveModels(modality, model, known, backend) — selects correct defaults per backend

src/commands/text.ts, image.ts, video.ts, models.ts

  • All commands call createProvider() and use provider.text/image/video(modelId)
  • video.ts exits with a clear message if backend has no video support
  • image.ts language-image path (GPT-4o with image output) is guarded to Vercel gateway only

New dependencies

  • @openrouter/ai-sdk-provider v2.9.0
  • @ai-sdk/openai v3.0.58
  • @ai-sdk/fal v2.0.33

Testing status

  • TypeScript type-checks clean (tsc --noEmit)
  • CLI starts and all --help pages render correctly
  • Provider package APIs verified at runtime (fal.image, fal.video, openrouter() all confirmed callable with correct signatures)
  • End-to-end generation via OpenRouter (not tested with live key — wiring is correct based on package API verification)
  • End-to-end generation via fal.ai (not tested with live key — wiring is correct based on package API verification)
  • Vercel AI Gateway path: unchanged, all existing tests pass

Happy to add end-to-end tests or adjust the approach based on feedback. If the project prefers to keep the tool Vercel-specific, I understand — just thought this was worth proposing given how often it comes up for first-time users.

Adds a provider abstraction layer that lets users choose their backend
via environment variables, without requiring a Vercel AI Gateway key.

New file: src/lib/provider.ts
- Auto-detects active backend from env vars (priority order):
  AI_GATEWAY_API_KEY → Vercel (default, unchanged behaviour)
  OPENROUTER_API_KEY → OpenRouter (300+ text models)
  OPENROUTER_API_KEY + FAL_KEY → OpenRouter for text, fal.ai for image/video
  FAL_KEY → fal.ai image + video only
  OPENAI_API_KEY → direct OpenAI text + DALL-E image
  AI_PROVIDER → explicit override

Updated: src/lib/models.ts
- fetchGatewayModels() now accepts a backend param
- OpenRouter model list fetched from openrouter.ai/api/v1/models
- fal.ai models: static catalogue (Flux Pro/Dev/Schnell, Kling, Wan,
  HunyuanVideo, Runway Gen3) — fal.ai has no unified model list endpoint
- OpenAI direct: static catalogue (GPT-4o, o3, o4-mini, DALL-E 2/3)
- resolveModels() accepts backend param to select correct defaults

Updated: commands/text.ts, image.ts, video.ts, models.ts
- All commands create provider via createProvider() and pass
  provider.text/image/video(modelId) to the AI SDK generate functions
- video.ts exits with clear message if chosen backend has no video support
- image.ts language-image path (GPT-4o with image output) remains
  Vercel-gateway-only, guarded by backend check

New deps: @openrouter/ai-sdk-provider, @ai-sdk/openai, @ai-sdk/fal

Note: Vercel AI Gateway (AI_GATEWAY_API_KEY) remains the default and
is the only backend with end-to-end test coverage. The new provider
paths type-check and wire to verified package APIs, but have not been
validated against live endpoints with real keys.
@vercel
Copy link
Copy Markdown

vercel Bot commented May 4, 2026

@prak-gup is attempting to deploy a commit to the Vercel Labs Team on Vercel.

A member of the Team first needs to authorize it.

@socket-security
Copy link
Copy Markdown

Review the following changes in direct dependencies. Learn more about Socket for GitHub.

Diff Package Supply Chain
Security
Vulnerability Quality Maintenance License
Addednpm/​@​openrouter/​ai-sdk-provider@​2.9.07210010099100
Addednpm/​@​ai-sdk/​openai@​3.0.58731008898100
Addednpm/​@​ai-sdk/​fal@​2.0.33741009298100

View full report

switch (m.type) {
case "language":
if (backend === "openrouter") {
const modality = m.architecture?.modality ?? "";
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OpenRouter model classification is dead code — both branches of the if/else push only "text", so image-capable models are never classified with "image" capability.

Fix on Vercel

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant