This project provides a relay server that allows you to use Raycast AI models through an OpenAI-compatible API interface, deployed as a Cloudflare Worker.
- Get your Raycast AI Token in the step 1 below
- Deploy
- Add
RAYCAST_BEARER_TOKEN
in Settings -> Domain & Routes -> Variables and Secrets. You can optionally add aAPI_KEY
,ADVANCED
,INCLUDE_DEPRECATED
. Follow the guide below
- Node.js
- Wrangler CLI
- Cloudflare account
- Raycast API credential (Bearer token)
- HTTP Debug Tool like Proxyman or Charles on macOS
- Get your Raycast AI Token
Proxyman: Add raycast to your SSL proxying list, resend a request to AI. You will see a request like in the photo below: Copy the API key after Authorization Bearer.
- Clone this repository
git clone https://github.com/szcharlesji/raycast-relay
cd raycast-relay
- Install dependencies:
npm install
- Configure your environment variables:
# Install wrangler
npm install -g wrangler
# Set your Raycast credentials as secrets
wrangler secret put RAYCAST_BEARER_TOKEN
# Optionally set an API key for authentication, if you want to use it with cursor, follow the cursor setup
wrangler secret put API_KEY
# Optionally set to filter out the advanced AI options if you don't have the subscription.
wrangler secret put ADVANCED
# Optionall set to include deprecated models
wrangler secret put INCLUDE_DEPRECATED
- Deploy to Cloudflare Workers:
npm run deploy
Once deployed, you can use the worker as an OpenAI-compatible API endpoint:
https://your-worker-name.your-account.workers.dev/v1
GET /v1/models
- List available modelsPOST /v1/chat/completions
- Create a chat completionGET /health
- Health check endpoint
If you've set an API_KEY, include it in your requests:
Authorization: Bearer your-api-key
Raycast-relay supports Cursor, but a workaround is needed since Cursor has a known issue with custom AI endpoints other than OpenAI. Thanks to Vincent's suggestions
In order to use your relayed API endpoint in cursor:
- Generate an API key in OpenAI Platform, you just need to use it to verify it
- Verify this key in Cursor by putting it in
Cursor Settings > Models > OpenAI API Key
with the default OpenAI endpoint - Upload your wrangler secret
API_KEY
bywrangler secret put API_KEY
, this needs to be the same key as the OpenAI key - Override your OpenAI Base URL with your wrangler endpoint
- Save it
- Add a custom model that you can find in the
/v1/models
endpoint or check out Available Models - Done!
Here's a list of all the model IDs:
- raycast-ray1
- raycast-ray1-mini
- openai-gpt-4.1
- openai-gpt-4.1-mini
- openai-gpt-4.1-nano
- openai-gpt-4
- openai-gpt-4-turbo
- openai-gpt-4o
- openai-gpt-4o-mini
- openai_o1-o3
- openai_o1-o4-mini
- openai_o1-o1-mini
- openai_o1-o1
- openai_o1-o3-mini
- anthropic-claude-haiku
- anthropic-claude-sonnet
- anthropic-claude-3-7-sonnet-latest
- anthropic-claude-3-7-sonnet-latest-reasoning
- anthropic-claude-opus
- anthropic-claude-sonnet-4
- anthropic-claude-opus-4
- anthropic-claude-sonnet-4-reasoning
- perplexity-sonar
- perplexity-sonar-pro
- perplexity-sonar-reasoning
- perplexity-sonar-reasoning-pro
- groq-meta-llama/llama-4-scout-17b-16e-instruct
- groq-llama-3.3-70b-versatile
- groq-llama-3.1-8b-instant
- groq-llama3-70b-8192
- together-meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo
- mistral-open-mistral-nemo
- mistral-mistral-large-latest
- mistral-mistral-small-latest
- mistral-codestral-latest
- groq-deepseek-r1-distill-llama-70b
- google-gemini-2.5-pro
- google-gemini-1.5-flash
- google-gemini-2.5-flash
- google-gemini-2.0-flash
- google-gemini-2.0-flash-thinking
- together-deepseek-ai/DeepSeek-R1
- together-deepseek-ai/DeepSeek-V3
- xai-grok-3
- xai-grok-3-mini
- xai-grok-2-latest