- 
                Notifications
    You must be signed in to change notification settings 
- Fork 104
Open
Labels
Milestone
Description
While working through #699, I noticed that many API providers now support reasoning_effort as a parameter, with some variation.
Here's a list of currently supported providers and their API docs. I'll update this issue with preliminary research into the overall availability and use of a reasoning_effort type parameter.
- chat_anthropic() – Anthropic – https://docs.anthropic.com/en/api/messages
- chat_aws_bedrock() – AWS Bedrock – https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html
- chat_azure_openai() – Azure OpenAI – https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#chat-completions
- chat_cloudflare() – Cloudflare – https://developers.cloudflare.com/workers-ai/configuration/open-ai-compatibility/
- chat_cortex_analyst() – Snowflake Cortex Analyst – https://docs.snowflake.com/en/developer-guide/snowflake-rest-api/reference/cortex-analyst
- chat_databricks() – Databricks – https://docs.databricks.com/aws/en/machine-learning/foundation-model-apis/api-reference#chat-request
- chat_deepseek() – DeepSeek – https://platform.deepseek.com/api-docs/api/create-chat-completion
- chat_github() – GitHub Models – https://docs.github.com/en/rest/models/inference?apiVersion=2022-11-28
- chat_google_gemini() – Google Gemini – https://ai.google.dev/api/generate-content#method:-models.generatecontent
- chat_groq() – Groq – https://console.groq.com/docs/api-reference#chat-create
- chat_huggingface() – Hugging Face – https://huggingface.co/docs/inference-providers/en/index?python-clients=requests#http--curl
- chat_mistral() – Mistral – https://api.mistral.ai/v1/
- chat_ollama() – Ollama – https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion
- chat_openai() – OpenAI – https://platform.openai.com/docs/api-reference/chat/create
- chat_openrouter() – OpenRouter – https://openrouter.ai/docs/api-reference/parameters
- chat_perplexity() – Perplexity – https://docs.perplexity.ai/api-reference/chat-completions-post
- chat_portkey() – PortkeyAI – https://portkey.ai/docs/product/ai-gateway/universal-api
- chat_snowflake() – Snowflake Cortex – https://docs.snowflake.com/en/user-guide/snowflake-cortex/cortex-llm-rest-api#api-reference
- chat_vllm() – vLLM – https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html
Sidenote: should we store this list in the ellmer source somewhere? Or maybe in the first line of each R/provider-*.R file?
howardbaikzilinskyjan and nielsoledam