Skip to content

Unable to Pass Custom Parameters to LLM provider Models #8585

@kuyu12

Description

@kuyu12

I am unable to find a way to pass custom parameters into a model when using DSPy.

For example, with AWS Bedrock, I want to pass a performanceConfig parameter but couldn’t find a supported way to do so. With boto3, the equivalent works as follows:

client.invoke_model( modelId=modelId, body=body, performanceConfigLatency="optimized" )

Similarly, I want to pass tool configurations (e.g., web search) for Claude or OpenAI models. For example, the Claude API allows:
tools=[{ "type": "web_search_20250305", "name": "web_search", "max_uses": 5 }]

I tried several different approaches in DSPy, but none worked.

Question:
Is it currently possible to pass such parameters or tool configurations through DSPy? If yes, how should it be implemented?

Environment:
DSPy version: 3.0.0b2 or 2.6.27
Python version: 3.12
Model provider: AWS Bedrock / Claude / OpenAI

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions