feat: Support multiple models per BYOK platform#1500
feat: Support multiple models per BYOK platform#1500dev-miro26 wants to merge 8 commits intoeigent-ai:mainfrom
Conversation
|
Hi, @4pmtong |
|
Thanks @dev-miro26 for contribution! Could @fengju0213 @a7m-1st help review it? |
|
@fengju0213 |
|
The changes were not persisted to the provider, and even when the model was modified, the old model was still being used. |
| description: 'OpenAI model configuration.', | ||
| is_valid: false, | ||
| model_type: '', | ||
| suggestedModels: [ |
There was a problem hiding this comment.
I will fix all issues soon!
bytecii
left a comment
There was a problem hiding this comment.
IMO we shouldn't store these model selection candidates in the database
|
|
I’ve updated the code. The Example: If we want to get the models from OpenAI, we need an OpenAI API key: So, to retrieve the models for each BYOK provider and populate Could you please review it again and kindly leave any feedback? |
|
Hi, @fengju0213 |


Related Issue:
Closes #1399
Description
Previously, BYOK required users to specify only one model name per platform. Switching between models (e.g., Claude Opus vs Sonnet vs Haiku on Anthropic) required manually editing the model name each time, which is tedious given that most LLM platforms offer several models with different prices and capabilities.
This PR adds support for saving and managing multiple model names per BYOK platform:
model_typesJSON column on theprovidertable (with migration and backfill of existingmodel_typevalues).Provider,ProviderIn, andProviderOutmodels updated. Create/update endpoints ensuremodel_typesstays in sync with the activemodel_type.model_typeremains the "active" model used bychatStoreand the backend agent system. No changes to the task execution flow.Testing Evidence (REQUIRED)
What is the purpose of this pull request?
Contribution Guidelines Acknowledgement