Skip to content

AzureAIInference: support response format schema #40793

Open
@mathrb

Description

@mathrb

Is your feature request related to a problem? Please describe.
I'm using semantic kernel for my prompts.
I've been using mostly Azure OpenAI models, which works fine for json structured output.
The issue I have is that, for a model like mistral, I can't get structured output working.
The reason is that AzureAIInference currently only support the "old way": json_object, no a specific format.
The issue was first opened on semantic-kernel repo: microsoft/semantic-kernel#11025 (comment)

Describe the solution you'd like
AzureAIInference to implement a schema base response format so that it can be ported to semantic kernel.

Describe alternatives you've considered
Not using Azure models?

Metadata

Metadata

Assignees

Labels

AI Model InferenceIssues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)ClientThis issue points to a problem in the data-plane of the library.Service AttentionWorkflow: This issue is responsible by Azure service team.customer-reportedIssues that are reported by GitHub users external to the Azure organization.feature-requestThis issue requires a new behavior in the product in order be resolved.needs-team-attentionWorkflow: This issue needs attention from Azure service team or SDK team

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions