Skip to content

Groq Model Was Not Accepted in createDataStreamResponse Function #5282

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
hamzaadil56 opened this issue Mar 20, 2025 · 3 comments
Closed

Groq Model Was Not Accepted in createDataStreamResponse Function #5282

hamzaadil56 opened this issue Mar 20, 2025 · 3 comments

Comments

@hamzaadil56
Copy link

Description

I was making a simple RAG chatbot with AI SDK and used createDataStreamResponse function for streaming. At first I used openai model and it worked fine. But When I shifted towards groq model so first there is a typescript error occurred which is as follows:

The expected type comes from property 'model' which is declared here on type 'CallSettings & Prompt & { model: LanguageModelV1; tools?: { addResource: Tool<ZodObject<{ content: ZodString; }, "strip", ZodTypeAny, { ...; }, { ...; }>, string> & { ...; }; getInformation: Tool<...> & { ...; }; } | undefined; ... 17 more ...; _internal?: { ...; } | undefined; }'

On Runtime it gives the following error which occurred when I used GROQ model:

{
  message: "Failed to call a function. Please adjust your prompt. See 'failed_generation' for more details.",
  type: 'invalid_request_error'
}

The above error did not occur when I used the Openai model.

Code example

import { createResource } from "@/lib/actions/resources";
import { openai } from "@ai-sdk/openai";
import { streamText, tool, createDataStreamResponse, smoothStream } from "ai";
import { z } from "zod";
import { findRelevantContent } from "@/lib/ai/embedding";
import { groq } from "@ai-sdk/groq";

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;

export async function POST(req: Request) {
	const { messages } = await req.json();

	return createDataStreamResponse({
		execute: (dataStream) => {
			const result = streamText({
				model: groq("llama-3.3-70b-versatile"),
				system: `You are an ai assistant of Hamza who knows everything about Muhammad Hamza. You gives responses to the user like as if you are Hamza talking to the people using the information you have about Hamza. If the user asks a question about irrelevant to Hamza, then say that you cannot tell anything else other than about Hamza. You are specifically designed to share the information about Hamza."`,
				messages,
				maxSteps: 5,

				experimental_transform: smoothStream({ chunking: "word" }),
				tools: {
					addResource: tool({
						description: `add a resource to your knowledge base.
				  If the user provides a random piece of knowledge unprompted, use this tool without asking for confirmation.`,
						parameters: z.object({
							content: z
								.string()
								.describe(
									"the content or resource to add to the knowledge base"
								),
						}),
						execute: async ({ content }) =>
							createResource({ content }),
					}),
					getInformation: tool({
						description: `get information from your knowledge base to answer questions.`,
						parameters: z.object({
							question: z.string().describe("the users question"),
						}),
						execute: async ({ question }) =>
							findRelevantContent(question),
					}),
				},
			});

			result.consumeStream();

			result.mergeIntoDataStream(dataStream, {
				sendReasoning: true,
			});
		},
		onError: () => {
			return "Oops, an error occured!";
		},
	});
}

AI provider

No response

Additional context

No response

@hamzaadil56 hamzaadil56 added the bug Something isn't working label Mar 20, 2025
@hamzaadil56 hamzaadil56 changed the title Groq Model Not Accepting in createDataStreamResponse Function Groq Model Was Accepted in createDataStreamResponse Function Mar 20, 2025
@hamzaadil56 hamzaadil56 changed the title Groq Model Was Accepted in createDataStreamResponse Function Groq Model Was Not Accepted in createDataStreamResponse Function Mar 21, 2025
@lgrammel
Copy link
Collaborator

@hamzaadil56 this seems to be a groq prompting issue - can you access failed_generation?

@KristampsWong
Copy link

Same problem here. But I keep the prompts as default and change chat-model to groq('llama3-70b-8192') are able to running without errors or warnings.

I feel like Vercel website said most groq provider support tool usage, but on the vercel ai example still have some problem when generating, like some error I met "brave_search", "prompts". Can't change to any model I perfer to use even it is supported tool usage.

It just my personal feellings, that may relative with prompts and zod object. FYI

@hamzaadil56
Copy link
Author

hamzaadil56 commented Apr 8, 2025

I have fixed this issue by deleting the node_modules folder and package-lock file. And then, installing the packages again using npm I fixed this problem.

@lgrammel lgrammel removed bug Something isn't working ai/provider labels Apr 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants