-
Notifications
You must be signed in to change notification settings - Fork 2.1k
Groq Model Was Not Accepted in createDataStreamResponse Function #5282
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@hamzaadil56 this seems to be a groq prompting issue - can you access |
Same problem here. But I keep the prompts as default and change chat-model to I feel like Vercel website said most groq provider support tool usage, but on the vercel ai example still have some problem when generating, like some error I met It just my personal feellings, that may relative with prompts and zod object. FYI |
I have fixed this issue by deleting the node_modules folder and package-lock file. And then, installing the packages again using npm I fixed this problem. |
Description
I was making a simple RAG chatbot with AI SDK and used createDataStreamResponse function for streaming. At first I used openai model and it worked fine. But When I shifted towards groq model so first there is a typescript error occurred which is as follows:
The expected type comes from property 'model' which is declared here on type 'CallSettings & Prompt & { model: LanguageModelV1; tools?: { addResource: Tool<ZodObject<{ content: ZodString; }, "strip", ZodTypeAny, { ...; }, { ...; }>, string> & { ...; }; getInformation: Tool<...> & { ...; }; } | undefined; ... 17 more ...; _internal?: { ...; } | undefined; }
'On Runtime it gives the following error which occurred when I used GROQ model:
The above error did not occur when I used the Openai model.
Code example
AI provider
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: