Skip to content

Commit 44265c3

Browse files
authored
docs: add handling errors for OpenAI provider (vercel#500)
1 parent 9b65ceb commit 44265c3

File tree

1 file changed

+41
-7
lines changed

1 file changed

+41
-7
lines changed

docs/pages/docs/guides/providers/openai.mdx

+41-7
Original file line numberDiff line numberDiff line change
@@ -152,13 +152,6 @@ export async function POST(req: Request) {
152152
prompt
153153
})
154154

155-
// Check for errors
156-
if (!response.ok) {
157-
return new Response(await response.text(), {
158-
status: response.status
159-
})
160-
}
161-
162155
// Convert the response into a friendly text-stream
163156
const stream = OpenAIStream(response)
164157

@@ -214,6 +207,47 @@ export default function Completion() {
214207

215208
</Steps>
216209

210+
## Guide: Handling Errors
211+
212+
The OpenAI's API throws an `OpenAI.APIError` when an error occurs during a request. It is recommended to wrap your API calls in a `try/catch` block to handle these errors. For more information about `OpenAI.APIError`, see [OpenAI SDK Handling Errors](https://github.com/openai/openai-node?tab=readme-ov-file#handling-errors).
213+
214+
```tsx filename="app/api/chat/route.ts"
215+
import OpenAI from 'openai'
216+
import { OpenAIStream, StreamingTextResponse } from 'ai'
217+
import { NextResponse } from 'next/server'
218+
219+
const openai = new OpenAI({
220+
apiKey: process.env.OPENAI_API_KEY
221+
})
222+
223+
export const runtime = 'edge'
224+
225+
export async function POST(req: Request) {
226+
// Wrap with a try/catch to handle API errors
227+
try {
228+
const { messages } = await req.json()
229+
230+
const response = await openai.chat.completions.create({
231+
model: 'gpt-3.5-turbo',
232+
stream: true,
233+
messages
234+
})
235+
236+
const stream = OpenAIStream(response)
237+
238+
return new StreamingTextResponse(stream)
239+
} catch (error) {
240+
// Check if the error is an APIError
241+
if (error instanceof OpenAI.APIError) {
242+
const { name, status, headers, message } = error
243+
return NextResponse.json({ name, status, headers, message }, { status })
244+
} else {
245+
throw error
246+
}
247+
}
248+
}
249+
```
250+
217251
## Guide: Save to Database After Completion
218252

219253
It’s common to want to save the result of a completion to a database after streaming it back to the user. The `OpenAIStream` adapter accepts a couple of optional callbacks that can be used to do this.

0 commit comments

Comments
 (0)