-
Notifications
You must be signed in to change notification settings - Fork 180
refactor: migrate to Nextjs Route Handler #625
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
🦋 Changeset detectedLatest commit: 9dd72be The changes in this PR will be included in the next version bump. This PR includes changesets to release 2 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
WalkthroughThis update migrates the server codebase to use Next.js Route Handlers, replacing custom HTTP handlers for API endpoints with Next.js-compliant route files. It introduces several new API route handlers and utilities under the Next.js directory, removes legacy handler files, and updates build scripts to accommodate the new structure. Supporting TypeScript configurations and development instructions are also adjusted. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant NextAPI as Next.js API Route
participant Workflow
participant StreamWriter
participant LlamaCloud
Client->>NextAPI: POST /api/chat (messages)
NextAPI->>Workflow: Create workflow instance
NextAPI->>Workflow: Run workflow with user input and history
Workflow-->>NextAPI: Stream events (agent/tool/source)
NextAPI->>StreamWriter: Convert events to data stream
StreamWriter-->>Client: Streamed response (text, annotations)
StreamWriter->>NextAPI: On final, append assistant response
NextAPI->>LlamaCloud: (If needed) Download files in background
NextAPI-->>Client: Complete response with suggested questions
Poem
Note ⚡️ AI Code Reviews for VS Code, Cursor, WindsurfCodeRabbit now has a plugin for VS Code, Cursor and Windsurf. This brings AI code reviews directly in the code editor. Each commit is reviewed immediately, finding bugs before the PR is raised. Seamless context handoff to your AI code agent ensures that you can easily incorporate review feedback. Note ⚡️ Faster reviews with cachingCodeRabbit now supports caching for code and dependencies, helping speed up reviews. This means quicker feedback, reduced wait times, and a smoother review experience overall. Cached data is encrypted and stored securely. This feature will be automatically enabled for all accounts on May 16th. To opt out, configure ✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
next.config.ts is useful for building frontend static files for Python only (with output="export" config).
It's not needed in TS dev mode
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rename to next-build.config.ts because we cannot use route handlers for TS when having output="export" in next config
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For more information, Nextjs currently doesn't support custom file path option for next.config.ts file. See: vercel/next.js#31433
It would be great if we can do something like:
build:static: next build --config next-build.config.ts
@@ -33,8 +33,9 @@ | |||
"postbuild": "pnpm copy:next-src && pnpm build:static && pnpm copy:static", | |||
"copy:next-src": "cp -r ./next ./server && pnpm build:css && rm -rf ./server/postcss.config.js", | |||
"build:css": "postcss server/app/globals.css -o server/app/globals.css", | |||
"build:static": "cd ./next && next build", | |||
"copy:static": "cp -r ./next/out ./dist/static", | |||
"prepare:temp": "rm -rf ./temp && cp -r ./next ./temp && rm -rf ./temp/app/api && mv ./temp/next-build.config.ts ./temp/next.config.ts", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use a temp folder to build frontend static files (don't have api folder, use output=export in next config)
packages/server/src/server.ts
Outdated
if (pathname === "/api/chat" && req.method === "POST") { | ||
// TODO: serialize workflowFactory and append it to req object, then using it in Route Handler | ||
return handleChat(req, res, this.workflowFactory); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be great if we can find a way to serialize workflow and then recreate it in route handler.
For now, we cannot do it, so I still keep chat handler here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe store the factory in the this.app object? should be possible to access this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
best store the whole options
object (containing componentsDir and factory). worst case you can store the options object in a global weak map that uses the LlamaIndexServer instance as key
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it would be great if we can share the global object with route handlers. As I remember before it doesn't work
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
packages/server/src/server.ts
Outdated
|
||
if (pathname === "/api/dev/files/workflow" && req.method === "PUT") { | ||
return updateWorkflowFile(req, res); | ||
query.componentsDir = this.componentsDir; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for /api/components, we only need to pass componentsDir from server config to Route Handler.
We can easily archive this via search query params
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 10
🧹 Nitpick comments (8)
.changeset/lucky-deers-kick.md (1)
5-5
: Fix the spelling of "Next.js".The official spelling of the framework is "Next.js" with a period, not "Nextjs".
-refactor: migrate to Nextjs Route Handler +refactor: migrate to Next.js Route Handler🧰 Tools
🪛 LanguageTool
[uncategorized] ~5-~5: The official spelling of this programming framework is “Next.js”.
Context: ...erver": patch --- refactor: migrate to Nextjs Route Handler(NODE_JS)
packages/server/src/server.ts (2)
77-78
: Consider providing more context in the TODO commentThe TODO comment would be more helpful if it included more context about the serialization challenge and potential approaches.
-// TODO: serialize workflowFactory and append it to req object, then using it in Route Handler +// TODO: Find a way to serialize workflowFactory for use in Route Handler +// Challenge: Workflow objects contain complex state that's difficult to serialize/deserialize +// Potential approaches: Use a shared memory approach or implement a custom serialization strategy
90-91
: Ensure query object is properly typedWhile this approach works, it would be good to ensure the query object has the correct TypeScript type to avoid potential type errors.
- handle(req, res, { ...parsedUrl, query }); + handle(req, res, { ...parsedUrl, query: { ...query } });packages/server/next/app/api/components/route.ts (1)
45-49
: Consider including the error message in the responseFor better debugging, consider including the actual error message in the response when in development mode.
console.error("Error reading components:", error); return NextResponse.json( { error: "Failed to read components" }, { status: 500 }, );You could enhance this with:
console.error("Error reading components:", error); return NextResponse.json( - { error: "Failed to read components" }, + { + error: "Failed to read components", + ...(process.env.NODE_ENV !== "production" && { + details: error instanceof Error ? error.message : String(error) + }) + }, { status: 500 }, );packages/server/next/app/api/dev/files/workflow/route.ts (4)
9-29
: Use consistent variable references.The function declares
filePath
on line 10 but then usesDEFAULT_WORKFLOW_FILE_PATH
directly on line 12. This creates inconsistency that could lead to bugs if the path is changed in one place but not the other.export async function GET(request: NextRequest) { const filePath = DEFAULT_WORKFLOW_FILE_PATH; - const fileExists = await promisify(fs.exists)(DEFAULT_WORKFLOW_FILE_PATH); + const fileExists = await promisify(fs.exists)(filePath); if (!fileExists) { return NextResponse.json( { detail: `Dev mode is currently in beta. It only supports updating workflow file at ${filePath}`, }, { status: 404 }, ); } const content = await promisify(fs.readFile)(filePath, "utf-8"); const last_modified = fs.statSync(filePath).mtime.getTime(); return NextResponse.json( { content, file_path: filePath, last_modified }, { status: 200 }, ); }
35-43
: Inconsistent path variable usage in error message.Similar to the GET handler, this function uses both
filePath
andDEFAULT_WORKFLOW_FILE_PATH
inconsistently. Use the local variable consistently.const fileExists = await promisify(fs.exists)(filePath); if (!fileExists) { return NextResponse.json( { - detail: `Dev mode is currently in beta. It only supports updating workflow file at ${DEFAULT_WORKFLOW_FILE_PATH}`, + detail: `Dev mode is currently in beta. It only supports updating workflow file at ${filePath}`, }, { status: 404 }, ); }
74-78
: Ensure temporary file names are unique across concurrent validations.While using a timestamp in the filename reduces the chance of collisions, it's still possible for two validations to happen at the exact same millisecond. Consider adding a random component to ensure uniqueness.
const tempFilePath = path.join( path.dirname(filePath), - `workflow_${Date.now()}.ts`, + `workflow_${Date.now()}_${Math.random().toString(36).substr(2, 9)}.ts`, );
78-90
: Use asynchronous file operations consistently.The validation function mixes synchronous (
writeFileSync
,existsSync
,unlinkSync
) and asynchronous operations. For consistency with the rest of the codebase, consider using promisified versions throughout.- fs.writeFileSync(tempFilePath, content); + await promisify(fs.writeFile)(tempFilePath, content); const errors = []; try { const tscCommand = `npx tsc ${tempFilePath} --noEmit --skipLibCheck true`; await promisify(exec)(tscCommand); } catch (error) { const errorMessage = (error as { stdout: string })?.stdout; errors.push(errorMessage); } finally { // Clean up temporary file - if (fs.existsSync(tempFilePath)) fs.unlinkSync(tempFilePath); + const exists = await promisify(fs.exists)(tempFilePath); + if (exists) await promisify(fs.unlink)(tempFilePath); }
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge Base: Disabled due to data retention organization setting
📒 Files selected for processing (11)
.changeset/lucky-deers-kick.md
(1 hunks)packages/server/.gitignore
(1 hunks)packages/server/examples/devmode/README.md
(1 hunks)packages/server/next/app/api/chat/config/llamacloud/route.ts
(1 hunks)packages/server/next/app/api/components/route.ts
(2 hunks)packages/server/next/app/api/dev/files/workflow/route.ts
(1 hunks)packages/server/next/app/api/files/[...slug]/route.ts
(1 hunks)packages/server/package.json
(1 hunks)packages/server/src/handlers/cloud.ts
(0 hunks)packages/server/src/handlers/files.ts
(0 hunks)packages/server/src/server.ts
(1 hunks)
💤 Files with no reviewable changes (2)
- packages/server/src/handlers/cloud.ts
- packages/server/src/handlers/files.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/next/app/api/dev/files/workflow/route.ts (2)
packages/server/next/app/api/files/[...slug]/route.ts (1)
GET
(5-24)packages/server/next/app/api/chat/config/llamacloud/route.ts (1)
GET
(5-32)
🪛 LanguageTool
.changeset/lucky-deers-kick.md
[uncategorized] ~5-~5: The official spelling of this programming framework is “Next.js”.
Context: ...erver": patch --- refactor: migrate to Nextjs Route Handler
(NODE_JS)
⏰ Context from checks skipped due to timeout of 90000ms (12)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
🔇 Additional comments (8)
packages/server/examples/devmode/README.md (1)
20-20
: Good update to nodemon ignore pattern.The addition of
--ignore src/app/workflow_*.ts
to the nodemon command is a smart optimization that prevents unnecessary server restarts when workflow files are modified. This aligns well with the migration to Next.js Route Handlers where these files are now managed via API routes.packages/server/.gitignore (1)
1-5
: Good documentation in .gitignore file.The added comments clearly explain the purpose of the ignored directories, making it easier for developers to understand why these paths are excluded from version control. This is especially helpful for understanding the build process with the new Next.js Route Handler architecture.
packages/server/package.json (1)
36-38
: Build script refactor looks good!The new build approach with a temporary directory effectively separates the frontend static build from the API routes, which aligns well with the Next.js Route Handler migration. Removing the API folder from the temp build ensures these routes are handled server-side rather than being included in the static bundle.
packages/server/next/app/api/chat/config/llamacloud/route.ts (1)
1-32
: Clean implementation of the Next.js Route HandlerThe migration from a custom Node.js handler to a Next.js Route Handler is well-implemented with proper error handling and response formatting.
A minor suggestion for improved error reporting:
} catch (error) { return NextResponse.json( { - error: "Failed to fetch LlamaCloud configuration", + error: `Failed to fetch LlamaCloud configuration: ${error instanceof Error ? error.message : String(error)}`, }, { status: 500 }, ); }packages/server/src/server.ts (1)
86-87
: Passing componentsDir as a query parameter is a good approachThis is a clean way to pass server configuration to the Next.js Route Handler without needing to modify the route handler implementation.
packages/server/next/app/api/components/route.ts (2)
6-9
: Good approach for extracting query parametersThe implementation correctly extracts the
componentsDir
parameter from the URL search params with a sensible default.
13-17
: Consistent use of NextResponse.json for error handlingThe error response pattern is consistent with Next.js best practices.
packages/server/next/app/api/dev/files/workflow/route.ts (1)
7-7
: Plan to parameterize the workflow file path.The TODO comment correctly identifies that hardcoding the file path is not ideal. Consider prioritizing this enhancement in the future to make the API more flexible.
@@ -30,8 +30,10 @@ | |||
"clean": "rm -rf ./dist ./server next/.next next/out", | |||
"prebuild": "pnpm clean", | |||
"build": "bunchee", | |||
"postbuild": "pnpm copy:next-src && pnpm build:static && pnpm copy:static", | |||
"copy:next-src": "cp -r ./next ./server && pnpm build:css && rm -rf ./server/postcss.config.js", | |||
"build:api": "tsc --skipLibCheck --project tsconfig.api.json", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
packages/server/src/server.ts (1)
29-30
: Consider using a more structured approach for sharing server options.While using
globalThis
provides a quick solution for sharing server options with Next.js route handlers, it introduces global state that might lead to maintainability issues as the application grows.Consider using a more structured approach like:
- The WeakMap approach mentioned in past comments: https://github.com/run-llama/LlamaIndexTS/blob/3e66ddc10db0b84040df3c4138f0f2c8ab229102/packages/env/src/utils/shared.ts#L18
- A singleton service that provides access to server options
- Environment variables for configuration that needs to be shared
- // eslint-disable-next-line @typescript-eslint/no-explicit-any - (globalThis as any).serverOptions = options; + // Use a WeakMap to store server options + import { getSharedObject } from "@llamaindex/env"; + const serverOptionsMap = getSharedObject<WeakMap<LlamaIndexServer, LlamaIndexServerOptions>>( + "SERVER_OPTIONS_MAP", + () => new WeakMap() + ); + serverOptionsMap.set(this, options);
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge Base: Disabled due to data retention organization setting
📒 Files selected for processing (2)
packages/server/next/app/api/components/route.ts
(2 hunks)packages/server/src/server.ts
(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/server/next/app/api/components/route.ts
⏰ Context from checks skipped due to timeout of 90000ms (36)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: lint
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
packages/server/src/server.ts
Outdated
@@ -80,37 +76,10 @@ export class LlamaIndexServer { | |||
const pathname = parsedUrl.pathname; | |||
|
|||
if (pathname === "/api/chat" && req.method === "POST") { | |||
// TODO: serialize workflowFactory and append it to req object, then using it in Route Handler |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Prioritize completing the migration to Next.js Route Handlers.
This TODO indicates that the /api/chat
endpoint hasn't been fully migrated to a Next.js Route Handler yet, unlike the other endpoints mentioned in the AI summary.
Could you add more details about the planned approach for serializing the workflowFactory? This would help ensure that the migration path is clear. For complex objects like workflows, consider:
- What specific properties need to be serialized
- How will you recreate the workflow in the route handler
- Whether there are any circular references that might cause issues during serialization
🏁 Script executed:
#!/bin/bash
# Check if there's a route handler file already started for chat endpoint
echo "Checking for existing chat route handler file..."
find packages/server/next/app/api -type f -name "route.ts" | grep chat
Length of output: 238
🏁 Script executed:
#!/bin/bash
set -e
echo "⎯⎯⎯ Showing content of chat route handler ⎯⎯⎯"
sed -n '1,200p' packages/server/next/app/api/chat/config/llamacloud/route.ts
echo
echo "⎯⎯⎯ Searching for workflowFactory references ⎯⎯⎯"
rg "workflowFactory" -n
Length of output: 3365
Implement Next.js Route Handler for /api/chat
Instead of “Serializing” the Factory
The /api/chat
endpoint hasn’t been migrated yet—only /api/chat/config/llamacloud
exists under packages/server/next/app/api/chat/...
. You don’t actually serialize a function over HTTP; you should import and invoke workflowFactory
directly in your Next.js route handler.
Please:
- Remove the TODO in
packages/server/src/server.ts
. - Create a Next.js route file at
packages/server/next/app/api/chat/route.ts
that imports both your chat handler and the workflow factory from the server package. - Sketch out the handler so it:
- Parses the request body (e.g.
const body = await req.json()
) - Calls
handleChat(body, NextResponse, workflowFactory)
- Returns the resulting
NextResponse
- Parses the request body (e.g.
- Double-check the request payload shape (
body
) and any required env/config values thatworkflowFactory
consumes.
Example diff snippet:
+ // packages/server/next/app/api/chat/route.ts
+ import { NextRequest, NextResponse } from "next/server";
+ import { handleChat } from "../../../src/handlers/chat";
+ import { workflowFactory } from "../../../src/app/workflow";
+
+ export async function POST(req: NextRequest) {
+ const body = await req.json();
+ return handleChat(body, NextResponse, workflowFactory);
+ }
🤖 Prompt for AI Agents
In packages/server/src/server.ts at line 79, remove the TODO comment about
serializing workflowFactory since serializing functions is not feasible.
Instead, create a Next.js route handler file at
packages/server/next/app/api/chat/route.ts that imports the chat handler and
workflowFactory directly from the server package. In this route handler, parse
the incoming request body with await req.json(), invoke handleChat with the
parsed body, NextResponse, and workflowFactory, then return the resulting
NextResponse. Verify the request payload structure and any environment or
configuration dependencies required by workflowFactory to ensure proper
operation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 9
🧹 Nitpick comments (8)
packages/server/next/app/api/chat/route.ts (1)
45-47
:AbortController
is not wired – cleanup leakYou create an
AbortController
but never abort it because the
res.on("close", …)
line is commented out.
Either remove the controller or re-enable the listener so
long-running workflows are cancelled when the client disconnects.packages/server/next/app/api/chat/utils/stream.ts (2)
43-45
: UnboundedcompletionText
may grow largeAccumulating every token into
completionText
can bloat memory for long
conversations. Consider:
- Capping the stored length (e.g., last N kB), or
- Building the final string only if an
onFinal
callback is provided.
74-78
:onError
does not surface to caller callbacks
onError
converts any error to a string and swallows it; caller-supplied
callbacks never see the failure.
Expose the original error or forward it tocallbacks?.onFinal
/onError
so API routes can react (e.g., log or transform the response).packages/server/next/app/api/chat/utils/suggestion.ts (1)
31-38
: Minor: early exit for empty history saves an LLM callWhen
chatHistory
is empty we can skip the completion request entirely.- const conversationText = conversation + if (conversation.length === 0) return []; + + const conversationText = conversation .map((message) => `${message.role}: ${message.content}`) .join("\n");packages/server/next/app/api/chat/utils/workflow.ts (3)
55-63
: Avoid lossy or circularJSON.stringify
on tool kwargs
event.data.toolKwargs
may contain circular references or large binary payloads, causingJSON.stringify
to throw or flood the client.
Consider serialising more defensively (e.g. use a custom replacer, pick only primitive fields, or truncate long strings).
78-81
: Surface download errors to avoid silent failuresBecause
downloadLlamaCloudFilesFromNodes
is fired-and-forgotten, any exception it throws becomes an unhandled rejection.
Wrap the call and log / swallow errors explicitly:- downloadLlamaCloudFilesFromNodes(sourceNodesForDownload); + downloadLlamaCloudFilesFromNodes(sourceNodesForDownload).catch((e) => + console.error("Background download failed:", e), + );
90-108
: Persist downloaded-file registry across invocations
downloadedFiles
is re-initialised every time the helper runs, so duplicate downloads are only avoided within a single batch.
Move the registry outside the function (or rely solely onfs.existsSync
) to spare redundant network traffic.-async function downloadLlamaCloudFilesFromNodes(nodes: SourceEventNode[]) { - const downloadedFiles: string[] = []; +const downloadedFiles = new Set<string>(); + +async function downloadLlamaCloudFilesFromNodes(nodes: SourceEventNode[]) { ... - if (downloadedFiles.includes(node.filePath)) continue; + if (downloadedFiles.has(node.filePath)) continue; ... - downloadedFiles.push(node.filePath); + downloadedFiles.add(node.filePath); } }packages/server/next/app/api/chat/events.ts (1)
38-52
: Guard against missingfile_name
/pipeline_id
when building pathsIf either metadata field is undefined the resulting path will contain
"undefined"
, generating broken URLs.
Consider a safer construction (and replace the$
delimiter, which is unconventional in file paths):- const { file_name, pipeline_id } = node.node.metadata; + const { file_name, pipeline_id } = node.node.metadata ?? {}; + if (!file_name) { + throw new Error("Missing file_name in node metadata"); + } - const filePath = pipeline_id - ? `output/llamacloud/${pipeline_id}$${file_name}` - : `data/${file_name}`; + const filePath = pipeline_id + ? `output/llamacloud/${pipeline_id}/${file_name}` + : `data/${file_name}`;
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge Base: Disabled due to data retention organization setting
📒 Files selected for processing (6)
packages/server/next/app/api/chat/events.ts
(1 hunks)packages/server/next/app/api/chat/route.ts
(1 hunks)packages/server/next/app/api/chat/utils/file.ts
(1 hunks)packages/server/next/app/api/chat/utils/stream.ts
(1 hunks)packages/server/next/app/api/chat/utils/suggestion.ts
(1 hunks)packages/server/next/app/api/chat/utils/workflow.ts
(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (3)
packages/server/next/app/api/chat/route.ts (4)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
Message
(6-10)packages/server/next/app/api/chat/utils/workflow.ts (1)
runWorkflow
(25-44)packages/server/next/app/api/chat/utils/stream.ts (1)
toDataStreamResponse
(35-80)packages/server/next/app/api/chat/utils/suggestion.ts (1)
sendSuggestedQuestionsEvent
(18-29)
packages/server/next/app/api/chat/utils/workflow.ts (2)
packages/server/next/app/api/chat/events.ts (4)
toAgentRunEvent
(65-86)toSourceEvent
(55-63)sourceEvent
(24-24)SourceEventNode
(8-16)packages/server/next/app/api/chat/utils/file.ts (1)
downloadFile
(4-31)
packages/server/next/app/api/chat/events.ts (1)
packages/create-llama/templates/components/ui/html/chat/chat-messages.tsx (1)
Message
(6-10)
⏰ Context from checks skipped due to timeout of 90000ms (49)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: lint
export async function downloadFile( | ||
urlToDownload: string, | ||
downloadedPath: string, | ||
) { | ||
try { | ||
// Check if file already exists | ||
if (fs.existsSync(downloadedPath)) return; | ||
|
||
const file = fs.createWriteStream(downloadedPath); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
async
Promise resolves before the download finishes
downloadFile
is declared async
, yet no await
is used.
The function returns immediately after scheduling the HTTPS request, so callers that
await downloadFile()
will proceed before the file is fully written.
This can cause race conditions when subsequent logic assumes the file already exists on disk.
-export async function downloadFile(
+export function downloadFile(
or wrap the stream logic in a new Promise<void>((resolve,reject)=>{ ... })
and await
it so the promise settles only after finish | error
.
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/utils/file.ts around lines 4 to 12, the
async function downloadFile does not await the completion of the file download,
causing it to resolve before the file is fully written. To fix this, wrap the
HTTPS request and file stream logic inside a new Promise that resolves on the
'finish' event and rejects on 'error', then await this Promise so the function
only resolves after the download completes successfully or fails.
const file = fs.createWriteStream(downloadedPath); | ||
https | ||
.get(urlToDownload, (response) => { | ||
response.pipe(file); | ||
file.on("finish", () => { | ||
file.close(() => { | ||
console.log("File downloaded successfully"); | ||
}); | ||
}); | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Stream error handling is incomplete – use pipeline
The manual response.pipe(file)
pattern misses back-pressure handling and
error propagation – errors on either stream can be swallowed.
A safer approach is stream.pipeline
which correctly forwards errors and
closes the destination stream.
- https
- .get(urlToDownload, (response) => {
- response.pipe(file);
- file.on("finish", () => {
- file.close(() => {
- console.log("File downloaded successfully");
- });
- });
- })
+ https
+ .get(urlToDownload, (response) => {
+ pipeline(response, file, (err) => {
+ if (err) return reject(err);
+ resolve();
+ });
+ })
Remember to import { pipeline } from "node:stream";
and wrap in a Promise per the previous comment.
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/utils/file.ts around lines 12 to 21,
replace the manual response.pipe(file) pattern with the use of stream.pipeline
to handle back-pressure and error propagation correctly. Import pipeline from
"node:stream", then wrap the pipeline call in a Promise to handle asynchronous
completion and errors properly. This ensures that errors on either the response
or file stream are not swallowed and the file stream is closed correctly.
.on("error", (err) => { | ||
fs.unlink(downloadedPath, () => { | ||
console.error("Error downloading file:", err); | ||
throw err; | ||
}); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Throwing inside async callback is lost
throw err
inside the .on("error")
callback is not caught by the outer
try/catch
; it will be emitted as an unhandled exception.
Reject the surrounding promise instead, or re-emit via an EventEmitter
.
-.on("error", (err) => {
- fs.unlink(downloadedPath, () => {
- console.error("Error downloading file:", err);
- throw err;
- });
-});
+.on("error", (err) => {
+ fs.unlink(downloadedPath, () => {
+ console.error("Error downloading file:", err);
+ reject(err);
+ });
+});
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
.on("error", (err) => { | |
fs.unlink(downloadedPath, () => { | |
console.error("Error downloading file:", err); | |
throw err; | |
}); | |
}); | |
.on("error", (err) => { | |
fs.unlink(downloadedPath, () => { | |
console.error("Error downloading file:", err); | |
reject(err); | |
}); | |
}); |
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/utils/file.ts around lines 22 to 27, the
error handler inside the .on("error") callback throws an error which is not
caught by the outer try/catch, causing an unhandled exception. Instead of
throwing the error inside the callback, modify the code to reject the
surrounding promise with the error or re-emit the error via an EventEmitter to
properly propagate the error to the caller.
const reqBody = await req.json(); | ||
const { messages } = reqBody as { messages: Message[] }; | ||
const chatHistory = messages.map((message) => ({ | ||
role: message.role as MessageType, | ||
content: message.content, | ||
})); | ||
|
||
const lastMessage = messages[messages.length - 1]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Validate request body structure before use
messages.map(...)
will crash if messages
is undefined
or not an array.
Defensive parsing prevents 500s caused by malformed clients.
-const { messages } = reqBody as { messages: Message[] };
-if (!Array.isArray(messages) || messages.length === 0) {
- return NextResponse.json(
- { detail: "`messages` must be a non-empty array" },
- { status: 400 },
- );
-}
+const { messages } = reqBody as { messages?: Message[] };
+if (!Array.isArray(messages) || messages.length === 0) {
+ return NextResponse.json(
+ { detail: "`messages` must be a non-empty array" },
+ { status: 400 },
+ );
+}
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/route.ts around lines 24 to 31, the code
assumes that messages is always defined and an array, which can cause a crash if
the request body is malformed. Add validation to check if messages exists and is
an array before calling map on it. If validation fails, return an appropriate
error response to prevent server errors from malformed input.
// eslint-disable-next-line @typescript-eslint/no-explicit-any | ||
const serverOptions = (globalThis as any).serverOptions; | ||
const workflowFactory = serverOptions.workflowFactory; | ||
|
||
if (!workflowFactory) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Guard against missing serverOptions
If globalThis.serverOptions
is undefined
, accessing
serverOptions.workflowFactory
throws before your 500-error guard.
Add a nullish check first:
-const serverOptions = (globalThis as any).serverOptions;
-const workflowFactory = serverOptions.workflowFactory;
+const workflowFactory = (globalThis as any)?.serverOptions?.workflowFactory;
or bail out early with an explicit 500.
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/route.ts around lines 10 to 14, the code
accesses serverOptions.workflowFactory without checking if serverOptions is
defined, which can cause a runtime error. Add a nullish check for serverOptions
before accessing workflowFactory, and if serverOptions is undefined, return or
throw an explicit 500 error early to prevent the code from crashing.
function extractQuestions(text: string): string[] { | ||
// Extract the text inside the triple backticks | ||
const contentMatch = text.match(/```(.*?)```/s); | ||
const content = contentMatch?.[1] ?? ""; | ||
|
||
// Split the content by newlines to get each question | ||
const questions = content | ||
.split("\n") | ||
.map((question) => question.trim()) | ||
.filter((question) => question !== ""); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Handle language-tagged code fences & numbering when extracting questions
LLMs frequently prepend a language tag ( ```text
/ ```markdown
) or numeric prefixes (1.
) inside the fenced block.
With the current regex the prefix becomes the first “question”, producing unusable suggestions.
A more defensive parser avoids the tag and strips a leading enumeration.
-const contentMatch = text.match(/```(.*?)```/s);
+// Allow an optional language tag after the opening back-ticks
+const contentMatch = text.match(/```(?:\w+\n)?([\s\S]*?)```/);
...
-.map((question) => question.trim())
+.map((q) => q.replace(/^\d+\.\s*/, "").trim())
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/utils/suggestion.ts around lines 50 to 59,
the current regex for extracting text inside triple backticks does not handle
optional language tags or numbered prefixes in fenced code blocks, causing
incorrect question extraction. Update the regex to allow an optional language
tag after the opening backticks and modify the question processing to strip any
leading numeric enumeration (e.g., "1.") before trimming. This will ensure only
the actual questions are extracted cleanly.
const questions = await generateNextQuestions(chatHistory); | ||
if (questions.length > 0) { | ||
streamWriter.writeMessageAnnotation({ | ||
type: "suggested_questions", | ||
data: questions, | ||
}); | ||
} | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Await the write operation to guarantee delivery
streamWriter.writeMessageAnnotation
is (in most transports) an async operation that returns a Promise
.
Not awaiting it means the enclosing async
function can resolve before the data is flushed, which may cause the “suggested_questions” annotation to be dropped if the caller does not keep the stream open long enough.
- if (questions.length > 0) {
- streamWriter.writeMessageAnnotation({
+ if (questions.length > 0) {
+ await streamWriter.writeMessageAnnotation({
type: "suggested_questions",
data: questions,
});
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const questions = await generateNextQuestions(chatHistory); | |
if (questions.length > 0) { | |
streamWriter.writeMessageAnnotation({ | |
type: "suggested_questions", | |
data: questions, | |
}); | |
} | |
}; | |
const questions = await generateNextQuestions(chatHistory); | |
if (questions.length > 0) { | |
await streamWriter.writeMessageAnnotation({ | |
type: "suggested_questions", | |
data: questions, | |
}); | |
} | |
}; |
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/utils/suggestion.ts around lines 22 to 29,
the call to streamWriter.writeMessageAnnotation is asynchronous and returns a
Promise, but it is not awaited. To ensure the "suggested_questions" annotation
is properly delivered before the function resolves, add an await before
streamWriter.writeMessageAnnotation so the function waits for the write
operation to complete.
else if (agentToolCallResultEvent.include(event)) { | ||
const rawOutput = event.data.raw; | ||
if ( | ||
rawOutput && | ||
typeof rawOutput === "object" && | ||
"sourceNodes" in rawOutput // TODO: better use Zod to validate and extract sourceNodes from toolCallResult | ||
) { | ||
const sourceNodes = | ||
rawOutput.sourceNodes as unknown as NodeWithScore<Metadata>[]; | ||
transformedEvent = toSourceEvent(sourceNodes); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Validate sourceNodes
instead of unchecked cast
The TODO
comment is spot-on: casting rawOutput.sourceNodes
without structural validation risks runtime errors if the tool returns unexpected data.
Leverage zod
(already in the project) or a type-guard before the cast.
-const sourceNodes =
- rawOutput.sourceNodes as unknown as NodeWithScore<Metadata>[];
+const sourceNodesSchema = z.array(z.object({ node: z.any(), score: z.number().nullable() }));
+if (sourceNodesSchema.safeParse(rawOutput.sourceNodes).success) {
+ const sourceNodes = rawOutput.sourceNodes as NodeWithScore<Metadata>[];
+ transformedEvent = toSourceEvent(sourceNodes);
+}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
else if (agentToolCallResultEvent.include(event)) { | |
const rawOutput = event.data.raw; | |
if ( | |
rawOutput && | |
typeof rawOutput === "object" && | |
"sourceNodes" in rawOutput // TODO: better use Zod to validate and extract sourceNodes from toolCallResult | |
) { | |
const sourceNodes = | |
rawOutput.sourceNodes as unknown as NodeWithScore<Metadata>[]; | |
transformedEvent = toSourceEvent(sourceNodes); | |
} | |
else if (agentToolCallResultEvent.include(event)) { | |
const rawOutput = event.data.raw; | |
if ( | |
rawOutput && | |
typeof rawOutput === "object" && | |
"sourceNodes" in rawOutput // TODO: better use Zod to validate and extract sourceNodes from toolCallResult | |
) { | |
const sourceNodesSchema = z.array( | |
z.object({ | |
node: z.any(), | |
score: z.number().nullable(), | |
}) | |
); | |
if (sourceNodesSchema.safeParse(rawOutput.sourceNodes).success) { | |
const sourceNodes = rawOutput.sourceNodes as NodeWithScore<Metadata>[]; | |
transformedEvent = toSourceEvent(sourceNodes); | |
} | |
} | |
} |
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/utils/workflow.ts around lines 65 to 75,
the code casts rawOutput.sourceNodes without validating its structure, risking
runtime errors. Replace the unchecked cast with a zod schema or a custom type
guard to validate that sourceNodes is an array of NodeWithScore<Metadata> before
using it. This ensures safe extraction and prevents errors from unexpected data
shapes.
export function extractAllArtifacts(messages: Message[]): Artifact[] { | ||
const allArtifacts: Artifact[] = []; | ||
|
||
for (const message of messages) { | ||
const artifacts = | ||
message.annotations | ||
?.filter( | ||
(annotation) => | ||
artifactAnnotationSchema.safeParse(annotation).success, | ||
) | ||
.map((artifact) => artifact as Artifact) ?? []; | ||
|
||
allArtifacts.push(...artifacts); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Message
from ai
does not declare annotations
– leads to type errors
extractAllArtifacts
assumes message.annotations
exists, but the imported Message
interface (see template snippet) only has id
, content
, and role
.
This will fail strict TypeScript compilation unless // @ts-ignore
is used elsewhere.
Add an interface augmentation or a local union that includes the optional annotations
field:
type AnnotatedMessage = Message & {
annotations?: unknown[];
};
...
for (const message of messages as AnnotatedMessage[]) {
🤖 Prompt for AI Agents
In packages/server/next/app/api/chat/events.ts around lines 151 to 164, the
function extractAllArtifacts accesses message.annotations which is not declared
in the imported Message interface, causing TypeScript type errors. To fix this,
create a new type that extends Message with an optional annotations field, for
example AnnotatedMessage = Message & { annotations?: unknown[] }, and then cast
the messages parameter to AnnotatedMessage[] in the function to ensure type
safety without using ts-ignore.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
packages/server/src/server.ts (1)
28-30
: Good solution for sharing server options with Next.js route handlersStoring server options in
globalThis
is a pragmatic solution to make configuration accessible to the Next.js route handlers. This approach addresses the previous challenge discussed in the comments about sharing the workflow factory and components directory with route handlers.However, be aware that modifying the global scope with
any
type bypasses TypeScript's type safety. Consider creating a more type-safe approach:- // eslint-disable-next-line @typescript-eslint/no-explicit-any - (globalThis as any).serverOptions = options; + interface CustomGlobalThis extends GlobalThis { + serverOptions?: LlamaIndexServerOptions; + } + (globalThis as CustomGlobalThis).serverOptions = options;
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge Base: Disabled due to data retention organization setting
📒 Files selected for processing (2)
packages/server/next/tsconfig.json
(1 hunks)packages/server/src/server.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- packages/server/next/tsconfig.json
⏰ Context from checks skipped due to timeout of 90000ms (35)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: lint
@@ -33,6 +25,9 @@ export class LlamaIndexServer { | |||
this.workflowFactory = workflow; | |||
this.componentsDir = options.uiConfig?.componentsDir; | |||
|
|||
// eslint-disable-next-line @typescript-eslint/no-explicit-any |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: this means we can't run multiple servers per node instance - this is ok, but we should document it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why suddenly so much more code? was the old code not deleted?
also please look at the code rabbit hints
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I will remove them, just keep for testing for now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
packages/create-llama/helpers/typescript.ts (1)
516-516
: Intentional dependency version pinning for stability.Changed from a caret range (
^3.1.3
) to an exact version (3.1.3
) for@llamaindex/readers
. This dependency pinning is a good practice during architectural migrations to ensure consistent behavior across environments.Consider adding a comment explaining why this specific version is pinned (if there are known compatibility issues with newer versions) and possibly create a future task to periodically review pinned dependencies for security updates.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge Base: Disabled due to data retention organization setting
📒 Files selected for processing (2)
packages/create-llama/helpers/typescript.ts
(1 hunks)packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
(1 hunks)
✅ Files skipped from review due to trivial changes (1)
- packages/create-llama/templates/types/llamaindexserver/nextjs/package.json
⏰ Context from checks skipped due to timeout of 90000ms (55)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: typescript (22, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, ubuntu-22.04, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, ubuntu-22.04, fastapi, --no-files, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, windows-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, llamaindexserver)
- GitHub Check: python (20, 3.11, windows-latest, fastapi, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --llamacloud, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --llamacloud, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --example-file, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, llamaindexserver)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --example-file, streaming)
- GitHub Check: typescript (20, 3.11, macos-latest, nextjs, --no-files, streaming)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, llamaindexserver)
- GitHub Check: python (20, 3.11, macos-latest, fastapi, --no-files, streaming)
- GitHub Check: lint
Summary by CodeRabbit
New Features
Refactor
Documentation
Chores