Skip to content

feat: enforce max queue deliveries in handlers with graceful failure#1344

Draft
pranaygp wants to merge 1 commit intomainfrom
pgp/handler-max-deliveries
Draft

feat: enforce max queue deliveries in handlers with graceful failure#1344
pranaygp wants to merge 1 commit intomainfrom
pgp/handler-max-deliveries

Conversation

@pranaygp
Copy link
Collaborator

@pranaygp pranaygp commented Mar 12, 2026

Summary

Replaces VQS maxDeliveries: 64 cap with handler-level enforcement. Handlers now gracefully fail runs/steps after excessive queue redeliveries, preventing "phantom stuck" runs.

Stacked on #1342#1340

Problem

When infrastructure is down (OOMs, network outages), VQS retries messages up to maxDeliveries: 64 times at 5s intervals. After exhausting retries, VQS drops the message — the run stays in running status forever with no error, no failure event.

Solution

  1. Remove maxDeliveries from VQS config — allow infinite retries at queue level
  2. Keep retryAfterSeconds: 5 — VQS owns retry timing (works even after SIGKILL/OOM)
  3. Handlers check metadata.attempt — when > MAX_QUEUE_DELIVERIES (64), fail gracefully with MAX_DELIVERIES_EXCEEDED error code
  4. If even failure event creation fails — log detailed error and consume the message (no point retrying further)

Queue error log examples (before → after)

Before (dumped full body, no run context):

[local world] Failed to queue message {
  queueName: '__wkf_step_...',
  text: '"WorkflowAPIError: Injected 5xx"',
  status: 500,
  headers: { ... },
  body: '{"workflowName":"...","workflowRunId":"wrun_01KKF...",
    "stepId":"step_01KKF...","traceCarrier":{...}}'
}

After (structured, includes run/step IDs, separates HTTP status from handler error):

[world-local] Queue message failed (attempt 3, HTTP 500) {
  queueName: '__wkf_step_...',
  messageId: 'msg_01KKF...',
  runId: 'wrun_01KKF...',
  stepId: 'step_01KKF...',
  handlerError: '"WorkflowAPIError: Injected 5xx"'
}

Local world queue

  • Removed hardcoded 3-retry cap → 1000 safety limit (handler enforces the real limit at 64)
  • Matches production VQS behavior

Test plan

  • 3 new unit tests for step handler max delivery enforcement
  • All core tests pass
  • All world-local tests pass
  • E2E: persistent failure → failed with MAX_DELIVERIES_EXCEEDED
  • E2E: transient failure → normal completion

🤖 Generated with Claude Code

@changeset-bot
Copy link

changeset-bot bot commented Mar 12, 2026

🦋 Changeset detected

Latest commit: d5b4132

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 20 packages
Name Type
@workflow/errors Patch
@workflow/core Patch
@workflow/world-local Patch
@workflow/builders Patch
@workflow/sveltekit Patch
@workflow/cli Patch
workflow Patch
@workflow/world-postgres Patch
@workflow/world-vercel Patch
@workflow/next Patch
@workflow/nitro Patch
@workflow/vitest Patch
@workflow/web-shared Patch
@workflow/world-testing Patch
@workflow/astro Patch
@workflow/nest Patch
@workflow/rollup Patch
@workflow/vite Patch
@workflow/ai Patch
@workflow/nuxt Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@vercel
Copy link
Contributor

vercel bot commented Mar 12, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
example-nextjs-workflow-turbopack Ready Ready Preview, Comment Mar 20, 2026 11:33pm
example-nextjs-workflow-webpack Ready Ready Preview, Comment Mar 20, 2026 11:33pm
example-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-astro-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-express-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-fastify-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-hono-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-nestjs-workflow Error Error Mar 20, 2026 11:33pm
workbench-nitro-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-nuxt-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-sveltekit-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workbench-vite-workflow Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workflow-docs Ready Ready Preview, Comment, Open in v0 Mar 20, 2026 11:33pm
workflow-nest Ready Ready Preview, Comment Mar 20, 2026 11:33pm
workflow-swc-playground Ready Ready Preview, Comment Mar 20, 2026 11:33pm

@github-actions
Copy link
Contributor

github-actions bot commented Mar 12, 2026

🧪 E2E Test Results

Some tests failed

Summary

Passed Failed Skipped Total
❌ ▲ Vercel Production 729 1 60 790
✅ 💻 Local Development 613 0 98 711
✅ 📦 Local Production 613 0 98 711
✅ 🐘 Local Postgres 613 0 98 711
✅ 🪟 Windows 74 0 5 79
❌ 🌍 Community Worlds 122 58 21 201
✅ 📋 Other 204 0 33 237
Total 2968 59 413 3440

❌ Failed Tests

▲ Vercel Production (1 failed)

nextjs-turbopack (1 failed):

  • sleepInLoopWorkflow - sleep inside loop with steps actually delays each iteration | wrun_01KM6SXFEX0MTJ2N4Q1QN624D7 | 🔍 observability
🌍 Community Worlds (58 failed)

mongodb (3 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KM6SH338486Q0PNZKGNMCHCK
  • webhookWorkflow | wrun_01KM6SHARJZC2J1YJV4CW5EH8B
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KM6SQEYXBKXG3PBM5YG7K6RG

redis (2 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KM6SH338486Q0PNZKGNMCHCK
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KM6SQEYXBKXG3PBM5YG7K6RG

turso (53 failed):

  • addTenWorkflow | wrun_01KM6SFZPG9WQP93VX2KGPVF6A
  • addTenWorkflow | wrun_01KM6SFZPG9WQP93VX2KGPVF6A
  • wellKnownAgentWorkflow (.well-known/agent) | wrun_01KM6SJ53R0V199AMHRV4G2VMX
  • should work with react rendering in step
  • promiseAllWorkflow | wrun_01KM6SG65KP0EZ0FBT7ZMX5210
  • promiseRaceWorkflow | wrun_01KM6SGAZJ2RD8Y1HNSSN2TTHC
  • promiseAnyWorkflow | wrun_01KM6SGD1KY3MCCC6H1R20ETXC
  • importedStepOnlyWorkflow | wrun_01KM6SJGYN5NJXDZB4SSP9P85S
  • hookWorkflow | wrun_01KM6SGSJ3V0RPB3E9MVQK72DM
  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KM6SH338486Q0PNZKGNMCHCK
  • webhookWorkflow | wrun_01KM6SHARJZC2J1YJV4CW5EH8B
  • sleepingWorkflow | wrun_01KM6SHG9HT19SSKWHPCX10H57
  • parallelSleepWorkflow | wrun_01KM6SHWX3Z5HRSYTZ59YGDDTE
  • nullByteWorkflow | wrun_01KM6SJ0C4AXRY9GX6YQWX7YH2
  • workflowAndStepMetadataWorkflow | wrun_01KM6SJ2X0NCJ1B0AVPP5DQB0D
  • fetchWorkflow | wrun_01KM6SKXSP6050RNV27W64PMM8
  • promiseRaceStressTestWorkflow | wrun_01KM6SM16JNSFADG4X11CW1QWX
  • error handling error propagation workflow errors nested function calls preserve message and stack trace
  • error handling error propagation workflow errors cross-file imports preserve message and stack trace
  • error handling error propagation step errors basic step error preserves message and stack trace
  • error handling error propagation step errors cross-file step error preserves message and function names in stack
  • error handling retry behavior regular Error retries until success
  • error handling retry behavior FatalError fails immediately without retries
  • error handling retry behavior RetryableError respects custom retryAfter delay
  • error handling retry behavior maxRetries=0 disables retries
  • error handling catchability FatalError can be caught and detected with FatalError.is()
  • hookCleanupTestWorkflow - hook token reuse after workflow completion | wrun_01KM6SPT8EB1K7AZWKMT8F5X04
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KM6SQEYXBKXG3PBM5YG7K6RG
  • hookDisposeTestWorkflow - hook token reuse after explicit disposal while workflow still running | wrun_01KM6SR3C8315GDTQW2DEXNDR9
  • stepFunctionPassingWorkflow - step function references can be passed as arguments (without closure vars) | wrun_01KM6SRPD1S26ACDG2VXJ0WH5S
  • stepFunctionWithClosureWorkflow - step function with closure variables passed as argument | wrun_01KM6SRYQ7ERNPHQ5938DVN366
  • closureVariableWorkflow - nested step functions with closure variables | wrun_01KM6SS48V5B1B6ZWWQ3JGM3X6
  • spawnWorkflowFromStepWorkflow - spawning a child workflow using start() inside a step | wrun_01KM6SS6EYWHE9FPD45F637H9G
  • startFromWorkflow - calling start() directly inside a workflow function with hook communication | wrun_01KM6SSGXJCS9YYFYP0QJKTKVG
  • fibonacciWorkflow - recursive workflow composition via start() | wrun_01KM6SSKP1V45WHGS5N7NPJN2E
  • health check (queue-based) - workflow and step endpoints respond to health check messages
  • pathsAliasWorkflow - TypeScript path aliases resolve correctly | wrun_01KM6ST6NASJT4ENABV6JWCDYF
  • Calculator.calculate - static workflow method using static step methods from another class | wrun_01KM6STD2V193JFJBCT5C1WQJF
  • AllInOneService.processNumber - static workflow method using sibling static step methods | wrun_01KM6STJ7KWBEC1RCHVYP7W3MT
  • ChainableService.processWithThis - static step methods using this to reference the class | wrun_01KM6STSP85G6JBBAVS97MDR3M
  • thisSerializationWorkflow - step function invoked with .call() and .apply() | wrun_01KM6SV002QH7C03E61KHS6E8M
  • customSerializationWorkflow - custom class serialization with WORKFLOW_SERIALIZE/WORKFLOW_DESERIALIZE | wrun_01KM6SV6B8TMS74CDTGAAQK0VV
  • instanceMethodStepWorkflow - instance methods with "use step" directive | wrun_01KM6SVE54FNKN0B9P9PF43Q7F
  • crossContextSerdeWorkflow - classes defined in step code are deserializable in workflow context | wrun_01KM6SVR9QBB8CPXT1KEQ7GBBN
  • stepFunctionAsStartArgWorkflow - step function reference passed as start() argument | wrun_01KM6SVZW1JNE1HTGTBB5VNRDT
  • cancelRun - cancelling a running workflow | wrun_01KM6SW65WE0JB5QVY6VVG1EY4
  • cancelRun via CLI - cancelling a running workflow | wrun_01KM6SWF59K5YB3WTWXMR2PTDA
  • pages router addTenWorkflow via pages router
  • pages router promiseAllWorkflow via pages router
  • pages router sleepingWorkflow via pages router
  • hookWithSleepWorkflow - hook payloads delivered correctly with concurrent sleep | wrun_01KM6SWTQMPM0QSVA1G9BB0FFA
  • sleepInLoopWorkflow - sleep inside loop with steps actually delays each iteration | wrun_01KM6SXFEX0MTJ2N4Q1QN624D7
  • sleepWithSequentialStepsWorkflow - sequential steps work with concurrent sleep (control) | wrun_01KM6SXSM6R1AQC8Y5APVXKNVX

Details by Category

❌ ▲ Vercel Production
App Passed Failed Skipped
✅ astro 72 0 7
✅ example 72 0 7
✅ express 72 0 7
✅ fastify 72 0 7
✅ hono 72 0 7
❌ nextjs-turbopack 76 1 2
✅ nextjs-webpack 77 0 2
✅ nitro 72 0 7
✅ nuxt 72 0 7
✅ vite 72 0 7
✅ 💻 Local Development
App Passed Failed Skipped
✅ express-stable 68 0 11
✅ fastify-stable 68 0 11
✅ hono-stable 68 0 11
✅ nextjs-turbopack-stable 74 0 5
✅ nextjs-webpack-canary 57 0 22
✅ nextjs-webpack-stable 74 0 5
✅ nitro-stable 68 0 11
✅ nuxt-stable 68 0 11
✅ vite-stable 68 0 11
✅ 📦 Local Production
App Passed Failed Skipped
✅ express-stable 68 0 11
✅ fastify-stable 68 0 11
✅ hono-stable 68 0 11
✅ nextjs-turbopack-stable 74 0 5
✅ nextjs-webpack-canary 57 0 22
✅ nextjs-webpack-stable 74 0 5
✅ nitro-stable 68 0 11
✅ nuxt-stable 68 0 11
✅ vite-stable 68 0 11
✅ 🐘 Local Postgres
App Passed Failed Skipped
✅ express-stable 68 0 11
✅ fastify-stable 68 0 11
✅ hono-stable 68 0 11
✅ nextjs-turbopack-stable 74 0 5
✅ nextjs-webpack-canary 57 0 22
✅ nextjs-webpack-stable 74 0 5
✅ nitro-stable 68 0 11
✅ nuxt-stable 68 0 11
✅ vite-stable 68 0 11
✅ 🪟 Windows
App Passed Failed Skipped
✅ nextjs-turbopack 74 0 5
❌ 🌍 Community Worlds
App Passed Failed Skipped
✅ mongodb-dev 3 0 2
❌ mongodb 54 3 5
✅ redis-dev 3 0 2
❌ redis 55 2 5
✅ turso-dev 3 0 2
❌ turso 4 53 5
✅ 📋 Other
App Passed Failed Skipped
✅ e2e-local-dev-nest-stable 68 0 11
✅ e2e-local-postgres-nest-stable 68 0 11
✅ e2e-local-prod-nest-stable 68 0 11

📋 View full workflow run


Some E2E test jobs failed:

  • Vercel Prod: failure
  • Local Dev: failure
  • Local Prod: failure
  • Local Postgres: failure
  • Windows: success

Check the workflow run for details.

Copy link
Collaborator Author

pranaygp commented Mar 12, 2026

Warning

This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite.
Learn more

This stack of pull requests is managed by Graphite. Learn more about stacking.

Copy link
Contributor

@vercel vercel bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additional Suggestion:

SvelteKit package has hardcoded maxDeliveries: 64 on queue triggers, causing VQS to silently drop messages before the handler can gracefully fail runs/steps.

Fix on Vercel

@pranaygp pranaygp force-pushed the pgp/handler-max-deliveries branch from c2ed3e7 to 085a05a Compare March 17, 2026 22:37
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR moves enforcement of the queue delivery cap from the Vercel Queue trigger configuration into the workflow/step runtime handlers, and updates local-queue behavior/logging to support the new approach.

Changes:

  • Add a shared MAX_QUEUE_DELIVERIES constant and enforce it in both workflow and step handlers with graceful failure (run_failed / step_failed + requeue workflow for step).
  • Remove maxDeliveries from queue trigger definitions in @workflow/builders.
  • Improve world-local queue logging with runId/stepId context and add a local retry safety limit.

Reviewed changes

Copilot reviewed 8 out of 8 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
packages/world-local/src/queue.ts Adds structured identifiers to logs and replaces the old retry counter with a fixed safety-loop cap.
packages/errors/src/error-codes.ts Introduces MAX_DELIVERIES_EXCEEDED run error code.
packages/core/src/runtime/step-handler.ts Enforces max deliveries for steps and adjusts event creation/logging behavior.
packages/core/src/runtime/step-handler.test.ts Adds test coverage for step max-deliveries behavior.
packages/core/src/runtime/constants.ts Defines MAX_QUEUE_DELIVERIES.
packages/core/src/runtime.ts Enforces max deliveries for workflow handler and records run_failed with a specific error code.
packages/builders/src/constants.ts Removes VQS maxDeliveries from trigger constants.
.changeset/handler-max-deliveries.md Changeset describing the behavior shift from VQS config to handler enforcement.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Comment on lines +172 to +176
const startResult = await world.events.create(workflowRunId, {
eventType: 'step_started',
specVersion: SPEC_VERSION_CURRENT,
correlationId: stepId,
});
Comment on lines +120 to +131
try {
const world = getWorld();
await world.events.create(runId, {
eventType: 'run_failed',
specVersion: SPEC_VERSION_CURRENT,
eventData: {
error: {
message: `Workflow exceeded maximum queue deliveries (${metadata.attempt}/${MAX_QUEUE_DELIVERIES})`,
},
errorCode: RUN_ERROR_CODES.MAX_DELIVERIES_EXCEEDED,
},
});
Comment on lines +117 to +121
// Safety limit to prevent infinite loops in the local queue.
// The actual max delivery enforcement happens in the workflow/step handlers.
const MAX_LOCAL_SAFETY_LIMIT = 1000;
try {
let defaultRetriesLeft = 3;
for (let attempt = 0; defaultRetriesLeft > 0; attempt++) {
defaultRetriesLeft--;

for (let attempt = 0; attempt < MAX_LOCAL_SAFETY_LIMIT; attempt++) {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah there's no reason this has to be 1000. just needs to be higher than the max queue attempts. let's go with 256

Comment on lines +529 to +568
it('should post step_failed and re-queue workflow when delivery count exceeds max', async () => {
const result = await capturedHandler(
createMessage(),
{ ...createMetadata('myStep'), attempt: 65 }
);

expect(result).toBeUndefined();
expect(mockEventsCreate).toHaveBeenCalledWith(
'wrun_test123',
expect.objectContaining({
eventType: 'step_failed',
correlationId: 'step_abc',
})
);
expect(mockQueueMessage).toHaveBeenCalled();
expect(mockRuntimeLogger.error).toHaveBeenCalledWith(
expect.stringContaining('exceeded max deliveries'),
expect.objectContaining({ workflowRunId: 'wrun_test123' })
);
});

it('should consume message silently when step_failed fails with EntityConflictError', async () => {
mockEventsCreate.mockRejectedValue(
new EntityConflictError('Step already completed')
);

const result = await capturedHandler(
createMessage(),
{ ...createMetadata('myStep'), attempt: 65 }
);

expect(result).toBeUndefined();
expect(mockStepFn).not.toHaveBeenCalled();
});

it('should not trigger max deliveries check when under limit', async () => {
const result = await capturedHandler(
createMessage(),
{ ...createMetadata('myStep'), attempt: 64 }
);
Replace VQS maxDeliveries cap with handler-level enforcement. Handlers
now gracefully fail runs/steps after excessive queue redeliveries,
preventing "phantom stuck" runs.

- Add MAX_QUEUE_DELIVERIES constant (64) and enforce in both workflow
  and step handlers with run_failed/step_failed events
- Remove maxDeliveries from VQS trigger configs (builders + sveltekit)
- Improve world-local queue: safety limit loop, structured logging
  with runId/stepId, backoff delay on failures
- Add MAX_DELIVERIES_EXCEEDED error code

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@github-actions
Copy link
Contributor

github-actions bot commented Mar 20, 2026

📊 Benchmark Results

📈 Comparing against baseline from main branch. Green 🟢 = faster, Red 🔺 = slower.

workflow with no steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 0.040s (-15.4% 🟢) 1.005s (~) 0.964s 10 1.00x
💻 Local Express 0.043s (-3.3%) 1.005s (~) 0.961s 10 1.08x
💻 Local Next.js (Turbopack) 0.049s 1.005s 0.955s 10 1.23x
🐘 Postgres Express 0.056s (-21.1% 🟢) 1.012s (-0.5%) 0.956s 10 1.40x
🌐 Redis Next.js (Turbopack) 0.057s 1.006s 0.949s 10 1.42x
🐘 Postgres Next.js (Turbopack) 0.059s 1.011s 0.952s 10 1.47x
🐘 Postgres Nitro 0.064s (-7.4% 🟢) 1.011s (~) 0.947s 10 1.60x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 0.506s (+1.9%) 1.969s (-22.6% 🟢) 1.464s 10 1.00x
▲ Vercel Next.js (Turbopack) 0.548s (-24.1% 🟢) 2.490s (-3.8%) 1.943s 10 1.08x
▲ Vercel Express 0.774s (+34.4% 🔺) 2.618s (+2.4%) 1.844s 10 1.53x

🔍 Observability: Nitro | Next.js (Turbopack) | Express

workflow with 1 step

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 1.098s (-2.8%) 2.005s (~) 0.907s 10 1.00x
💻 Local Next.js (Turbopack) 1.121s 2.006s 0.885s 10 1.02x
💻 Local Express 1.125s (~) 2.005s (~) 0.880s 10 1.02x
🐘 Postgres Next.js (Turbopack) 1.137s 2.012s 0.875s 10 1.04x
🌐 Redis Next.js (Turbopack) 1.139s 2.006s 0.867s 10 1.04x
🐘 Postgres Express 1.147s (+0.7%) 2.012s (~) 0.864s 10 1.04x
🐘 Postgres Nitro 1.147s (~) 2.011s (~) 0.864s 10 1.05x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.998s (-14.5% 🟢) 3.205s (-17.3% 🟢) 1.207s 10 1.00x
▲ Vercel Next.js (Turbopack) 2.073s (-10.8% 🟢) 3.969s (+4.2%) 1.897s 10 1.04x
▲ Vercel Express 2.220s (+0.6%) 3.830s (-1.6%) 1.610s 10 1.11x

🔍 Observability: Nitro | Next.js (Turbopack) | Express

workflow with 10 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 10.644s (-2.3%) 11.022s (~) 0.379s 3 1.00x
🌐 Redis Next.js (Turbopack) 10.764s 11.024s 0.260s 3 1.01x
💻 Local Next.js (Turbopack) 10.794s 11.022s 0.228s 3 1.01x
🐘 Postgres Nitro 10.908s (~) 11.038s (~) 0.130s 3 1.02x
💻 Local Express 10.919s (~) 11.022s (~) 0.103s 3 1.03x
🐘 Postgres Express 10.925s (~) 11.040s (~) 0.115s 3 1.03x
🐘 Postgres Next.js (Turbopack) 10.935s 11.039s 0.104s 3 1.03x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 16.543s (-4.8%) 18.676s (-3.6%) 2.133s 2 1.00x
▲ Vercel Next.js (Turbopack) 17.488s (-1.5%) 19.378s (-2.1%) 1.890s 2 1.06x
▲ Vercel Nitro 17.694s (+1.7%) 18.809s (~) 1.115s 2 1.07x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

workflow with 25 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 14.197s (-4.8%) 15.027s (~) 0.830s 4 1.00x
🌐 Redis Next.js (Turbopack) 14.281s 15.029s 0.748s 4 1.01x
🐘 Postgres Next.js (Turbopack) 14.494s 15.043s 0.549s 4 1.02x
💻 Local Next.js (Turbopack) 14.629s 15.029s 0.400s 4 1.03x
🐘 Postgres Nitro 14.688s (-0.9%) 15.042s (~) 0.354s 4 1.03x
🐘 Postgres Express 14.847s (+1.9%) 15.047s (~) 0.200s 4 1.05x
💻 Local Express 14.935s (~) 15.028s (-3.2%) 0.093s 4 1.05x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 32.896s (+1.3%) 34.447s (-0.7%) 1.551s 2 1.00x
▲ Vercel Express 34.233s (+1.4%) 35.909s (+0.9%) 1.675s 2 1.04x
▲ Vercel Nitro 34.365s (+6.3% 🔺) 35.819s (+3.1%) 1.454s 2 1.04x

🔍 Observability: Next.js (Turbopack) | Express | Nitro

workflow with 50 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 13.509s 14.026s 0.517s 7 1.00x
🐘 Postgres Next.js (Turbopack) 13.987s 14.469s 0.482s 7 1.04x
🐘 Postgres Nitro 14.191s (-1.3%) 15.040s (~) 0.849s 6 1.05x
🐘 Postgres Express 14.329s (+2.3%) 15.041s (+4.0%) 0.712s 6 1.06x
💻 Local Nitro 14.953s (-9.1% 🟢) 15.027s (-11.8% 🟢) 0.075s 6 1.11x
💻 Local Next.js (Turbopack) 16.090s 17.031s 0.942s 6 1.19x
💻 Local Express 16.527s (-0.9%) 17.030s (~) 0.502s 6 1.22x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 56.636s (-9.8% 🟢) 59.087s (-8.3% 🟢) 2.451s 2 1.00x
▲ Vercel Nitro 61.007s (+8.5% 🔺) 62.250s (+8.7% 🔺) 1.243s 2 1.08x
▲ Vercel Express 63.384s (+9.9% 🔺) 65.268s (+9.8% 🔺) 1.884s 2 1.12x

🔍 Observability: Next.js (Turbopack) | Nitro | Express

Promise.all with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 1.266s 2.011s 0.745s 15 1.00x
🐘 Postgres Nitro 1.273s (-1.6%) 2.011s (~) 0.738s 15 1.01x
🐘 Postgres Express 1.278s (+0.9%) 2.011s (~) 0.734s 15 1.01x
🌐 Redis Next.js (Turbopack) 1.308s 2.007s 0.699s 15 1.03x
💻 Local Nitro 1.486s (~) 2.006s (~) 0.520s 15 1.17x
💻 Local Next.js (Turbopack) 1.511s 2.006s 0.495s 15 1.19x
💻 Local Express 1.539s (+2.1%) 2.006s (~) 0.467s 15 1.22x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.349s (-10.3% 🟢) 3.560s (-15.2% 🟢) 1.211s 9 1.00x
▲ Vercel Next.js (Turbopack) 2.475s (-0.7%) 4.052s (+3.9%) 1.577s 8 1.05x
▲ Vercel Express 2.625s (-4.9%) 4.150s (-10.1% 🟢) 1.525s 8 1.12x

🔍 Observability: Nitro | Next.js (Turbopack) | Express

Promise.all with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 2.433s (-1.5%) 3.012s (~) 0.579s 10 1.00x
🐘 Postgres Express 2.445s (-0.6%) 3.012s (~) 0.567s 10 1.01x
🐘 Postgres Next.js (Turbopack) 2.477s 3.011s 0.534s 10 1.02x
🌐 Redis Next.js (Turbopack) 2.557s 3.008s 0.451s 10 1.05x
💻 Local Nitro 2.836s (-1.2%) 3.108s (~) 0.272s 10 1.17x
💻 Local Express 2.941s (-1.9%) 3.341s (-6.3% 🟢) 0.400s 9 1.21x
💻 Local Next.js (Turbopack) 3.036s 3.675s 0.639s 9 1.25x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.688s (+2.2%) 4.113s (+2.2%) 1.426s 8 1.00x
▲ Vercel Express 3.209s (+12.4% 🔺) 4.575s (+2.5%) 1.366s 7 1.19x
▲ Vercel Nitro 3.212s (+26.6% 🔺) 4.377s (+14.9% 🔺) 1.165s 7 1.20x

🔍 Observability: Next.js (Turbopack) | Express | Nitro

Promise.all with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 3.593s (-1.1%) 4.012s (~) 0.419s 8 1.00x
🐘 Postgres Express 3.602s (~) 4.014s (~) 0.412s 8 1.00x
🐘 Postgres Next.js (Turbopack) 3.711s 4.015s 0.304s 8 1.03x
🌐 Redis Next.js (Turbopack) 4.220s 5.012s 0.793s 6 1.17x
💻 Local Nitro 6.743s (-17.0% 🟢) 7.515s (-14.3% 🟢) 0.772s 4 1.88x
💻 Local Next.js (Turbopack) 7.543s 8.015s 0.472s 4 2.10x
💻 Local Express 8.219s (-3.0%) 8.771s (-2.8%) 0.552s 4 2.29x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 3.114s (-5.2% 🟢) 4.447s (-4.7%) 1.333s 7 1.00x
▲ Vercel Nitro 3.285s (+7.9% 🔺) 4.607s (+3.1%) 1.322s 7 1.05x
▲ Vercel Next.js (Turbopack) 3.626s (+8.1% 🔺) 5.351s (+6.8% 🔺) 1.725s 6 1.16x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

Promise.race with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 1.252s 2.011s 0.758s 15 1.00x
🐘 Postgres Express 1.262s (~) 2.011s (~) 0.750s 15 1.01x
🐘 Postgres Nitro 1.266s (-0.8%) 2.010s (~) 0.744s 15 1.01x
🌐 Redis Next.js (Turbopack) 1.308s 2.006s 0.699s 15 1.04x
💻 Local Nitro 1.512s (-3.1%) 2.005s (~) 0.493s 15 1.21x
💻 Local Next.js (Turbopack) 1.554s 2.006s 0.452s 15 1.24x
💻 Local Express 1.579s (+2.7%) 2.006s (~) 0.427s 15 1.26x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.132s (-3.5%) 3.538s (-8.5% 🟢) 1.405s 9 1.00x
▲ Vercel Nitro 2.583s (+21.5% 🔺) 4.005s (+8.5% 🔺) 1.423s 8 1.21x
▲ Vercel Next.js (Turbopack) 2.624s (+0.9%) 4.281s (+6.7% 🔺) 1.657s 8 1.23x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

Promise.race with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 2.447s (~) 3.013s (~) 0.566s 10 1.00x
🐘 Postgres Express 2.450s (~) 3.012s (~) 0.562s 10 1.00x
🐘 Postgres Next.js (Turbopack) 2.452s 3.011s 0.559s 10 1.00x
🌐 Redis Next.js (Turbopack) 2.594s 3.008s 0.415s 10 1.06x
💻 Local Nitro 2.655s (-12.3% 🟢) 3.007s (-18.2% 🟢) 0.352s 10 1.09x
💻 Local Next.js (Turbopack) 3.006s 3.675s 0.669s 9 1.23x
💻 Local Express 3.044s (~) 4.011s (+6.7% 🔺) 0.967s 8 1.24x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.643s (-1.3%) 4.099s (-11.1% 🟢) 1.457s 8 1.00x
▲ Vercel Next.js (Turbopack) 2.702s (+13.6% 🔺) 4.115s (+12.4% 🔺) 1.413s 8 1.02x
▲ Vercel Nitro 3.304s (+27.7% 🔺) 4.720s (+22.2% 🔺) 1.416s 7 1.25x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

Promise.race with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 3.597s (~) 4.015s (~) 0.418s 8 1.00x
🐘 Postgres Nitro 3.611s (~) 4.014s (~) 0.403s 8 1.00x
🐘 Postgres Next.js (Turbopack) 3.732s 4.015s 0.282s 8 1.04x
🌐 Redis Next.js (Turbopack) 4.175s 5.012s 0.837s 6 1.16x
💻 Local Nitro 7.446s (-13.9% 🟢) 8.017s (-11.2% 🟢) 0.571s 4 2.07x
💻 Local Next.js (Turbopack) 8.700s 9.268s 0.568s 4 2.42x
💻 Local Express 8.912s (-1.6%) 9.523s (-2.6%) 0.610s 4 2.48x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.094s (+9.4% 🔺) 4.387s (-1.0%) 1.293s 7 1.00x
▲ Vercel Express 3.257s (+6.2% 🔺) 5.089s (+12.0% 🔺) 1.832s 6 1.05x
▲ Vercel Next.js (Turbopack) 3.894s (-0.6%) 5.524s (+0.6%) 1.630s 6 1.26x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 10 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 0.689s (-27.9% 🟢) 1.004s (-1.7%) 0.315s 60 1.00x
🌐 Redis Next.js (Turbopack) 0.732s 1.005s 0.273s 60 1.06x
🐘 Postgres Next.js (Turbopack) 0.804s 1.026s 0.222s 59 1.17x
🐘 Postgres Nitro 0.866s (-3.5%) 1.009s (-3.3%) 0.143s 60 1.26x
🐘 Postgres Express 0.882s (+1.9%) 1.009s (-1.6%) 0.127s 60 1.28x
💻 Local Next.js (Turbopack) 0.900s 1.057s 0.158s 57 1.31x
💻 Local Express 0.978s (-0.5%) 1.057s (-3.5%) 0.079s 57 1.42x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 9.254s (-6.9% 🟢) 10.700s (-8.5% 🟢) 1.446s 6 1.00x
▲ Vercel Nitro 9.438s (-1.1%) 10.848s (-0.8%) 1.409s 6 1.02x
▲ Vercel Next.js (Turbopack) 10.176s (-3.1%) 12.573s (+2.0%) 2.397s 5 1.10x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

workflow with 25 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.729s 2.006s 0.277s 45 1.00x
🐘 Postgres Next.js (Turbopack) 1.972s 2.260s 0.288s 40 1.14x
🐘 Postgres Nitro 2.079s (-3.8%) 2.979s (~) 0.900s 31 1.20x
🐘 Postgres Express 2.142s (+1.2%) 3.012s (~) 0.870s 30 1.24x
💻 Local Nitro 2.231s (-24.5% 🟢) 3.007s (-3.3%) 0.776s 30 1.29x
💻 Local Next.js (Turbopack) 2.803s 3.041s 0.238s 30 1.62x
💻 Local Express 3.014s (~) 3.508s (-1.1%) 0.494s 26 1.74x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 29.829s (-3.9%) 31.129s (-4.2%) 1.301s 3 1.00x
▲ Vercel Express 30.346s (-8.7% 🟢) 31.926s (-9.2% 🟢) 1.580s 3 1.02x
▲ Vercel Next.js (Turbopack) 31.450s (-3.7%) 32.944s (-3.6%) 1.495s 3 1.05x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 50 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 3.360s 4.009s 0.648s 30 1.00x
🐘 Postgres Next.js (Turbopack) 4.012s 4.370s 0.358s 28 1.19x
🐘 Postgres Nitro 4.282s (-1.2%) 5.056s (+0.8%) 0.774s 24 1.27x
🐘 Postgres Express 4.343s (+1.9%) 5.014s (+0.8%) 0.671s 24 1.29x
💻 Local Nitro 7.304s (-19.1% 🟢) 8.015s (-15.4% 🟢) 0.710s 15 2.17x
💻 Local Next.js (Turbopack) 8.505s 9.017s 0.512s 14 2.53x
💻 Local Express 9.113s (~) 9.479s (-3.2%) 0.366s 13 2.71x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 79.103s (-10.2% 🟢) 80.676s (-10.1% 🟢) 1.573s 2 1.00x
▲ Vercel Nitro 81.883s (-11.3% 🟢) 83.062s (-11.9% 🟢) 1.179s 2 1.04x
▲ Vercel Next.js (Turbopack) 82.255s (-15.6% 🟢) 84.305s (-15.1% 🟢) 2.050s 2 1.04x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

workflow with 10 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.264s 1.009s 0.744s 60 1.00x
🐘 Postgres Nitro 0.294s (~) 1.009s (~) 0.715s 60 1.11x
🐘 Postgres Express 0.302s (~) 1.009s (~) 0.707s 60 1.14x
🌐 Redis Next.js (Turbopack) 0.369s 1.005s 0.635s 60 1.40x
💻 Local Next.js (Turbopack) 0.556s 1.004s 0.449s 60 2.10x
💻 Local Nitro 0.592s (-1.0%) 1.004s (~) 0.413s 60 2.24x
💻 Local Express 0.596s (-1.1%) 1.005s (~) 0.408s 60 2.25x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.592s (-31.3% 🟢) 3.016s (-22.0% 🟢) 1.424s 21 1.00x
▲ Vercel Express 1.739s (-30.8% 🟢) 3.040s (-24.5% 🟢) 1.302s 20 1.09x
▲ Vercel Next.js (Turbopack) 2.618s (+37.8% 🔺) 4.505s (+25.9% 🔺) 1.887s 14 1.64x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 25 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 0.530s (-1.7%) 1.009s (~) 0.479s 90 1.00x
🐘 Postgres Next.js (Turbopack) 0.544s 1.009s 0.465s 90 1.03x
🐘 Postgres Express 0.551s (+4.4%) 1.010s (~) 0.459s 90 1.04x
🌐 Redis Next.js (Turbopack) 1.195s 2.006s 0.811s 45 2.25x
💻 Local Nitro 2.406s (-2.8%) 3.007s (~) 0.601s 30 4.54x
💻 Local Next.js (Turbopack) 2.516s 3.008s 0.493s 30 4.75x
💻 Local Express 2.606s (+1.0%) 3.043s (+1.1%) 0.437s 30 4.92x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.732s (-5.7% 🟢) 3.939s (-6.6% 🟢) 1.207s 23 1.00x
▲ Vercel Express 2.981s (-3.2%) 4.443s (-1.2%) 1.462s 21 1.09x
▲ Vercel Next.js (Turbopack) 3.270s (-4.9%) 4.785s (~) 1.516s 19 1.20x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 50 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.903s 1.104s 0.202s 109 1.00x
🐘 Postgres Nitro 0.909s (-3.2%) 1.078s (-9.6% 🟢) 0.169s 112 1.01x
🐘 Postgres Express 0.930s (+3.5%) 1.205s (+15.8% 🔺) 0.275s 100 1.03x
🌐 Redis Next.js (Turbopack) 2.774s 3.033s 0.259s 40 3.07x
💻 Local Nitro 10.148s (-8.7% 🟢) 10.856s (-6.9% 🟢) 0.708s 12 11.24x
💻 Local Next.js (Turbopack) 10.579s 11.026s 0.446s 11 11.72x
💻 Local Express 11.037s (-2.2%) 11.667s (-3.0%) 0.630s 11 12.23x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 6.874s (-5.5% 🟢) 8.621s (-3.7%) 1.747s 14 1.00x
▲ Vercel Next.js (Turbopack) 12.637s (+61.2% 🔺) 14.065s (+50.1% 🔺) 1.428s 9 1.84x
▲ Vercel Nitro 158.077s (+1790.5% 🔺) 159.247s (+1533.5% 🔺) 1.170s 2 23.00x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

Stream Benchmarks (includes TTFB metrics)
workflow with stream

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Nitro 0.145s (-26.1% 🟢) 1.003s (~) 0.010s (-16.9% 🟢) 1.016s (~) 0.871s 10 1.00x
💻 Local Next.js (Turbopack) 0.180s 1.002s 0.012s 1.018s 0.839s 10 1.24x
🌐 Redis Next.js (Turbopack) 0.182s 1.000s 0.002s 1.008s 0.826s 10 1.25x
💻 Local Express 0.196s (-4.1%) 1.003s (~) 0.011s (-8.4% 🟢) 1.017s (~) 0.821s 10 1.35x
🐘 Postgres Next.js (Turbopack) 0.198s 1.001s 0.001s 1.011s 0.814s 10 1.36x
🐘 Postgres Nitro 0.222s (-7.2% 🟢) 0.996s (~) 0.002s (+28.6% 🔺) 1.013s (~) 0.791s 10 1.53x
🐘 Postgres Express 0.223s (+3.0%) 0.993s (~) 0.002s (+50.0% 🔺) 1.013s (~) 0.790s 10 1.54x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.532s (-7.2% 🟢) 2.536s (-5.4% 🟢) 0.619s (+10211.7% 🔺) 3.564s (+8.9% 🔺) 2.032s 10 1.00x
▲ Vercel Express 1.535s (-9.0% 🟢) 2.657s (+0.6%) 0.590s (+10618.2% 🔺) 3.797s (+17.3% 🔺) 2.262s 10 1.00x
▲ Vercel Next.js (Turbopack) 1.708s (-1.2%) 2.512s (-15.9% 🟢) 0.825s (+13419.7% 🔺) 3.968s (+10.4% 🔺) 2.260s 10 1.11x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

stream pipeline with 5 transform steps (1MB)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 0.488s 1.000s 0.003s 1.011s 0.523s 60 1.00x
💻 Local Nitro 0.579s (-20.4% 🟢) 1.009s (~) 0.009s (-7.1% 🟢) 1.021s (~) 0.443s 59 1.19x
💻 Local Next.js (Turbopack) 0.653s 1.008s 0.009s 1.022s 0.369s 59 1.34x
🐘 Postgres Nitro 0.662s (-8.3% 🟢) 1.005s (~) 0.004s (+2.8%) 1.025s (~) 0.363s 59 1.36x
🐘 Postgres Next.js (Turbopack) 0.691s 1.025s 0.004s 1.042s 0.351s 58 1.42x
🐘 Postgres Express 0.693s (~) 1.004s (~) 0.013s (+240.7% 🔺) 1.039s (+1.2%) 0.346s 59 1.42x
💻 Local Express 0.720s (-7.3% 🟢) 1.009s (-1.7%) 0.008s (-17.6% 🟢) 1.022s (-1.8%) 0.302s 59 1.47x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 4.518s (+3.6%) 5.599s (~) 0.195s (-26.8% 🟢) 6.373s (-0.7%) 1.855s 10 1.00x
▲ Vercel Nitro 4.570s (+5.6% 🔺) 5.610s (+1.9%) 0.257s (-1.6%) 6.279s (~) 1.710s 10 1.01x
▲ Vercel Next.js (Turbopack) 4.619s (+1.5%) 6.021s (+4.0%) 0.405s (+102.1% 🔺) 7.029s (+7.7% 🔺) 2.410s 9 1.02x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

10 parallel streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 0.905s 1.016s 0.000s 1.021s 0.117s 59 1.00x
🐘 Postgres Nitro 1.094s (+18.8% 🔺) 1.758s (+65.4% 🔺) 0.000s (-17.6% 🟢) 1.781s (+64.8% 🔺) 0.687s 34 1.21x
🐘 Postgres Express 1.115s (+24.8% 🔺) 1.931s (+78.3% 🔺) 0.000s (-100.0% 🟢) 1.954s (+76.9% 🔺) 0.839s 31 1.23x
🐘 Postgres Next.js (Turbopack) 1.157s 1.938s 0.000s 1.962s 0.805s 31 1.28x
💻 Local Nitro 1.185s (-3.2%) 2.018s (~) 0.000s (-30.8% 🟢) 2.021s (~) 0.836s 30 1.31x
💻 Local Express 1.206s (-4.2%) 2.019s (~) 0.000s (-47.1% 🟢) 2.022s (~) 0.816s 30 1.33x
💻 Local Next.js (Turbopack) 1.233s 2.018s 0.000s 2.022s 0.789s 30 1.36x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.820s (+17.9% 🔺) 3.577s (+8.8% 🔺) 0.000s (NaN%) 4.124s (+4.6%) 1.303s 15 1.00x
▲ Vercel Nitro 2.830s (+11.5% 🔺) 3.747s (+16.6% 🔺) 0.000s (+113.3% 🔺) 4.211s (+12.0% 🔺) 1.381s 15 1.00x
▲ Vercel Next.js (Turbopack) 3.479s (+18.8% 🔺) 4.797s (+20.3% 🔺) 0.000s (-68.2% 🟢) 5.480s (+19.4% 🔺) 2.001s 11 1.23x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

fan-out fan-in 10 streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.663s 2.001s 0.000s 2.006s 0.344s 30 1.00x
🐘 Postgres Express 2.080s (+15.3% 🔺) 2.695s (+28.4% 🔺) 0.000s (NaN%) 2.710s (+27.7% 🔺) 0.630s 23 1.25x
🐘 Postgres Nitro 2.083s (+10.1% 🔺) 2.603s (+17.3% 🔺) 0.000s (+134.8% 🔺) 2.622s (+17.0% 🔺) 0.539s 23 1.25x
🐘 Postgres Next.js (Turbopack) 2.215s 3.055s 0.000s 3.066s 0.851s 20 1.33x
💻 Local Express 3.428s (~) 4.031s (-1.6%) 0.001s (+12.5% 🔺) 4.035s (-1.6%) 0.606s 15 2.06x
💻 Local Nitro 3.444s (+3.5%) 3.968s (~) 0.000s (+40.0% 🔺) 3.972s (~) 0.527s 16 2.07x
💻 Local Next.js (Turbopack) 3.803s 4.462s 0.001s 4.467s 0.664s 14 2.29x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 4.127s (+25.6% 🔺) 4.872s (+6.6% 🔺) 0.000s (-100.0% 🟢) 5.559s (+8.1% 🔺) 1.433s 11 1.00x
▲ Vercel Nitro 4.202s (+24.5% 🔺) 4.758s (+4.9%) 0.000s (-100.0% 🟢) 5.323s (+4.7%) 1.121s 12 1.02x
▲ Vercel Next.js (Turbopack) 4.421s (+1.2%) 5.479s (+1.5%) 0.000s (-72.5% 🟢) 6.078s (+2.0%) 1.657s 10 1.07x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

Summary

Fastest Framework by World

Winner determined by most benchmark wins

World 🥇 Fastest Framework Wins
💻 Local Nitro 19/21
🐘 Postgres Next.js (Turbopack) 11/21
▲ Vercel Express 10/21
Fastest World by Framework

Winner determined by most benchmark wins

Framework 🥇 Fastest World Wins
Express 🐘 Postgres 15/21
Next.js (Turbopack) 🌐 Redis 9/21
Nitro 🐘 Postgres 12/21
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • TTFB: Time to First Byte - time from workflow start until first stream byte received (stream benchmarks only)
  • Slurp: Time from first byte to complete stream consumption (stream benchmarks only)
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • Samples: Number of benchmark iterations run
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Worlds:

  • 💻 Local: In-memory filesystem world (local development)
  • 🐘 Postgres: PostgreSQL database world (local development)
  • ▲ Vercel: Vercel production/preview deployment
  • 🌐 Turso: Community world (local development)
  • 🌐 MongoDB: Community world (local development)
  • 🌐 Redis: Community world (local development)
  • 🌐 Jazz: Community world (local development)

📋 View full workflow run


// --- Max delivery check ---
// Enforce max delivery limit before any infrastructure calls.
// This prevents runaway steps from consuming infinite queue deliveries.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// This prevents runaway steps from consuming infinite queue deliveries.
// This prevents runaway steps from consuming infinite queue deliveries.
// At this point, we want to do the minimal amount of work (no fetching
// of the step details, etc. We simply attempt to mark the step as failed
// and enqueue the workflow once, and if either of those fails, the message
// is still consumed but with adequate logging that an error occurred.

Comment on lines +145 to +147
'Failed to post run_failed for max deliveries exceeded, consuming message anyway',
{
workflowRunId: runId,
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like the step error message, this should also be more verbose and explain that a persistent outage is preventing us from failing the run normally etc. etc.

EntityConflictError.is(err) ||
RunExpiredError.is(err)
) {
// Run already finished, consume the message
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// Run already finished, consume the message
// Run already finished, consume the message silently

return;
}
runtimeLogger.error(
'Failed to post run_failed for max deliveries exceeded, consuming message anyway',
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like the step error message, this should also be more verbose and explain that a persistent outage is preventing us from failing the run normally etc. etc.


// --- Max delivery check ---
// Enforce max delivery limit before any infrastructure calls.
// This prevents runaway workflows from consuming infinite queue deliveries.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// This prevents runaway workflows from consuming infinite queue deliveries.
// This prevents runaway workflows from consuming infinite queue deliveries.
// At this point, we want to do the minimal amount of work (no fetching
// of the workflow events, etc. We simply attempt to mark the run as failed
// and if that fails, the message is still consumed but with adequate logging
// that an error occurred preventing us from failing the run.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants