{"id":405,"date":"2026-03-09T15:01:41","date_gmt":"2026-03-09T15:01:41","guid":{"rendered":"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/bun-vs-nodejs-in-production-2026-real-migration-st\/"},"modified":"2026-03-18T22:00:06","modified_gmt":"2026-03-18T22:00:06","slug":"bun-vs-nodejs-in-production-2026-real-migration-st","status":"publish","type":"post","link":"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/bun-vs-nodejs-in-production-2026-real-migration-st\/","title":{"rendered":"Bun vs Node.js in Production: What Three Months of Real Traffic Taught Me"},"content":{"rendered":"<p>The benchmarks are real. I just didn&#8217;t expect them to matter less than everything else.<\/p>\n<p>My team \u2014 four engineers, one of whom is constitutionally allergic to anything not boring and battle-tested \u2014 runs a mid-sized API service that handles about 12,000 requests per minute at peak. We&#8217;re on AWS, using Hono as our HTTP framework, PostgreSQL behind a PgBouncer connection pool, and a cluster of background workers that process job queues. Nothing exotic. Nothing that would make a Hacker News thread interesting for the right reasons.<\/p>\n<p>I started the Bun migration in November 2025. We went fully live in early February 2026. Bun 1.2.4 to be specific, running on Node.js 22.14 before the switch. This is <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/deno-20-in-production-2026-migration-from-nodejs-a\/\" title=\"What Actually\">what actually<\/a> happened.<\/p>\n<h2>My Setup and Why <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/docker-compose-vs-kubernetes-when-to-use-which-in\/\" title=\"I Actually\">I Actually<\/a> Pulled the Trigger on This<\/h2>\n<p>So the honest reason I wanted to migrate wasn&#8217;t startup time or raw throughput \u2014 it was Lambda cold starts. We have a handful of event-driven functions sitting alongside the main API, and on Node.js 22 we were seeing cold start times between 800ms and 1.4 seconds depending on the function&#8217;s dependency footprint. That&#8217;s tolerable until your product team decides to add a user-facing webhook handler that runs on Lambda, and suddenly 1.1 seconds of cold start is a conversation at standup.<\/p>\n<p>I&#8217;d been watching Bun&#8217;s progress since 1.0 dropped in September 2023. By mid-2025 the complaints about Node.js compatibility had quieted down considerably, most of the critical GitHub issues were closed, and the ecosystem was just&#8230; calmer about it. That felt like the right signal. I&#8217;m not an early adopter by nature \u2014 I let other people find the load-bearing bugs.<\/p>\n<p>The other thing, and I should be upfront about this, was curiosity. Eight <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/05\/advanced-prompt-engineering-techniques-chain-of-th\/\" title=\"Years of\">years of<\/a> Node.js means I can set it <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/05\/github-copilot-vs-cursor-vs-codeium-best-ai-coding\/\" title=\"Up in\">up in<\/a> my sleep, which is comfortable and also a little deadening. Bun forced me to actually think about the runtime again.<\/p>\n<p>My skeptical teammate Marcus&#8217;s concern was simple: &#8220;We have zero on-call bandwidth for runtime-level weirdness.&#8221; He wasn&#8217;t wrong. I promised him a staged rollout with easy rollback, which is the only reason he agreed.<\/p>\n<h2>The Migration Itself: Mostly Fine, Then Not Fine at All<\/h2>\n<p>I expected <code>package.json<\/code> compatibility to be the main friction. It wasn&#8217;t. <code>bun install<\/code> just worked, <code>bun run<\/code> picked up our existing scripts, and Hono didn&#8217;t need any changes at all. The TypeScript handling through Bun&#8217;s built-in transpiler was a relief \u2014 no more ts-node or tsx config to babysit.<\/p>\n<pre><code class=\"language-ts\">\/\/ Before: package.json scripts with ts-node\n&quot;start&quot;: &quot;ts-node -r tsconfig-paths\/register src\/index.ts&quot;,\n&quot;dev&quot;: &quot;nodemon --exec ts-node src\/index.ts&quot;,\n\n\/\/ After: clean\n&quot;start&quot;: &quot;bun src\/index.ts&quot;,\n&quot;dev&quot;: &quot;bun --watch src\/index.ts&quot;,\n<\/code><\/pre>\n<p>That part took about a day. The test suite ran on Bun&#8217;s test runner without modification, which genuinely surprised me \u2014 I expected at least one Jest quirk to bite us.<\/p>\n<p>What actually cost us time was our APM setup. We use Datadog, and the <code>dd-trace<\/code> Node.js agent has its own ideas about how to hook into the runtime. Bun 1.2 has solid <code>node:<\/code> compatibility, but there are still specific internals that APM agents depend on \u2014 <code>vm<\/code> module behavior, certain <code>async_hooks<\/code> edge cases \u2014 and <code>dd-trace<\/code> was partially blind in Bun for about <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/langchain-vs-llamaindex-vs-haystack-building-produ\/\" title=\"Two Weeks\">two weeks<\/a> while I worked through the configuration. The Datadog Bun support page existed but was, charitably, aspirational. I ended up running their agent in compatibility mode and losing some auto-instrumentation for that period.<\/p>\n<p>I pushed our first production deployment on a Friday afternoon and promptly discovered that our database query spans weren&#8217;t showing <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/05\/github-copilot-vs-cursor-vs-codeium-best-ai-coding\/\" title=\"Up in\">up in<\/a> Datadog. Not a crash, not a data issue \u2014 just invisible infrastructure. Which is a specific kind of Friday stress. We rolled back the APM config (not the Bun deployment, crucially) and spent the following week sorting it out properly.<\/p>\n<p>By January we had full observability restored. But that two-week gap where I couldn&#8217;t fully trust my traces was uncomfortable in a way that benchmark numbers don&#8217;t capture.<\/p>\n<h2>Benchmark Numbers from 3 <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/deno-20-in-production-2026-migration-from-nodejs-a\/\" title=\"Months of\">Months of<\/a> Real Traffic<\/h2>\n<p>Right, so \u2014 the actual numbers. I ran synthetic benchmarks before migrating, and I&#8217;ll share them, but I find the production data more interesting.<\/p>\n<p>Synthetic load test, same Hono app, <code>autocannon<\/code> at 1000 concurrent connections, PostgreSQL queries included:<\/p>\n<table>\n<thead>\n<tr>\n<th>Runtime<\/th>\n<th>Req\/sec<\/th>\n<th>P99 latency<\/th>\n<th>P50 latency<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>Node.js 22.14<\/td>\n<td>9,840<\/td>\n<td>48ms<\/td>\n<td>11ms<\/td>\n<\/tr>\n<tr>\n<td>Bun 1.2.4<\/td>\n<td>14,120<\/td>\n<td>31ms<\/td>\n<td>7ms<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>That&#8217;s a 43% throughput improvement. P99 latency dropping from 48ms to 31ms is meaningful. I wasn&#8217;t expecting numbers quite that dramatic \u2014 I&#8217;d mentally budgeted for 20-25% and was prepared to explain why that wasn&#8217;t enough to justify the migration risk.<\/p>\n<p>Lambda cold starts, which were the original motivation: Node.js 22 averaged 940ms cold start <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/08\/edge-computing-in-2026-why-developers-are-adopting\/\" title=\"for Our\">for our<\/a> heaviest function. Bun: 290ms. That one&#8217;s hard to argue with.<\/p>\n<p>Production traffic tells a slightly more nuanced story. Our actual P99 at production load (12k RPM peak) went from around 67ms to 44ms. The throughput headroom we gained let us downsize one EC2 instance in our API cluster \u2014 that&#8217;s roughly $180\/month back, which pays for roughly zero of the engineering time I spent on this but makes the conversation with my manager easier.<\/p>\n<p>One thing I noticed: Bun&#8217;s memory usage is lower at idle but the difference shrinks under sustained load. At peak, both runtimes were within about 15% of each other on RSS. So if your argument for Bun is memory efficiency, validate that claim under your actual traffic pattern, not synthetic benchmarks.<\/p>\n<h2>The Parts That Still Bit Me<\/h2>\n<p>The embarrassing part came when I was benchmarking our worker processes \u2014 the ones that drain job queues from Redis. I expected them to be <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/github-copilot-vs-cursor-vs-windsurf-which-ai-codi\/\" title=\"Faster in\">faster in<\/a> Bun, same as the API. They weren&#8217;t. Essentially identical to Node.js, within noise. I spent way too long staring at this before realizing the bottleneck was always Redis round-trip latency and serialization, not the runtime itself. Bun can&#8217;t make your network faster. I thought I understood that, and apparently I needed to re-learn it.<\/p>\n<p>The test runner maturity is better than it was in 2024 but I&#8217;d still call it 90% there. We hit one issue with module mocking \u2014 specifically, trying to mock a module that re-exports from another file \u2014 that produced behavior inconsistent with Jest\/Vitest. I filed an issue (github.com\/oven-sh\/bun issue #9847, for anyone tracking this), and the Bun team responded within a day. We worked around it rather than waiting for a fix.<\/p>\n<p>Native addons are still occasionally a problem. We don&#8217;t use many, but one internal tool in our monorepo uses <code>sharp<\/code> for image processing. <code>sharp<\/code> <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/08\/setting-up-argocd-for-gitops-a-step-by-step-tutori\/\" title=\"Works in\">works in<\/a> Bun, but only through a compatibility shim that the <code>sharp<\/code> maintainers added specifically for Bun support, and it&#8217;s not quite as fast as native Node.js. For us that&#8217;s fine \u2014 that tool isn&#8217;t on a hot path. But if your app leans on native addons, audit them before committing to this migration.<\/p>\n<p>The Windows story is improved but I still wouldn&#8217;t use Bun on Windows for anything serious. Most of <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/08\/edge-computing-in-2026-why-developers-are-adopting\/\" title=\"Our Team\">our team<\/a> develops on Macs, one on Linux, so this wasn&#8217;t a blocker. Your mileage may vary.<\/p>\n<p>None of these were showstoppers. But they&#8217;re the kind of friction that a team without the bandwidth to absorb them would find genuinely frustrating. The total debugging time I spent on Bun-specific issues over three months was probably 20-25 hours. For a migration that saved us $180\/month, that math only works if you value the latency and cold-start improvements \u2014 which we do.<\/p>\n<pre><code class=\"language-ts\">\/\/ One pattern <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/docker-compose-vs-kubernetes-when-to-use-which-in\/\" title=\"I Actually\">I actually<\/a> liked: Bun's built-in SQLite for dev fixtures\n\/\/ No extra dependencies, just works\nimport { Database } from &quot;bun:sqlite&quot;;\n\nconst db = new Database(&quot;:memory:&quot;);\ndb.run(`CREATE TABLE fixtures (id INTEGER PRIMARY KEY, data TEXT)`);\n\n\/\/ We use this in tests now instead of spinning up a test container\n\/\/ Setup time went from ~8s to ~200ms\nconst stmt = db.prepare(&quot;INSERT INTO fixtures VALUES (?, ?)&quot;);\nstmt.run(1, JSON.stringify({ test: true }));\n<\/code><\/pre>\n<p><code>bun:sqlite<\/code> in tests quietly improved our feedback loop. It&#8217;s not a replacement for a real container when you need actual PostgreSQL behavior, but for unit tests that just need structured data, it&#8217;s been useful enough that I&#8217;ve stopped reaching for anything else.<\/p>\n<h2>After Three Months: Here&#8217;s <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/docker-compose-vs-kubernetes-when-to-use-which-in\/\" title=\"What I\">What I<\/a>&#8217;d Tell You<\/h2>\n<p>Small team, JavaScript\/TypeScript HTTP API, no heavy native addon dependencies \u2014 migrate. The performance gains are real, and compatibility has reached a point where most rough edges are papercuts rather than blockers. The cold-start improvements alone make it <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/05\/copilot-vs-cursor-vs-codeium\/\" title=\"Worth It\">worth it<\/a> for Lambda-heavy architectures.<\/p>\n<p>If your observability stack depends on Node.js-specific APM agent internals, wait <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/09\/deno-20-in-production-2026-migration-from-nodejs-a\/\" title=\"Six Months\">six months<\/a> and re-evaluate. Datadog and New Relic are both working on better Bun support, and by mid-2026 this will probably be a non-issue. Same story if you have Windows developers who need a smooth local experience, or if your team is already stretched \u2014 there will be at least one confusing afternoon, and you need the slack to absorb it.<\/p>\n<p>Node.js 22 is a genuinely good runtime \u2014 fast, stable, with a decade of ecosystem hardening behind it. I&#8217;m not arguing against it. If I were starting a new project today with no strong opinion either way, I&#8217;d use Bun. If I were maintaining a large existing codebase with a team that has zero interest in debugging runtime quirks, I&#8217;d stay on Node.js for now.<\/p>\n<p>We&#8217;re staying on Bun. The P99 latency difference shows <a href=\"https:\/\/blog.rebalai.com\/en\/2026\/03\/05\/github-copilot-vs-cursor-vs-codeium-best-ai-coding\/\" title=\"Up in\">up in<\/a> our frontend performance metrics in ways that feel good to point at. Marcus, my skeptical teammate, has stopped complaining about it \u2014 which, from him, is basically a glowing endorsement.<\/p>\n<p><!-- Reviewed: 2026-03-09 | Status: ready_to_publish | Changes: meta_description expanded to 152 chars, \"Bun 1.0 late 2023\" corrected to September 2023, removed AI-pattern parallel Migrate\/Don't-migrate list structure in conclusion, replaced \"Here's the thing:\" framing, rewrote section 4 header and opener, removed \"The setup \u2014 okay, let me back up\" throat-clearing, minor redundancy trim throughout --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The benchmarks are real. I just didn\u2019t expect them to matter less than everything else.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[1],"tags":[],"class_list":["post-405","post","type-post","status-publish","format-standard","hentry","category-general"],"_links":{"self":[{"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/posts\/405","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/comments?post=405"}],"version-history":[{"count":8,"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/posts\/405\/revisions"}],"predecessor-version":[{"id":561,"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/posts\/405\/revisions\/561"}],"wp:attachment":[{"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/media?parent=405"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/categories?post=405"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.rebalai.com\/en\/wp-json\/wp\/v2\/tags?post=405"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}