Article · 6 min read

Node.js vs Bun in 2026: Why We Picked Bun for New Projects

Node.js vs Bun in 2026: Why We Picked Bun for New Projects

 

We used Node.js for years. It's been good. Still is. But earlier this year, we started a new project and didn't reach for Node. We used Bun instead. Haven't gone back.

 

Not a "Node is dead" post. Node 22 is solid. But we're two people. Time spent configuring tools is time not spent building. Bun removes most of that.

Also, their mascot is a little steamed bun with a face. Like a cartoon bao. We're not saying a cute mascot influenced a serious engineering decision. We're not not saying it either.

Five tools vs one

A new Node project still needs a runtime, package manager, bundler, test runner, and a TypeScript setup. Five tools. Five config files. Five things that can fight each other.

Bun does all of it. One binary. No config files. Write code, run it.

What the world is reporting

These aren't our numbers. These are from dev blogs and benchmarks published through 2025-2026.

HTTP speed (requests per second)

  Bun Node.js
Simple server ~52,000-106,000 ~14,000-44,000
With Express/Hono ~92,000-99,000 ~38,000-41,000

2x to 4x faster depending on the test.

Startup time

Bun: 8-18 ms. Node: 55-120 ms. On AWS Lambda, that's a 35% difference in cold start costs.

Package install

bun install: ~2 seconds. npm install: ~18 seconds. Not a typo. 6-9x faster.

TypeScript

Bun runs .ts files directly. No ts-node, no config. 100 files in 50 ms vs 1,200 ms with ts-node.

Memory

Bun HTTP server: ~25-30 MB. Node HTTP server: ~50-60 MB. Half.

The honest part

Add a database, validation, auth, real business logic? The speed gap mostly disappears. A proper app benchmark showed both runtimes at roughly 12,000 req/sec. The database becomes the bottleneck, not the runtime.

Benchmarks are useful but they're not the whole story.

What we saw

We've shipped three projects on Bun + Hono since January. A client API, an ecommerce backend, and a full-stack app with SvelteKit. All deployed on our own servers.

We ran these tests on our production server: 4 vCPU, 8 GB RAM, Ubuntu 24.04. Bun 1.2.4, Node 22.14. All tests ran 5 times, numbers below are averages.

Test 1: Package install

Project with 847 dependencies.

Scenario npm Bun Difference
Cold install (no cache) 52.4s 8.1s 6.5x faster
Warm install (with cache) 38.7s 4.3s 9x faster
CI install (lockfile only) 41.2s 3.9s 10.5x faster

On our deploy pipeline, this alone saves about 40 seconds per deploy. We deploy 6-7 times a day during active work.

Test 2: HTTP throughput (real app)

Not a hello world. Actual routes with auth middleware, input validation, Postgres queries via Drizzle, and JSON response serialization. Tested with wrk, 100 concurrent connections, 30 seconds.

Metric Express + Node Hono + Bun Difference
Requests/sec 4,847 11,234 2.3x faster
Avg latency 20.6 ms 8.9 ms 2.3x lower
P99 latency 89 ms 34 ms 2.6x lower
Transfer/sec 1.82 MB 4.21 MB 2.3x more
Memory (RSS) 168 MB 94 MB 44% less

Same database, same queries, same response shape.

Test 3: Health check (no DB)

Raw runtime speed without the database being the bottleneck.

Setup Requests/sec
Node + Express 12,480
Bun + Hono 38,200
Bun.serve() (no framework) 61,400

3x with a framework. 5x without. These numbers look great but don't mean much for real apps. The DB is always the bottleneck.

Test 4: File I/O

Reading and parsing 5,000 JSON files from disk.

Runtime Time Difference
Node 842 ms -
Bun 287 ms 2.9x faster

We noticed this when building a static site generator for a client. The build went from "go get chai" to "already done."

Test 5: JSON serialization

2 MB JSON payload, 1,000 iterations.

Operation Node Bun Difference
JSON.stringify 3,240 ms 1,180 ms 2.7x faster
JSON.parse 2,870 ms 1,040 ms 2.8x faster

Matters a lot for APIs that move large payloads around.

Test 6: Test runner

Same test suite. 68 files, 412 tests. API routes and utility functions.

Runner Time Difference
Jest (via npx) 14.8s (includes 3.2s startup) -
bun test 1.9s 7.8x faster

Same test syntax. We didn't change a single test file.

Test 7: SvelteKit dev server

Metric Node Bun Difference
Cold start 1.82s 0.41s 4.4x faster
Hot reload (.svelte file save) 320-580 ms 80-120 ms ~4x faster

The hot reload difference is what you feel all day. Not the cold start.

Test 8: Docker image size

Same app, multi-stage build.

Setup Image size
Node 22 alpine + npm 287 MB
Bun alpine 134 MB
Bun static binary 98 MB

Smaller images, faster pulls, faster deploys. On a self-hosted server with limited bandwidth, this matters.

Test 9: WebSocket connections

Simple echo server. Measured max concurrent connections before performance degraded.

Metric Node (ws library) Bun (built-in) Difference
Stable connections 8,400 14,600 1.7x more
Memory at peak 312 MB 184 MB 41% less
Avg echo time 4.2 ms 1.8 ms 2.3x faster

Bun's WebSocket is built-in. No extra dependency needed.

Test 10: SQLite (built-in)

Bun has SQLite built into the runtime. No npm package needed.

Operation Node + better-sqlite3 Bun (bun:sqlite) Difference
Insert 100,000 rows 1,840 ms 680 ms 2.7x faster
Read 100,000 rows 920 ms 310 ms 3x faster
Mixed read/write (10,000 ops) 2,100 ms 740 ms 2.8x faster

We use SQLite for local caching and session storage on smaller projects. Not having to install a separate package for it is nice.

What broke

Two things across three projects.

sharp for image resizing didn't work with Bun's native addon support. Switched to sharp's Wasm build (sharp --platform=wasm), worked fine. Took about 45 minutes.

One project used crypto.createCipheriv with a specific padding mode that behaved differently under Bun's implementation. Took a couple hours to debug and find the workaround.

That's it. Hono, Prisma, Drizzle, SvelteKit, Zod, everything else worked without changing a single line.

Why we care

We switched because it made our work faster and our servers cheaper. Also we're a little nuts about performance. If you've seen our about page, you already know that.

Bun is backed by Anthropic now. Stripe, Midjourney use it in production. Claude Code runs on it. It's not going anywhere.

When we still use Node

If a client already has a big Node codebase, we're not rewriting it for no reason. If some specific C++ addon doesn't work with Bun, we use Node.

But for anything new, it's Bun.

Try it

Developer? Run bun init on your next project. You'll get it in five minutes.

Business owner? You don't need to care about runtime names. Just know we use whatever ships your project faster and cheaper. Right now, that's Bun.

Amit G

Written by

Amit G