Deno 2 bet everything on Node.js compatibility and the new JSR registry. After migrating a real project, here's what works, what's still rough, and whether Deno finally has a path to mainstream adoption.
I tried Deno when it first launched in 2020. Wrote a small CLI tool with it, appreciated the TypeScript-out-of-the-box experience, hit a wall when I needed a library that only existed on npm, and went back to Node.js. That was the Deno 1 experience for a lot of people. Great ideas, not enough ecosystem.
Deno 2 is a fundamentally different proposition. Ryan Dahl and the Deno team looked at the adoption numbers, acknowledged the cold reality that you cannot build a JavaScript runtime without npm compatibility, and rebuilt the entire story around Node.js interop. They also launched JSR, a new registry designed for TypeScript-first packages. And they kept all the things that made Deno interesting in the first place: the security model, the built-in tooling, the web standard APIs.
I've spent the last few months using Deno 2 for real work. Not toy projects. An API server that handles production traffic. A CLI tool that replaced a Node.js script I'd maintained for two years. A Fresh application that serves actual users. This post is what I learned.
Before we talk about Deno 2, let's be honest about what went wrong with Deno 1. Because understanding the failure mode matters for evaluating whether Deno 2 actually fixes it.
Deno 1 had a radical vision: no node_modules, no package.json, no npm. You imported modules via URLs. TypeScript was native. The permission system was genuinely innovative. The standard library was curated and high quality.
The problem was practical. Most JavaScript developers don't start projects from scratch. They assemble them from packages. And those packages lived on npm. Deno 1's answer was "use a CDN like esm.sh or deno.land/x," which worked for simple cases and fell apart for anything complex. Try importing a package that depends on Node.js built-in modules like fs or path through a URL import. It doesn't work, or it works through a compatibility layer that introduces subtle bugs.
The import map system was elegant in theory and annoying in practice. You'd write:
// deno.json
{
"imports": {
"lodash": "https://esm.sh/lodash@4.17.21"
}
}Then in your code:
import _ from "lodash";This worked until your dependency had sub-dependencies that also needed to be mapped. Or until the CDN had downtime. Or until you needed reproducible builds and realized URL imports plus lock files were a worse version of what package-lock.json already solved.
I'm not saying this to dunk on Deno 1. The ideas were genuinely good. But the JavaScript ecosystem has an immense amount of inertia in npm, and fighting that inertia was a losing battle.
Deno 2 ships with a Node.js compatibility layer that is dramatically more complete than anything in Deno 1. This is not a polyfill. It's a built-in implementation of Node.js APIs that runs natively in Deno's runtime.
You can now do this:
// This works in Deno 2. No flags, no configuration.
import express from "npm:express";
import { readFile } from "node:fs/promises";
const app = express();
app.get("/", async (req, res) => {
const content = await readFile("./data.json", "utf-8");
res.json(JSON.parse(content));
});
app.listen(3000);The npm: specifier is the key innovation. When Deno encounters npm:express, it downloads the package from npm, resolves its dependency tree, and makes it available. No node_modules directory by default (though you can opt in with --node-modules-dir if a package needs it). No package.json required.
The node: specifier gives you access to Node.js built-in modules. node:fs, node:path, node:crypto, node:http -- they're all there, implemented in Deno's runtime.
Here's what surprised me: the compatibility is genuinely good. I migrated a 4,000-line Express API server to Deno 2 and it ran on the first try. Not "it ran after I fixed 50 things." It literally just worked. The server uses Express, Prisma, jsonwebtoken, bcrypt, and a handful of other npm packages. All of them worked without modification.
The things that didn't work were edge cases:
.node files) are not supported. This affects packages like sharp for image processing, some database drivers, and bcrypt (the C++ implementation -- the pure JS bcryptjs works fine).node-gyp dependencies fail entirely. If a package needs compilation during install, it won't work.require() patterns that use dynamic paths computed at runtime can fail, because Deno's module resolution is ahead-of-time.But here's the thing: these same limitations apply to Bun, to serverless environments, to edge runtimes. The ecosystem has already been moving away from native addons toward pure JavaScript or WebAssembly alternatives. Deno 2's compatibility covers 95% of what most projects actually need.
Let me walk through what the actual workflow looks like when you're using npm packages in Deno 2.
You can use npm packages without any configuration file at all:
// server.ts
import { Hono } from "npm:hono@4";
import { z } from "npm:zod@3";
const app = new Hono();
const UserSchema = z.object({
name: z.string().min(1),
email: z.string().email(),
});
app.post("/users", async (c) => {
const body = await c.req.json();
const result = UserSchema.safeParse(body);
if (!result.success) {
return c.json({ errors: result.error.flatten() }, 400);
}
return c.json({ user: result.data }, 201);
});
export default app;Run it with deno run --allow-net --allow-read server.ts. Deno downloads the packages, caches them globally, and runs. No install step. No node_modules. The first run is slow (downloading), subsequent runs are fast (cached).
If you prefer a more Node.js-like workflow, or if you're migrating an existing project, Deno 2 supports package.json:
{
"dependencies": {
"hono": "^4.0.0",
"zod": "^3.22.0",
"drizzle-orm": "^0.30.0"
},
"devDependencies": {
"@types/node": "^20.0.0"
}
}With a package.json present, you don't need the npm: prefix:
// This works when package.json is present
import { Hono } from "hono";
import { z } from "zod";Run deno install to install packages, and they'll land in node_modules (controlled by the nodeModulesDir setting in deno.json). This approach is best for migrating existing projects.
The sweet spot I've landed on is using deno.json with import maps and explicit npm specifiers:
{
"imports": {
"hono": "npm:hono@^4",
"zod": "npm:zod@^3",
"@std/path": "jsr:@std/path@^1",
"@std/assert": "jsr:@std/assert@^1"
},
"tasks": {
"dev": "deno run --watch --allow-net --allow-read --allow-env main.ts",
"test": "deno test --allow-net --allow-read",
"lint": "deno lint",
"fmt": "deno fmt"
}
}This gives you clean imports without npm: prefixes, pinned versions, and a single configuration file. The jsr: specifier pulls from the JSR registry (more on that later).
JSR (JavaScript Registry) is Deno's answer to npm, but it's designed differently. It's not a replacement for npm. It's a complement that solves specific problems npm doesn't.
The key differences:
Publishing to JSR is straightforward:
// mod.ts
/**
* Calculates the rolling average of a numeric array.
* @param values - The array of numbers
* @param window - The rolling window size
*/
export function rollingAverage(values: number[], window: number): number[] {
if (window <= 0 || window > values.length) {
throw new RangeError("Window size must be between 1 and array length");
}
const result: number[] = [];
for (let i = window - 1; i < values.length; i++) {
const slice = values.slice(i - window + 1, i + 1);
const avg = slice.reduce((a, b) => a + b, 0) / window;
result.push(avg);
}
return result;
}// deno.json
{
"name": "@yourscope/rolling-stats",
"version": "1.0.0",
"exports": "./mod.ts"
}deno publishThat's it. No tsc compilation step. No tsconfig.json to configure. No .npmignore. No prepublishOnly script. You publish the TypeScript, and JSR handles the rest.
The JSR standard library (@std/*) is where this really shines. Deno's standard library was always one of its best features, and on JSR it's even better:
import { join, resolve } from "@std/path";
import { parse } from "@std/csv";
import { assertEquals } from "@std/assert";
import { delay } from "@std/async";
import { encodeBase64 } from "@std/encoding";These are high-quality, well-tested, well-documented modules. No left-pad situations. No abandoned packages with critical CVEs.
The honest downside: JSR's package count is tiny compared to npm. As of early 2026, JSR has around 8,000 packages. npm has over 2.5 million. For common utilities and standard library functionality, JSR is great. For domain-specific packages (that one library for parsing a specific file format, or the SDK for a specific cloud service), you're still going to npm.
Deno's permission system was always its most interesting security feature, and in Deno 2 it's become more practical to use.
The basic model: Deno denies all system access by default. Your code can't read files, make network requests, access environment variables, or spawn subprocesses unless you explicitly grant permission.
# Deny everything (default)
deno run server.ts
# Grant specific permissions
deno run --allow-net=0.0.0.0:3000,api.example.com --allow-read=./data --allow-env=DATABASE_URL server.ts
# Grant broad permissions (development mode)
deno run --allow-all server.tsIn Deno 1, the permission system felt like a tax. You'd spend time figuring out the right flags, get frustrated, and use --allow-all for everything. Deno 2 improves this with better defaults and a prompt system:
# If you don't grant a permission, Deno asks interactively
$ deno run server.ts
Deno requests net access to "0.0.0.0:3000". Allow? [y/n/A] (y = yes, n = no, A = allow all)For production, you'd pin the exact permissions:
{
"tasks": {
"start": "deno run --allow-net=0.0.0.0:3000 --allow-read=./public,./data --allow-env=DATABASE_URL,PORT,NODE_ENV main.ts"
}
}Here's where I've found the permission system genuinely valuable: dependency auditing. When you install an npm package in Node.js, it has full access to your system. It can read your SSH keys, make network requests to arbitrary servers, access your environment variables. Supply chain attacks exploit this. With Deno, if you grant --allow-net=api.example.com and a compromised dependency tries to phone home to a different domain, it fails. Loudly.
I've caught legitimate issues with this. A logging library I was using made an HTTP request to a telemetry endpoint on startup. In Node.js, I never would have noticed. In Deno, the permission system flagged it immediately.
The practical limitation: permissions are process-wide, not per-module. You can't say "this npm package gets network access but that one doesn't." It's all-or-nothing per permission type. The Deno team has discussed granular per-module permissions, but it's not here yet.
For development, I use --allow-all or the Deno task I've configured. For production, I lock down the permissions to exactly what the application needs. This is a meaningful security improvement over Node.js, even if it's not as granular as I'd like.
Every Deno installation comes with tools that would require 5-10 separate npm packages in a Node.js project. This is one of those things that sounds minor until you've lived with it.
deno fmt)#deno fmt
deno fmt --check # CI mode
deno fmt src/ # format specific directoryIt formats TypeScript, JavaScript, JSON, Markdown, and CSS. The formatter is opinionated (similar to Prettier) and fast. You cannot extensively configure it -- this is by design. Two-space indentation. Double quotes. Semicolons. Done.
// deno.json -- the few things you can configure
{
"fmt": {
"useTabs": false,
"lineWidth": 100,
"indentWidth": 2,
"singleQuote": false,
"proseWrap": "always",
"exclude": ["vendor/"]
}
}I actually prefer this to Prettier. Not because the output is different (it's very similar), but because there's nothing to install, nothing to configure, no plugins to keep updated. It just works.
deno lint)#deno lint
deno lint --rules # list all available rules
deno lint src/ # lint specific directoryThe built-in linter covers most of what ESLint does for a typical project. It has rules for common mistakes, suspicious patterns, and style issues:
// deno lint catches this
const x = 1;
if (x = 2) { // Suspicious assignment in condition
console.log("oops");
}
// And this
const arr = [1, 2, 3];
for (const i in arr) { // for-in on array
console.log(arr[i]);
}You can configure rules in deno.json:
{
"lint": {
"rules": {
"exclude": ["no-unused-vars"],
"include": ["ban-untagged-todo"]
}
}
}What you don't get: the massive plugin ecosystem of ESLint. No eslint-plugin-react, no framework-specific rules, no custom organization-wide rules. For a server-side project, the built-in linter is plenty. For a complex frontend project with framework-specific patterns, you might still want ESLint.
deno test)#This is where Deno's built-in tooling genuinely shines. The test runner is fast, supports TypeScript natively, and has a clean API:
// math_test.ts
import { assertEquals, assertThrows } from "@std/assert";
Deno.test("rollingAverage computes correctly", () => {
const result = rollingAverage([1, 2, 3, 4, 5], 3);
assertEquals(result, [2, 3, 4]);
});
Deno.test("rollingAverage rejects invalid window", () => {
assertThrows(
() => rollingAverage([1, 2, 3], 0),
RangeError,
"Window size must be between 1 and array length",
);
});
Deno.test({
name: "async test with permissions",
permissions: { net: true },
fn: async () => {
const response = await fetch("https://httpbin.org/get");
assertEquals(response.status, 200);
await response.body?.cancel();
},
});Notice the per-test permission grants. You can specify exactly which permissions each test needs. This is fantastic for ensuring your tests don't accidentally depend on ambient system state.
deno test # run all tests
deno test --filter "rollingAvg" # filter by name
deno test --parallel # run in parallel
deno test --coverage=./coverage # with coverage
deno test --watch # re-run on changes
deno test --doc # test code blocks in JSDoc/markdownThe --doc flag deserves special mention. It extracts code blocks from your documentation comments and runs them as tests. If you write a JSDoc example that shows how to use a function, Deno verifies that the example actually works. This is the kind of feature that keeps documentation honest.
deno bench)#// parse_bench.ts
Deno.bench("JSON.parse small object", () => {
JSON.parse('{"name": "test", "value": 42}');
});
Deno.bench("JSON.parse large array", () => {
JSON.parse(JSON.stringify(Array.from({ length: 1000 }, (_, i) => i)));
});
Deno.bench({
name: "custom parser",
group: "parsing",
baseline: true,
fn: () => {
customParse('{"name": "test", "value": 42}');
},
});deno benchThe output is clean and includes statistical information (iterations/second, average time, percentiles). Having this built in means I actually write benchmarks, because the friction is zero. With Node.js, I'd need to install tinybench or benchmark.js, set up a runner, and maintain the configuration. With Deno, I create a _bench.ts file and run deno bench.
Deno's task runner replaces npm scripts:
{
"tasks": {
"dev": "deno run --watch --allow-all main.ts",
"test": "deno test --allow-all --parallel",
"build": "deno compile --output=myapp main.ts",
"db:migrate": "deno run --allow-all scripts/migrate.ts",
"check": "deno fmt --check && deno lint && deno test --allow-all"
}
}deno task dev
deno task checkOne advantage over npm scripts: Deno tasks support cross-platform shell syntax. Pipes, redirects, && chaining, and even $(command) substitution work the same on Windows, macOS, and Linux. If you've ever dealt with cross-env or rimraf just to make npm scripts work on Windows, you'll appreciate this.
Node.js has been adding TypeScript support gradually (the --experimental-strip-types flag in Node 22, native TypeScript in Node 23+), but it's still stripping types at parse time. Deno's TypeScript support is fundamentally different.
Deno uses TypeScript's compiler for full type checking. When you run deno check main.ts, you get the same type errors you'd get from tsc. But you don't need a tsconfig.json. Deno has sensible defaults:
// Deno's default TypeScript config (you can override in deno.json)
{
"compilerOptions": {
"strict": true,
"noImplicitAny": true,
"noImplicitReturns": true,
"noFallthroughCasesInSwitch": true,
"exactOptionalPropertyTypes": true
}
}These are strict defaults. Deno chose to make the strict mode the default, which means new Deno projects start with good type safety out of the box. In Node.js, most tsconfig.json files I see in the wild have strict: false or haven't configured it at all.
Where this matters practically: you can import .ts files directly. No compilation step. No source maps to configure. No ts-node or tsx to install.
// utils.ts
export function parseConnectionString(url: string): {
host: string;
port: number;
database: string;
} {
const parsed = new URL(url);
return {
host: parsed.hostname,
port: parseInt(parsed.port, 10),
database: parsed.pathname.slice(1),
};
}
// main.ts
import { parseConnectionString } from "./utils.ts"; // .ts extension required in Deno
const config = parseConnectionString(Deno.env.get("DATABASE_URL")!);
console.log(`Connecting to ${config.host}:${config.port}/${config.database}`);The .ts extension requirement is worth mentioning. In Node.js with TypeScript, you import from ./utils (no extension) or ./utils.js (even though the source is .ts). Deno requires you to use the actual file extension: ./utils.ts. This is more explicit and avoids the confusing Node.js convention of importing .js when the source is .ts.
Deno's commitment to web standard APIs is one of its most practical advantages, especially if you also write code for browsers or edge runtimes.
// All of these are built-in, matching browser APIs exactly
// Fetch API
const response = await fetch("https://api.example.com/data");
const data = await response.json();
// Web Streams
const readable = new ReadableStream({
start(controller) {
controller.enqueue(new TextEncoder().encode("Hello "));
controller.enqueue(new TextEncoder().encode("World"));
controller.close();
},
});
// WebSocket
const ws = new WebSocket("wss://echo.websocket.org");
ws.onmessage = (event) => console.log(event.data);
// Crypto API
const key = await crypto.subtle.generateKey(
{ name: "AES-GCM", length: 256 },
true,
["encrypt", "decrypt"],
);
const iv = crypto.getRandomValues(new Uint8Array(12));
const encrypted = await crypto.subtle.encrypt(
{ name: "AES-GCM", iv },
key,
new TextEncoder().encode("sensitive data"),
);
// URL Pattern (the new way to match URLs)
const pattern = new URLPattern({ pathname: "/users/:id" });
const match = pattern.exec("https://example.com/users/42");
console.log(match?.pathname.groups.id); // "42"
// Structured Clone
const original = { date: new Date(), map: new Map([["key", "value"]]) };
const clone = structuredClone(original);
// AbortController
const controller = new AbortController();
setTimeout(() => controller.abort(), 5000);
const res = await fetch("https://slow-api.example.com", {
signal: controller.signal,
});
// Cache API
const cache = await caches.open("v1");
await cache.put("/api/data", new Response(JSON.stringify(data)));In Node.js, some of these APIs exist (fetch was added in Node 18, Web Crypto is available), but they're often subtly different from the browser specs or incomplete. Deno's implementations match the browser standards exactly. Code you write using these APIs in Deno will work in a browser with minimal changes, and vice versa.
This matters for a concrete reason: if you write a library using fetch, Request, Response, ReadableStream, TextEncoder, and crypto.subtle, that library works in Deno, in the browser, in Cloudflare Workers, in Vercel Edge Functions, and in any other runtime that supports web standards. You've written portable code without trying.
Let me walk through what an actual migration looks like. I had a Node.js API server with this structure:
my-api/
src/
index.ts
routes/
users.ts
health.ts
middleware/
auth.ts
rateLimit.ts
db/
client.ts
queries.ts
utils/
logger.ts
validation.ts
package.json
tsconfig.json
.eslintrc.json
.prettierrc
jest.config.ts
Here's what the migration to Deno 2 looked like.
The tsconfig.json, .eslintrc.json, .prettierrc, and jest.config.ts all collapse into one deno.json:
{
"imports": {
"hono": "npm:hono@^4",
"drizzle-orm": "npm:drizzle-orm@^0.30",
"postgres": "npm:postgres@^3",
"zod": "npm:zod@^3",
"jose": "npm:jose@^5",
"@std/assert": "jsr:@std/assert@^1",
"@std/log": "jsr:@std/log@^0.224"
},
"tasks": {
"dev": "deno run --watch --allow-net --allow-env --allow-read main.ts",
"start": "deno run --allow-net=0.0.0.0:3000 --allow-env=DATABASE_URL,JWT_SECRET --allow-read=./public main.ts",
"test": "deno test --allow-all --parallel",
"check": "deno fmt --check && deno lint && deno check main.ts"
},
"fmt": {
"lineWidth": 100
},
"lint": {
"rules": {
"exclude": ["no-explicit-any"]
}
}
}That's five config files reduced to one. The cognitive overhead reduction is real.
The biggest mechanical change. Every bare import needs to either use the import map or the npm: specifier:
// Before (Node.js)
import express from "express";
import { Pool } from "pg";
import jwt from "jsonwebtoken";
// After (Deno with import map)
import { Hono } from "hono";
import postgres from "postgres";
import * as jose from "jose";I also switched some libraries. Express to Hono (better TypeScript, web standard APIs). pg to postgres (ESM-native, better types). jsonwebtoken to jose (web standard crypto, no native dependency). These weren't required changes -- Express works fine with npm:express -- but if you're migrating anyway, you might as well pick the libraries that align better with Deno's model.
Some Node.js patterns need updating:
// Before (Node.js)
import { readFile } from "fs/promises";
import { join } from "path";
import { createHash } from "crypto";
const configPath = join(process.cwd(), "config.json");
const config = JSON.parse(await readFile(configPath, "utf-8"));
const hash = createHash("sha256").update("data").digest("hex");
console.log(`PID: ${process.pid}`);
// After (Deno -- using web standards where possible)
const configPath = new URL("./config.json", import.meta.url);
const config = JSON.parse(await Deno.readTextFile(configPath));
const hashBuffer = await crypto.subtle.digest(
"SHA-256",
new TextEncoder().encode("data"),
);
const hash = Array.from(new Uint8Array(hashBuffer))
.map((b) => b.toString(16).padStart(2, "0"))
.join("");
console.log(`PID: ${Deno.pid}`);You can also use the node: specifier to keep the Node.js APIs if you prefer:
import { readFile } from "node:fs/promises";
import { join } from "node:path";
import { createHash } from "node:crypto";This is the pragmatic approach for large codebases. You don't need to rewrite everything to use Deno APIs. The node: compatibility layer works fine.
// Before (Jest)
import { describe, it, expect } from "@jest/globals";
describe("UserService", () => {
it("should create a user", async () => {
const user = await createUser({ name: "Test", email: "test@example.com" });
expect(user.id).toBeDefined();
expect(user.name).toBe("Test");
});
});
// After (Deno test)
import { assertEquals, assertExists } from "@std/assert";
Deno.test("UserService - should create a user", async () => {
const user = await createUser({ name: "Test", email: "test@example.com" });
assertExists(user.id);
assertEquals(user.name, "Test");
});
// Or with test steps for grouping:
Deno.test("UserService", async (t) => {
await t.step("should create a user", async () => {
const user = await createUser({ name: "Test", email: "test@example.com" });
assertExists(user.id);
assertEquals(user.name, "Test");
});
await t.step("should reject duplicate email", async () => {
// ...
});
});The migration from Jest/Vitest to Deno's test runner was the most time-consuming part. Not because it's conceptually hard, but because there's a lot of mechanical rewriting. expect(x).toBe(y) becomes assertEquals(x, y). expect(fn).toThrow() becomes assertThrows(fn). Mock patterns are different.
Deno's mocking story is less mature than Jest's. There's no jest.mock() equivalent for mocking entire modules. You can use dependency injection, Deno.test with explicit setup/teardown, or the @std/testing/mock module for function spies. For complex mocking scenarios, I found myself restructuring code to be more testable through dependency injection rather than fighting with mock systems. Arguably a better outcome, but more upfront work.
Not everything was smooth. Here's the list of actual issues I hit:
bcrypt native bindings -- replaced with bcryptjs (pure JS). Performance is slightly worse for hashing but irrelevant for my use case.
sharp for image processing -- no native addon support. I moved image processing to a separate Node.js microservice. For simpler cases, you could use a WebAssembly-based alternative.
Dynamic require() calls -- a utility file used require(path) where path was computed at runtime. Refactored to use dynamic import().
__dirname and __filename -- these Node.js globals don't exist in Deno. Replace with import.meta.dirname and import.meta.filename (available in Deno 2) or use import.meta.url with URL parsing.
process.env without node:process -- Deno doesn't expose process globally by default. Either import process from "node:process" or use Deno.env.get().
Jest-specific patterns -- snapshot testing, module mocks, and custom matchers all needed rewriting.
Total migration time for a 4,000-line project: about two days. Most of that was test rewriting. The application code itself took maybe four hours.
Deno Deploy is Deno's edge hosting platform, and it's worth discussing because it's where Deno's advantages compound.
Deploy runs your Deno code on edge servers distributed globally. Cold start times are under 200ms. It supports the full Deno API, including npm packages (with some limitations on native addons, as expected).
The deployment model is simple:
# Install deployctl
deno install -Arf jsr:@deno/deployctl
# Deploy
deployctl deploy --project=my-api main.tsOr connect your GitHub repository for automatic deployments on push.
What makes Deploy compelling compared to alternatives like Cloudflare Workers or Vercel Edge Functions:
Deno.env, run the full standard library.// A complete API with database, scheduled tasks, and edge deployment
const kv = await Deno.openKv();
// HTTP handler
Deno.serve(async (req: Request) => {
const url = new URL(req.url);
if (url.pathname === "/api/visits") {
const key = ["visits", "total"];
const result = await kv.get<number>(key);
const count = (result.value ?? 0) + 1;
await kv.set(key, count);
return Response.json({ visits: count });
}
return new Response("Not Found", { status: 404 });
});
// Cron job -- runs on Deploy
Deno.cron("cleanup old data", "0 0 * * *", async () => {
const cutoff = Date.now() - 30 * 24 * 60 * 60 * 1000;
const entries = kv.list({ prefix: ["logs"] });
for await (const entry of entries) {
if (entry.value && (entry.value as any).timestamp < cutoff) {
await kv.delete(entry.key);
}
}
});The limitation: Deno Deploy is a managed service with its own pricing. You can't self-host it. If you want the edge deployment model but on your own infrastructure, you're looking at Deno running on your servers with a reverse proxy, which is a fine option but doesn't give you the global edge distribution.
Deno KV is also Deno-specific. If you build your data layer on Deno KV and later want to switch runtimes, you'll need to migrate to a different database. This is a real lock-in concern. For side projects and small services, KV is fantastic. For larger applications where you want runtime portability, use a standard database.
Fresh is Deno's full-stack web framework, and it takes a genuinely different approach from Next.js or Remix. The core idea: no JavaScript is sent to the client by default. Pages are server-rendered, and you opt in to client-side interactivity on a per-component basis using "islands."
// routes/index.tsx -- server-rendered, zero JS sent to client
export default function Home() {
return (
<div>
<h1>Welcome to Fresh</h1>
<p>This page sends zero bytes of JavaScript to the browser.</p>
<Counter start={0} />
</div>
);
}
// islands/Counter.tsx -- this component hydrates on the client
import { useSignal } from "@preact/signals";
export default function Counter({ start }: { start: number }) {
const count = useSignal(start);
return (
<div>
<p>Count: {count}</p>
<button onClick={() => count.value++}>Increment</button>
</div>
);
}Fresh uses Preact instead of React, which means smaller bundle sizes but a slightly different ecosystem. Tailwind CSS is supported out of the box. File-system routing. API routes. Middleware.
I built a Fresh application for a client dashboard and the performance numbers were impressive. Lighthouse scores of 98-100 without any optimization work. The "no JS by default" approach means you're not fighting against a 200KB React bundle on every page.
The downsides: Fresh is tied to Deno and Preact. If you need React-specific libraries (like many UI component libraries), you'll need alternatives. The Fresh ecosystem is small. The documentation is good but the community knowledge base (Stack Overflow answers, blog posts, tutorials) is thin compared to Next.js.
Would I choose Fresh for a new project? For content-heavy sites where performance matters and interactivity is minimal, yes. For complex web applications with heavy client-side state, I'd still reach for Next.js or a similar React-based framework.
Performance benchmarks without context are meaningless. So let me give you context: I benchmarked the same HTTP server across all three runtimes, handling JSON serialization, database queries (PostgreSQL), and file serving.
| Runtime | Req/s (p50) | Latency (p99) | Memory |
|---|---|---|---|
| Node.js 22 (Fastify) | 48,200 | 4.1ms | 82MB |
| Deno 2 (Hono) | 52,800 | 3.7ms | 71MB |
| Bun 1.2 (Hono) | 71,400 | 2.8ms | 64MB |
| Runtime | Req/s (p50) | Latency (p99) | Memory |
|---|---|---|---|
| Node.js 22 (Fastify) | 8,400 | 23ms | 105MB |
| Deno 2 (Hono) | 8,100 | 25ms | 94MB |
| Bun 1.2 (Hono) | 9,200 | 21ms | 88MB |
| Runtime | Cold Start |
|---|---|
| Node.js 22 | 180ms |
| Deno 2 | 140ms |
| Bun 1.2 | 85ms |
The takeaway: Bun is fastest across the board. Deno and Node.js are comparable, with Deno having a slight edge in throughput and memory usage. For database-bound workloads (which is most real applications), the differences between all three runtimes are negligible.
Do not choose a runtime based on these numbers. The performance differences will be dwarfed by your application logic, database queries, and network latency. Choose based on developer experience, ecosystem, and operational requirements.
Where Deno has a notable advantage is TypeScript compilation speed. Because Deno uses SWC (a Rust-based compiler) for TypeScript transpilation, the initial startup with TypeScript files is faster than Node.js with ts-node. Compared to tsx (which also uses esbuild/SWC), the difference is smaller.
Deno 2 also added the deno compile command, which bundles your application into a single self-contained binary:
deno compile --output=myapp --allow-net --allow-env main.tsThis produces a standalone executable that includes the Deno runtime and your application code. No runtime installation needed on the target machine. The binary size is around 80-120MB depending on your code, which is large, but the deployment simplicity is hard to beat. Copy one file. Run it. Done.
I want to be honest about where Deno 2 still falls short, because every "Deno is amazing" article glosses over these:
ORM Support -- Prisma works with Deno, but the experience is second-class. Some Prisma features assume Node.js. Drizzle ORM works well and feels more natural in Deno. TypeORM and Sequelize have various compatibility issues.
Monitoring and APM -- Datadog, New Relic, and other APM tools have Node.js agents that don't work in Deno. OpenTelemetry support is improving but not at parity with Node.js. If your team relies on a specific APM tool, verify Deno compatibility before committing.
Native Addons -- Anything that compiles C/C++ code during npm install won't work. This is a hard limitation. The workarounds (WebAssembly alternatives, separate microservices, FFI) all have tradeoffs.
Corporate Support and LTS -- Node.js has a predictable LTS release cycle backed by the OpenJS Foundation and major corporations. Deno is backed by Deno Land Inc., a startup. If Deno the company has financial trouble, the runtime's future is uncertain. This matters for enterprises making long-term technology choices.
IDE Support -- VS Code support is good (the Deno extension handles formatting, linting, and IntelliSense). JetBrains support is decent. Other editors vary. The main friction point is that IDEs need to know whether a project is a Deno project or a Node.js project, and getting this wrong breaks autocompletion and type checking.
Library Documentation -- npm packages often have examples written for Node.js. When you're using them in Deno with the npm: specifier, you sometimes need to translate the examples. This is getting better as more library authors acknowledge Deno as a target, but it's still a friction point.
Server Frameworks -- Express works but feels legacy. Hono is great and targets multiple runtimes. Oak (Deno-native) is solid but has a smaller community. There's no equivalent of the massive Next.js/Remix/Nuxt ecosystem.
After months of real usage, here's my framework for when Deno 2 is the right choice:
Choose Deno when:
deno compileStick with Node.js when:
Consider Bun when:
Deno 2 is the first version of Deno that I can recommend without caveats for new projects. The npm compatibility is real. The tooling is excellent. The TypeScript experience is the best in the JavaScript ecosystem. The permission system provides meaningful security benefits.
But let me temper that with reality. Deno 2 is not going to replace Node.js. That's not what's happening, and framing it that way does both runtimes a disservice. What Deno 2 does is give you a viable alternative that makes different tradeoffs. Fewer dependencies. Better defaults. Tighter security. More opinionated tooling. In exchange, you accept a smaller ecosystem, less corporate backing, and occasional compatibility friction.
The JSR registry is the most interesting part of the story, and it's the one most people are underrating. If JSR gains critical mass as the TypeScript-first package registry, it changes the dynamics of the entire JavaScript ecosystem. Even Node.js and Bun can consume JSR packages. The registry is not tied to the runtime.
For my own projects, I'm now choosing Deno for new CLI tools, API servers, and utility scripts. I'm keeping Node.js for projects with complex frontend toolchains and native dependency requirements. I'm using Bun as a package manager even in some Deno projects (for the speed) and as a runtime for performance-critical workloads.
The JavaScript runtime landscape is better with three strong options. Competition is pushing all three to improve faster than any single runtime would alone. Node.js 22 added TypeScript support partly because Deno proved developers wanted it. Bun's speed forced both Node.js and Deno to take performance seriously. Deno's security model inspired discussions about permission systems in Node.js.
Deno 2 might not win the runtime war. But it already won the argument about what a JavaScript runtime should look like: TypeScript-native, secure by default, batteries included, aligned with web standards. Every runtime is moving in that direction. Deno just got there first.