Skip to main content

ofetch vs undici in 2026: Server-Side HTTP Compared

·PkgPulse Team
0

TL;DR

ofetch for universal Nuxt/Nitro apps; undici for maximum Node.js HTTP performance. ofetch (~5M weekly downloads) is the unified HTTP client from the Nuxt team — works in browsers, Node.js, and edge runtimes with consistent behavior. undici (~18M downloads) is Node.js's official HTTP/1.1 client built for performance — faster than Node's built-in http module and the basis for native fetch in Node.js 18+.

Key Takeaways

  • undici: ~18M weekly downloads — ofetch: ~5M (npm, March 2026)
  • undici is Node.js official — included as the basis for node:fetch
  • ofetch works universally — browser, Node.js, and edge runtimes
  • undici is faster — designed for high-throughput Node.js servers
  • ofetch has better DX — automatic JSON, retries, error handling

What Each Is For

undici:
  - Node.js official HTTP client (from Node.js core team)
  - Maximum HTTP performance for Node.js servers
  - Low-level control over connection pools, pipelining
  - Powers Node.js 18+ built-in fetch
  - Best for: high-throughput server-to-server requests

ofetch:
  - Universal fetch wrapper from Nuxt team
  - Consistent behavior across environments
  - Used internally by Nuxt, Nitro, and H3
  - Smart defaults (JSON, error handling, retries)
  - Best for: universal apps, Nuxt projects, readable code

The key distinction is where each library targets. undici is a Node.js-specific library — it uses Node.js stream APIs and connection pooling features that don't exist in browsers or edge runtimes. ofetch is environment-agnostic — it uses the Fetch API standard and adds developer-friendly defaults on top, running identically whether you're in a browser, Cloudflare Workers, or Node.js.


API Comparison

// undici — low-level but fast
import { fetch, request, pipeline, stream } from 'undici';

// Using undici's high-performance request
const { statusCode, body } = await request('https://api.example.com/users', {
  method: 'GET',
  headers: { Authorization: `Bearer ${token}` },
});
const data = await body.json();

// Connection pool for high-throughput (key undici advantage)
import { Pool } from 'undici';
const pool = new Pool('https://api.example.com', {
  connections: 10,       // 10 concurrent connections
  pipelining: 3,         // HTTP pipelining
});

// 100 requests reusing the pool:
const results = await Promise.all(
  ids.map(id => pool.request({ path: `/users/${id}`, method: 'GET' })
    .then(({ body }) => body.json())
  )
);
// ofetch — developer-friendly universal client
import { ofetch } from 'ofetch';

// GET — automatic JSON, type-safe
const users = await ofetch<User[]>('/api/users', {
  baseURL: 'https://api.example.com',
  headers: { Authorization: `Bearer ${token}` },
});
// users is User[] — typed

// POST — automatic JSON serialization
const user = await ofetch<User>('/api/users', {
  method: 'POST',
  body: { name: 'Alice', email: 'alice@example.com' },
  // body is automatically stringified to JSON
});

// Error handling — throws FetchError on 4xx/5xx
try {
  await ofetch('/api/missing');
} catch (err) {
  if (err.response) {
    console.log(err.status); // 404
    console.log(err.data);   // Parsed error response body
  }
}

The developer experience difference is significant for day-to-day code. ofetch's automatic JSON parsing means you never forget to call .json() on a response body. Its automatic error throwing on 4xx/5xx responses (with the response body attached to the error) eliminates the need for manual status code checking. These defaults are opinionated but correct for the vast majority of API calls.

undici's API is lower-level by design. The request() function returns a stream you must explicitly consume — body.json() or body.text(). This gives you more control (you can stream large responses without loading them into memory) but requires more code for simple use cases.


Nuxt / Nitro Integration

// ofetch is used internally by Nuxt and $fetch is the global
// No import needed in Nuxt components:
const data = await $fetch('/api/users');

// In Nuxt server routes (auto-imported):
export default defineEventHandler(async (event) => {
  // $fetch works on both server and client
  const user = await $fetch<User>(`/api/users/${event.context.params.id}`);
  return user;
});

// useFetch composable (wraps ofetch + reactivity):
const { data, pending, error } = await useFetch('/api/users');

If you're using Nuxt 3, ofetch is already your HTTP client — it's the implementation behind $fetch and useFetch. The consistent behavior is the key benefit: when a Nuxt page uses $fetch('/api/users'), it runs on the server during SSR (making a local function call) and on the client during navigation (making an HTTP request to the same URL). The behavior is identical in both environments.


Performance

Benchmark: 10,000 POST requests to localhost (Node.js)

Client          | Req/sec  | Notes
----------------|----------|--------
undici Pool     | 45,000   | Connection pooling
undici fetch    | 38,000   | Standard fetch API
node-fetch      | 22,000   | Legacy wrapper
ofetch          | 20,000   | Universal, some overhead
Axios           | 15,000   | XMLHttpRequest based

undici is 2-3x faster than alternatives for server-to-server.
For typical web app use, the difference is negligible.

The performance advantage of undici's Pool is most significant for services that make many repeated requests to the same host. A Node.js service that calls a backend API on every incoming request benefits enormously from connection pooling — rather than establishing a new TCP connection per call, it reuses persistent connections from the pool.

For applications making fewer than 1,000 requests per second to external APIs, the difference between undici and ofetch is negligible. Both are fast enough that HTTP client overhead won't be your bottleneck.


Error Handling Patterns

// ofetch — throws on 4xx/5xx with response attached
async function getUser(id) {
  try {
    return await ofetch(`/api/users/${id}`);
  } catch (err) {
    if (err.status === 404) return null;
    if (err.status === 401) throw new AuthenticationError();
    throw err; // Re-throw unexpected errors
  }
}

// ofetch — retry configuration
const api = ofetch.create({
  baseURL: 'https://api.example.com',
  retry: 3,           // Retry up to 3 times
  retryDelay: 500,    // 500ms between retries
  retryStatusCodes: [429, 500, 502, 503, 504], // Status codes to retry
});
// undici — manual error handling
async function getUser(id) {
  const { statusCode, body } = await request(`/api/users/${id}`);
  
  if (statusCode === 404) return null;
  if (statusCode === 401) throw new AuthenticationError();
  if (statusCode >= 500) throw new ServerError(statusCode);
  
  return body.json();
}

ofetch's built-in retry logic is a practical advantage for production applications that call external APIs. Network hiccups, rate limit errors, and transient server errors are common, and implementing retry logic correctly (with exponential backoff, jitter, and status code filtering) is non-trivial. ofetch handles this with a few configuration options.


Creating API Clients

// ofetch — create a configured instance
import { $fetch } from 'ofetch';

export const apiClient = $fetch.create({
  baseURL: process.env.API_BASE_URL,
  headers: { 'X-API-Key': process.env.API_KEY },
  retry: 2,
  onResponseError({ response }) {
    if (response.status === 401) {
      // Handle token refresh
    }
  },
});

// Usage:
const users = await apiClient<User[]>('/users');
const post = await apiClient<Post>('/posts', { method: 'POST', body: data });
// undici — Pool-based client
import { Pool } from 'undici';

const api = new Pool(process.env.API_BASE_URL, {
  connections: 5,
  headers: { 'X-API-Key': process.env.API_KEY },
});

async function get<T>(path: string): Promise<T> {
  const { statusCode, body } = await api.request({ path, method: 'GET' });
  if (statusCode >= 400) throw new Error(`API error: ${statusCode}`);
  return body.json() as T;
}

When to Choose

Choose undici when:

  • High-throughput Node.js microservice (10K+ requests/second)
  • You need connection pooling for repeated requests to same host
  • Maximum raw HTTP performance is the priority
  • You're making HTTP/2 or HTTP pipelining optimizations

Choose ofetch when:

  • Building with Nuxt or Nitro framework
  • Universal app that runs on browser and server
  • Developer ergonomics over raw performance
  • You want consistent behavior across environments
  • You need automatic retry, JSON parsing, and error handling

Choose native fetch when:

  • Node.js 18+ and you need basic HTTP with no additional features
  • Edge runtime compatibility is required
  • Zero dependencies is a hard constraint

Compare ofetch and undici package health on PkgPulse. Also see our Axios vs ky vs undici comparison and best Node.js API frameworks.

The 2026 JavaScript Stack Cheatsheet

One PDF: the best package for every category (ORMs, bundlers, auth, testing, state management). Used by 500+ devs. Free, updated monthly.