middy vs Lambda Powertools vs serverless-http 2026
middy vs AWS Lambda Powertools vs serverless-http: Node.js Lambda Middleware in 2026
TL;DR
Node.js Lambda functions grow fast — and without middleware, every handler becomes a copy-paste graveyard of auth checks, error handling, and JSON parsing. In 2026, three packages dominate this space: middy (the most popular Lambda middleware framework, Express-style middleware for AWS Lambda), @aws-lambda-powertools (AWS's official suite for observability, tracing, and structured logging), and serverless-http (runs your existing Express/Fastify/Hono app on Lambda with zero rewrites). They solve different problems and are often used together: middy for cross-cutting concerns, Powertools for observability, serverless-http if you're migrating an existing HTTP framework.
Key Takeaways
- middy is the Express-style middleware framework for raw Lambda handlers — wraps your handler with a pipeline of middleware for auth, validation, error handling, CORS, warm-up
- @aws-lambda-powertools is AWS's official toolkit for production Lambda functions — structured logging (JSON), tracing (X-Ray), metrics (CloudWatch), idempotency, batch processing
- serverless-http wraps any Node.js HTTP framework (Express, Fastify, Hono, Koa) to run on Lambda — no Lambda-specific rewrites needed
- Cold start impact: middy adds ~2-5ms; Powertools adds ~10-15ms; serverless-http adds ~5-10ms plus your framework's overhead
- They compose:
middy(handler).use(injectLambdaContext(logger))— Powertools middleware integrates natively into middy's pipeline - Weekly downloads: middy ~800K; @aws-lambda-powertools/logger ~500K; serverless-http ~900K
The Lambda Middleware Problem
A raw AWS Lambda handler is just a function:
export const handler = async (event: APIGatewayProxyEventV2) => {
// You're responsible for everything:
// - Parsing the JSON body
// - Validating the body schema
// - Authenticating the request
// - Handling errors consistently
// - Returning the right status codes
// - Logging with correlation IDs
// - Tracing with X-Ray
// - Returning CORS headers
const body = JSON.parse(event.body ?? '{}')
// ... 50 more lines of boilerplate
}
Without middleware, every handler duplicates this boilerplate. At 10 handlers it's manageable; at 50 it's a maintenance nightmare.
middy: The Lambda Middleware Framework
middy v5 is the most widely-used Node.js Lambda middleware framework. It wraps your handler with a composable pipeline — identical conceptually to Express middleware but purpose-built for Lambda's event/response model.
Core Architecture
import middy from '@middy/core'
import httpJsonBodyParser from '@middy/http-json-body-parser'
import httpErrorHandler from '@middy/http-error-handler'
import httpCors from '@middy/http-cors'
import validator from '@middy/validator'
import { transpileSchema } from '@middy/validator/transpile'
// Your actual business logic — no boilerplate
const baseHandler = async (event) => {
const { name, email } = event.body // Already parsed and validated
const user = await createUser({ name, email })
return { statusCode: 201, body: JSON.stringify(user) }
}
const schema = {
type: 'object',
properties: {
body: {
type: 'object',
properties: {
name: { type: 'string', minLength: 1 },
email: { type: 'string', format: 'email' },
},
required: ['name', 'email'],
},
},
}
export const handler = middy(baseHandler)
.use(httpJsonBodyParser()) // Parse JSON body
.use(validator({ eventSchema: transpileSchema(schema) })) // Validate
.use(httpCors()) // CORS headers on every response
.use(httpErrorHandler()) // Consistent error responses
The Official Middleware Ecosystem
middy ships a rich ecosystem of official middleware packages:
import httpSecurityHeaders from '@middy/http-security-headers'
import inputOutputLogger from '@middy/input-output-logger'
import warmUp from '@middy/warmup'
import ssm from '@middy/ssm'
import secretsManager from '@middy/secrets-manager'
import rdsSigner from '@middy/rds-signer'
export const handler = middy(baseHandler)
// Fetch SSM params at cold start, cache for 5 minutes
.use(ssm({
fetchData: {
DB_URL: '/prod/myapp/database-url',
},
cacheExpiry: 5 * 60 * 1000, // 5 minutes
setToContext: true,
}))
// Fetch Secrets Manager secrets
.use(secretsManager({
fetchData: {
stripeKey: 'prod/myapp/stripe-secret',
},
setToContext: true,
}))
// Respond immediately to keep-warm pings (prevents real handler logic on warm-up)
.use(warmUp())
// Security headers on all HTTP responses
.use(httpSecurityHeaders())
// Log all inputs and outputs (with redaction)
.use(inputOutputLogger({
logger: (message) => console.log(JSON.stringify(message)),
omitPaths: ['event.body.password', 'event.headers.authorization'],
}))
Error Handling with createError
middy's error handling pattern uses @middy/util's createError to produce HTTP-compatible errors that httpErrorHandler converts to proper API responses:
import createError from '@middy/util'
const baseHandler = async (event) => {
const userId = event.pathParameters?.id
const user = await db.users.findById(userId)
if (!user) {
throw createError(404, 'User not found', {
expose: true, // Safe to show to clients
})
}
if (!user.isActive) {
throw createError(403, 'Account suspended', { expose: true })
}
return {
statusCode: 200,
body: JSON.stringify(user),
}
}
export const handler = middy(baseHandler)
.use(httpErrorHandler())
// Result: 404 → { statusCode: 404, body: '{"message":"User not found"}' }
// Result: 500 → { statusCode: 500, body: '{"message":"Internal Server Error"}' }
// (unexposed errors become generic 500s)
@aws-lambda-powertools: Production Observability
@aws-lambda-powertools is AWS's official TypeScript toolkit for production-grade Lambda functions. While middy handles request/response middleware, Powertools handles observability — structured logging, distributed tracing, and custom metrics.
Logger: Structured JSON Logging
Powertools Logger outputs structured JSON with Lambda context automatically injected:
import { Logger } from '@aws-lambda-powertools/logger'
import { injectLambdaContext } from '@aws-lambda-powertools/logger/middleware'
import middy from '@middy/core'
const logger = new Logger({
serviceName: 'user-service',
logLevel: 'INFO',
})
const baseHandler = async (event) => {
logger.info('Processing user request', {
userId: event.pathParameters.id,
action: 'get-user',
})
const user = await getUser(event.pathParameters.id)
logger.info('User fetched successfully', {
userId: user.id,
plan: user.plan,
})
return { statusCode: 200, body: JSON.stringify(user) }
}
export const handler = middy(baseHandler)
.use(injectLambdaContext(logger, { clearState: true }))
The injectLambdaContext middleware enriches every log line with:
{
"level": "INFO",
"message": "Processing user request",
"service": "user-service",
"timestamp": "2026-03-09T10:00:00.000Z",
"xray_trace_id": "1-abc123-def456",
"cold_start": true,
"function_name": "user-service-prod",
"function_memory_size": "512",
"function_arn": "arn:aws:lambda:...",
"function_request_id": "uuid-here",
"userId": "usr_abc123",
"action": "get-user"
}
This makes CloudWatch Logs Insights queries trivial:
fields @timestamp, level, message, userId, cold_start
| filter level = "ERROR"
| stats count(*) by userId
Tracer: X-Ray Distributed Tracing
import { Tracer } from '@aws-lambda-powertools/tracer'
import { captureLambdaHandler } from '@aws-lambda-powertools/tracer/middleware'
const tracer = new Tracer({ serviceName: 'user-service' })
// Auto-patches AWS SDK clients and HTTP calls
const dynamodb = tracer.captureAWSv3Client(new DynamoDBClient({}))
const axios = tracer.captureHTTPsGlobal(require('https'))
const baseHandler = async (event) => {
// Custom subsegments for business logic tracing
const segment = tracer.getSegment()
const subsegment = segment?.addNewSubsegment('## validateUser')
try {
const user = await db.users.findById(event.pathParameters.id)
tracer.putAnnotation('userId', user.id) // Searchable in X-Ray
tracer.putMetadata('user', user) // Non-searchable detail
return { statusCode: 200, body: JSON.stringify(user) }
} catch (err) {
subsegment?.addError(err as Error)
throw err
} finally {
subsegment?.close()
}
}
export const handler = middy(baseHandler)
.use(captureLambdaHandler(tracer))
Metrics: CloudWatch Custom Metrics
import { Metrics, MetricUnit } from '@aws-lambda-powertools/metrics'
import { logMetrics } from '@aws-lambda-powertools/metrics/middleware'
const metrics = new Metrics({
namespace: 'MyApp',
serviceName: 'user-service',
})
const baseHandler = async (event) => {
const startTime = Date.now()
const user = await processUserRequest(event)
metrics.addMetric('UserRequestProcessed', MetricUnit.Count, 1)
metrics.addMetric('ProcessingTime', MetricUnit.Milliseconds, Date.now() - startTime)
metrics.addDimension('plan', user.plan) // Filter metrics by plan in CloudWatch
return { statusCode: 200, body: JSON.stringify(user) }
}
export const handler = middy(baseHandler)
.use(logMetrics(metrics, { captureColdStartMetric: true }))
Idempotency: Preventing Duplicate Processing
Powertools' idempotency utility prevents duplicate Lambda executions — critical for payment processing, email sends, and any non-idempotent operation:
import { makeHandlerIdempotent } from '@aws-lambda-powertools/idempotency/middleware'
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb'
const persistenceStore = new DynamoDBPersistenceLayer({
tableName: 'IdempotencyTable',
})
export const handler = middy(baseHandler)
.use(makeHandlerIdempotent({
persistenceStore,
expiresAfterSeconds: 3600, // Results cached for 1 hour
}))
// If the same request ID is sent twice within 1 hour:
// → First call: runs handler, caches result
// → Second call: returns cached result immediately, no handler execution
serverless-http: Run Any Framework on Lambda
serverless-http wraps existing Node.js HTTP frameworks to run on Lambda. Instead of rewriting your Express/Fastify/Hono app as Lambda handlers, serverless-http translates between API Gateway events and HTTP requests:
import serverless from 'serverless-http'
import express from 'express'
const app = express()
// Your entire Express app — routes, middleware, error handlers
app.use(express.json())
app.get('/users/:id', authenticate, async (req, res) => {
const user = await db.users.findById(req.params.id)
if (!user) return res.status(404).json({ error: 'Not found' })
res.json(user)
})
app.post('/users', authenticate, validateBody(createUserSchema), async (req, res) => {
const user = await db.users.create(req.body)
res.status(201).json(user)
})
app.use((err, req, res, next) => {
console.error(err)
res.status(err.status ?? 500).json({ error: err.message })
})
// Wrap the Express app — returns a Lambda handler
export const handler = serverless(app)
Hono on Lambda via serverless-http
Hono's edge-native design makes it ideal for Lambda — small bundle, fast cold starts:
import serverless from 'serverless-http'
import { Hono } from 'hono'
import { zValidator } from '@hono/zod-validator'
import { z } from 'zod'
const app = new Hono()
app.get('/users/:id', async (c) => {
const user = await db.users.findById(c.req.param('id'))
if (!user) return c.json({ error: 'Not found' }, 404)
return c.json(user)
})
app.post(
'/users',
zValidator('json', z.object({
name: z.string().min(1),
email: z.string().email(),
})),
async (c) => {
const data = c.req.valid('json')
const user = await db.users.create(data)
return c.json(user, 201)
}
)
export const handler = serverless(app)
serverless-http Configuration
export const handler = serverless(app, {
// Map API Gateway binary media types to buffers
binary: ['image/*', 'application/pdf'],
// Custom request/response transformations
request(request, event, context) {
request.lambdaEvent = event
request.lambdaContext = context
},
response(response, event, context) {
response.headers['X-Lambda-Request-Id'] = context.awsRequestId
},
})
Performance Comparison: Cold Start Impact
Cold start performance matters for API latency, especially on new deployments:
| Package | Added Cold Start Time | Notes |
|---|---|---|
| middy (core only) | ~2ms | Negligible |
| middy + 5 middleware | ~8-15ms | Middleware initialization cost |
| @aws-lambda-powertools (all 3) | ~10-20ms | X-Ray patching is the heavy part |
| serverless-http + Express | ~40-80ms | Express initialization |
| serverless-http + Hono | ~8-15ms | Hono's minimal startup |
| serverless-http + Fastify | ~25-40ms | Fastify plugin loading |
Tip for cold start optimization: Use Lambda's POWERTOOLS_TRACER_CAPTURE_RESPONSE=false and POWERTOOLS_TRACER_CAPTURE_ERROR=false environment variables to disable verbose tracing in cost-sensitive functions.
Combining All Three: The Production Pattern
In production, using all three packages together is common and they compose cleanly:
import middy from '@middy/core'
import httpJsonBodyParser from '@middy/http-json-body-parser'
import httpErrorHandler from '@middy/http-error-handler'
import httpCors from '@middy/http-cors'
import ssm from '@middy/ssm'
import { Logger } from '@aws-lambda-powertools/logger'
import { Tracer } from '@aws-lambda-powertools/tracer'
import { Metrics, MetricUnit } from '@aws-lambda-powertools/metrics'
import { injectLambdaContext } from '@aws-lambda-powertools/logger/middleware'
import { captureLambdaHandler } from '@aws-lambda-powertools/tracer/middleware'
import { logMetrics } from '@aws-lambda-powertools/metrics/middleware'
const logger = new Logger({ serviceName: 'order-service' })
const tracer = new Tracer({ serviceName: 'order-service' })
const metrics = new Metrics({ namespace: 'Ecommerce', serviceName: 'order-service' })
const baseHandler = async (event, context) => {
const { productId, quantity } = event.body
const userId = context.user.id // Injected by auth middleware
logger.info('Creating order', { userId, productId, quantity })
const order = await orderService.create({ userId, productId, quantity })
metrics.addMetric('OrderCreated', MetricUnit.Count, 1)
metrics.addMetric('OrderValue', MetricUnit.None, order.total)
return {
statusCode: 201,
body: JSON.stringify(order),
}
}
export const handler = middy(baseHandler)
// Powertools observability (order matters: logger first for context)
.use(injectLambdaContext(logger, { clearState: true }))
.use(captureLambdaHandler(tracer))
.use(logMetrics(metrics, { captureColdStartMetric: true }))
// Request processing
.use(httpJsonBodyParser())
.use(httpCors({ origin: 'https://myapp.com' }))
// Fetch secrets at cold start
.use(ssm({
fetchData: { STRIPE_KEY: '/prod/stripe-secret' },
cacheExpiry: 15 * 60 * 1000,
setToContext: true,
}))
// Error handling (always last)
.use(httpErrorHandler({ fallbackMessage: 'An error occurred' }))
When to Use Each
Use middy when:
- You're building raw Lambda handlers (not migrating an existing Express app)
- You need a composable middleware pipeline for validation, auth, error handling
- You want the official
@middy/*packages for SSM, Secrets Manager, and other integrations - Your team thinks in middleware terms (similar to Express patterns)
Use @aws-lambda-powertools when:
- You need structured logging that plays well with CloudWatch Logs Insights
- You want X-Ray tracing without manual segment management
- You need idempotency for payment processing or email sends
- You're processing SQS/SNS/Kinesis batches and want partial batch failure handling
Use serverless-http when:
- You have an existing Express/Fastify/Hono app you want to deploy to Lambda
- Your team prefers standard HTTP framework patterns over Lambda-specific code
- You're lifting-and-shifting an existing application to serverless
- You want to run the same app locally (via Express) and on Lambda (via serverless-http)
Ecosystem and Community
middy has been around since 2017 and has over 3,700 GitHub stars. The @middy scope on npm contains more than 40 individual middleware packages, covering everything from SQS batch processing to RDS IAM authentication. While not backed by a major cloud provider, middy is independently maintained and widely adopted across the AWS CDK, SST, and Serverless Framework communities.
AWS Lambda Powertools started as a Python library in 2020 before the TypeScript port launched in 2022. The TypeScript version has grown rapidly — each of the three main packages (@aws-lambda-powertools/logger, tracer, metrics) gets around 400-500K weekly downloads. AWS itself uses and recommends Powertools in its Well-Architected Framework workshops. The project is open-source under Apache 2.0 despite being an official AWS product.
serverless-http is a lean, focused package maintained by Doug Moscrop. It doesn't try to be more than what it is: a translation layer between API Gateway events and Node.js HTTP server interfaces. With ~900K weekly downloads, it's the most downloaded of the three on npm, driven by the massive installed base of Express developers moving to Lambda. See also Hono vs Elysia 2026 for understanding which lightweight framework pairs best with serverless-http.
Real-World Adoption
middy is standard in the Serverless Framework and SST communities. Most full-stack serverless applications — built with tools like SST v3 or AWS CDK — use middy as the middleware layer in their Lambda handlers. Companies that publish serverless boilerplates and templates almost universally include middy. The @middy/ssm middleware is particularly widely used because fetching SSM parameters at cold start (with caching) is a common pattern for Lambda configuration management. For the AWS SDK v3 that these Lambda handlers call, see @aws-sdk v3 vs v2 migration guide 2026.
AWS Lambda Powertools is mandatory in AWS Well-Architected workshops and recommended in the Serverless Application Lens whitepaper. Any Lambda-heavy AWS shop that has gone through a Well-Architected review has likely been told to adopt structured logging at minimum. Companies running large Lambda fleets — thousands of functions — rely on the structured JSON output for CloudWatch Logs Insights aggregation and X-Ray for distributed tracing across service boundaries.
serverless-http powers a significant portion of the "Express on Lambda" pattern that became popular when teams first moved to serverless but wanted to keep their existing codebases. At scale, companies have found the "one Lambda per app" pattern (with serverless-http) has limitations compared to "one Lambda per route" (with middy), but for small-to-medium APIs it remains a pragmatic and widely used approach.
Developer Experience Deep Dive
middy's TypeScript support improved substantially in v4 and v5. Middleware packages are typed with generics that flow through the handler, so your IDE understands what shape event.body takes after httpJsonBodyParser runs. The middleware chain is readable and maintainable — you can see exactly what a handler does by reading its .use() calls. The primary DX challenge is ordering: some middleware pairs have required ordering (e.g., httpJsonBodyParser before validator, httpErrorHandler last), and violations produce subtle bugs rather than clear errors.
Powertools TypeScript is well-documented and has excellent IDE integration. The structured logger's appendKeys() method is particularly useful for adding per-request context that appears on every subsequent log line. One DX quirk: @aws-lambda-powertools/tracer requires the active-tracing flag on your Lambda function (or a dev flag to disable tracing), otherwise it silently does nothing — easy to miss in local development where X-Ray daemon isn't running.
serverless-http has essentially no configuration complexity — the API is just serverless(app). The tradeoff is debugging: stack traces can be harder to follow when an exception bubbles from your Express middleware through serverless-http's translation layer. HTTP frameworks designed for serverless-first environments like Hono have better cold start performance than Express when used with serverless-http, which matters for latency-sensitive APIs.
Migration Guide
Adopting middy in an existing Lambda project is incremental — wrap one handler at a time. Start with httpJsonBodyParser and httpErrorHandler to normalize error handling, then add ssm or secretsManager to clean up environment variable management. The key insight is that middy doesn't change how Lambda invokes your function; it just wraps the handler function before AWS sees it.
Adding Powertools to a middy-based project is the natural progression. The Powertools middleware integrates directly into the middy pipeline, so you're just adding .use() calls. Add the logger middleware first (it gives you request correlation IDs immediately), then tracer if you have X-Ray enabled, then metrics as your Lambda fleet matures.
Migrating from serverless-http to native middy handlers makes sense when cold start performance becomes critical or when you want per-route Lambda functions instead of one function per app. The migration is conceptual: break each Express route into its own Lambda handler wrapped with middy. If you need shared logic (auth, logging), that becomes middy middleware rather than Express middleware.
Final Verdict 2026
Use middy for Lambda handlers that need clean, composable middleware. Use Powertools for observability in any production Lambda function regardless of middleware approach. Use serverless-http when you have existing HTTP framework code or prefer that programming model. The happy path in 2026 is all three together: middy provides the middleware pipeline, Powertools provides production observability, and if you're running an HTTP API on Lambda, serverless-http provides the framework compatibility layer. None of these packages conflicts with the others — the ecosystem is designed for composition.
Performance Benchmarks and Lambda Optimization
Beyond cold start numbers, the ongoing execution overhead of each package matters for cost-sensitive workloads. AWS Lambda pricing is based on duration in milliseconds, so middleware that adds meaningful latency per invocation multiplies across millions of calls.
middy's per-invocation overhead is essentially zero beyond the middleware logic itself. The framework itself is a thin loop over an array of middleware functions — there's no reflection, no dependency injection container, and no runtime code generation. Benchmarks from the middy GitHub repository show less than 0.5ms overhead per invocation for the core framework with five middleware packages active. The SSM and SecretsManager middleware add network latency on first invocation (the SSM call takes 10-30ms), but the built-in caching means subsequent invocations within the cache window have zero overhead.
Powertools' logger is the most performance-impactful component because it runs on every invocation. The structured JSON serialization adds approximately 1-2ms per log statement. For functions that log extensively, this can add up — a function that logs 10 events per invocation at 1M invocations per day generates 10M log statements. The logLevel environment variable (LOG_LEVEL=WARN) lets you reduce logging verbosity in production without code changes. Powertools Tracer's X-Ray patching adds about 5-10ms per cold start for patching the AWS SDK — subsequent invocations run at native speed because the patching is done at initialization.
serverless-http's execution overhead is framework-dependent. The translation between API Gateway events and the Node.js http.IncomingMessage format takes under 1ms. However, frameworks like Express add their own middleware processing time — Express's router is O(n) over registered routes, so apps with hundreds of routes see more overhead than apps with dozens. Hono's router uses a radix tree and is O(log n), which is why the Hono + serverless-http combination has significantly better cold start and routing performance than Express + serverless-http.
For production Lambda cost optimization: use middy's warmUp middleware to handle keep-warm pings without executing business logic, set POWERTOOLS_TRACER_CAPTURE_RESPONSE=false to avoid X-Ray recording large response payloads, and use Hono (rather than Express) if bundle size and routing performance matter.
Methodology
- Download data from npmjs.com API, March 2026 weekly averages
- Cold start measurements from AWS Lambda documentation and community benchmarks (us-east-1, Node.js 22 runtime, 512MB)
- Versions: middy v5.x, @aws-lambda-powertools v2.x, serverless-http v3.x
- Sources: middy.js.org, docs.powertools.aws.dev, AWS Lambda documentation
Compare Lambda and serverless packages on PkgPulse — health scores, download trends, and dependency analysis.
Related: Best Node.js Background Job Libraries 2026, Hono vs Elysia 2026, Best Monorepo Tools 2026