Vercel Queues Public Beta Launch
Vercel has moved Vercel Queues out of experimental status and into public beta, making it available to all teams. This durable event streaming system is built atop Vercel's Fluid compute infrastructure and addresses a critical need: reliably handling asynchronous work and guaranteeing that long-running tasks complete successfully.
How It Works
Queues follows a clean pub/sub model:
- Message publishing: Send messages from any route handler to a durable topic
- Fan-out delivery: The queue distributes messages to subscribed consumer groups
- Independent processing: Each consumer group processes messages independently with automatic retries
- Guaranteed delivery: Messages are redelivered until successfully processed or expired
The system provides at-least-once delivery semantics with built-in handling for common failure scenarios like function crashes and new deployments rolling out.
Core Features
Vercel Queues includes:
- Multiple availability zone synchronous replication
- Customizable visibility timeout and delayed delivery
- Idempotency keys for deduplication
- Concurrency control per consumer
- Per-deployment topic partitioning
Consumer routes configured via vercel.json become private endpoints—they have no public URL and can only be invoked by Vercel's queue infrastructure.
Pricing and Integration
Queues billing starts at $0.60 per 1M API operations, making it accessible for a range of workloads. Functions invoked by Queues in push mode use existing Fluid compute rates. Queues also powers Vercel's Workflow feature, which provides higher-level multi-step orchestration on top of the messaging primitives.
Developers can get started with the Queues documentation and deploy using a free Vercel account.