Using with Next.js
Use experimental-agent with Next.js. No special configuration needed for basic usage. Opt into durable workflows for production.
No special Next.js configuration is needed to use experimental-agent. Define your agent, create API routes, and start building. For durable workflows that survive reconnects and timeouts, opt in with "use workflow".
Installation
npm i experimental-agent ai- experimental-agent — The agent SDK
- ai — Vercel AI SDK for tools and streaming
To use durable workflows, also add workflow:
npm i workflowDefine Your Agent
import { agent } from "experimental-agent";
import { vercelSandbox } from "experimental-agent/sandbox";
export const myAgent = agent("my-agent", {
model: "anthropic/claude-opus-4.6",
system: "You are a helpful assistant.",
sandbox: vercelSandbox(),
});Agent files can live anywhere under src/ or app/:
src/agent.tssrc/agents/code-assistant.tsapp/agent.ts
Tip: For projects with multiple agents, use file-based routing to auto-discover agents from the filesystem and serve them without writing route handlers.
API Route Patterns
POST Handler — Send and Stream
import { createUIMessageStreamResponse } from "ai";
import { myAgent } from "@/agent";
export async function POST(
req: Request,
{ params }: { params: Promise<{ chatId: string }> }
) {
const { chatId } = await params;
const { message } = await req.json();
const session = myAgent.session(chatId);
await session.send(message);
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });
}GET Handler — Reconnect
import { createUIMessageStreamResponse } from "ai";
import { myAgent } from "@/agent";
export async function GET(
_req: Request,
{ params }: { params: Promise<{ chatId: string }> }
) {
const { chatId } = await params;
const session = myAgent.session(chatId);
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });
}When the client disconnects mid-stream, it can call the GET endpoint to reconnect and continue receiving the same stream. See Streaming for details.
Full Example with Context
import { createUIMessageStreamResponse } from "ai";
import { myAgent } from "@/agent";
export async function POST(
req: Request,
{ params }: { params: Promise<{ chatId: string }> }
) {
const { chatId } = await params;
const { message } = await req.json();
const session = myAgent.session(chatId);
await session.send(message, {
context: {
authToken: req.headers.get("authorization") ?? "",
},
});
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });
}
export async function GET(
_req: Request,
{ params }: { params: Promise<{ chatId: string }> }
) {
const { chatId } = await params;
const session = myAgent.session(chatId);
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });
}Durable Workflows (Opt-in)
For production agents that need to survive reconnects, timeouts, and long-running tool calls, opt into durable workflows using the Vercel Workflow runtime.
1. Create a Workflow Function
The "use workflow" directive goes inside the function body. The function receives the session ID and send arguments, and calls session.send() — it is the workflow:
import type { SessionSendArgs } from "experimental-agent";
import { myAgent } from "@/agent";
export async function agentWorkflow(
sessionId: string,
...args: SessionSendArgs<typeof myAgent>
) {
"use workflow";
return await myAgent.session(sessionId).send(...args);
}2. Start from Your API Route
Call start() from workflow/api to kick off the workflow, then stream from the result:
import { createUIMessageStreamResponse } from "ai";
import { start } from "workflow/api";
import { myAgent } from "@/agent";
import { agentWorkflow } from "./workflow";
export async function POST(
req: Request,
{ params }: { params: Promise<{ chatId: string }> }
) {
const { chatId } = await params;
const { message } = await req.json();
const session = myAgent.session(chatId);
const result = await start(agentWorkflow, [chatId, message]);
const stream = await session.stream(result);
return createUIMessageStreamResponse({ stream });
}3. Configure Next.js for Workflow
Wrap your Next.js config with withWorkflow:
import { withWorkflow } from "workflow/next";
export default withWorkflow({});This enables the Vercel Workflow runtime. Only needed when opting into durable workflows.
Environment Variables
| Variable | Purpose |
|---|---|
ANTHROPIC_API_KEY | API key for Anthropic models |
OPENAI_API_KEY | API key for OpenAI models |
VERCEL_OIDC_TOKEN | Auto-set on Vercel; used for sandbox auth |
Add your model provider key to .env.local for development and to your Vercel project settings for production.
Development vs Production
| Feature | Development | Production (Vercel) |
|---|---|---|
| Sandbox | Local (host filesystem) | Vercel Sandbox |
| Transition | Automatic | Automatic |
If you omit sandbox on the agent, it infers the type from the environment:
- Dev — No
VERCEL_OIDC_TOKEN→sandbox: { type: "local" } - Prod —
VERCEL_OIDC_TOKENset →sandbox: { type: "vercel" }
Override explicitly if you need different behavior (e.g. sandbox: { type: "vercel" } in dev to test the cloud sandbox).
Other Frameworks
The core APIs — agent(), session.send(), session.stream() — work with any JavaScript runtime.
See Framework Guides for other runtimes.
Next Steps
- File-Based Routing — Auto-discover agents from the filesystem with
withAgents - Build Your First Agent — End-to-end tutorial from install to deployment
- API Routes — Detailed API route patterns and error handling
- Concepts — Sessions, tools, sandbox, storage, and streaming
- Quickstart — Get a working agent in 5 minutes