Guides

Using with Next.js

Use experimental-agent with Next.js. No special configuration needed for basic usage. Opt into durable workflows for production.

No special Next.js configuration is needed to use experimental-agent. Define your agent, create API routes, and start building. For durable workflows that survive reconnects and timeouts, opt in with "use workflow".

Installation

npm i experimental-agent ai
  • experimental-agent — The agent SDK
  • ai — Vercel AI SDK for tools and streaming

To use durable workflows, also add workflow:

npm i workflow

Define Your Agent

src/agent.ts
import { agent } from "experimental-agent";
import { vercelSandbox } from "experimental-agent/sandbox";

export const myAgent = agent("my-agent", {
  model: "anthropic/claude-opus-4.6",
  system: "You are a helpful assistant.",
  sandbox: vercelSandbox(),
});

Agent files can live anywhere under src/ or app/:

  • src/agent.ts
  • src/agents/code-assistant.ts
  • app/agent.ts

Tip: For projects with multiple agents, use file-based routing to auto-discover agents from the filesystem and serve them without writing route handlers.

API Route Patterns

POST Handler — Send and Stream

app/api/chat/[chatId]/route.ts
import { createUIMessageStreamResponse } from "ai";
import { myAgent } from "@/agent";

export async function POST(
  req: Request,
  { params }: { params: Promise<{ chatId: string }> }
) {
  const { chatId } = await params;
  const { message } = await req.json();

  const session = myAgent.session(chatId);
  await session.send(message);

  const stream = await session.stream();

  return createUIMessageStreamResponse({ stream });
}

GET Handler — Reconnect

app/api/chat/[chatId]/route.ts
import { createUIMessageStreamResponse } from "ai";
import { myAgent } from "@/agent";

export async function GET(
  _req: Request,
  { params }: { params: Promise<{ chatId: string }> }
) {
  const { chatId } = await params;
  const session = myAgent.session(chatId);
  const stream = await session.stream();

  return createUIMessageStreamResponse({ stream });
}

When the client disconnects mid-stream, it can call the GET endpoint to reconnect and continue receiving the same stream. See Streaming for details.

Full Example with Context

app/api/chat/[chatId]/route.ts
import { createUIMessageStreamResponse } from "ai";
import { myAgent } from "@/agent";

export async function POST(
  req: Request,
  { params }: { params: Promise<{ chatId: string }> }
) {
  const { chatId } = await params;
  const { message } = await req.json();

  const session = myAgent.session(chatId);
  await session.send(message, {
    context: {
      authToken: req.headers.get("authorization") ?? "",
    },
  });

  const stream = await session.stream();

  return createUIMessageStreamResponse({ stream });
}

export async function GET(
  _req: Request,
  { params }: { params: Promise<{ chatId: string }> }
) {
  const { chatId } = await params;
  const session = myAgent.session(chatId);
  const stream = await session.stream();

  return createUIMessageStreamResponse({ stream });
}

Durable Workflows (Opt-in)

For production agents that need to survive reconnects, timeouts, and long-running tool calls, opt into durable workflows using the Vercel Workflow runtime.

1. Create a Workflow Function

The "use workflow" directive goes inside the function body. The function receives the session ID and send arguments, and calls session.send() — it is the workflow:

app/api/chat/[chatId]/workflow.ts
import type { SessionSendArgs } from "experimental-agent";
import { myAgent } from "@/agent";

export async function agentWorkflow(
  sessionId: string,
  ...args: SessionSendArgs<typeof myAgent>
) {
  "use workflow";
  return await myAgent.session(sessionId).send(...args);
}

2. Start from Your API Route

Call start() from workflow/api to kick off the workflow, then stream from the result:

app/api/chat/[chatId]/route.ts
import { createUIMessageStreamResponse } from "ai";
import { start } from "workflow/api";
import { myAgent } from "@/agent";
import { agentWorkflow } from "./workflow";

export async function POST(
  req: Request,
  { params }: { params: Promise<{ chatId: string }> }
) {
  const { chatId } = await params;
  const { message } = await req.json();

  const session = myAgent.session(chatId);
  const result = await start(agentWorkflow, [chatId, message]);
  const stream = await session.stream(result);

  return createUIMessageStreamResponse({ stream });
}

3. Configure Next.js for Workflow

Wrap your Next.js config with withWorkflow:

next.config.ts
import { withWorkflow } from "workflow/next";

export default withWorkflow({});

This enables the Vercel Workflow runtime. Only needed when opting into durable workflows.

Environment Variables

VariablePurpose
ANTHROPIC_API_KEYAPI key for Anthropic models
OPENAI_API_KEYAPI key for OpenAI models
VERCEL_OIDC_TOKENAuto-set on Vercel; used for sandbox auth

Add your model provider key to .env.local for development and to your Vercel project settings for production.

Development vs Production

FeatureDevelopmentProduction (Vercel)
SandboxLocal (host filesystem)Vercel Sandbox
TransitionAutomaticAutomatic

If you omit sandbox on the agent, it infers the type from the environment:

  • Dev — No VERCEL_OIDC_TOKENsandbox: { type: "local" }
  • ProdVERCEL_OIDC_TOKEN set → sandbox: { type: "vercel" }

Override explicitly if you need different behavior (e.g. sandbox: { type: "vercel" } in dev to test the cloud sandbox).

Other Frameworks

The core APIs — agent(), session.send(), session.stream() — work with any JavaScript runtime.

See Framework Guides for other runtimes.

Next Steps