Streaming
Real-time streaming with the AI SDK's UIMessageStream protocol. Reconnect on disconnect, handle status updates, and integrate with useChat.
Text deltas, tool calls, tool results, reasoning, and status updates flow to the client as they happen. If the client disconnects, call session.stream() again to reconnect to the current stream.
Streaming a Response
After sending a message, get the response stream:
import { myAgent } from "@/agent";
import { createUIMessageStreamResponse } from "ai";
const session = myAgent.session(chatId);
await session.send(message);
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });session.stream() returns a ReadableStream<UIMessageChunk>. The stream contains text deltas, tool calls, tool results, reasoning, and transient status updates.
Workflow Streaming
When using Vercel Workflow, pass the workflow result to session.stream():
import { start } from "workflow/api";
import { agentWorkflow } from "./workflow";
const session = myAgent.session(chatId);
const result = await start(agentWorkflow, [chatId, message, opts]);
const stream = await session.stream(result);
return createUIMessageStreamResponse({ stream });The workflow result implements WorkflowRunLike — session.stream() calls result.getReadable() to tap into the durable stream. This stream survives crashes, timeouts, and deploys.
Reconnection
If the client disconnects mid-stream, call session.stream() again to reconnect:
const session = myAgent.session(chatId);
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });Without workflow, this reconnects to the active in-memory run (same process only). With workflow, it reconnects to the durable workflow stream — even after a restart.
If no active stream exists, session.stream() throws. Fall back to session.history() to load completed state from storage.
Status Updates
During long operations, the agent emits AgentStatus objects as transient data-status chunks. These are not persisted — they're for real-time UI feedback.
| Status Type | When |
|---|---|
sandbox-setup | Sandbox is initializing |
sandbox-setup-cold | Cold start (no snapshot) |
loading-skills | Loading skill instructions |
processing-approvals | Checking approval rules |
needs-approval | Waiting for user approval |
thinking | Model is generating |
custom | Custom status from your code |
Use the status hook for logging or analytics. Parse data-status chunks in your frontend for loading indicators and progress UI.
Interruption
Stop the current assistant response:
await session.send("New message", { interruptIfStreaming: true });Or explicitly:
await session.interrupt();Pending approvals are auto-rejected when interrupted. See Sessions for details.
Frontend Integration
The stream works with @ai-sdk/react's useChat using a custom transport. Configure one endpoint for sending messages and another for reconnection. The AI SDK handles parsing UIMessageChunk and assembling messages.
See the Frontend guide for a complete setup.