Concepts

Streaming

Real-time streaming with the AI SDK's UIMessageStream protocol. Reconnect on disconnect, handle status updates, and integrate with useChat.

Text deltas, tool calls, tool results, reasoning, and status updates flow to the client as they happen. If the client disconnects, call session.stream() again to reconnect to the current stream.

Streaming a Response

After sending a message, get the response stream:

src/chat.ts
import { myAgent } from "@/agent";
import { createUIMessageStreamResponse } from "ai";

const session = myAgent.session(chatId);
await session.send(message);
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });

session.stream() returns a ReadableStream<UIMessageChunk>. The stream contains text deltas, tool calls, tool results, reasoning, and transient status updates.

Workflow Streaming

When using Vercel Workflow, pass the workflow result to session.stream():

src/chat.ts
import { start } from "workflow/api";
import { agentWorkflow } from "./workflow";

const session = myAgent.session(chatId);
const result = await start(agentWorkflow, [chatId, message, opts]);
const stream = await session.stream(result);
return createUIMessageStreamResponse({ stream });

The workflow result implements WorkflowRunLikesession.stream() calls result.getReadable() to tap into the durable stream. This stream survives crashes, timeouts, and deploys.

Reconnection

If the client disconnects mid-stream, call session.stream() again to reconnect:

src/chat.ts
const session = myAgent.session(chatId);
const stream = await session.stream();
return createUIMessageStreamResponse({ stream });

Without workflow, this reconnects to the active in-memory run (same process only). With workflow, it reconnects to the durable workflow stream — even after a restart.

If no active stream exists, session.stream() throws. Fall back to session.history() to load completed state from storage.

Status Updates

During long operations, the agent emits AgentStatus objects as transient data-status chunks. These are not persisted — they're for real-time UI feedback.

Status TypeWhen
sandbox-setupSandbox is initializing
sandbox-setup-coldCold start (no snapshot)
loading-skillsLoading skill instructions
processing-approvalsChecking approval rules
needs-approvalWaiting for user approval
thinkingModel is generating
customCustom status from your code

Use the status hook for logging or analytics. Parse data-status chunks in your frontend for loading indicators and progress UI.

Interruption

Stop the current assistant response:

await session.send("New message", { interruptIfStreaming: true });

Or explicitly:

await session.interrupt();

Pending approvals are auto-rejected when interrupted. See Sessions for details.

Frontend Integration

The stream works with @ai-sdk/react's useChat using a custom transport. Configure one endpoint for sending messages and another for reconnection. The AI SDK handles parsing UIMessageChunk and assembling messages.

See the Frontend guide for a complete setup.

Next Steps

  • Sessions — send, stream, interrupt
  • Hooks — status hook for observability
  • Approvals — needs-approval status and resolution
  • Frontend — useChat and custom transport