File: create-ui-message-stream-response.md | Updated: 11/15/2025
Menu
v5 (Latest)
AI SDK 5.x
Model Context Protocol (MCP) Tools
Copy markdown
===============================================================================================================================================
The createUIMessageStreamResponse function creates a Response object that streams UI messages to the client.
import { createUIMessageStreamResponse } from "ai"
import { createUIMessageStream, createUIMessageStreamResponse } from 'ai';
const response = createUIMessageStreamResponse({ status: 200, statusText: 'OK', headers: { 'Custom-Header': 'value', }, stream: createUIMessageStream({ execute({ writer }) { // Write custom data writer.write({ type: 'data', value: { message: 'Hello' }, });
// Write text content writer.write({ type: 'text', value: 'Hello, world!', });
// Write source information writer.write({ type: 'source-url', value: { type: 'source', id: 'source-1', url: 'https://example.com', title: 'Example Source', }, });
// Merge with LLM stream const result = streamText({ model: openai('gpt-4'), prompt: 'Say hello', });
writer.merge(result.toUIMessageStream()); }, }),});
ReadableStream<UIMessageChunk>
The UI message stream to send to the client.
number
The status code for the response. Defaults to 200.
string
The status text for the response.
Headers | Record<string, string>
Additional headers for the response.
(options: { stream: ReadableStream<string> }) => PromiseLike<void> | void
Optional callback to consume the Server-Sent Events stream.
Response
A Response object that streams UI message chunks with the specified status, headers, and content.
On this page
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: