File: ai-stream.md | Updated: 11/15/2025
Menu
v5 (Latest)
AI SDK 5.x
Model Context Protocol (MCP) Tools
AWSBedrockAnthropicMessagesStream
Copy markdown
==================================================================================
AIStream has been removed in AI SDK 4.0. Use streamText.toDataStreamResponse() instead.
Creates a readable stream for AI responses. This is based on the responses returned by fetch and serves as the basis for the OpenAIStream and AnthropicStream. It allows you to handle AI response streams in a controlled and customized manner that will work with useChat and useCompletion.
AIStream will throw an error if response doesn't have a 2xx status code. This is to ensure that the stream is only created for successful responses.
import { AIStream } from "ai"
Response
This is the response object returned by fetch. It's used as the source of the readable stream.
(AIStreamParser) => void
This is a function that is used to parse the events in the stream. It should return a function that receives a stringified chunk from the LLM and extracts the message content. The function is expected to return nothing (void) or a string.
AIStreamParser
(data: string) => string | void
AIStreamCallbacksAndOptions
AIStreamCallbacksAndOptions
() => Promise<void>
An optional function that is called at the start of the stream processing.
(completion: string) => Promise<void>
An optional function that is called for every completion. It's passed the completion as a string.
(completion: string) => Promise<void>
An optional function that is called once when the stream is closed with the final completion message.
(token: string) => Promise<void>
An optional function that is called for each token in the stream. It's passed the token as a string.
On this page
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: