File: openai-stream.md | Updated: 11/15/2025
Menu
v5 (Latest)
AI SDK 5.x
Model Context Protocol (MCP) Tools
AWSBedrockAnthropicMessagesStream
Copy markdown
==============================================================================================
OpenAIStream has been removed in AI SDK 4.0
OpenAIStream is part of the legacy OpenAI integration. It is not compatible with the AI SDK 3.1 functions. It is recommended to use the AI SDK OpenAI Provider instead.
Transforms the response from OpenAI's language models into a ReadableStream.
Note: Prior to v4, the official OpenAI API SDK does not support the Edge Runtime and only works in serverless environments. The openai-edge package is based on fetch instead of axios (and thus works in the Edge Runtime) so we recommend using openai v4+ or openai-edge.
import { OpenAIStream } from "ai"
Response
The response object returned by a call made by the Provider SDK.
AIStreamCallbacksAndOptions
An object containing callback functions to handle the start, each token, and completion of the AI response. In the absence of this parameter, default behavior is implemented.
AIStreamCallbacksAndOptions
() => Promise<void>
An optional function that is called at the start of the stream processing.
(completion: string) => Promise<void>
An optional function that is called for every completion. It's passed the completion as a string.
(completion: string) => Promise<void>
An optional function that is called once when the stream is closed with the final completion message.
(token: string) => Promise<void>
An optional function that is called for each token in the stream. It's passed the token as a string.
On this page
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: