File: llamaindex-adapter.md | Updated: 11/15/2025
Menu
v5 (Latest)
AI SDK 5.x
Model Context Protocol (MCP) Tools
AWSBedrockAnthropicMessagesStream
Copy markdown
=============================================================================================================
The @ai-sdk/llamaindex package provides helper functions to transform LlamaIndex output streams into data streams and data stream responses. See the LlamaIndex Adapter documentation
for more information.
It supports:
import { toDataResponse } from "@ai-sdk/llamaindex"
(stream: AsyncIterable<EngineResponse>, AIStreamCallbacksAndOptions) => AIStream
Converts LlamaIndex output streams to data stream.
(stream: AsyncIterable<EngineResponse>, options?: {init?: ResponseInit, data?: StreamData, callbacks?: AIStreamCallbacksAndOptions}) => Response
Converts LlamaIndex output streams to data stream response.
(stream: AsyncIterable<EngineResponse>, options: { dataStream: DataStreamWriter; callbacks?: StreamCallbacks }) => void
Merges LlamaIndex output streams into an existing data stream.
app/api/completion/route.ts
import { OpenAI, SimpleChatEngine } from 'llamaindex';import { toDataStreamResponse } from '@ai-sdk/llamaindex';
export async function POST(req: Request) { const { prompt } = await req.json();
const llm = new OpenAI({ model: 'gpt-4o' }); const chatEngine = new SimpleChatEngine({ llm });
const stream = await chatEngine.chat({ message: prompt, stream: true, });
return toDataStreamResponse(stream);}
On this page
Convert LlamaIndex ChatEngine Stream
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: