📄 ai-sdk/docs/ai-sdk-core/testing

File: testing.md | Updated: 11/15/2025

Source: https://ai-sdk.dev/docs/ai-sdk-core/testing

AI SDK

Menu

v5 (Latest)

AI SDK 5.x

AI SDK by Vercel

AI SDK 6 Beta

Foundations

Overview

Providers and Models

Prompts

Tools

Streaming

Getting Started

Navigating the Library

Next.js App Router

Next.js Pages Router

Svelte

Vue.js (Nuxt)

Node.js

Expo

Agents

Agents

Building Agents

Workflow Patterns

Loop Control

AI SDK Core

Overview

Generating Text

Generating Structured Data

Tool Calling

Model Context Protocol (MCP) Tools

Prompt Engineering

Settings

Embeddings

Image Generation

Transcription

Speech

Language Model Middleware

Provider & Model Management

Error Handling

Testing

Telemetry

AI SDK UI

Overview

Chatbot

Chatbot Message Persistence

Chatbot Resume Streams

Chatbot Tool Usage

Generative User Interfaces

Completion

Object Generation

Streaming Custom Data

Error Handling

Transport

Reading UIMessage Streams

Message Metadata

Stream Protocols

AI SDK RSC

Advanced

Reference

AI SDK Core

AI SDK UI

AI SDK RSC

Stream Helpers

AI SDK Errors

Migration Guides

Troubleshooting

Copy markdown

Testing

===============================================================

Testing language models can be challenging, because they are non-deterministic and calling them is slow and expensive.

To enable you to unit test your code that uses the AI SDK, the AI SDK Core includes mock providers and test helpers. You can import the following helpers from ai/test:

With mock providers and test helpers, you can control the output of the AI SDK and test your code in a repeatable and deterministic way without actually calling a language model provider.

Examples


You can use the test helpers with the AI Core functions in your unit tests:

generateText

import { generateText } from 'ai';import { MockLanguageModelV2 } from 'ai/test';
const result = await generateText({  model: new MockLanguageModelV2({    doGenerate: async () => ({      finishReason: 'stop',      usage: { inputTokens: 10, outputTokens: 20, totalTokens: 30 },      content: [{ type: 'text', text: `Hello, world!` }],      warnings: [],    }),  }),  prompt: 'Hello, test!',});

streamText

import { streamText, simulateReadableStream } from 'ai';import { MockLanguageModelV2 } from 'ai/test';
const result = streamText({  model: new MockLanguageModelV2({    doStream: async () => ({      stream: simulateReadableStream({        chunks: [          { type: 'text-start', id: 'text-1' },          { type: 'text-delta', id: 'text-1', delta: 'Hello' },          { type: 'text-delta', id: 'text-1', delta: ', ' },          { type: 'text-delta', id: 'text-1', delta: 'world!' },          { type: 'text-end', id: 'text-1' },          {            type: 'finish',            finishReason: 'stop',            logprobs: undefined,            usage: { inputTokens: 3, outputTokens: 10, totalTokens: 13 },          },        ],      }),    }),  }),  prompt: 'Hello, test!',});

generateObject

import { generateObject } from 'ai';import { MockLanguageModelV2 } from 'ai/test';import { z } from 'zod';
const result = await generateObject({  model: new MockLanguageModelV2({    doGenerate: async () => ({      finishReason: 'stop',      usage: { inputTokens: 10, outputTokens: 20, totalTokens: 30 },      content: [{ type: 'text', text: `{"content":"Hello, world!"}` }],      warnings: [],    }),  }),  schema: z.object({ content: z.string() }),  prompt: 'Hello, test!',});

streamObject

import { streamObject, simulateReadableStream } from 'ai';import { MockLanguageModelV2 } from 'ai/test';import { z } from 'zod';
const result = streamObject({  model: new MockLanguageModelV2({    doStream: async () => ({      stream: simulateReadableStream({        chunks: [          { type: 'text-start', id: 'text-1' },          { type: 'text-delta', id: 'text-1', delta: '{ ' },          { type: 'text-delta', id: 'text-1', delta: '"content": ' },          { type: 'text-delta', id: 'text-1', delta: `"Hello, ` },          { type: 'text-delta', id: 'text-1', delta: `world` },          { type: 'text-delta', id: 'text-1', delta: `!"` },          { type: 'text-delta', id: 'text-1', delta: ' }' },          { type: 'text-end', id: 'text-1' },          {            type: 'finish',            finishReason: 'stop',            logprobs: undefined,            usage: { inputTokens: 3, outputTokens: 10, totalTokens: 13 },          },        ],      }),    }),  }),  schema: z.object({ content: z.string() }),  prompt: 'Hello, test!',});

Simulate UI Message Stream Responses

You can also simulate UI Message Stream responses for testing, debugging, or demonstration purposes.

Here is a Next example:

route.ts

import { simulateReadableStream } from 'ai';
export async function POST(req: Request) {  return new Response(    simulateReadableStream({      initialDelayInMs: 1000, // Delay before the first chunk      chunkDelayInMs: 300, // Delay between chunks      chunks: [        `data: {"type":"start","messageId":"msg-123"}\n\n`,        `data: {"type":"text-start","id":"text-1"}\n\n`,        `data: {"type":"text-delta","id":"text-1","delta":"This"}\n\n`,        `data: {"type":"text-delta","id":"text-1","delta":" is an"}\n\n`,        `data: {"type":"text-delta","id":"text-1","delta":" example."}\n\n`,        `data: {"type":"text-end","id":"text-1"}\n\n`,        `data: {"type":"finish"}\n\n`,        `data: [DONE]\n\n`,      ],    }).pipeThrough(new TextEncoderStream()),    {      status: 200,      headers: {        'Content-Type': 'text/event-stream',        'Cache-Control': 'no-cache',        Connection: 'keep-alive',        'x-vercel-ai-ui-message-stream': 'v1',      },    },  );}

On this page

Testing

Examples

generateText

streamText

generateObject

streamObject

Simulate UI Message Stream Responses

Deploy and Scale AI Apps with Vercel.

Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.

Trusted by industry leaders:

  • OpenAI
  • Photoroom
  • leonardo-ai Logoleonardo-ai Logo
  • zapier Logozapier Logo

Talk to an expert