File: default-settings-middleware.md | Updated: 11/15/2025
Menu
v5 (Latest)
AI SDK 5.x
Model Context Protocol (MCP) Tools
Experimental_StdioMCPTransport
Copy markdown
=====================================================================================================================================
defaultSettingsMiddleware is a middleware function that applies default settings to language model calls. This is useful when you want to establish consistent default parameters across multiple model invocations.
import { defaultSettingsMiddleware } from 'ai';
const middleware = defaultSettingsMiddleware({ settings: { temperature: 0.7, maxOutputTokens: 1000, // other settings... },});
import { defaultSettingsMiddleware } from "ai"
The middleware accepts a configuration object with the following properties:
settings: An object containing default parameter values to apply to language model calls. These can include any valid LanguageModelV2CallOptions properties and optional provider metadata.Returns a middleware object that:
import { streamText } from 'ai';import { wrapLanguageModel } from 'ai';import { defaultSettingsMiddleware } from 'ai';import { openai } from 'ai';
// Create a model with default settingsconst modelWithDefaults = wrapLanguageModel({ model: openai.ChatTextGenerator({ model: 'gpt-4' }), middleware: defaultSettingsMiddleware({ settings: { temperature: 0.5, maxOutputTokens: 800, providerMetadata: { openai: { tags: ['production'], }, }, }, }),});
// Use the model - default settings will be appliedconst result = await streamText({ model: modelWithDefaults, prompt: 'Your prompt here', // These parameters will override the defaults temperature: 0.8,});
The middleware:
On this page
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: