File: use-chat-custom-request-options.md | Updated: 11/15/2025
Menu
v5 (Latest)
AI SDK 5.x
Model Context Protocol (MCP) Tools
Client-Side Function Calls Not Invoked
Server Actions in Client Components
useChat/useCompletion stream output contains 0:... instead of text
Tool Invocation Missing Result Error
Streaming Not Working When Deployed
Streaming Not Working When Proxied
Getting Timeouts When Deploying on Vercel
useChat Failed to Parse Stream
Server Action Plain Objects Error
Custom headers, body, and credentials not working with useChat
TypeScript performance issues with Zod and AI SDK 5
Repeated assistant messages in useChat
onFinish not called when stream is aborted
Tool calling with generateObject and streamObject
Abort breaks resumable streams
Streaming Status Shows But No Text Appears
Stale body values with useChat
Unsupported model version error
Object generation failed with OpenAI
Model is not assignable to type "LanguageModelV1"
TypeScript error "Cannot find namespace 'JSX'"
React error "Maximum update depth exceeded"
Jest: cannot find module '@ai-sdk/rsc'
Copy markdown
Custom headers, body, and credentials not working with useChat
=======================================================================================================================================================================================================
When using the useChat hook, custom request options like headers, body fields, and credentials configured directly on the hook are not being sent with the request:
// These options are not sent with the requestconst { messages, sendMessage } = useChat({ headers: { Authorization: 'Bearer token123', }, body: { user_id: '123', }, credentials: 'include',});
The useChat hook has changed its API for configuring request options. Direct options like headers, body, and credentials on the hook itself are no longer supported. Instead, you need to use the transport configuration with DefaultChatTransport or pass options at the request level.
There are three ways to properly configure request options with useChat:
For dynamic values that change over time, the recommended approach is to pass options when calling sendMessage:
const { messages, sendMessage } = useChat();
// Send options with each messagesendMessage( { text: input }, { headers: { Authorization: `Bearer ${getAuthToken()}`, // Dynamic auth token 'X-Request-ID': generateRequestId(), }, body: { temperature: 0.7, max_tokens: 100, user_id: getCurrentUserId(), // Dynamic user ID sessionId: getCurrentSessionId(), // Dynamic session }, },);
This approach ensures that the most up-to-date values are always sent with each request.
For static values that don't change during the component lifecycle, use the DefaultChatTransport:
import { useChat } from '@ai-sdk/react';import { DefaultChatTransport } from 'ai';
const { messages, sendMessage } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat', headers: { 'X-API-Version': 'v1', // Static API version 'X-App-ID': 'my-app', // Static app identifier }, body: { model: 'gpt-4o', // Default model stream: true, // Static configuration }, credentials: 'include', // Static credentials policy }),});
If you need dynamic values at the hook level, you can use functions that return configuration values. However, request-level configuration is generally preferred for better reliability:
import { useChat } from '@ai-sdk/react';import { DefaultChatTransport } from 'ai';
const { messages, sendMessage } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat', headers: () => ({ Authorization: `Bearer ${getAuthToken()}`, 'X-User-ID': getCurrentUserId(), }), body: () => ({ sessionId: getCurrentSessionId(), preferences: getUserPreferences(), }), credentials: () => (isAuthenticated() ? 'include' : 'same-origin'), }),});
For component state that changes over time, request-level configuration (Option 1) is recommended. If using hook-level functions, consider using useRef to store current values and reference ref.current in your configuration function.
Request-level options take precedence over hook-level options:
// Hook-level default configurationconst { messages, sendMessage } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat', headers: { 'X-API-Version': 'v1', }, body: { model: 'gpt-4o', }, }),});
// Override or add options per requestsendMessage( { text: input }, { headers: { 'X-API-Version': 'v2', // This overrides the hook-level header 'X-Request-ID': '123', // This is added }, body: { model: 'gpt-4o-mini', // This overrides the hook-level body field temperature: 0.5, // This is added }, },);
For more details on request configuration, see the Request Configuration documentation.
On this page
Custom headers, body, and credentials not working with useChat
Option 1: Request-Level Configuration (Recommended for Dynamic Values)
Option 2: Hook-Level Configuration with Static Values
Option 3: Hook-Level Configuration with Resolvable Functions
Combining Hook and Request Level Options
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: