File: use-chat-stale-body-data.md | Updated: 11/15/2025
Menu
v5 (Latest)
AI SDK 5.x
Model Context Protocol (MCP) Tools
Client-Side Function Calls Not Invoked
Server Actions in Client Components
useChat/useCompletion stream output contains 0:... instead of text
Tool Invocation Missing Result Error
Streaming Not Working When Deployed
Streaming Not Working When Proxied
Getting Timeouts When Deploying on Vercel
useChat Failed to Parse Stream
Server Action Plain Objects Error
Custom headers, body, and credentials not working with useChat
TypeScript performance issues with Zod and AI SDK 5
Repeated assistant messages in useChat
onFinish not called when stream is aborted
Tool calling with generateObject and streamObject
Abort breaks resumable streams
Streaming Status Shows But No Text Appears
Stale body values with useChat
Unsupported model version error
Object generation failed with OpenAI
Model is not assignable to type "LanguageModelV1"
TypeScript error "Cannot find namespace 'JSX'"
React error "Maximum update depth exceeded"
Jest: cannot find module '@ai-sdk/rsc'
Copy markdown
Stale body values with useChat
==================================================================================================================================
When using useChat and passing dynamic information via the body parameter at the hook level, the data remains stale and only reflects the value from the initial component render. This occurs because the body configuration is captured once when the hook is initialized and doesn't update with subsequent component re-renders.
// Problematic code - body data will be staleexport default function Chat() { const [temperature, setTemperature] = useState(0.7); const [userId, setUserId] = useState('user123');
// This body configuration is captured once and won't update const { messages, sendMessage } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat', body: { temperature, // Always the initial value (0.7) userId, // Always the initial value ('user123') }, }), });
// Even if temperature or userId change, the body in requests will still use initial values return ( <div> <input type="range" value={temperature} onChange={e => setTemperature(parseFloat(e.target.value))} /> {/* Chat UI */} </div> );}
The hook-level body configuration is evaluated once during the initial render and doesn't re-evaluate when component state changes.
Pass dynamic variables via the second argument of the sendMessage function instead of at the hook level. Request-level options are evaluated on each call and take precedence over hook-level options.
export default function Chat() { const [temperature, setTemperature] = useState(0.7); const [userId, setUserId] = useState('user123'); const [input, setInput] = useState('');
const { messages, sendMessage } = useChat({ // Static configuration only transport: new DefaultChatTransport({ api: '/api/chat', }), });
return ( <div> <input type="range" value={temperature} onChange={e => setTemperature(parseFloat(e.target.value))} />
<form onSubmit={event => { event.preventDefault(); if (input.trim()) { // Pass dynamic values as request-level options sendMessage( { text: input }, { body: { temperature, // Current value at request time userId, // Current value at request time }, }, ); setInput(''); } }} > <input value={input} onChange={e => setInput(e.target.value)} /> </form> </div> );}
If you need hook-level configuration that responds to changes, you can use functions that return configuration values. However, for component state, you'll need to use useRef to access current values:
export default function Chat() { const temperatureRef = useRef(0.7);
const { messages, sendMessage } = useChat({ transport: new DefaultChatTransport({ api: '/api/chat', body: () => ({ temperature: temperatureRef.current, // Access via ref.current sessionId: getCurrentSessionId(), // Function calls work directly }), }), });
// ...}
Recommendation: Request-level configuration is simpler and more reliable for component state. Use it whenever you need to pass dynamic values that change during the component lifecycle.
On your server side, retrieve the custom fields by destructuring the request body:
// app/api/chat/route.tsexport async function POST(req: Request) { const { messages, temperature, userId } = await req.json();
const result = streamText({ model: openai('gpt-4o-mini'), messages: convertToModelMessages(messages), temperature, // Use the dynamic temperature from the request // ... other configuration });
return result.toUIMessageStreamResponse();}
For more information, see chatbot request configuration documentation .
On this page
Stale body values with useChat
Alternative: Dynamic Hook-Level Configuration
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: