📄 ai-sdk/docs/troubleshooting/use-chat-stale-body-data

File: use-chat-stale-body-data.md | Updated: 11/15/2025

Source: https://ai-sdk.dev/docs/troubleshooting/use-chat-stale-body-data

AI SDK

Menu

v5 (Latest)

AI SDK 5.x

AI SDK by Vercel

AI SDK 6 Beta

Foundations

Overview

Providers and Models

Prompts

Tools

Streaming

Getting Started

Navigating the Library

Next.js App Router

Next.js Pages Router

Svelte

Vue.js (Nuxt)

Node.js

Expo

Agents

Agents

Building Agents

Workflow Patterns

Loop Control

AI SDK Core

Overview

Generating Text

Generating Structured Data

Tool Calling

Model Context Protocol (MCP) Tools

Prompt Engineering

Settings

Embeddings

Image Generation

Transcription

Speech

Language Model Middleware

Provider & Model Management

Error Handling

Testing

Telemetry

AI SDK UI

Overview

Chatbot

Chatbot Message Persistence

Chatbot Resume Streams

Chatbot Tool Usage

Generative User Interfaces

Completion

Object Generation

Streaming Custom Data

Error Handling

Transport

Reading UIMessage Streams

Message Metadata

Stream Protocols

AI SDK RSC

Advanced

Reference

AI SDK Core

AI SDK UI

AI SDK RSC

Stream Helpers

AI SDK Errors

Migration Guides

Troubleshooting

Azure OpenAI Slow to Stream

Client-Side Function Calls Not Invoked

Server Actions in Client Components

useChat/useCompletion stream output contains 0:... instead of text

Streamable UI Errors

Tool Invocation Missing Result Error

Streaming Not Working When Deployed

Streaming Not Working When Proxied

Getting Timeouts When Deploying on Vercel

Unclosed Streams

useChat Failed to Parse Stream

Server Action Plain Objects Error

useChat No Response

Custom headers, body, and credentials not working with useChat

TypeScript performance issues with Zod and AI SDK 5

useChat "An error occurred"

Repeated assistant messages in useChat

onFinish not called when stream is aborted

Tool calling with generateObject and streamObject

Abort breaks resumable streams

streamText fails silently

Streaming Status Shows But No Text Appears

Stale body values with useChat

Type Error with onToolCall

Unsupported model version error

Object generation failed with OpenAI

Model is not assignable to type "LanguageModelV1"

TypeScript error "Cannot find namespace 'JSX'"

React error "Maximum update depth exceeded"

Jest: cannot find module '@ai-sdk/rsc'

Copy markdown

Stale body values with useChat

==================================================================================================================================

Issue


When using useChat and passing dynamic information via the body parameter at the hook level, the data remains stale and only reflects the value from the initial component render. This occurs because the body configuration is captured once when the hook is initialized and doesn't update with subsequent component re-renders.

// Problematic code - body data will be staleexport default function Chat() {  const [temperature, setTemperature] = useState(0.7);  const [userId, setUserId] = useState('user123');
  // This body configuration is captured once and won't update  const { messages, sendMessage } = useChat({    transport: new DefaultChatTransport({      api: '/api/chat',      body: {        temperature, // Always the initial value (0.7)        userId, // Always the initial value ('user123')      },    }),  });
  // Even if temperature or userId change, the body in requests will still use initial values  return (    <div>      <input        type="range"        value={temperature}        onChange={e => setTemperature(parseFloat(e.target.value))}      />      {/* Chat UI */}    </div>  );}

Background


The hook-level body configuration is evaluated once during the initial render and doesn't re-evaluate when component state changes.

Solution


Pass dynamic variables via the second argument of the sendMessage function instead of at the hook level. Request-level options are evaluated on each call and take precedence over hook-level options.

export default function Chat() {  const [temperature, setTemperature] = useState(0.7);  const [userId, setUserId] = useState('user123');  const [input, setInput] = useState('');
  const { messages, sendMessage } = useChat({    // Static configuration only    transport: new DefaultChatTransport({      api: '/api/chat',    }),  });
  return (    <div>      <input        type="range"        value={temperature}        onChange={e => setTemperature(parseFloat(e.target.value))}      />
      <form        onSubmit={event => {          event.preventDefault();          if (input.trim()) {            // Pass dynamic values as request-level options            sendMessage(              { text: input },              {                body: {                  temperature, // Current value at request time                  userId, // Current value at request time                },              },            );            setInput('');          }        }}      >        <input value={input} onChange={e => setInput(e.target.value)} />      </form>    </div>  );}

Alternative: Dynamic Hook-Level Configuration

If you need hook-level configuration that responds to changes, you can use functions that return configuration values. However, for component state, you'll need to use useRef to access current values:

export default function Chat() {  const temperatureRef = useRef(0.7);
  const { messages, sendMessage } = useChat({    transport: new DefaultChatTransport({      api: '/api/chat',      body: () => ({        temperature: temperatureRef.current, // Access via ref.current        sessionId: getCurrentSessionId(), // Function calls work directly      }),    }),  });
  // ...}

Recommendation: Request-level configuration is simpler and more reliable for component state. Use it whenever you need to pass dynamic values that change during the component lifecycle.

Server-side handling

On your server side, retrieve the custom fields by destructuring the request body:

// app/api/chat/route.tsexport async function POST(req: Request) {  const { messages, temperature, userId } = await req.json();
  const result = streamText({    model: openai('gpt-4o-mini'),    messages: convertToModelMessages(messages),    temperature, // Use the dynamic temperature from the request    // ... other configuration  });
  return result.toUIMessageStreamResponse();}

For more information, see chatbot request configuration documentation .

On this page

Stale body values with useChat

Issue

Background

Solution

Alternative: Dynamic Hook-Level Configuration

Server-side handling

Deploy and Scale AI Apps with Vercel.

Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.

Trusted by industry leaders:

  • OpenAI
  • Photoroom
  • leonardo-ai Logoleonardo-ai Logo
  • zapier Logozapier Logo

Talk to an expert