📄 ai-sdk/docs/ai-sdk-rsc/loading-state

File: loading-state.md | Updated: 11/15/2025

Source: https://ai-sdk.dev/docs/ai-sdk-rsc/loading-state

AI SDK

Menu

v5 (Latest)

AI SDK 5.x

AI SDK by Vercel

AI SDK 6 Beta

Foundations

Overview

Providers and Models

Prompts

Tools

Streaming

Getting Started

Navigating the Library

Next.js App Router

Next.js Pages Router

Svelte

Vue.js (Nuxt)

Node.js

Expo

Agents

Agents

Building Agents

Workflow Patterns

Loop Control

AI SDK Core

Overview

Generating Text

Generating Structured Data

Tool Calling

Model Context Protocol (MCP) Tools

Prompt Engineering

Settings

Embeddings

Image Generation

Transcription

Speech

Language Model Middleware

Provider & Model Management

Error Handling

Testing

Telemetry

AI SDK UI

Overview

Chatbot

Chatbot Message Persistence

Chatbot Resume Streams

Chatbot Tool Usage

Generative User Interfaces

Completion

Object Generation

Streaming Custom Data

Error Handling

Transport

Reading UIMessage Streams

Message Metadata

Stream Protocols

AI SDK RSC

Overview

Streaming React Components

Managing Generative UI State

Saving and Restoring States

Multistep Interfaces

Streaming Values

Handling Loading State

Error Handling

Handling Authentication

Migrating from RSC to UI

Advanced

Reference

AI SDK Core

AI SDK UI

AI SDK RSC

Stream Helpers

AI SDK Errors

Migration Guides

Troubleshooting

Copy markdown

Handling Loading State

==================================================================================================

AI SDK RSC is currently experimental. We recommend using AI SDK UI for production. For guidance on migrating from RSC to UI, see our migration guide .

Given that responses from language models can often take a while to complete, it's crucial to be able to show loading state to users. This provides visual feedback that the system is working on their request and helps maintain a positive user experience.

There are three approaches you can take to handle loading state with the AI SDK RSC:

  • Managing loading state similar to how you would in a traditional Next.js application. This involves setting a loading state variable in the client and updating it when the response is received.
  • Streaming loading state from the server to the client. This approach allows you to track loading state on a more granular level and provide more detailed feedback to the user.
  • Streaming loading component from the server to the client. This approach allows you to stream a React Server Component to the client while awaiting the model's response.

Handling Loading State on the Client


Client

Let's create a simple Next.js page that will call the generateResponse function when the form is submitted. The function will take in the user's prompt (input) and then generate a response (response). To handle the loading state, use the loading state variable. When the form is submitted, set loading to true, and when the response is received, set it back to false. While the response is being streamed, the input field will be disabled.

app/page.tsx

'use client';
import { useState } from 'react';import { generateResponse } from './actions';import { readStreamableValue } from '@ai-sdk/rsc';
// Force the page to be dynamic and allow streaming responses up to 30 secondsexport const maxDuration = 30;
export default function Home() {  const [input, setInput] = useState<string>('');  const [generation, setGeneration] = useState<string>('');  const [loading, setLoading] = useState<boolean>(false);
  return (    <div>      <div>{generation}</div>      <form        onSubmit={async e => {          e.preventDefault();          setLoading(true);          const response = await generateResponse(input);
          let textContent = '';
          for await (const delta of readStreamableValue(response)) {            textContent = `${textContent}${delta}`;            setGeneration(textContent);          }          setInput('');          setLoading(false);        }}      >        <input          type="text"          value={input}          disabled={loading}          className="disabled:opacity-50"          onChange={event => {            setInput(event.target.value);          }}        />        <button>Send Message</button>      </form>    </div>  );}

Server

Now let's implement the generateResponse function. Use the streamText function to generate a response to the input.

app/actions.ts

'use server';
import { streamText } from 'ai';import { openai } from '@ai-sdk/openai';import { createStreamableValue } from '@ai-sdk/rsc';
export async function generateResponse(prompt: string) {  const stream = createStreamableValue();
  (async () => {    const { textStream } = streamText({      model: openai('gpt-4o'),      prompt,    });
    for await (const text of textStream) {      stream.update(text);    }
    stream.done();  })();
  return stream.value;}

Streaming Loading State from the Server


If you are looking to track loading state on a more granular level, you can create a new streamable value to store a custom variable and then read this on the frontend. Let's update the example to create a new streamable value for tracking loading state:

Server

app/actions.ts

'use server';
import { streamText } from 'ai';import { openai } from '@ai-sdk/openai';import { createStreamableValue } from '@ai-sdk/rsc';
export async function generateResponse(prompt: string) {  const stream = createStreamableValue();  const loadingState = createStreamableValue({ loading: true });
  (async () => {    const { textStream } = streamText({      model: openai('gpt-4o'),      prompt,    });
    for await (const text of textStream) {      stream.update(text);    }
    stream.done();    loadingState.done({ loading: false });  })();
  return { response: stream.value, loadingState: loadingState.value };}

Client

app/page.tsx

'use client';
import { useState } from 'react';import { generateResponse } from './actions';import { readStreamableValue } from '@ai-sdk/rsc';
// Force the page to be dynamic and allow streaming responses up to 30 secondsexport const maxDuration = 30;
export default function Home() {  const [input, setInput] = useState<string>('');  const [generation, setGeneration] = useState<string>('');  const [loading, setLoading] = useState<boolean>(false);
  return (    <div>      <div>{generation}</div>      <form        onSubmit={async e => {          e.preventDefault();          setLoading(true);          const { response, loadingState } = await generateResponse(input);
          let textContent = '';
          for await (const responseDelta of readStreamableValue(response)) {            textContent = `${textContent}${responseDelta}`;            setGeneration(textContent);          }          for await (const loadingDelta of readStreamableValue(loadingState)) {            if (loadingDelta) {              setLoading(loadingDelta.loading);            }          }          setInput('');          setLoading(false);        }}      >        <input          type="text"          value={input}          disabled={loading}          className="disabled:opacity-50"          onChange={event => {            setInput(event.target.value);          }}        />        <button>Send Message</button>      </form>    </div>  );}

This allows you to provide more detailed feedback about the generation process to your users.

Streaming Loading Components with streamUI


If you are using the streamUI function, you can stream the loading state to the client in the form of a React component. streamUI supports the usage of JavaScript generator functions , which allow you to yield some value (in this case a React component) while some other blocking work completes.

Server


'use server';
import { openai } from '@ai-sdk/openai';import { streamUI } from '@ai-sdk/rsc';
export async function generateResponse(prompt: string) {  const result = await streamUI({    model: openai('gpt-4o'),    prompt,    text: async function* ({ content }) {      yield <div>loading...</div>;      return <div>{content}</div>;    },  });
  return result.value;}

Remember to update the file from .ts to .tsx because you are defining a React component in the streamUI function.

Client


'use client';
import { useState } from 'react';import { generateResponse } from './actions';import { readStreamableValue } from '@ai-sdk/rsc';
// Force the page to be dynamic and allow streaming responses up to 30 secondsexport const maxDuration = 30;
export default function Home() {  const [input, setInput] = useState<string>('');  const [generation, setGeneration] = useState<React.ReactNode>();
  return (    <div>      <div>{generation}</div>      <form        onSubmit={async e => {          e.preventDefault();          const result = await generateResponse(input);          setGeneration(result);          setInput('');        }}      >        <input          type="text"          value={input}          onChange={event => {            setInput(event.target.value);          }}        />        <button>Send Message</button>      </form>    </div>  );}

On this page

Handling Loading State

Handling Loading State on the Client

Client

Server

Streaming Loading State from the Server

Server

Client

Streaming Loading Components with streamUI

Server

Client

Deploy and Scale AI Apps with Vercel.

Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.

Trusted by industry leaders:

  • OpenAI
  • Photoroom
  • leonardo-ai Logoleonardo-ai Logo
  • zapier Logozapier Logo

Talk to an expert