File: call-tools.md | Updated: 11/15/2025
Menu
Google Gemini Image Generation
Get started with Claude 3.7 Sonnet
Get started with OpenAI o3-mini
Generate Text with Chat Prompt
Generate Image with Chat Prompt
streamText Multi-Step Cookbook
Markdown Chatbot with Memoization
Generate Object with File Prompt through Form Submission
Model Context Protocol (MCP) Tools
Share useChat State Across Components
Human-in-the-Loop Agent with Next.js
Render Visual Interface in Chat
Generate Text with Chat Prompt
Generate Text with Image Prompt
Generate Object with a Reasoning Model
Stream Object with Image Prompt
Record Token Usage After Streaming Object
Record Final Object after Streaming Object
Model Context Protocol (MCP) Tools
Retrieval Augmented Generation
Generate Text with Chat Prompt
Restore Messages From Database
Render Visual Interface in Chat
Stream Updates to Visual Interfaces
Record Token Usage after Streaming User Interfaces
Copy markdown
====================================================================
Some models allow developers to provide a list of tools that can be called at any time during a generation. This is useful for extending the capabilities of a language model to either use logic or data to interact with systems external to the model.
http://localhost:3000
User: How is it going?
Assistant: All good, how may I help you?
What is 24 celsius in fahrenheit?
Send Message
Let's create a simple conversation between a user and model and place a button that will call continueConversation.
app/page.tsx
'use client';
import { useState } from 'react';import { Message, continueConversation } from './actions';
// Allow streaming responses up to 30 secondsexport const maxDuration = 30;
export default function Home() { const [conversation, setConversation] = useState<Message[]>([]); const [input, setInput] = useState<string>('');
return ( <div> <div> {conversation.map((message, index) => ( <div key={index}> {message.role}: {message.content} </div> ))} </div>
<div> <input type="text" value={input} onChange={event => { setInput(event.target.value); }} /> <button onClick={async () => { const { messages } = await continueConversation([ ...conversation, { role: 'user', content: input }, ]);
setConversation(messages); }} > Send Message </button> </div> </div> );}
Now, let's implement the continueConversation action that uses generateText to generate a response to the user's question. We will use the tools
parameter to specify our own function called celsiusToFahrenheit that will convert a user given value in celsius to fahrenheit.
We will use zod to specify the schema for the celsiusToFahrenheit function's parameters.
app/actions.ts
'use server';
import { generateText } from 'ai';import { openai } from '@ai-sdk/openai';import { z } from 'zod';
export interface Message { role: 'user' | 'assistant'; content: string;}
export async function continueConversation(history: Message[]) { 'use server';
const { text, toolResults } = await generateText({ model: openai('gpt-3.5-turbo'), system: 'You are a friendly assistant!', messages: history, tools: { celsiusToFahrenheit: { description: 'Converts celsius to fahrenheit', inputSchema: z.object({ value: z.string().describe('The value in celsius'), }), execute: async ({ value }) => { const celsius = parseFloat(value); const fahrenheit = celsius * (9 / 5) + 32; return `${celsius}°C is ${fahrenheit.toFixed(2)}°F`; }, }, }, });
return { messages: [ ...history, { role: 'assistant' as const, content: text || toolResults.map(toolResult => toolResult.result).join('\n'), }, ], };}
On this page
Deploy and Scale AI Apps with Vercel.
Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.
Trusted by industry leaders: