📄 ai-sdk/docs/foundations/providers-and-models

File: providers-and-models.md | Updated: 11/15/2025

Source: https://ai-sdk.dev/docs/foundations/providers-and-models

AI SDK

Menu

v5 (Latest)

AI SDK 5.x

AI SDK by Vercel

AI SDK 6 Beta

Foundations

Overview

Providers and Models

Prompts

Tools

Streaming

Getting Started

Navigating the Library

Next.js App Router

Next.js Pages Router

Svelte

Vue.js (Nuxt)

Node.js

Expo

Agents

Agents

Building Agents

Workflow Patterns

Loop Control

AI SDK Core

Overview

Generating Text

Generating Structured Data

Tool Calling

Model Context Protocol (MCP) Tools

Prompt Engineering

Settings

Embeddings

Image Generation

Transcription

Speech

Language Model Middleware

Provider & Model Management

Error Handling

Testing

Telemetry

AI SDK UI

Overview

Chatbot

Chatbot Message Persistence

Chatbot Resume Streams

Chatbot Tool Usage

Generative User Interfaces

Completion

Object Generation

Streaming Custom Data

Error Handling

Transport

Reading UIMessage Streams

Message Metadata

Stream Protocols

AI SDK RSC

Advanced

Reference

AI SDK Core

AI SDK UI

AI SDK RSC

Stream Helpers

AI SDK Errors

Migration Guides

Troubleshooting

Copy markdown

Providers and Models

======================================================================================================

Companies such as OpenAI and Anthropic (providers) offer access to a range of large language models (LLMs) with differing strengths and capabilities through their own APIs.

Each provider typically has its own unique method for interfacing with their models, complicating the process of switching providers and increasing the risk of vendor lock-in.

To solve these challenges, AI SDK Core offers a standardized approach to interacting with LLMs through a language model specification that abstracts differences between providers. This unified interface allows you to switch between providers with ease while using the same API for all providers.

Here is an overview of the AI SDK Provider Architecture:

AI SDK Providers


The AI SDK comes with a wide range of providers that you can use to interact with different language models:

You can also use the OpenAI Compatible provider with OpenAI-compatible APIs:

Our language model specification is published as an open-source package, which you can use to create custom providers .

The open-source community has created the following providers:

Self-Hosted Models


You can access self-hosted models with the following providers:

Additionally, any self-hosted provider that supports the OpenAI specification can be used with the OpenAI Compatible Provider .

Model Capabilities


The AI providers support different language models with various capabilities. Here are the capabilities of popular models:

| Provider | Model | Image Input | Object Generation | Tool Usage | Tool Streaming | | --- | --- | --- | --- | --- | --- | | xAI Grok | grok-4 | | | | | | xAI Grok | grok-3 | | | | | | xAI Grok | grok-3-fast | | | | | | xAI Grok | grok-3-mini | | | | | | xAI Grok | grok-3-mini-fast | | | | | | xAI Grok | grok-2-1212 | | | | | | xAI Grok | grok-2-vision-1212 | | | | | | xAI Grok | grok-beta | | | | | | xAI Grok | grok-vision-beta | | | | | | Vercel | v0-1.0-md | | | | | | OpenAI | gpt-5 | | | | | | OpenAI | gpt-5-mini | | | | | | OpenAI | gpt-5-nano | | | | | | OpenAI | gpt-5.1-chat-latest | | | | | | OpenAI | gpt-5.1-codex-mini | | | | | | OpenAI | gpt-5.1-codex | | | | | | OpenAI | gpt-5.1 | | | | | | OpenAI | gpt-5-codex | | | | | | OpenAI | gpt-5-chat-latest | | | | | | Anthropic | claude-opus-4-1 | | | | | | Anthropic | claude-opus-4-0 | | | | | | Anthropic | claude-sonnet-4-0 | | | | | | Anthropic | claude-3-7-sonnet-latest | | | | | | Anthropic | claude-3-5-haiku-latest | | | | | | Mistral | pixtral-large-latest | | | | | | Mistral | mistral-large-latest | | | | | | Mistral | mistral-medium-latest | | | | | | Mistral | mistral-medium-2505 | | | | | | Mistral | mistral-small-latest | | | | | | Mistral | pixtral-12b-2409 | | | | | | Google Generative AI | gemini-2.0-flash-exp | | | | | | Google Generative AI | gemini-1.5-flash | | | | | | Google Generative AI | gemini-1.5-pro | | | | | | Google Vertex | gemini-2.0-flash-exp | | | | | | Google Vertex | gemini-1.5-flash | | | | | | Google Vertex | gemini-1.5-pro | | | | | | DeepSeek | deepseek-chat | | | | | | DeepSeek | deepseek-reasoner | | | | | | Cerebras | llama3.1-8b | | | | | | Cerebras | llama3.1-70b | | | | | | Cerebras | llama3.3-70b | | | | | | Groq | meta-llama/llama-4-scout-17b-16e-instruct | | | | | | Groq | llama-3.3-70b-versatile | | | | | | Groq | llama-3.1-8b-instant | | | | | | Groq | mixtral-8x7b-32768 | | | | | | Groq | gemma2-9b-it | | | | |

This table is not exhaustive. Additional models can be found in the provider documentation pages and on the provider websites.

On this page

Providers and Models

AI SDK Providers

Self-Hosted Models

Model Capabilities

Deploy and Scale AI Apps with Vercel.

Vercel delivers the infrastructure and developer experience you need to ship reliable AI-powered applications at scale.

Trusted by industry leaders:

  • OpenAI
  • Photoroom
  • leonardo-ai Logoleonardo-ai Logo
  • zapier Logozapier Logo

Talk to an expert