Vercel AI SDK Pricing 2026
Complete pricing guide with plans, hidden costs, and cost analysis
Vercel AI SDK has a free plan. Paid plans start at $20/per month (Vercel plan) (Pro) and go up to $20/per month (Vercel plan).
Vercel AI SDK costs Free to $20 per per month (Vercel plan) as of April 2026, with 3 plans available including a free tier. Plans: Hobby (Free) (free), and Pro at $20/per month (Vercel plan). Enterprise pricing is available on request. Pricing depends on your chosen tier, contract length, and negotiated discounts.
Use the interactive pricing calculator to estimate your exact cost based on team size and requirements.
- Free tier: Yes
Vercel AI SDK offers 3 pricing tiers: Hobby (Free), Pro, Enterprise. A free plan is available. Paid plans include Pro at $20/month. The Pro plan is teams building ai applications on vercel who want built-in llm observability and caching.
Compared to other llm api providers software, Vercel AI SDK is positioned at the budget-friendly price point.
- 3 documented hidden costs beyond list price
How much does Vercel AI SDK cost?
Vercel AI SDK Pricing Overview
Vercel AI SDK has 3 pricing plans, including a free tier. Paid plans range from $0 to $20/per month (Vercel plan). The Hobby (Free) plan is free and is best for individual developers using vercel ai sdk for personal projects. The Pro plan costs $20/per month (Vercel plan), best for teams building ai applications on vercel who want built-in llm observability and caching. The Enterprise plan requires contacting sales for a custom quote and is designed for enterprises with strict compliance requirements and high-volume ai workloads on vercel.
There are at least 3 documented hidden costs beyond Vercel AI SDK's list price, including implementation, training, and add-on fees.
This pricing was last verified in April 15, 2026 from 1 independent sources.
Vercel AI Gateway is a built-in LLM proxy within the Vercel platform, available as part of the Vercel AI SDK. It provides a unified endpoint for routing to OpenAI, Anthropic, Google, and other providers with automatic caching, observability dashboards, and rate limiting. The gateway itself adds no per-token markup — you pay the underlying provider's rates plus your Vercel plan. The Vercel AI SDK (open-source) is free; the gateway observability features are included in Vercel Pro ($20/mo) and Enterprise plans. This makes it most useful for teams already deployed on Vercel who want built-in LLM observability without a separate provider like LangSmith or Helicone.
How Vercel AI SDK Pricing Compares
Compare Vercel AI SDK pricing against top alternatives in LLM API Providers.
All Vercel AI SDK Plans & Pricing
| Plan | Monthly | Annual | Best For |
|---|---|---|---|
| Hobby (Free) observability: Limited to basic request logs | Free | Free | Individual developers using Vercel AI SDK for personal projects |
| Pro billing: $20/month platform fee; LLM costs billed directly to your provider accounts | $20 /month | $216 /year | Teams building AI applications on Vercel who want built-in LLM observability and caching |
| Enterprise | Contact Sales | Contact Sales | Enterprises with strict compliance requirements and high-volume AI workloads on Vercel |
View all features by plan
Hobby (Free)
- Vercel AI SDK (open-source) fully available
- Basic AI Gateway routing
- Access to all providers via SDK
- Limited observability features
Pro
- Full AI Gateway observability dashboard
- Request caching (reduce duplicate LLM costs)
- Rate limiting per user/team
- Provider routing (OpenAI, Anthropic, Google, Cohere, Mistral)
- Usage analytics and cost tracking
- No per-token markup
- Team collaboration features
Enterprise
- Everything in Pro
- Custom SLAs
- Advanced security controls
- SSO and SAML
- Dedicated support
- Custom usage limits
- HIPAA compliance options
Usage-Based Rates
Per-unit pricing for Vercel AI SDK API usage.
Pro
| Model | Unit | Rate |
|---|---|---|
| OpenAI GPT-4o (via gateway) | 1M input tokens | $2.50 OpenAI passthrough — no Vercel markup |
| OpenAI GPT-4o (via gateway) | 1M output tokens | $10.00 OpenAI passthrough — no Vercel markup |
| OpenAI GPT-4o mini (via gateway) | 1M input tokens | $0.150 OpenAI passthrough |
| OpenAI GPT-4o mini (via gateway) | 1M output tokens | $0.600 OpenAI passthrough |
| Anthropic Claude Sonnet 3.5 (via gateway) | 1M input tokens | $3.00 Anthropic passthrough |
| Anthropic Claude Sonnet 3.5 (via gateway) | 1M output tokens | $15.00 Anthropic passthrough |
| Google Gemini 1.5 Flash (via gateway) | 1M input tokens | $0.075 Google passthrough |
| Google Gemini 1.5 Flash (via gateway) | 1M output tokens | $0.300 Google passthrough |
| Mistral Large (via gateway) | 1M input tokens | $2.00 Mistral passthrough |
| Mistral Large (via gateway) | 1M output tokens | $6.00 Mistral passthrough |
- Vercel AI Gateway adds NO per-token markup — you pay provider rates directly
- Gateway features (caching, observability) are included in the $20/mo Pro plan
- LLM API keys must be configured in your Vercel project environment variables
- Verify current gateway features at vercel.com/docs/ai/ai-gateway
Compare Vercel AI SDK vs Alternatives
Before committing to Vercel AI SDK, compare pricing with these 3 alternatives in the same category.
How Vercel AI SDK Pricing Compares
| Software | Starting Price | Top Price |
|---|---|---|
| Vercel AI SDK | Free | $20/per month (Vercel plan) |
| Amazon Bedrock | $0.07/per million tokens | $75/per million tokens |
| Anyscale | $0.15/per million tokens | $5/per million tokens |
| Baidu ERNIE API | $0.1/per million tokens | $10/per million tokens |
| Cerebras Inference API | $0.1/per million tokens | $6/per million tokens |
| Claude API | $0.03/per million tokens | $75/per million tokens |
Detailed pricing comparisons:
Vercel AI SDK Pricing FAQ
01 How much does Vercel AI Gateway cost?
The Vercel AI Gateway is included in the Vercel Pro plan ($20/month). The gateway itself adds no per-token markup — you pay your underlying LLM provider (OpenAI, Anthropic, Google) at their standard rates. On the free Hobby plan, basic routing is available but advanced observability features require Pro.
02 What is Vercel AI Gateway?
Vercel AI Gateway is a built-in LLM proxy in the Vercel platform. It routes requests to multiple AI providers (OpenAI, Anthropic, Google, Mistral, Cohere) through a unified endpoint, with caching to reduce duplicate costs, usage analytics, and rate limiting — all integrated with your Vercel deployment.
03 Does Vercel AI Gateway add a markup on LLM costs?
No. Vercel AI Gateway passes through provider pricing at cost — there is no per-token markup. You pay the underlying model provider's rates. The value is the observability, caching, and routing features bundled into your Vercel plan.
04 What providers does Vercel AI Gateway support?
Vercel AI Gateway supports OpenAI, Anthropic, Google AI (Gemini), Mistral, Cohere, and others supported by the Vercel AI SDK. The full provider list is at sdk.vercel.ai/providers.
05 Vercel AI Gateway vs OpenRouter: what's the difference?
OpenRouter is a standalone LLM aggregator with 300+ models and its own billing. Vercel AI Gateway is integrated into Vercel and requires no separate account or billing — but it's only useful if you're already hosting on Vercel. For non-Vercel deployments, OpenRouter is the better standalone choice.
Is this pricing incorrect? — we'll verify and update it.