AI SDK

Connect Integrations to
AI SDK

Build advanced AI agents with AI SDK. Connect 600+ integrations, automate workflows, and deploy with ease using Metorial.

Back to AI SDK overview

Switching Between AI Providers

Metorial works with all major AI providers through the Vercel AI SDK. Learn how to switch between different models.

Supported Providers

The AI SDK supports multiple providers out of the box:

  • OpenAI (GPT-4, GPT-4o, GPT-3.5)
  • Anthropic (Claude)
  • Google (Gemini)
  • Mistral
  • And many more

Using Different Providers

OpenAI

import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';

let result = await generateText({
  model: openai('gpt-4o'),
  prompt: 'Your prompt here',
  tools: session.tools
});

Anthropic Claude

import { anthropic } from '@ai-sdk/anthropic';
import { generateText } from 'ai';

let result = await generateText({
  model: anthropic('claude-3-5-sonnet-20241022'),
  prompt: 'Your prompt here',
  tools: session.tools
});

Google Gemini

import { google } from '@ai-sdk/google';
import { generateText } from 'ai';

let result = await generateText({
  model: google('gemini-1.5-pro'),
  prompt: 'Your prompt here',
  tools: session.tools
});

The Same Tools, Different Models

The beauty of Metorial is that your integrations work with any AI provider:

metorial.withProviderSession(
  metorialAiSdk,
  { serverDeployments: ['your-deployment'] },
  async session => {
    // Try with GPT-4
    let gptResult = await generateText({
      model: openai('gpt-4o'),
      prompt: 'Summarize my GitHub issues',
      tools: session.tools
    });

    // Try with Claude
    let claudeResult = await generateText({
      model: anthropic('claude-3-5-sonnet-20241022'),
      prompt: 'Summarize my GitHub issues',
      tools: session.tools
    });

    // Same tools, different models!
  }
);

Choosing the Right Model

Different models have different strengths:

  • GPT-4o: Fast, cost-effective, great for most use cases
  • Claude: Excellent for complex reasoning and long documents
  • Gemini Pro: Strong multimodal capabilities
  • GPT-3.5: Budget-friendly for simple tasks

Setting Up Provider API Keys

Each provider requires its own API key. Set them as environment variables:

OPENAI_API_KEY=your_openai_key
ANTHROPIC_API_KEY=your_anthropic_key
GOOGLE_GENERATIVE_AI_API_KEY=your_google_key

The AI SDK automatically picks up these environment variables.

Provider-Specific Configuration

Some providers support additional options:

let result = await generateText({
  model: openai('gpt-4o', {
    // Provider-specific options
  }),
  prompt: 'Your prompt here',
  temperature: 0.7,
  maxTokens: 1000,
  tools: session.tools
});

Best Practices

  • Test Multiple Models: Different models may perform better for different tasks
  • Consider Costs: Compare pricing across providers for your use case
  • Monitor Performance: Track response times and quality
  • Have Fallbacks: Consider implementing fallback logic if one provider is unavailable

For more details, see the Vercel AI SDK documentation.

AI SDK on Metorial

Connect Vercel AI SDK to Metorial and unlock instant access to over 600 integrations for your AI-powered applications. Our open-source, MCP-powered platform makes it effortless to add tools, APIs, and services to your AI SDK projects without writing complex integration code. With Metorial's TypeScript SDK, you can integrate calendars, databases, communication tools, and hundreds of other services in just a couple of lines of code. Whether you're building chatbots, AI assistants, or intelligent workflows with Vercel's AI SDK, Metorial eliminates integration headaches so you can focus on creating exceptional user experiences. Our developer-friendly approach means less time wrestling with authentication, API documentation, and maintenance—and more time innovating. Join developers who are shipping AI applications faster by letting Metorial handle the integration layer while you concentrate on what makes your app unique.

Connect anything. Anywhere.

Ready to build with Metorial?

Let's take your AI-powered applications to the next level, together.

About Metorial

Metorial provides developers with instant access to 600+ MCP servers for building AI agents that can interact with real-world tools and services. Built on MCP, Metorial simplifies agent tool integration by offering pre-configured connections to popular platforms like Google Drive, Slack, GitHub, Notion, and hundreds of other APIs. Our platform supports all major AI agent frameworks—including LangChain, AutoGen, CrewAI, and LangGraph—enabling developers to add tool calling capabilities to their agents in just a few lines of code. By eliminating the need for custom integration code, Metorial helps AI developers move from prototype to production faster while maintaining security and reliability. Whether you're building autonomous research agents, customer service bots, or workflow automation tools, Metorial's MCP server library provides the integrations you need to connect your agents to the real world.

Star us on GitHub