Build advanced AI agents with TogetherAI. Connect 600+ integrations, automate workflows, and deploy with ease using Metorial.
Learn how to work with different AI models from various providers while using Metorial integrations.
Different AI models have different strengths:
With Metorial, you can use any AI provider while maintaining access to the same 600+ integrations.
The AI SDK makes it easy to switch between providers:
import { openai } from '@ai-sdk/openai';
import { anthropic } from '@ai-sdk/anthropic';
import { google } from '@ai-sdk/google';
// Use OpenAI
const gpt4Result = await generateText({
model: openai('gpt-4'),
messages,
tools: session.tools
});
// Use Anthropic
const claudeResult = await generateText({
model: anthropic('claude-3-opus-20240229'),
messages,
tools: session.tools
});
// Use Google
const geminiResult = await generateText({
model: google('gemini-1.5-pro'),
messages,
tools: session.tools
});
Choose models dynamically based on task requirements:
function selectModel(taskType: string) {
switch (taskType) {
case 'complex-reasoning':
return openai('gpt-4');
case 'analysis':
return anthropic('claude-3-opus-20240229');
case 'quick-response':
return openai('gpt-3.5-turbo');
default:
return openai('gpt-4');
}
}
await metorial.withProviderSession(
metorialAiSdk,
{ serverDeployments: [process.env.METORIAL_DEPLOYMENT_ID!] },
async session => {
const result = await generateText({
model: selectModel('complex-reasoning'),
messages,
tools: session.tools
});
return result.text;
}
);
Implement fallbacks for reliability:
async function generateWithFallback(
messages: any[],
tools: any[]
) {
const models = [
openai('gpt-4'),
anthropic('claude-3-opus-20240229'),
openai('gpt-3.5-turbo')
];
for (const model of models) {
try {
const result = await generateText({
model,
messages,
tools,
maxToolRoundtrips: 5
});
return result;
} catch (error) {
console.error(`Model ${model} failed, trying next...`);
continue;
}
}
throw new Error('All models failed');
}
await metorial.withProviderSession(
metorialAiSdk,
{ serverDeployments: [process.env.METORIAL_DEPLOYMENT_ID!] },
async session => {
return await generateWithFallback(messages, session.tools);
}
);
Use cheaper models for simpler tasks:
async function generateCostEffective(
complexity: 'simple' | 'medium' | 'complex',
messages: any[],
tools: any[]
) {
const modelMap = {
simple: openai('gpt-3.5-turbo'),
medium: openai('gpt-4'),
complex: openai('gpt-4-turbo')
};
return await generateText({
model: modelMap[complexity],
messages,
tools
});
}
Different models may need different configurations:
function getModelConfig(modelType: string) {
const configs = {
'gpt-4': {
maxToolRoundtrips: 5,
temperature: 0.7
},
'claude-3-opus': {
maxToolRoundtrips: 5,
temperature: 0.8
},
'gpt-3.5-turbo': {
maxToolRoundtrips: 3,
temperature: 0.5
}
};
return configs[modelType] || configs['gpt-4'];
}
Make sure to install the necessary AI SDK providers:
# For OpenAI
npm install @ai-sdk/openai
# For Anthropic
npm install @ai-sdk/anthropic
# For Google
npm install @ai-sdk/google
And configure their API keys:
# .env.local
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GOOGLE_GENERATIVE_AI_API_KEY=your-google-key
METORIAL_API_KEY=your-metorial-key
The beauty of using Metorial is that all your integrations work with any AI provider. Switch models freely without rebuilding your integration layer—Metorial handles the complexity so you can focus on choosing the best model for each task.
Build powerful AI applications with TogetherAI and Metorial's comprehensive integration platform. Connect TogetherAI's diverse collection of open-source language models to over 600 integrations through our MCP-powered, open-source SDKs. Metorial makes it effortless to give your TogetherAI-based agents access to calendars, databases, communication tools, project management platforms, and hundreds of other services in just a couple of lines of Python or TypeScript code. Whether you're leveraging Llama, Mistral, or other models available through TogetherAI's platform, Metorial eliminates integration complexity so you can focus on building intelligent features. Our developer-first approach handles authentication, API management, error handling, and rate limiting automatically—no more maintaining brittle integration code or debugging OAuth flows. With Metorial's open-core model, you get the transparency and flexibility of open source with the reliability and support you need for production applications. Stop wasting engineering cycles on integration plumbing and start shipping AI-powered features that differentiate your product and delight your users. Let Metorial handle the connections while you concentrate on creating breakthrough AI experiences.
Let's take your AI-powered applications to the next level, together.
Metorial provides developers with instant access to 600+ MCP servers for building AI agents that can interact with real-world tools and services. Built on MCP, Metorial simplifies agent tool integration by offering pre-configured connections to popular platforms like Google Drive, Slack, GitHub, Notion, and hundreds of other APIs. Our platform supports all major AI agent frameworks—including LangChain, AutoGen, CrewAI, and LangGraph—enabling developers to add tool calling capabilities to their agents in just a few lines of code. By eliminating the need for custom integration code, Metorial helps AI developers move from prototype to production faster while maintaining security and reliability. Whether you're building autonomous research agents, customer service bots, or workflow automation tools, Metorial's MCP server library provides the integrations you need to connect your agents to the real world.