Build advanced AI agents with TogetherAI. Connect 600+ integrations, automate workflows, and deploy with ease using Metorial.
Learn how to properly handle tool calls from your AI model and process the responses from Metorial integrations.
When using Metorial with the AI SDK, the typical flow is:
generateText
with ToolsThe simplest approach uses the AI SDK's generateText
function:
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
await metorial.withProviderSession(
metorialAiSdk,
{ serverDeployments: ['your-deployment-id'] },
async session => {
const result = await generateText({
model: openai('gpt-4'),
messages: [
{ role: 'user', content: 'Schedule a meeting for tomorrow at 2pm' }
],
tools: session.tools,
maxToolRoundtrips: 5 // Allow up to 5 rounds of tool calls
});
console.log(result.text);
}
);
The AI SDK automatically handles tool calls and responses when using generateText
with tools.
For more control, you can manually handle the tool call loop:
import { generateText } from 'ai';
const messages = [
{ role: 'user', content: 'What tasks are due today?' }
];
await metorial.withProviderSession(
metorialAiSdk,
{ serverDeployments: ['your-deployment-id'] },
async session => {
for (let i = 0; i < 10; i++) {
const result = await generateText({
model: openai('gpt-4'),
messages,
tools: session.tools,
maxToolRoundtrips: 0 // Disable automatic tool calling
});
// Check if the model wants to call tools
if (result.toolCalls && result.toolCalls.length > 0) {
// Call tools through Metorial
const toolResponses = await session.callTools(result.toolCalls);
// Add to message history
messages.push(
{ role: 'assistant', toolCalls: result.toolCalls },
...toolResponses
);
continue;
}
// No more tool calls - we have the final answer
console.log(result.text);
break;
}
}
);
Tool responses from Metorial contain the results of integration calls. The session.callTools
method automatically formats these responses for the AI SDK.
Always wrap your tool calls in try-catch blocks:
try {
const result = await generateText({
model: openai('gpt-4'),
messages,
tools: session.tools
});
console.log(result.text);
} catch (error) {
console.error('Error during generation:', error.message);
// Handle the error appropriately
}
maxToolRoundtrips
to prevent infinite loopsBuild powerful AI applications with TogetherAI and Metorial's comprehensive integration platform. Connect TogetherAI's diverse collection of open-source language models to over 600 integrations through our MCP-powered, open-source SDKs. Metorial makes it effortless to give your TogetherAI-based agents access to calendars, databases, communication tools, project management platforms, and hundreds of other services in just a couple of lines of Python or TypeScript code. Whether you're leveraging Llama, Mistral, or other models available through TogetherAI's platform, Metorial eliminates integration complexity so you can focus on building intelligent features. Our developer-first approach handles authentication, API management, error handling, and rate limiting automatically—no more maintaining brittle integration code or debugging OAuth flows. With Metorial's open-core model, you get the transparency and flexibility of open source with the reliability and support you need for production applications. Stop wasting engineering cycles on integration plumbing and start shipping AI-powered features that differentiate your product and delight your users. Let Metorial handle the connections while you concentrate on creating breakthrough AI experiences.
Let's take your AI-powered applications to the next level, together.
Metorial provides developers with instant access to 600+ MCP servers for building AI agents that can interact with real-world tools and services. Built on MCP, Metorial simplifies agent tool integration by offering pre-configured connections to popular platforms like Google Drive, Slack, GitHub, Notion, and hundreds of other APIs. Our platform supports all major AI agent frameworks—including LangChain, AutoGen, CrewAI, and LangGraph—enabling developers to add tool calling capabilities to their agents in just a few lines of code. By eliminating the need for custom integration code, Metorial helps AI developers move from prototype to production faster while maintaining security and reliability. Whether you're building autonomous research agents, customer service bots, or workflow automation tools, Metorial's MCP server library provides the integrations you need to connect your agents to the real world.