Google Gemini

Connect Integrations to
Google Gemini

Build advanced AI agents with Google Gemini. Connect 600+ integrations, automate workflows, and deploy with ease using Metorial.

Back to Google Gemini overview

Handling Function Calls from Gemini

When Google Gemini decides to use one of Metorial's integrations, it returns function calls that you need to handle. This guide shows you how.

How Function Calling Works

  1. You send a prompt to Gemini with Metorial tools available
  2. If Gemini needs to use an integration, it returns function calls instead of text
  3. You execute those function calls through Metorial
  4. You can send the results back to Gemini for further processing

Basic Function Call Handling

Here's how to detect and handle function calls:

await metorial.withProviderSession(
  metorialGoogle,
  { serverDeployments: ['your-server-deployment-id'] },
  async session => {
    let response = await genAI.models.generateContent({
      model: 'gemini-1.5-pro-latest',
      contents: [{
        role: 'user',
        parts: [{ 
          text: 'Search for the metorial/websocket-explorer repository on GitHub' 
        }]
      }],
      config: {
        tools: session.tools
      }
    });

    // Check if Gemini wants to call functions
    let functionCalls = response.candidates?.[0]?.content?.parts
      ?.filter(part => part.functionCall)
      .map(part => part.functionCall!);

    if (functionCalls && functionCalls.length > 0) {
      // Execute the function calls through Metorial
      let toolResponses = await session.callTools(functionCalls);
      console.log('Tool responses:', toolResponses);
    } else {
      // No function calls, just regular text response
      let text = response.candidates?.[0]?.content?.parts?.[0]?.text;
      console.log(text);
    }
  }
);

Complete Example with Error Handling

await metorial.withProviderSession(
  metorialGoogle,
  { serverDeployments: ['your-server-deployment-id'] },
  async session => {
    try {
      let response = await genAI.models.generateContent({
        model: 'gemini-1.5-pro-latest',
        contents: [{
          role: 'user',
          parts: [{ text: 'Find my next meeting on the calendar' }]
        }],
        config: {
          tools: session.tools
        }
      });

      let functionCalls = response.candidates?.[0]?.content?.parts
        ?.filter(part => part.functionCall)
        .map(part => part.functionCall!);

      if (functionCalls && functionCalls.length > 0) {
        let toolResponses = await session.callTools(functionCalls);
        
        // Use the tool responses as needed
        toolResponses.forEach(response => {
          console.log('Tool result:', response);
        });
      }
    } catch (error) {
      console.error('Error:', error);
    }
  }
);

What Happens Behind the Scenes

When you call session.callTools(functionCalls):

  1. Metorial validates the function calls
  2. Executes them against the appropriate integrations
  3. Returns the results in a format Gemini can understand

Metorial handles all the complexity of authentication, API calls, and error handling automatically.

Google Gemini on Metorial

Build exceptional AI applications with Google Gemini and Metorial's comprehensive integration platform. Connect Gemini's state-of-the-art multimodal AI models to over 600 integrations including Google Workspace, Slack, GitHub, Salesforce, and hundreds more through our MCP-powered SDKs. Metorial eliminates the complexity of building and maintaining integrations, allowing you to add powerful capabilities to your Gemini-based agents in just a couple of lines of Python or TypeScript code. Whether you're creating virtual assistants, data analysis tools, or intelligent workflow automation with Google's advanced AI, Metorial provides the integration infrastructure you need to ship faster. Our open-source platform handles authentication flows, API versioning, rate limiting, and error handling automatically, so you can focus on crafting intelligent behaviors and delightful user experiences. Stop reinventing the wheel for every integration—let Metorial manage the connections while you concentrate on building innovative AI solutions. With Metorial, your Gemini agents can seamlessly interact with the tools and platforms your users depend on daily.

Connect anything. Anywhere.

Ready to build with Metorial?

Let's take your AI-powered applications to the next level, together.

About Metorial

Metorial provides developers with instant access to 600+ MCP servers for building AI agents that can interact with real-world tools and services. Built on MCP, Metorial simplifies agent tool integration by offering pre-configured connections to popular platforms like Google Drive, Slack, GitHub, Notion, and hundreds of other APIs. Our platform supports all major AI agent frameworks—including LangChain, AutoGen, CrewAI, and LangGraph—enabling developers to add tool calling capabilities to their agents in just a few lines of code. By eliminating the need for custom integration code, Metorial helps AI developers move from prototype to production faster while maintaining security and reliability. Whether you're building autonomous research agents, customer service bots, or workflow automation tools, Metorial's MCP server library provides the integrations you need to connect your agents to the real world.

Star us on GitHub