Module 2 (Gemini) - Lesson 2g: Tools and Function Calling with Gemini
Define tools with functionDeclarations and handle Gemini function calls.
Published: 1/15/2026
Lesson 2g: Function/Tool Calling with Google Gemini
Learn how to give Gemini access to external tools and APIs through function calling. Gemini's approach uses functionDeclarations and is similar in concept to OpenAI but with different syntax from Anthropic.
Key Differences from OpenAI and Anthropic
OpenAI: Tools array with function definitions
tools: [{ type: "function", function: { name: "...", parameters: {...} } }]
Anthropic: Tools with input_schema
tools: [{ name: "...", input_schema: { type: "object", properties: {...} } }]
Gemini: Uses functionDeclarations in config.tools
config: { tools: [{ functionDeclarations: [{ name: "...", parameters: { type: Type.OBJECT, properties: {...} } }] }] }
Code Example
Create src/gemini/tools-prompt.ts:
import { GoogleGenAI, ApiError, Content, Tool, Type } from "@google/genai"; import dotenv from "dotenv"; // Load environment variables dotenv.config(); // Check for weather tool type from CLI argument const input = process.argv[2]?.toUpperCase(); if (input && input !== "ACTUAL" && input !== "MOCK") { throw new Error(`Invalid weather tool type: "${process.argv[2]}". Must be "ACTUAL" or "MOCK".`); } const weatherToolType: "ACTUAL" | "MOCK" = input === "ACTUAL" ? "ACTUAL" : "MOCK"; // Define tools for Gemini const tools: Tool[] = [ { functionDeclarations: [ { name: "get_mock_weather", description: "Get the mock weather for today for a location.", parameters: { type: Type.OBJECT, properties: { location: { type: Type.STRING, description: "A location to get the weather for, e.g., London, UK", }, }, required: ["location"], }, }, { name: "get_actual_weather", description: "Get the actual weather for today for a location.", parameters: { type: Type.OBJECT, properties: { location: { type: Type.STRING, description: "A location to get the weather for, e.g., London, UK", }, }, required: ["location"], }, }, ], }, ]; // Mock weather function function getMockWeather(location: string): string { return `${location}: sunny with a high of 25°C.`; } // Actual weather function (requires WEATHER_API_KEY in .env) async function getActualWeather(location: string): Promise<string> { const response = await fetch( `https://api.weatherapi.com/v1/current.json?q=${location}&aqi=no&key=${process.env.WEATHER_API_KEY}` ); const data = await response.json(); return JSON.stringify(data); } // Execute local function based on name async function executeLocalFunction( name: string, args: Record<string, unknown> ): Promise<string> { switch (name) { case "get_mock_weather": return getMockWeather(args.location as string); case "get_actual_weather": return await getActualWeather(args.location as string); default: return `Error: Unknown function ${name}`; } } // Create Gemini client const gemini = new GoogleGenAI({}); // Main function async function toolsPrompt(weatherType: "ACTUAL" | "MOCK"): Promise<void> { try { console.log("Testing Gemini connection..."); console.log(`Using weather tool type: ${weatherType}`); const weatherRequest = weatherType === "ACTUAL" ? "actual" : "mock"; // Initial conversation let messages: Content[] = [ { role: "user", parts: [ { text: `Suggest a travel destination within Europe where there is a Christmas market that is famous but is not in a big city. I would like to go somewhere that is less than 2 hours from a major airport and has good public transport links, please provide the ${weatherRequest} weather as of today there`, }, ], }, ]; // First API call - Gemini decides whether to use tools const response = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: messages, config: { systemInstruction: "You are a helpful travel assistant. Provide detailed travel suggestions based on user preferences and give a guide to the destination and include distance from the airport. Get mocked weather data for locations when requested using the get_mock_weather tool and actual weather data using the get_actual_weather tool.", tools, }, }); // Check for function calls in response const candidate = response.candidates?.[0]; const parts = candidate?.content?.parts ?? []; const functionResponses: { functionResponse: { name: string; response: { content: string } } }[] = []; // Process any function calls for (const part of parts) { if (part.functionCall) { const { name, args } = part.functionCall; console.log(`Calling function: ${name} with args:`, args); const result = await executeLocalFunction(name, args as Record<string, unknown>); console.log(`Function result:`, result); functionResponses.push({ functionResponse: { name, response: { content: result }, }, }); } } // If there were function calls, send results back to get final response if (functionResponses.length > 0) { const finalResult = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: [ ...messages, candidate?.content ?? { role: "model", parts: [] }, { role: "user", parts: functionResponses }, ], config: { systemInstruction: "You are a helpful travel assistant. Use the function results to provide a complete answer.", tools, }, }); console.log("Tools Prompt Success!"); console.log( "AI Response:", finalResult.candidates?.[0]?.content?.parts?.[0]?.text ); console.log("Tokens used:"); console.dir(finalResult.usageMetadata, { depth: null }); } else { // No function calls - direct response console.log("Tools Prompt Success!"); console.log("AI Response:", parts[0]?.text); console.log("Tokens used:"); console.dir(response.usageMetadata, { depth: null }); } } catch (error) { if (error instanceof ApiError) { console.log("API Error:", error.status, error.message); } else if (error instanceof Error) { console.log("Error:", error.message); } else { console.log("Unknown error occurred"); } } } // Run the test toolsPrompt(weatherToolType).catch((error) => { console.error("Error:", error); });
Run It
# With mock weather pnpm tsx src/gemini/tools-prompt.ts MOCK # With actual weather (requires WEATHER_API_KEY in .env) pnpm tsx src/gemini/tools-prompt.ts ACTUAL
Understanding Tool Calling Flow
1. Define Tools
Define what tools Gemini can use with functionDeclarations:
import { Tool, Type } from "@google/genai"; const tools: Tool[] = [ { functionDeclarations: [ { name: "get_weather", description: "Get current weather for a location", parameters: { type: Type.OBJECT, properties: { location: { type: Type.STRING, description: "City name, e.g., 'London, UK'" }, unit: { type: Type.STRING, enum: ["celsius", "fahrenheit"], description: "Temperature unit" } }, required: ["location"] } } ] } ];
2. Initial Request
Gemini decides if it needs to use a tool:
const response = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: "What's the weather in Paris?", config: { tools } }); // Check for function calls const parts = response.candidates?.[0]?.content?.parts ?? []; for (const part of parts) { if (part.functionCall) { console.log("Function call:", part.functionCall); // { name: "get_weather", args: { location: "Paris" } } } }
3. Execute Tool
Run the actual function:
const functionCall = parts.find(p => p.functionCall)?.functionCall; if (functionCall) { const { name, args } = functionCall; const result = await executeFunction(name, args); // Prepare function response const functionResponse = { functionResponse: { name: name, response: { content: result } } }; }
4. Send Results Back
Gemini uses the tool result to generate final response:
const finalResponse = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: [ ...originalMessages, candidate.content, // Include the function call { role: "user", parts: [functionResponse] } // Function result ], config: { tools } }); console.log(finalResponse.candidates?.[0]?.content?.parts?.[0]?.text);
Complete Example: Calculator Tool
import { GoogleGenAI, Tool, Type, Content } from "@google/genai"; const gemini = new GoogleGenAI({}); // Define calculator tool const tools: Tool[] = [ { functionDeclarations: [ { name: "calculate", description: "Perform mathematical calculations", parameters: { type: Type.OBJECT, properties: { expression: { type: Type.STRING, description: "Math expression to evaluate, e.g., '2 + 2'" } }, required: ["expression"] } } ] } ]; // Calculator implementation function calculate(expression: string): string { try { // Use Function constructor for safer eval const result = new Function(`return ${expression}`)(); return String(result); } catch (error) { return `Error: ${error.message}`; } } // Main function async function mathWithGemini(question: string) { let messages: Content[] = [ { role: "user", parts: [{ text: question }] } ]; // Initial request const response = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: messages, config: { tools } }); // Check for function call const candidate = response.candidates?.[0]; const functionCall = candidate?.content?.parts?.find(p => p.functionCall)?.functionCall; if (functionCall) { // Execute function const result = calculate(functionCall.args.expression as string); // Send result back const finalResponse = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: [ ...messages, candidate.content, { role: "user", parts: [{ functionResponse: { name: functionCall.name, response: { content: result } } }] } ], config: { tools } }); return finalResponse.candidates?.[0]?.content?.parts?.[0]?.text; } return candidate?.content?.parts?.[0]?.text; } // Usage const answer = await mathWithGemini("What is 234 × 567?"); console.log(answer); // "234 × 567 equals 132,678"
Multiple Tools Example
const tools: Tool[] = [ { functionDeclarations: [ { name: "get_weather", description: "Get current weather", parameters: { type: Type.OBJECT, properties: { location: { type: Type.STRING } }, required: ["location"] } }, { name: "get_time", description: "Get current time", parameters: { type: Type.OBJECT, properties: { timezone: { type: Type.STRING } }, required: ["timezone"] } }, { name: "search_flights", description: "Search for flights", parameters: { type: Type.OBJECT, properties: { from: { type: Type.STRING }, to: { type: Type.STRING }, date: { type: Type.STRING } }, required: ["from", "to", "date"] } } ] } ]; // Gemini will choose the right tool(s) based on the query const response = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: "I want to fly from London to Paris tomorrow. What's the weather there?", config: { tools } }); // May call multiple tools: search_flights AND get_weather
Handling Tool Errors
async function executeToolSafely( name: string, args: Record<string, unknown> ): Promise<{ content: string; isError: boolean }> { try { let result: string; switch (name) { case "get_weather": result = await getWeather(args.location as string); break; case "calculate": result = calculate(args.expression as string); break; default: return { content: `Unknown tool: ${name}`, isError: true }; } return { content: result, isError: false }; } catch (error) { return { content: `Error executing ${name}: ${error.message}`, isError: true }; } } // Use in conversation const { content, isError } = await executeToolSafely(functionCall.name, functionCall.args); const functionResponse = { functionResponse: { name: functionCall.name, response: { content } } }; // Gemini will handle errors gracefully
Provider Comparison
OpenAI Approach
const response = await openai.responses.create({ model: "gpt-5-nano", tools: [{ type: "function", function: { name: "get_weather", parameters: { type: "object", properties: {...} } } }], input: "..." }); // Tool call in response const toolCall = response.tool_calls?.[0]; // Add tool result messages.push({ role: "tool", tool_call_id: toolCall.id, content: result });
Anthropic Approach
const response = await anthropic.messages.create({ model: "claude-haiku-4-5", max_tokens: 1000, tools: [{ name: "get_weather", input_schema: { type: "object", properties: {...} } }], messages: [...] }); // Tool use in content const toolUse = response.content.find(b => b.type === "tool_use"); // Add tool result messages.push({ role: "assistant", content: response.content }); messages.push({ role: "user", content: [{ type: "tool_result", tool_use_id: toolUse.id, content: result }] });
Gemini Approach
const response = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: [...], config: { tools: [{ functionDeclarations: [{ name: "get_weather", parameters: { type: Type.OBJECT, properties: {...} } }] }] } }); // Function call in parts const functionCall = response.candidates?.[0]?.content?.parts?.find( p => p.functionCall )?.functionCall; // Send function response contents.push(candidate.content); contents.push({ role: "user", parts: [{ functionResponse: { name: functionCall.name, response: { content: result } } }] });
Key Differences Table
| Feature | OpenAI | Anthropic | Gemini |
|---|---|---|---|
| Tool definition | tools array | tools array | config.tools |
| Schema location | function.parameters | input_schema | parameters with Type |
| Call location | response.tool_calls | content array | parts array |
| Result format | role: "tool" | type: "tool_result" | functionResponse |
| Type system | JSON Schema | JSON Schema | Type enum |
Best Practices
1. Clear Tool Descriptions
{ name: "search_database", description: "Search the customer database by name, email, or ID. Returns customer details including order history. Use this when the user asks about a specific customer.", parameters: {...} }
2. Validate Tool Input
if (functionCall) { const { name, args } = functionCall; // Validate before executing if (name === "get_weather" && !args.location) { throw new Error("Missing required parameter: location"); } const result = await executeFunction(name, args); }
3. Handle Multiple Function Calls
const parts = response.candidates?.[0]?.content?.parts ?? []; const functionResponses = []; for (const part of parts) { if (part.functionCall) { const result = await executeFunction(part.functionCall.name, part.functionCall.args); functionResponses.push({ functionResponse: { name: part.functionCall.name, response: { content: result } } }); } } if (functionResponses.length > 0) { // Send all results back const finalResponse = await gemini.models.generateContent({ contents: [ ...messages, candidate.content, { role: "user", parts: functionResponses } ], config: { tools } }); }
4. Provide Context in System Instruction
config: { systemInstruction: `You are a travel assistant with access to: - get_weather: Current weather conditions - search_flights: Flight availability - book_hotel: Hotel reservations Always check weather before suggesting destinations. Use search_flights to find the best options.`, tools }
Common Use Cases
1. API Integration
async function callExternalAPI(endpoint: string, params: Record<string, unknown>) { const response = await fetch(`https://api.example.com/${endpoint}`, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify(params) }); return await response.json(); }
2. Database Queries
async function queryDatabase(query: string) { const results = await db.execute(query); return JSON.stringify(results); }
3. File Operations
async function readFile(path: string) { const content = await fs.readFile(path, 'utf-8'); return content; }
Key Takeaways
- Define tools with
functionDeclarationsinconfig.tools - Use
Typeenum from@google/genaifor parameter types - Function calls appear in
candidates[0].content.parts - Send results back as
functionResponsein parts array - Gemini decides when to use tools based on context
- Can handle multiple tools in one conversation
Module 2 (Gemini) Complete!
Congratulations! You've completed Module 2 for Google Gemini and learned how to use the Gemini API. You now know how to:
- Make basic API calls to Gemini
- Use system prompts with
systemInstruction - Control response creativity with temperature
- Craft detailed extended prompts
- Stream responses in real-time
- Get validated structured output with JSON mode
- Integrate external tools and APIs with function calling
What's Next?
- Compare OpenAI, Anthropic, and Gemini implementations side-by-side
- Build a multi-provider application
- Explore advanced features and optimizations
- Apply your skills to real projects!
Back to Module 2 (Gemini) Overview
Quick Reference
// Tool calling pattern import { GoogleGenAI, Tool, Type, Content } from "@google/genai"; const tools: Tool[] = [{ functionDeclarations: [{ name: "function_name", description: "What this function does", parameters: { type: Type.OBJECT, properties: { param: { type: Type.STRING, description: "Parameter description" } }, required: ["param"] } }] }]; const response = await gemini.models.generateContent({ model: "gemini-3-flash-preview", contents: messages, config: { tools } }); const functionCall = response.candidates?.[0]?.content?.parts?.find( p => p.functionCall )?.functionCall; if (functionCall) { const result = await executeFunction(functionCall.name, functionCall.args); const finalResponse = await gemini.models.generateContent({ contents: [ ...messages, response.candidates[0].content, { role: "user", parts: [{ functionResponse: { name: functionCall.name, response: { content: result } } }] } ], config: { tools } }); }
Common Pitfalls
- Forgetting to import
Typefrom@google/genai - Using JSON schema syntax instead of
Typeenum - Not including the original function call in follow-up request
- Forgetting to wrap function response in correct structure
- Not handling cases where Gemini doesn't call any tools