Module 1 - Lesson 3g: Function/Tool Calling
Integrating external APIs and tools with function calling.
Published: 1/3/2026
Example 7: Function/Tool Calling
Function calling (also called tool calling) allows the AI to call your functions during the conversation. This enables the AI to access external data, APIs, databases, and more.
What It Does
Defines tools (functions) that the AI can call, then executes them and includes the results in the conversation. In this example, we create weather tools (mock and actual).
Code Snippet
Create src/tools-prompt.ts:
import OpenAI from "openai"; import dotenv from "dotenv"; import { Responses } from "openai/resources.mjs"; dotenv.config(); // Parse command line argument for tool type const input = process.argv[2]?.toUpperCase(); if (input && input !== "ACTUAL" && input !== "MOCK") { throw new Error( `Invalid weather tool type: "${process.argv[2]}". Must be "ACTUAL" or "MOCK".` ); } const weatherToolType: "ACTUAL" | "MOCK" = input === "ACTUAL" ? "ACTUAL" : "MOCK"; // 1. Define the tools (functions) the AI can call const tools: Responses.Tool[] = [ { type: "function", name: "get_mock_weather", description: "Get the mock weather for today for a location.", parameters: { type: "object", properties: { location: { type: "string", description: "A location to get the weather for, e.g., London, UK", }, }, required: ["location"], additionalProperties: false, }, strict: true, }, { type: "function", name: "get_actual_weather", description: "Get the actual weather for today for a location.", parameters: { type: "object", properties: { location: { type: "string", description: "A location to get the weather for, e.g., London, UK", }, }, required: ["location"], additionalProperties: false, }, strict: true, }, ]; // 2. Implement the functions function getMockWeather(location: string): string { return location + " the weather is sunny with a high of 25°C."; } export const getWeather = async (location: string, unit = "celsius") => { const response = await fetch( `https://api.weatherapi.com/v1/current.json?q=${location}&aqi=no&key=${process.env.WEATHER_API_KEY}` ); const data = await response.json(); return data; }; const openai = new OpenAI(); async function toolsPrompt(weatherToolType: "ACTUAL" | "MOCK"): Promise<void> { try { console.log("Testing OpenAI connection..."); console.log(`Using weather tool type: ${weatherToolType}`); const weatherRequest = weatherToolType === "ACTUAL" ? "actual" : "mock"; let messages: any[] = [ { role: "system", content: "You are a helpful travel assistant. Provide detailed travel suggestions based on user preferences and give a guide to the destination and include distance from the airport. Get mocked weather data for locations when requested using the get_mock_weather tool and actual weather data using the get_actual_weather tool.", }, { role: "user", content: `Suggest a travel destination within Europe where there is a Christmas market that is famous but is not in a big city. I would like to go somewhere that is less than 2 hours from a major airport and has good public transport links, please provide the ${weatherRequest} weather as of today there`, }, ]; // 3. First API call - AI decides if it needs to call a tool const response = await openai.responses.create({ model: "gpt-5-nano", tools: tools, input: messages, }); // 4. Add all output items (including function calls) to messages messages.push(...response.output); // 5. Check if AI called a function and execute it for (const item of response.output) { if (item.type == "function_call") { if (item.name === "get_mock_weather") { const args = JSON.parse(item.arguments); const weatherInfo = getMockWeather(args.location); // Add function result back to messages messages.push({ type: "function_call_output", call_id: item.call_id, output: JSON.stringify({ weatherInfo }), }); } if (item.name === "get_actual_weather") { const args = JSON.parse(item.arguments); const weatherInfo = await getWeather(args.location); // Add function result back to messages messages.push({ type: "function_call_output", call_id: item.call_id, output: JSON.stringify({ weatherInfo }), }); } } } // 6. Second API call - AI uses the function results const finalResponse = await openai.responses.create({ model: "gpt-5-nano", tools: tools, input: messages, }); console.log("✅ Function Calling Success!"); console.log("AI Response:", finalResponse.output_text); console.log("Tokens used:"); console.dir(finalResponse.usage, { depth: null }); } catch (error) { if (error instanceof OpenAI.APIError) { console.log("❌ API Error:", error.status, error.message); } else if (error instanceof Error) { console.log("❌ Error:", error.message); } } } toolsPrompt(weatherToolType).catch(console.error);
Run It
# Use mock weather (no API key needed) pnpm tsx src/tools-prompt.ts MOCK # Use actual weather (requires WEATHER_API_KEY in .env) # See "Get Weather API Key" section in Project Setup above pnpm tsx src/tools-prompt.ts ACTUAL
Note: To use actual weather data, you'll need a Weather API key from weatherapi.com. See the Project Setup section at the beginning of this lesson for detailed instructions.
Key Points
- Tool definition: Describe functions the AI can call
- AI decides: Model chooses when to call functions
- Function execution: Your code runs the functions
- Result integration: Function results go back to AI
- Use case: External APIs, databases, calculations, real-time data
Function Calling Flow
1. User asks question
↓
2. AI sees question needs external data
↓
3. AI calls your function (get_weather)
↓
4. Your code executes function
↓
5. Function result added to conversation
↓
6. AI uses result to answer question
Function Calling Flow Diagram
Tool Definition Best Practices
// ✅ Good: Clear description { name: "get_weather", description: "Get current weather for a location. Returns temperature, condition, and humidity.", parameters: { type: "object", properties: { location: { type: "string", description: "City name and country, e.g., 'Paris, France'" } }, required: ["location"] } } // ❌ Bad: Vague description { name: "weather", description: "Gets weather", // Missing parameter descriptions }
Congratulations! 🎉
You've completed Module 1: OpenAI Prompts! You now have a solid foundation in prompt engineering with OpenAI's API, covering:
- ✅ Basic prompts and system prompts
- ✅ Temperature control for creativity
- ✅ Extended prompts with detailed context
- ✅ Real-time streaming responses
- ✅ Structured outputs with Zod validation
- ✅ Function/tool calling for external integrations
What's Next?
Coming Soon: Advanced AI SDK Techniques
The next modules will dive deeper into professional AI development:
Module 2: Advanced Prompt Engineering (Coming Soon)
Explore advanced prompting strategies and patterns:
- Chain-of-thought prompting
- Few-shot learning techniques
- Prompt templating and composition
- Multi-turn conversations and context management
- Error handling and retry strategies
Module 3: Multi-Provider AI Development (Coming Soon)
Learn to work with multiple AI providers:
- Anthropic Claude API - Advanced reasoning and long context windows
- Google Gemini API - Multimodal capabilities and broad knowledge
- Vercel AI SDK - Universal interface for all providers
- Provider comparison and selection strategies
- Building provider-agnostic applications
Module 4: Production AI Applications (Coming Soon)
Build production-ready AI systems:
- Cost optimization and token management
- Rate limiting and error handling
- Caching strategies
- Testing AI applications
- Monitoring and analytics
Keep Practicing!
The best way to master AI development is to build projects. Try these ideas:
- Travel Planner: Combine structured output with function calling
- Content Generator: Use temperature control for creative writing
- Data Analyzer: Parse structured data from documents
- Chat Assistant: Implement streaming with conversation history
- Research Tool: Integrate multiple external APIs
Resources
Ready to explore more? Stay tuned for upcoming modules! 🚀