How to use OpenAI Function Calling with Next.js and the Vercel AI SDK

In June 2023, OpenAI announced function calling, which allows developers to define functions to GPT-4 and GPT-3.5 Turbo.

What is OpenAI function calling?

For context, GPT-4 and GPT-3.5 Turbo are Large Language Models (LLM) that has been trained by OpenAI on a vast corpus of text data. It excels in a wide range of language tasks, including text completion, summarization, and even creative writing.

With function calling, LLMs can now receive function definitions and output a JSON object with the necessary arguments for invoking those functions. This makes it easier to integrate GPT's features with external tools and APIs, allowing developers to build intuitive, AI-powered experiences by leveraging the power of LLMs.

Types of function calling UX flows

We've identified three primary user experience flows that are essential for the new OpenAI Function Calling API:

1. Automatic Function Execution

This is the typical function calling execution flow:

  1. Client/user sends a message in natural language.
  2. On the server, the AI SDK sends the list of predefined functions along with the user input to OpenAI, which returns the JSON required for the function call.
  3. Server executes the function call.
  4. The AI SDK sends the function call output to OpenAI and gets a summarized output.
  5. AI SDK streams the output to the client via the edge.
fc-1.png

2. Automatic Function Execution with Intent & Progress

While the function still executes automatically, this flow provides context to the user about the function progression:

  1. Client/user sends a message in natural language.
  2. On the server, the AI SDK sends the list of predefined functions along with the user input to OpenAI, which returns the JSON required for the function call.
  3. Server streams an intent to the client that there will be a function call execution via the AI SDK, and proceeds to executes the function call.
  4. While the function call is executing, the AI SDK streams the progress to the client.
  5. Once complete, AI SDK sends the function call output to OpenAI and gets a summarized output.
  6. AI SDK streams the output to the client via the edge.
fc-2.png

3. Function Execution with User Approval

The function only executes given user confirmation:

  1. Client/user sends a message in natural language.
  2. On the server, the AI SDK sends the list of predefined functions along with the user input to OpenAI, which returns the JSON required for the function call.
  3. Server sends an intent to the client. If the client confirms, server executes the function call.
  4. The AI SDK sends the function call output to OpenAI and gets a summarized output.
  5. AI SDK streams the output to the client via the edge.
fc-3.png

Use cases for OpenAI function calling

OpenAI function calling is a powerful tool for building AI-enhanced user experiences.

Here are some examples:

  1. Parsing and processing freeform data: Instead of manually coding up complex regexes to parse and process freeform data, you can now leverage OpenAI function calling to convert free form data into type-safe, fully-formatted function inputs that can be used to perform POST/PUT operations against your database.
  2. Chatbots that interact with external APIs: With function calling, you can quickly scaffold chatbots that can interact with third-party APIs using natural language. This is similar to how ChatGPT plugins work; take the WeatherGPT Plugin for instance – it converts a user's question about the weather into a structured JSON output that can be fed into a weather API of your choice.

How to use OpenAI Function Calling?

To learn more about OpenAI Function Calling, we will be building an AI chatbot that interacts with the with the Hacker News API in real time using OpenAI Function Calling and the Vercel AI SDK.

If you prefer not to start from scratch, we have prepared a template that you can clone locally and use that as a starting point instead.

Terminal
git clone https://github.com/steven-tey/chathn

Step 1: Creating a Chat API route

First, create an API route using the Edge runtime. We can use the default example in the AI SDK docs for now:

app/api/chat/route.ts
import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';
// Optional, but recommended: run on the edge runtime.
// See https://vercel.com/docs/concepts/functions/edge-functions
export const runtime = 'edge';
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY!,
});
export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();
// Request the OpenAI API for the response based on the prompt
const response = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
stream: true,
messages: messages,
});
// Convert the response into a friendly text-stream
const stream = OpenAIStream(response);
// Respond with the stream
return new StreamingTextResponse(stream);
}

Step 2: Defining your functions

Think of functions as an array of possible functions that can be used to interact with your API – similar to the options defined in an OpenAPI spec.

Below is an example of a function that fetches and returns the top stories on Hacker News. You can refer to the full list of functions in the ChatHN template here.

app/api/chat/functions.ts
import { CompletionCreateParams } from "openai/resources/chat/index";
export const functions: CompletionCreateParams.Function[] = [
{
name: "get_top_stories",
description:
"Get the top stories from Hacker News. Also returns the Hacker News URL to each story.",
parameters: {
type: "object",
properties: {
limit: {
type: "number",
description: "The number of stories to return. Defaults to 10.",
},
},
required: [],
},
},
... // more functions
];
async function get_top_stories(limit: number = 10) {
const response = await fetch(
"https://hacker-news.firebaseio.com/v0/topstories.json",
);
const ids = await response.json();
const stories = await Promise.all(
ids.slice(0, limit).map((id: number) => get_story(id)),
);
return stories;
}
export async function runFunction(name: string, args: any) {
switch (name) {
case "get_top_stories":
return await get_top_stories();
... // more functions
}
}

If you prefer, you can safely colocate the functions file in the same directory as your API route, since only the content returned by page.js or route.js is sent to the client.

Step 3: Add function calling mechanism to API route

Once you define your functions, you can start adding function calling to the API route that you created in Step 1.

app/api/chat/route.ts
import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';
// Optional, but recommended: run on the edge runtime.
// See https://vercel.com/docs/concepts/functions/edge-functions
export const runtime = 'edge';
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY!,
});
export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();
// check if the conversation requires a function call to be made
const initialResponse = await openai.chat.completions.create({
model: "gpt-3.5-turbo-0613",
messages,
stream: true,
functions,
function_call: "auto",
});
const stream = OpenAIStream(initialResponse, {
experimental_onFunctionCall: async (
{ name, arguments: args },
createFunctionCallMessages,
) => {
const result = await runFunction(name, args);
const newMessages = createFunctionCallMessages(result);
return openai.chat.completions.create({
model: "gpt-3.5-turbo-0613",
stream: true,
messages: [...messages, ...newMessages],
});
},
});
// Respond with the stream
return new StreamingTextResponse(stream);
}

Step 4: Wire up the chat interface with AI SDK

Final step is to wire up the chat interface with the Vercel AI SDK.

We've scaffolded a simple ChatGPT-esque design in the ChatHN repository that you can copy and paste into your application.

CleanShot 2023-06-25 at 12.47.17.png

Once everything is ready, you can run the following command to start the app:

Terminal
npm i
npm run dev

Build AI-powered UX with function calling and AI SDK

All in all, OpenAI function calling feature provides a powerful way to integrate AI chatbots with third-party APIs. By defining functions and utilizing the OpenAI SDK, you can easily interact with APIs in real time.

With the provided example, you can create a chatbot that fetches top stories from Hacker News and more. By following the steps outlined, you can incorporate function calling into your API route and connect it with the chat interface using the Vercel AI SDK. Through this approach, you'll have a chatbot that can seamlessly interact with 3rd party APIs, enhancing its capabilities and providing more value to users.

Explore more

Deploy this template

Screenshot of template

by Steven Tey

Continue the conversation

steventey on twitter

Steven Tey

@steventey

Introducing ChatHN:
http://chathn.vercel.app Chat with Hacker News in real-time using natural language. It's fully open-source and built with: ◆ @OpenAI's new Functions Calling feature ◆ @Vercel AI SDK ◆ @HackerNews API

Discuss on Twitter ↗

Couldn't find the guide you need?