Vercel Logo

Define the Agent Skeleton

A regular LLM call is one turn: you send a prompt, you get a response. An agent is a loop. The model generates a response, and if that response includes a tool call, the agent executes the tool, feeds the result back, and lets the model decide what to do next. This continues until the model produces a final text response or hits a step limit.

The AI SDK's ToolLoopAgent handles this loop for you. You give it a model, tools, and instructions. It manages the back-and-forth:

prompt → LLM → tool call → execute tool → result → LLM → tool call → ... → final text

For your filesystem agent, the loop looks like this: the user asks "did anyone mention pricing?", the model calls bashTool with grep -r "pricing" calls/, reads the output, maybe runs another command to get more context, and then synthesizes a final answer. Each tool call is one step in the loop. The default limit is 20 steps before the agent stops.

Outcome

You have a minimal ToolLoopAgent exported from lib/agent.ts that compiles and responds (without tools) when called from the API route.

Fast Track

  1. Import ToolLoopAgent from ai in lib/agent.ts
  2. Create and export a ToolLoopAgent with the model set to anthropic/claude-opus-4.6
  3. Start the dev server and verify the app loads at localhost:3000

How the API route uses the agent

The starter repo's API route is already wired up to import your agent and stream its responses:

app/api/route.ts
import { agent } from '@/lib/agent';
 
export async function POST(request: Request) {
  // ... extract prompt from messages ...
 
  const stream = await agent.stream({ prompt });
  // ... stream response back to UI ...
}

The route calls agent.stream() with the user's message. The agent runs its loop, streaming text and tool calls back as they happen. Your job is to export an agent from lib/agent.ts that this route can use.

Hands-on Exercise 1.2

Create the agent skeleton in lib/agent.ts.

Requirements:

  1. Import ToolLoopAgent from the ai package
  2. Set the model to anthropic/claude-opus-4.6 (via AI Gateway)
  3. Pass empty instructions and tools for now
  4. Export the agent as a named export called agent

Implementation hints:

  • The ToolLoopAgent constructor takes an object with model, instructions, and tools
  • The model string 'anthropic/claude-opus-4.6' routes through AI Gateway automatically. No createGateway() call needed.
  • Use a MODEL constant so you can easily swap models later
  • The tools property takes an object. Pass {} for now.
  • The API route expects export const agent, not a default export

Try It

  1. Start the dev server:

    pnpm dev
  2. Open http://localhost:3000 and type a question like "hello". The agent responds with plain text (no tools, no file access). That's expected.

  3. Check the terminal for any compilation errors. If lib/agent.ts exports correctly, the app compiles without issues.

Generic responses are expected

Without tools or instructions, the agent is just a bare LLM. It can't explore files or answer questions about calls yet. You'll fix that in the next two lessons.

Commit

git add lib/agent.ts
git commit -m "feat(agent): add ToolLoopAgent skeleton"

Done-When

  • lib/agent.ts exports a ToolLoopAgent instance named agent
  • The dev server compiles without errors
  • The chat UI loads and the agent responds to messages (generic responses are fine)

Solution

lib/agent.ts
import { ToolLoopAgent } from 'ai';
 
const MODEL = 'anthropic/claude-opus-4.6';
 
export const agent = new ToolLoopAgent({
  model: MODEL,
  instructions: '',
  tools: {}
});