Define the Agent Skeleton
A regular LLM call is one turn: you send a prompt, you get a response. An agent is a loop. The model generates a response, and if that response includes a tool call, the agent executes the tool, feeds the result back, and lets the model decide what to do next. This continues until the model produces a final text response or hits a step limit.
The AI SDK's ToolLoopAgent handles this loop for you. You give it a model, tools, and instructions. It manages the back-and-forth:
prompt → LLM → tool call → execute tool → result → LLM → tool call → ... → final text
For your filesystem agent, the loop looks like this: the user asks "did anyone mention pricing?", the model calls bashTool with grep -r "pricing" calls/, reads the output, maybe runs another command to get more context, and then synthesizes a final answer. Each tool call is one step in the loop. The default limit is 20 steps before the agent stops.
Outcome
You have a minimal ToolLoopAgent exported from lib/agent.ts that compiles and responds (without tools) when called from the API route.
Fast Track
- Import
ToolLoopAgentfromaiinlib/agent.ts - Create and export a
ToolLoopAgentwith the model set toanthropic/claude-opus-4.6 - Start the dev server and verify the app loads at
localhost:3000
How the API route uses the agent
The starter repo's API route is already wired up to import your agent and stream its responses:
import { agent } from '@/lib/agent';
export async function POST(request: Request) {
// ... extract prompt from messages ...
const stream = await agent.stream({ prompt });
// ... stream response back to UI ...
}The route calls agent.stream() with the user's message. The agent runs its loop, streaming text and tool calls back as they happen. Your job is to export an agent from lib/agent.ts that this route can use.
Hands-on Exercise 1.2
Create the agent skeleton in lib/agent.ts.
Requirements:
- Import
ToolLoopAgentfrom theaipackage - Set the model to
anthropic/claude-opus-4.6(via AI Gateway) - Pass empty
instructionsandtoolsfor now - Export the agent as a named export called
agent
Implementation hints:
- The
ToolLoopAgentconstructor takes an object withmodel,instructions, andtools - The model string
'anthropic/claude-opus-4.6'routes through AI Gateway automatically. NocreateGateway()call needed. - Use a
MODELconstant so you can easily swap models later - The
toolsproperty takes an object. Pass{}for now. - The API route expects
export const agent, not a default export
Try It
-
Start the dev server:
pnpm dev -
Open
http://localhost:3000and type a question like "hello". The agent responds with plain text (no tools, no file access). That's expected. -
Check the terminal for any compilation errors. If
lib/agent.tsexports correctly, the app compiles without issues.
Without tools or instructions, the agent is just a bare LLM. It can't explore files or answer questions about calls yet. You'll fix that in the next two lessons.
Commit
git add lib/agent.ts
git commit -m "feat(agent): add ToolLoopAgent skeleton"Done-When
lib/agent.tsexports aToolLoopAgentinstance namedagent- The dev server compiles without errors
- The chat UI loads and the agent responds to messages (generic responses are fine)
Solution
import { ToolLoopAgent } from 'ai';
const MODEL = 'anthropic/claude-opus-4.6';
export const agent = new ToolLoopAgent({
model: MODEL,
instructions: '',
tools: {}
});Was this helpful?