Vercel Logo

Using System Prompts to Shape AI Personality

Now that we have a professional chat interface with Elements, let's give our AI some personality!

System prompts are like permanent instructions that shape how your AI behaves throughout an entire conversation. While user messages change with each interaction, the system prompt remains constant, ensuring consistent personality and behavior.

Building on Elements

We'll be modifying the API route while keeping our beautiful Elements UI from the previous lesson. The visual impact of different personalities will be even more striking with professional message bubbles and markdown rendering!

What are System Prompts?

System Prompts act like persistent instructions or "character notes" for an LLM. Unlike user prompts (which change each turn), a system prompt guides overall behavior of the LLM when it is generating responses. Your system prompt will:

  • Defines Persona: Sets tone (e.g., formal, casual, witty, brand voice).
  • Set Constraints: Instructs AI on boundaries (e.g., "Do not offer financial advice", "Only discuss product features").
  • Provide Core System Context: Gives background relevant to all interactions (e.g., "You are a helpful assistant for Vercel products").

Your system prompts control how the LLM responds to every prompt in a conversation, separate from what the user asks with their prompts throughout a conversation. The system prompt is essential for branding, safety, and consistent bot behavior.

Loading diagram...

Implementation: The system Property

Let's modify our existing API route to add personality. Open app/api/chat/route.ts and add a system property to your streamText call:

TypeScriptapp/api/chat/route.ts
import { streamText, convertToModelMessages } from 'ai';

export const maxDuration = 30;

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = streamText({
    model: 'openai/gpt-4.1',
    // TODO: Add a system prompt here to define the AI's personality
    // system: 'Your personality instructions here',
    messages: convertToModelMessages(messages),
  });

  return result.toUIMessageStreamResponse();
}

Now add the system prompt:

TypeScriptapp/api/chat/route.ts
import { streamText, convertToModelMessages } from "ai";

// Allow streaming responses up to 30 seconds
export const maxDuration = 30;

export async function POST(req: Request) {
	try {
		const { messages } = await req.json();

		const result = streamText({
			model: "openai/gpt-4.1",
			system: "You are a helpful assistant.", // Initial basic prompt
			messages: convertToModelMessages(messages),
		});

		return result.toUIMessageStreamResponse();
// existing code ...

Test it out! Start your dev server if it isn't running:

pnpm dev

Navigate to http://localhost:3000/chat and ask "What is Next.js?"

With our Elements UI, notice how the AI's response appears in a professional message bubble with proper formatting. But the personality is generic. Let's change that!

Example 1: The Unhelpful Riddle Bot

It's possible to modify the system property to drastically change behavior for every single response.

Update route.ts and change the system prompt value:

TypeScriptapp/api/chat/route.ts
  system: 'You are an unhelpful assistant that only responds to users with confusing riddles.',

Save and test: Refresh your chat page and ask "What is Next.js?" again. Watch how the same professional UI now delivers a completely different personality - the riddle appears in the same polished message bubble, making the contrast even more striking!

Screenshot of chat UI. User asks 'What is Next.js?'. AI responds with a confusing riddle instead of a direct answer.

Example 2: The 1984 Steve Jobs Bot

Models can adopt personas. Detail improves adherence.

Update route.ts: Change system string to detailed persona.

TypeScriptapp/api/chat/route.ts
  system: `You are Steve Jobs. Assume his character, both strengths and flaws.
  Respond exactly how he would, in exactly his tone.
  It is 1984 you have just created the Macintosh.`,

Save and test: Refresh the page and try asking about modern technology like "What is Next.js?" The response will be fascinating - watch Steve Jobs from 1984 try to comprehend modern web frameworks!

Screenshot of chat UI. User asks 'What is Next.js?'. AI responds in a tone mimicking Steve Jobs in 1984.

Model Selection & System Prompts

More capable (and expensive) models (like openai/gpt-5 or openai/o3) generally follow System Prompts more precisely and maintain character consistency. For production chatbots where persona is critical, test models to balance performance and cost.

Example 3: Practical Support Assistant

Define persona and constraints for realistic application.

Update route.ts: Change system string to business context.

TypeScriptapp/api/chat/route.ts
  system: `You are a support assistant for TechCorp's cloud platform.
  Focus on helping users troubleshoot deployment issues, API usage, and account settings.
  Be concise but thorough. Link to documentation at docs.techcorp.com when relevant.
  If a question is outside your knowledge area, politely redirect to contact@techcorp.com.`,

Save and test: Try various questions:

  • "How do I reset my password?"
  • "Tell me about pricing"
  • "What's your favorite color?"

Notice how the AI stays in character, provides helpful support responses, and politely deflects off-topic questions. The Elements UI makes these professional responses look even more credible!

System Prompt Length

While detailed System Prompts improve behavior, very long prompts consume context window space, potentially affecting performance or cost. Keep prompts concise yet clear for production.

Key Takeaways

System prompts transform your chatbot from a generic assistant into a unique personality:

  • The system property in streamText sets persistent behavioral rules
  • Personas stick - The AI maintains character across the entire conversation
  • Details matter - More specific prompts lead to better adherence
  • Elements amplifies impact - Professional UI makes personality changes more striking
Reflection Prompt
Defining Your Bot's Persona

Imagine building a chatbot for a specific purpose (e.g., company support, technical documentation assistant, personal project). What system prompt would define its core personality, tone, and key constraints? Draft 2-3 sentences for system prompt.

What's Next?

Your chatbot now has personality, but it's still limited to conversation. In the next lesson, we'll give it superpowers by adding tool calling - letting it fetch real data, perform calculations, and interact with external APIs.

Imagine your Steve Jobs bot being able to actually look up modern technology, or your support assistant actually checking account statuses!