This template uses the Vercel AI Gateway to access multiple AI models through a unified interface. The default configuration includes xAI models (grok-2-vision-1212
, grok-3-mini-beta
) routed through the gateway.
For Vercel deployments: Authentication is handled automatically via OIDC tokens.
For non-Vercel deployments: You need to provide an AI Gateway API key by setting the AI_GATEWAY_API_KEY
environment variable in your .env.local
file.
With the AI SDK, you can also switch to direct LLM providers like OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:
You will need to use the environment variables defined in .env.example
to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env
file is all that is necessary.
Note: You should not commit your
.env
file or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts.
npm i -g vercel
.vercel
directory): vercel link
vercel env pull
pnpm installpnpm dev
Your app template should now be running on localhost:3000.