Skip to content
Avatar of vercel-labsvercel-labs/gemini-chatbot

Gemini AI Chatbot

Gemini-powered chatbot with the Vercel AI SDK, Next.js, and React.

Gemini

Features

  • Next.js App Router
  • React Server Components (RSCs), Suspense, and Server Actions
  • Vercel AI SDK for streaming chat UI
  • Support for Google Gemini (default), OpenAI, Anthropic, Cohere, Hugging Face, or custom AI chat models and/or LangChain
  • shadcn/ui
  • Chat History, rate limiting, and session storage with Vercel KV
  • NextAuth.js for authentication

Model Providers

This template ships with Google Gemini models/gemini-1.0-pro-001 as the default. However, thanks to the Vercel AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, Hugging Face, or using LangChain with just a few lines of code.

Deploy Your Own

You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:

Running locally

You will need to use the environment variables defined in .env.example to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.

Note: You should not commit your .env file or it will expose secrets that will allow others to control access to your various Google Cloud and authentication provider accounts.

  1. Install Vercel CLI: npm i -g vercel
  2. Link local instance with Vercel and GitHub accounts (creates .vercel directory): vercel link
  3. Download your environment variables: vercel env pull
pnpm install
pnpm dev

Your app template should now be running on localhost:3000.

Authors

This library is created by Vercel and Next.js team members, with contributions from:

Gemini
Avatar of vercel-labsvercel-labs/gemini-chatbot

Gemini AI Chatbot

Gemini-powered chatbot with the Vercel AI SDK, Next.js, and React.

Features

  • Next.js App Router
  • React Server Components (RSCs), Suspense, and Server Actions
  • Vercel AI SDK for streaming chat UI
  • Support for Google Gemini (default), OpenAI, Anthropic, Cohere, Hugging Face, or custom AI chat models and/or LangChain
  • shadcn/ui
  • Chat History, rate limiting, and session storage with Vercel KV
  • NextAuth.js for authentication

Model Providers

This template ships with Google Gemini models/gemini-1.0-pro-001 as the default. However, thanks to the Vercel AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, Hugging Face, or using LangChain with just a few lines of code.

Deploy Your Own

You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:

Running locally

You will need to use the environment variables defined in .env.example to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.

Note: You should not commit your .env file or it will expose secrets that will allow others to control access to your various Google Cloud and authentication provider accounts.

  1. Install Vercel CLI: npm i -g vercel
  2. Link local instance with Vercel and GitHub accounts (creates .vercel directory): vercel link
  3. Download your environment variables: vercel env pull
pnpm install
pnpm dev

Your app template should now be running on localhost:3000.

Authors

This library is created by Vercel and Next.js team members, with contributions from:

Unleash New Possibilities

Deploy your app on Vercel and unlock its full potential