Skip to content

Zola

Zola is a free, open-source AI chat app with multi-model support.

RepositoryAvatar of ibelickibelick/zola
Framework
Use Case
Database
Contentful thumbnail for zola

Zola

zola.chat

Zola is the open-source chat interface for all your models.

Features

  • Multi-model support: OpenAI, Mistral, Claude, Gemini, Ollama (local models)
  • Bring your own API key (BYOK) support via OpenRouter
  • File uploads
  • Clean, responsive UI with light/dark themes
  • Built with Tailwind CSS, shadcn/ui, and prompt-kit
  • Open-source and self-hostable
  • Customizable: user system prompt, multiple layout options
  • Local AI with Ollama: Run models locally with automatic model detection
  • Full MCP support (wip)

Quick Start

Option 1: With OpenAI (Cloud)

git clone https://github.com/ibelick/zola.git
cd zola
npm install
echo "OPENAI_API_KEY=your-key" > .env.local
npm run dev

Option 2: With Ollama (Local)

# Install and start Ollama
curl -fsSL https://ollama.ai/install.sh | sh
ollama pull llama3.2  # or any model you prefer

# Clone and run Zola
git clone https://github.com/ibelick/zola.git
cd zola
npm install
npm run dev

Zola will automatically detect your local Ollama models!

Option 3: Docker with Ollama

git clone https://github.com/ibelick/zola.git
cd zola
docker-compose -f docker-compose.ollama.yml up

To unlock features like auth, file uploads, see INSTALL.md.

Built with

Sponsors

License

Apache License 2.0

Notes

This is a beta release. The codebase is evolving and may change.

Contentful thumbnail for zola

Zola

Zola is a free, open-source AI chat app with multi-model support.

RepositoryAvatar of ibelickibelick/zola
Framework
Use Case
Database

Zola

zola.chat

Zola is the open-source chat interface for all your models.

Features

  • Multi-model support: OpenAI, Mistral, Claude, Gemini, Ollama (local models)
  • Bring your own API key (BYOK) support via OpenRouter
  • File uploads
  • Clean, responsive UI with light/dark themes
  • Built with Tailwind CSS, shadcn/ui, and prompt-kit
  • Open-source and self-hostable
  • Customizable: user system prompt, multiple layout options
  • Local AI with Ollama: Run models locally with automatic model detection
  • Full MCP support (wip)

Quick Start

Option 1: With OpenAI (Cloud)

git clone https://github.com/ibelick/zola.git
cd zola
npm install
echo "OPENAI_API_KEY=your-key" > .env.local
npm run dev

Option 2: With Ollama (Local)

# Install and start Ollama
curl -fsSL https://ollama.ai/install.sh | sh
ollama pull llama3.2  # or any model you prefer

# Clone and run Zola
git clone https://github.com/ibelick/zola.git
cd zola
npm install
npm run dev

Zola will automatically detect your local Ollama models!

Option 3: Docker with Ollama

git clone https://github.com/ibelick/zola.git
cd zola
docker-compose -f docker-compose.ollama.yml up

To unlock features like auth, file uploads, see INSTALL.md.

Built with

Sponsors

License

Apache License 2.0

Notes

This is a beta release. The codebase is evolving and may change.

Unleash New Possibilities

Deploy your app on Vercel and unlock its full potential