← Back to Blog

Introducing MultiChat

The Problem

If you work with large language models, you know the drill: you have an OpenAI tab, an Anthropic tab, maybe a Google AI Studio tab, and you're constantly copy-pasting the same prompt between them to compare answers. Each provider has its own UI, its own conversation history, and its own quirks. It's tedious, and it makes systematic comparison nearly impossible.

What MultiChat Does

MultiChat gives you a single, unified chat interface for multiple LLM providers. Pick your models, type your prompt once, and see responses side-by-side. Every conversation is saved locally so you never lose context.

Key Features

Tech Stack

MultiChat is built with NiceGUI, a Python framework that lets you build web UIs entirely in Python — no JavaScript required. The backend uses the official SDKs from each LLM provider (openai, anthropic, google-generativeai), and conversations are persisted in SQLite for zero-config local storage.

The entire app is a single Python project. No build step, no webpack, no node_modules. Just pip install and run.

Try It

MultiChat is live at multichat.daddaops.com. The source code is available on GitHub.