Skip to content

open-webui/desktop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

106 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Open WebUI Desktop

Version Downloads Discord License

Open WebUI Desktop

Your AI, right on your desktop. Open WebUI as a native app. Run models locally or connect to any server. No Docker, no terminal, no setup. Download, launch, chat.

Warning

Early Alpha. Things move fast and stuff might break. Report bugs or come hang out on Discord.

Download

Platform Installer
macOS (Apple Silicon) Download .dmg
macOS (Intel) Download .dmg
Windows x64 Download .exe
Linux (AppImage) Download .AppImage
Linux (Debian/Ubuntu) Download .deb
Linux (Snap) Download .snap
Linux (Flatpak) Download .flatpak

Internet required on first launch. After that, everything works offline. All releases →

How It Works

🖥️ Run locally. The app sets up Open WebUI and llama.cpp on your machine. Download models, chat offline, keep everything private. Nothing leaves your computer.

☁️ Connect remotely. Point the app at any Open WebUI server. Switch between multiple connections from the sidebar.

Use both at the same time.

Highlights

  • Spotlight. Hit Shift+Cmd+I (macOS) or Shift+Ctrl+I (Windows/Linux) to summon a floating chat bar over whatever you're doing. Drag to screenshot anything on screen.
  • 🎙️ Voice input. System-wide push-to-talk. Press the shortcut from any app to record, and your speech is transcribed and sent to your chat automatically.
  • 🧠 Local inference. Download and run models entirely on your hardware. Your data never leaves your machine.
  • 🎯 One-click setup. Everything installs itself. Just click "Get Started."
  • 🔌 Multiple connections. Juggle servers and switch between them instantly.
  • 🔄 Auto-updates. New releases land in the background.
  • 📡 Offline-ready. No internet needed after initial setup.
  • 💻 Cross-platform. macOS, Windows, and Linux.

System Requirements

Local Models Remote Only
Disk 5 GB+ ~500 MB
RAM 16 GB+ 4 GB
OS macOS 12+, Windows 10+, modern Linux Same

Note

Local models need serious RAM (7B ≈ 8 GB, 13B ≈ 16 GB). Lighter machine? Connect to a remote server instead.

Privacy

No telemetry. No tracking. No phone-home. Your conversations stay on your machine. Period.

Community

  • 💬 Discord - Come hang out
  • 🐛 Issues - Report bugs or request features
  • 🌐 Open WebUI - The main project
  • 📖 Docs - Full documentation

Contributing

npm install
npm run dev

See CHANGELOG.md for release history. Licensed under AGPL-3.0.

About

Open WebUI Desktop 🌐

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors