A blazing fast Terminal User Interface (TUI) client for Ollama with real-time streaming, markdown support, and smart scrolling.
Built with performance and user experience in mind
Responses are generated live, providing immediate feedback without waiting for complete generation.
Automatic formatting for headers, lists, bold text, and syntax-highlighted code blocks with custom borders.
Automatic following of AI output with manual scroll lock (๐) for reading previous messages undisturbed.
Easily switch between installed Ollama models using arrow keys with separate input/output buffers per model.
Each LLM maintains its own chat history, input text, and scroll position for seamless multi-model workflows.
Every chat session is automatically saved as text files with both combined and per-model histories.
Familiar keyboard shortcuts for efficient navigation and control, designed for power users.
Ultra-low latency and minimal resource footprint using Rust and the Ratatui terminal framework.
Ultra-low latency operations well within 60 FPS frame budget (< 16ms)
LazyLlama is built with performance as a top priority. All operations complete in microseconds or nanoseconds, ensuring a smooth and responsive user experience even with large conversation histories.
Note: All benchmarks are measured on optimized release builds. Unicode operations are actually faster than ASCII due to efficient UTF-8 handling in Rust. Run cargo test --release --benches to execute the full benchmark suite yourself.
Last measured: 2 March 2026 · Release build · cargo test --release --benches
Get up and running in minutes
# From crates.io (recommended)
cargo install lazyllama
# Or build from source
git clone https://github.com/Pommersche92/lazyllama.git
cd lazyllama
cargo install --path .
lazyllama
That's it! Start chatting with your AI models right in your terminal.
Choose your preferred installation method
Universal Linux package that runs on any distribution
โInstall directly from AUR using your favorite AUR helper
โPre-built binary for Windows 10/11
โBuild from source using Rust's package manager
โClone the repository and build it yourself
โDesigned for efficiency and productivity
Everything you need to get the most out of LazyLlama