Clippy is an open-source desktop assistant that allows users to run modern large language models locally while presenting them through a nostalgic interface inspired by Microsoft’s classic Clippy assistant from the 1990s. The project serves as both a playful homage to the early days of personal computing and a practical demonstration of local AI inference. Clippy integrates with the llama.cpp runtime to run models directly on a user’s computer without requiring cloud-based AI services. It supports models in the GGUF format, which allows it to run many publicly available open-source LLMs efficiently on consumer hardware. Users interact with the system through a simple animated assistant interface that can answer questions, generate text, and perform conversational tasks. The application includes one-click installation support for several popular models such as Meta’s Llama, Google’s Gemma, and other open models.

Features

  • Local execution of large language models using llama.cpp
  • Support for GGUF model formats used by many open LLMs
  • Retro desktop assistant interface inspired by Microsoft Clippy
  • One-click installation for several popular open-source models
  • Offline AI interaction without reliance on cloud services
  • Lightweight desktop application focused on simplicity and privacy

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow Clippy

Clippy Web Site

Other Useful Business Software
QA Wolf | We Write, Run and Maintain Tests Icon
QA Wolf | We Write, Run and Maintain Tests

For developer teams searching for a testing software

QA Wolf is an AI-native service that delivers 80% automated E2E test coverage for web & mobile apps in weeks not years.
Learn More
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Clippy!

Additional Project Details

Programming Language

TypeScript

Related Categories

TypeScript Large Language Models (LLM)

Registered

2026-03-09