Install Ollama.
Download and install Ollama for Windows. Once installed, it provides the local runtime that Cliprr can talk to.
Cliprr works without any model setup. This page is only for users who want to connect a local model through Ollama and improve title suggestions inside the app.
Current recommended local setup: Ollama + qwen2:7b.
Cliprr can already run its normal title workflow without Ollama. The AI setup is an upgrade path for users who want stronger local title suggestions.
This page is intentionally focused on the current local setup. Later versions of Cliprr can expand this into a broader model setup page for OpenAI and other providers without changing the basic flow.
The goal is not to turn users into local AI experts. It is just to make Cliprr title generation better.
Download and install Ollama for Windows. Once installed, it provides the local runtime that Cliprr can talk to.
After Ollama is installed, open Command Prompt or Terminal on your PC so you can pull the recommended model.
Start qwen2:7b through Ollama. That downloads the model the first time and makes the local setup usable for Cliprr.
Run the command below after Ollama finishes installing.
ollama run qwen2:7b
The first run may take a while because Ollama needs to download the model locally.
After that, go back to Cliprr and use your local AI title flow the way your app settings expect.
No. Cliprr works without Ollama. This setup is only for users who want stronger local title suggestions.
The current recommended local model for Cliprr is qwen2:7b.
Yes. This page is designed to grow into a broader AI setup page once Cliprr supports OpenAI keys and other model providers.
Yes. When v1.1 adds provider choice in settings, this page can expand into local and hosted model setup instructions.
First confirm Cliprr is configured to use the local model path you expect. Then verify that qwen2:7b was actually run through Ollama.
No. Keep this as an optional setup page so normal users can install Cliprr without getting blocked by AI steps.