AI Setup

Optional AI setup for better titles.

Cliprr works without any model setup. This page is only for users who want to connect a local model through Ollama and improve title suggestions inside the app.

Current recommended local setup: Ollama + qwen2:7b.

Before you start

Do this only if you want enhanced titles.

Cliprr can already run its normal title workflow without Ollama. The AI setup is an upgrade path for users who want stronger local title suggestions.

This page is intentionally focused on the current local setup. Later versions of Cliprr can expand this into a broader model setup page for OpenAI and other providers without changing the basic flow.

Recommended path

Standard first. AI second.

Install and test Cliprr first
Install Ollama only if you want better titles
Run the recommended model once
Return to Cliprr and enable the local model path
Setup steps

Get the local model running in a few minutes.

The goal is not to turn users into local AI experts. It is just to make Cliprr title generation better.

01

Install Ollama.

Download and install Ollama for Windows. Once installed, it provides the local runtime that Cliprr can talk to.

02

Open Command Prompt.

After Ollama is installed, open Command Prompt or Terminal on your PC so you can pull the recommended model.

03

Run the model once.

Start qwen2:7b through Ollama. That downloads the model the first time and makes the local setup usable for Cliprr.

Command

Use this in Command Prompt.

Run the command below after Ollama finishes installing.

ollama run qwen2:7b

The first run may take a while because Ollama needs to download the model locally.

After that, go back to Cliprr and use your local AI title flow the way your app settings expect.

What to expect

Normal setup behavior.

  • Ollama installs separately from Cliprr
  • The first model run downloads local files to your machine
  • Cliprr should still be usable even before you finish this step
  • You can keep this setup optional for non-technical users
Need to know
Do I need this to use Cliprr?

No. Cliprr works without Ollama. This setup is only for users who want stronger local title suggestions.

What model should I use right now?

The current recommended local model for Cliprr is qwen2:7b.

Future ready
Will this page change later?

Yes. This page is designed to grow into a broader AI setup page once Cliprr supports OpenAI keys and other model providers.

Will OpenAI go here later too?

Yes. When v1.1 adds provider choice in settings, this page can expand into local and hosted model setup instructions.

Support flow
What if Ollama is installed but titles still do not improve?

First confirm Cliprr is configured to use the local model path you expect. Then verify that qwen2:7b was actually run through Ollama.

Should this be on the main download path?

No. Keep this as an optional setup page so normal users can install Cliprr without getting blocked by AI steps.

Next step

Install Cliprr first, then add AI when you want it.

Keep the main product flow simple. Use this page as the optional upgrade path for local title generation.