Cerewro in local mode: private AI without cloud

Cerewro can be configured to use local AI models (Ollama, LM Studio, Jan) instead of sending data to the cloud. Your conversations, files and commands remain on your machine.

Configure local model
# In Cerewro's configuration file
provider: ollama
model: llama3.2:8b
base_url: http://localhost:11434/v1

Compatible local providers

ProviderDefault URLRecommended models
Ollamahttp://localhost:11434llama3.2, mistral, phi4
LM Studiohttp://localhost:1234Any GGUF model
Janhttp://localhost:1337llama3, gemma2

Advantages of local mode

  1. Total privacy: no data leaves your machine
  2. No token limits: no API usage costs
  3. Works offline: no internet connection required
  4. Full control: use any open-source model