Your Smart Info Processing & Creative AI Partner

Harness AI to effortlessly manage web content—from deep understanding and smart conversations to instant translation—all under your control.

Install Now

Core Features

💡

Deep Insights

Intelligently analyze current web content, quickly generate summaries, organize logic, find logical gaps, and rewrite articles. Highly configurable to meet your personalized web information processing needs.

💬

Smart Chat

Cllama provides a smart chat experience beyond traditional conversations. Engage in free communication or preset scenarios and roles, such as simulating a professional psychological counselor for deep communication and self-exploration.

🌐

Text Selection Translation

Select any text on a webpage and get high-quality translations powered by AI models instantly via the right-click menu. Cllama supports custom translation prompts to ensure translation results better meet your expectations.

Cllama Browser Extension Features

⚙️

Flexible Model Compatibility

Seamlessly connect to local Ollama services for data privacy and security; also compatible with OpenAI API 1.0 protocol to easily connect to various online AI services.

📝

Easy Function Expansion

The Insight module can easily expand various functions using prompts, such as webpage summaries, key point organization, and content rewriting, to meet your personalized needs.

🗣️

Multi-scenario Conversation

Cllama extension easily creates and manages multiple conversation scenarios, allowing flexible switching for more efficient and organized communication in different contexts.

🌍

Smart Translation

Cllama extension supports multi-language translation and custom translation prompts, providing high-quality translation services.

💾

Convenient Data Management

Supports data import and export for easy data migration, backup, and sharing.

Rich Resource Expansion

Cllama extension has a large number of excellent insight and scenario resources available for import and use, meeting more application needs.

Feature Showcase

Cllama Plugin Installation

Firefox Add-ons

Click the button below to go to the Firefox Add-ons store and install the Cllama plugin.

Install Now (Firefox)
Chrome Web Store

Click the button below to go to the Chrome Web Store and install the Cllama plugin.

Install Now (Chrome)

Frequently Asked Questions

The 403 error occurs because Ollama's default security settings do not allow cross-origin access. Here's how to fix it:

  1. Open the Ollama service configuration file (usually located at /etc/systemd/system/ollama.service)
  2. Add the following two lines to the [Service] section:
[Service] ... Environment="OLLAMA_HOST=0.0.0.0" # Add this line if you need other machines to access the local Ollama service Environment="OLLAMA_ORIGINS=*" ...
  1. Save the configuration file
  2. Restart the Ollama service:
    sudo systemctl daemon-reload
    sudo systemctl restart ollama
Note: Setting OLLAMA_ORIGINS=* allows access from all origins. In a production environment, it is recommended to set specific domains.

Excluding firewall and other network factors, you need to add the following settings to the Ollama service configuration on the target host:

  1. Open the Ollama service configuration file on the target host
  2. Add to the [Service] section:
[Service] ... Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=*" ...
  1. Save the configuration file
  2. Restart the Ollama service on the target host:
    sudo systemctl daemon-reload
    sudo systemctl restart ollama

After configuration, you can access it by entering the target host's IP address and port (usually 11434) in the Cllama settings.

Network Check: Please ensure that both devices are on the same local network and that the firewall allows communication on port 11434.

In addition to modifying the Ollama configuration, you can also use a reverse proxy like Nginx to solve cross-origin issues:

  1. Add the following location configuration to the Nginx configuration file:
location /api/chat { proxy_set_header Origin http://localhost:11434; proxy_pass http://localhost:11434; add_header 'Access-Control-Allow-Origin' '*'; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS'; add_header 'Access-Control-Allow-Headers' 'Content-Type'; }
  1. Save the configuration file
  2. Reload the Nginx configuration: sudo nginx -s reload

After configuration, set the Ollama address in Cllama to the Nginx proxy address.

Tip: This method is suitable for situations where you cannot directly modify the Ollama configuration, or when additional security control is required.

Feedback and Suggestions

Your feedback is the driving force for improvement. We look forward to your valuable suggestions!

Submit Feedback