ChatGPT support for Matomo.
Get instant AI-generated insights for any Matomo report. The plugin adds an "Insights" button to all report widgets that analyzes your data and provides actionable recommendations.
A full-featured chat interface for asking questions about your analytics data.
Choose from preset models or specify custom model names.
Preset Models: - GPT-5.1 / GPT-5 - GPT-4.1 / GPT-4.1 Mini / GPT-4.1 Nano - GPT-4o / GPT-4o Mini - GPT-4 Turbo - o3 / o3 Mini - o1 / o1 Mini / o1 Pro
Custom Models: Specify any model name to use models not in the preset list, perfect for: - New OpenAI models - Self-hosted LLMs (LLaMA, Mistral, etc.) - Other OpenAI-compatible providers
Configure different AI settings per website using Measurable Settings: - Override system-wide host, API key, and model per site - Customize prompts for specific websites - Leave empty to use system defaults
Connect to any OpenAI-compatible API endpoint: - OpenAI (default) - Azure OpenAI - Self-hosted solutions (Ollama, LocalAI, vLLM, etc.) - Other providers (Anthropic via proxy, Mistral, etc.)
Note: API key is optional when using custom hosts, making it easy to connect to local LLM instances.
Tailor the AI's behavior with custom prompts: - Chat Base Prompt: Customize how the AI responds in conversations - Insight Base Prompt: Customize how the AI analyzes report data
Full translations available in: - English - German (Deutsch) - Spanish (Espaol) - French (Franais) - Italian (Italiano) - Dutch (Nederlands) - Swedish (Svenska)
View and download this plugin for a specific Matomo version:
Integrate AI-powered analytics insights and chat functionality into your Matomo instance using ChatGPT or any OpenAI-compatible API.
Get instant AI-generated insights for any Matomo report. The plugin adds an "Insights" button to all report widgets that analyzes your data and provides actionable recommendations.
A full-featured chat interface for asking questions about your analytics data.
Choose from preset models or specify custom model names:
Preset Models: - GPT 5.5 (default) - GPT 5.4 / GPT 5.4 Mini / GPT 5.4 Nano - GPT 5.1 - GPT 5 Mini / GPT 5 Nano / GPT 5 (Latest) - GPT 4.1 / GPT 4.1 Mini / GPT 4.1 Nano - GPT 4o / GPT 4o Mini / GPT 4o (Latest) - GPT 4 / GPT 4 Turbo
The preset list only includes conversational models suited for chatting about report data. Reasoning models (o-series) and *-pro variants are intentionally excluded — they are tuned for one-shot deep analysis rather than back-and-forth discussion and would produce a poor chat experience.
Custom Models: Specify any model name to use models not in the preset list, perfect for new OpenAI models, self-hosted LLMs, or other providers. If a custom model is rejected by the configured endpoint, the upstream error message will be shown in the chat.
Connect to any OpenAI-compatible API endpoint: - OpenAI (default) - Azure OpenAI - Self-hosted solutions (Ollama, LocalAI, vLLM, etc.) - Other providers (Anthropic via proxy, Mistral, etc.)
The chat and insight components use Matomo's native CSS theme variables, so the UI automatically follows your Matomo theme — both light and dark — without any additional configuration.
/plugins folderNavigate to Administration > General Settings > ChatGPT to configure:
Setting Description Host API endpoint URL. Default:https://api.openai.com/v1/chat/completions
API Key
Your OpenAI API key (required for OpenAI, optional for custom hosts)
Model (Preset)
Select from available model presets
Model (Custom)
Override preset with a custom model name
Chat Base Prompt
System prompt for chat conversations
Insight Base Prompt
System prompt for report insights
All system settings can be overridden per website in the site's Measurable Settings. Leave fields empty to use system defaults.
This is useful for: - Using different models for different sites - Customizing prompts for specific website contexts - Using separate API keys per site
The plugin provides the following API methods:
Method DescriptionChatGPT.getResponse
Get AI response for messages (non-streaming)
ChatGPT.getStreamingResponse
Get AI response with SSE streaming
ChatGPT.getInsight
Get AI insights for report data
ChatGPT.getModels
Get list of available preset models
ChatGPT.getResponse
- idSite - Site ID
- period - Period (day, week, month, year)
- date - Date string
- messages - Conversation messages in ChatGPT format
ChatGPT.getInsight
- idSite - Site ID
- period - Period
- date - Date string
- reportId - Report identifier
- messages - Conversation messages
All API methods require appropriate view permissions for the requested site.
The plugin interface is available in: - English - German (Deutsch) - Spanish (Español) - French (Français) - Italian (Italiano) - Dutch (Nederlands) - Swedish (Svenska)
How do I install this plugin?
This plugin is available in the official Matomo Marketplace:
Alternatively, download the plugin from GitHub and extract it to your /plugins folder.
What do I need to make it work?
You need an OpenAI API key, which you can obtain at https://platform.openai.com/. If you're using a custom host (like a self-hosted LLM), an API key may be optional.
Can I use models other than OpenAI's?
Yes! The plugin supports any OpenAI-compatible API endpoint. You can connect to:
Simply configure the custom host URL in the plugin settings.
Which models are supported?
The plugin includes presets for the following conversational chat-completion models:
You can also specify any custom model name for models not in the preset list.
Why aren't reasoning models (o1, o3) or *-pro variants in the list?
Reasoning models and *-pro variants (gpt-5-pro, gpt-5.5-pro, o1-pro, o3-pro, etc.) are tuned for one-shot deep analysis with multi-second "thinking" latency, not for back-and-forth conversation. They were removed from the preset list because they produced a poor chat experience for discussing report data. They also use a different OpenAI endpoint (/v1/responses) that this plugin does not target. If you really want to try one, you can still type its name in the Model (Custom) field — any error returned by the API will be displayed directly in the chat.
What happens if my model name is wrong or the API returns an error?
The plugin now surfaces upstream API errors as a danger notice directly inside the chat (for both streaming and non-streaming requests). You'll see the actual error message returned by OpenAI (or your custom host) — for example, an invalid model name, quota issue, or authentication problem — instead of a silent failure.
Does the plugin support dark mode?
Yes. The chat and insight components are styled with Matomo's native CSS theme variables (--theme-color-background-contrast, --theme-color-border, etc.), so they automatically follow whichever Matomo theme is active — light or dark — with no extra configuration.
Is the plugin available to all users in my Matomo instance?
Yes, once activated, all users with view permissions can access the AI features for their permitted sites.
Can I configure different settings per website?
Yes! Use Measurable Settings to override the system-wide host, API key, model, and prompts for specific websites. Leave fields empty to use system defaults.
How do I get insights for a report?
Does the plugin support streaming responses?
Yes, real-time streaming responses are supported. The plugin automatically falls back to non-streaming mode if your server doesn't support Server-Sent Events (SSE).
Can I customize the AI's behavior?
Yes, you can customize:
These can be set globally or per website.
What languages are supported?
The plugin interface is translated into:
What are the requirements?
Is my data sent to OpenAI?
When you use the Insights feature or Chat, the relevant report data and your messages are sent to the configured API endpoint (OpenAI by default). If you have data privacy concerns, consider using a self-hosted LLM solution.
How can I contribute to this plugin?
You can contribute by:
How long will this plugin be maintained?
The plugin is actively maintained. The developer uses Matomo on many projects and will continue to patch and improve the plugin.
View and download this plugin for a specific Matomo version: