MistralAI support for Matomo.
Get instant AI-generated insights for any Matomo report. The plugin adds an "Insights" button to all report widgets that analyzes your data and provides actionable recommendations.
A full-featured chat interface for asking questions about your analytics data.
Choose from preset models or specify custom model names.
Preset Models: - Mistral Large - Mistral Medium - Mistral Small - Open Mistral Nemo - Codestral - Pixtral Large - Ministral 8B - Ministral 3B
Custom Models: Specify any model name to use models not in the preset list, perfect for: - New Mistral models - Self-hosted LLMs (LLaMA, Mistral, etc.) - Other MistralAI-compatible providers
Configure different AI settings per website using Measurable Settings: - Override system-wide host, API key, and model per site - Customize prompts for specific websites - Leave empty to use system defaults
Connect to any MistralAI-compatible API endpoint: - MistralAI (default) - Azure MistralAI - Self-hosted solutions (Ollama, LocalAI, vLLM, etc.) - Other providers (Anthropic via proxy, Mistral, etc.)
Note: API key is optional when using custom hosts, making it easy to connect to local LLM instances.
Tailor the AI's behavior with custom prompts: - Chat Base Prompt: Customize how the AI responds in conversations - Insight Base Prompt: Customize how the AI analyzes report data
Full translations available in: - English - German (Deutsch) - Spanish (Espaol) - French (Franais) - Italian (Italiano) - Dutch (Nederlands) - Swedish (Svenska)
Integrate AI-powered analytics insights and chat functionality into your Matomo instance using MistralAI or any MistralAI-compatible API.
Get instant AI-generated insights for any Matomo report. The plugin adds an "Insights" button to all report widgets that analyzes your data and provides actionable recommendations.
A full-featured chat interface for asking questions about your analytics data.
Choose from preset models or specify custom model names:
Preset Models: - Mistral Large - Mistral Medium - Mistral Small - Open Mistral Nemo - Codestral - Pixtral Large - Ministral 8B - Ministral 3B
Custom Models: Specify any model name to use models not in the preset list, perfect for new MistralAI models, self-hosted LLMs, or other providers.
Connect to any MistralAI-compatible API endpoint: - MistralAI (default) - Azure MistralAI - Self-hosted solutions (Ollama, LocalAI, vLLM, etc.) - Other providers (Anthropic via proxy, Mistral, etc.)
/plugins folderNavigate to Administration > General Settings > MistralAI to configure:
Setting Description Host API endpoint URL. Default:https://api.openai.com/v1/chat/completions
API Key
Your MistralAI API key (required for MistralAI, optional for custom hosts)
Model (Preset)
Select from available model presets
Model (Custom)
Override preset with a custom model name
Chat Base Prompt
System prompt for chat conversations
Insight Base Prompt
System prompt for report insights
All system settings can be overridden per website in the site's Measurable Settings. Leave fields empty to use system defaults.
This is useful for: - Using different models for different sites - Customizing prompts for specific website contexts - Using separate API keys per site
The plugin provides the following API methods:
Method DescriptionMistralAI.getResponse
Get AI response for messages (non-streaming)
MistralAI.getStreamingResponse
Get AI response with SSE streaming
MistralAI.getInsight
Get AI insights for report data
MistralAI.getModels
Get list of available preset models
MistralAI.getResponse
- idSite - Site ID
- period - Period (day, week, month, year)
- date - Date string
- messages - Conversation messages in MistralAI format
MistralAI.getInsight
- idSite - Site ID
- period - Period
- date - Date string
- reportId - Report identifier
- messages - Conversation messages
All API methods require appropriate view permissions for the requested site.
The plugin interface is available in: - English - German (Deutsch) - Spanish (Español) - French (Français) - Italian (Italiano) - Dutch (Nederlands) - Swedish (Svenska)
How do I install this plugin?
This plugin is available in the official Matomo Marketplace:
Alternatively, download the plugin from GitHub and extract it to your /plugins folder.
What do I need to make it work?
You need an MistralAI API key, which you can obtain at https://docs.mistral.ai/api. If you're using a custom host (like a self-hosted LLM), an API key may be optional.
Can I use models other than MistralAI's?
Yes! The plugin supports any MistralAI-compatible API endpoint. You can connect to:
Simply configure the custom host URL in the plugin settings.
Which models are supported?
The plugin includes presets for:
You can also specify any custom model name for models not in the preset list.
Is the plugin available to all users in my Matomo instance?
Yes, once activated, all users with view permissions can access the AI features for their permitted sites.
Can I configure different settings per website?
Yes! Use Measurable Settings to override the system-wide host, API key, model, and prompts for specific websites. Leave fields empty to use system defaults.
How do I get insights for a report?
Does the plugin support streaming responses?
Yes, real-time streaming responses are supported. The plugin automatically falls back to non-streaming mode if your server doesn't support Server-Sent Events (SSE).
Can I customize the AI's behavior?
Yes, you can customize:
These can be set globally or per website.
What languages are supported?
The plugin interface is translated into:
What are the requirements?
Is my data sent to MistralAI?
When you use the Insights feature or Chat, the relevant report data and your messages are sent to the configured API endpoint (MistralAI by default). If you have data privacy concerns, consider using a self-hosted LLM solution.
How can I contribute to this plugin?
You can contribute by:
How long will this plugin be maintained?
The plugin is actively maintained. The developer uses Matomo on many projects and will continue to patch and improve the plugin.