MaxKB Version
MaxKB version: v2.8.1 (self-hosted Docker on Windows 11)
Please describe your needs or suggestions for improvements
Feature Request: Add Mistral AI as a native model provider
Problem
MaxKB supports OpenAI, DeepSeek, Ollama, and many other providers, but Mistral AI — one of Europe's leading AI labs — is missing from the built-in provider list.
Mistral API is fully OpenAI-compatible (/v1/chat/completions, /v1/models) and uses Bearer token authentication — identical to other already-supported providers.
Why Mistral?
- Mistral Large ranks among top models for RAG/knowledge-base tasks
- Fully OpenAI-compatible API → minimal implementation effort
- Already supported in Dify as a native provider
- Growing European user base
Current Workarounds (Broken)
Users try:
- Adding as "OpenAI" provider with custom base URL → model list works but verification/runtime fails
- Direct database insertion with RSA-encrypted credentials → unstable, breaks after UI changes
- Using Ollama's
mistral:7b instead → limited to local 7B model, no access to Mistral Large
Suggested Implementation
- Add
model_mistral_provider following the same pattern as model_deepseek_provider and model_openai_provider
- Use LangChain's
langchain-mistralai package or reuse existing OpenAI-compatible wrapper
- Base URL:
https://api.mistral.ai/v1
- Auth:
Authorization: Bearer <API_KEY>
- Supported models:
mistral-large-latest, mistral-small-latest, pixtral-large-latest
Context
We recently submitted a similar feature request to DeepChat — it was implemented and merged within 24 hours (PR #1598). MaxKB is a critical part of our AI stack for automotive diagnostics and Mistral support would be a game-changer.
MaxKB version: v2.8.1 (Docker)
Deployment: Docker on Windows 11
Please describe the solution you suggest
No response
Additional Information
No response
MaxKB Version
MaxKB version: v2.8.1 (self-hosted Docker on Windows 11)
Please describe your needs or suggestions for improvements
Feature Request: Add Mistral AI as a native model provider
Problem
MaxKB supports OpenAI, DeepSeek, Ollama, and many other providers, but Mistral AI — one of Europe's leading AI labs — is missing from the built-in provider list.
Mistral API is fully OpenAI-compatible (
/v1/chat/completions,/v1/models) and uses Bearer token authentication — identical to other already-supported providers.Why Mistral?
Current Workarounds (Broken)
Users try:
mistral:7binstead → limited to local 7B model, no access to Mistral LargeSuggested Implementation
model_mistral_providerfollowing the same pattern asmodel_deepseek_providerandmodel_openai_providerlangchain-mistralaipackage or reuse existing OpenAI-compatible wrapperhttps://api.mistral.ai/v1Authorization: Bearer <API_KEY>mistral-large-latest,mistral-small-latest,pixtral-large-latestContext
We recently submitted a similar feature request to DeepChat — it was implemented and merged within 24 hours (PR #1598). MaxKB is a critical part of our AI stack for automotive diagnostics and Mistral support would be a game-changer.
MaxKB version: v2.8.1 (Docker)
Deployment: Docker on Windows 11
Please describe the solution you suggest
No response
Additional Information
No response