Skip to content

feat: add MiniMax as a supported LLM provider#815

Open
octo-patch wants to merge 1 commit intoCinnamon:mainfrom
octo-patch:feat/add-minimax-provider
Open

feat: add MiniMax as a supported LLM provider#815
octo-patch wants to merge 1 commit intoCinnamon:mainfrom
octo-patch:feat/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

  • Add native MiniMax LLM provider support via their OpenAI-compatible API
  • Introduce ChatMiniMax class with MiniMax-specific parameter handling (temperature clamping to (0, 1.0], automatic response_format removal)
  • Register MiniMax as a first-class vendor in the LLM Manager UI
  • Supported models: MiniMax-M2.5 (default), MiniMax-M2.5-highspeed — both with 204K context window

Changes

File Change
libs/kotaemon/kotaemon/llms/chats/minimax.py New ChatMiniMax class extending ChatOpenAI
libs/kotaemon/kotaemon/llms/chats/__init__.py Export ChatMiniMax
libs/kotaemon/kotaemon/llms/__init__.py Export ChatMiniMax
libs/ktem/ktem/llms/manager.py Register ChatMiniMax as vendor
flowsettings.py Add default MiniMax LLM configuration
.env.example Add MINIMAX_API_KEY
README.md Mention MiniMax in supported providers
libs/kotaemon/tests/test_llms_chat_models.py Unit tests for ChatMiniMax

Configuration

Users can enable MiniMax by setting the API key in .env:

MINIMAX_API_KEY=<your_key>

Both global (https://api.minimax.io/v1) and China (https://api.minimaxi.com/v1) endpoints are supported.

Test plan

  • Syntax validation passes for all modified files
  • Unit test covers: instantiation, temperature clamping (0 → 0.01), response_format removal
  • Manual integration test with a valid MiniMax API key

🤖 Generated with Claude Code

Add native support for MiniMax large language models (MiniMax-M2.5,
MiniMax-M2.5-highspeed) via their OpenAI-compatible API endpoint.

Changes:
- Add ChatMiniMax class extending ChatOpenAI with MiniMax-specific
  defaults and parameter handling (temperature clamping, response_format
  removal)
- Register ChatMiniMax in LLMManager vendors for UI model selection
- Add default MiniMax configuration in flowsettings.py
- Add MINIMAX_API_KEY to .env.example
- Add unit test for ChatMiniMax
- Update README to mention MiniMax in supported providers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant