Skip to content

feat: add MiniMax as LLM provider#198

Open
octo-patch wants to merge 1 commit intovirattt:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider#198
octo-patch wants to merge 1 commit intovirattt:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class LLM provider, following the same OpenAI-compatible pattern used by xAI, Moonshot, and DeepSeek.

  • Provider registration in providers.ts with minimax: prefix routing and MINIMAX_API_KEY env var
  • Model factory in model/llm.ts using ChatOpenAI with https://api.minimax.io/v1 base URL
  • Model definitions in utils/model.ts: MiniMax M2.7 (1M context) and M2.7-highspeed
  • Environment config updated in env.example and README.md

Models added

Model Context Use case
MiniMax-M2.7 1M tokens Primary model for deep research
MiniMax-M2.7-highspeed 1M tokens Fast variant for summarization

Files changed (9 files, 257 additions)

File Change
src/providers.ts Add MiniMax provider definition
src/model/llm.ts Add MiniMax factory (ChatOpenAI + baseURL)
src/utils/model.ts Add M2.7 and M2.7-highspeed models
env.example Add MINIMAX_API_KEY
README.md Add MiniMax API key mention
src/providers.test.ts 7 unit tests for provider registration + routing
src/utils/model.test.ts 5 unit tests for model definitions
src/model/llm.test.ts 5 unit tests for LLM factory
src/model/llm.integration.test.ts 3 integration tests for end-to-end pipeline

Test plan

  • All 20 new tests pass (bun test)
  • All 36 existing tests still pass (56 total)
  • TypeScript type check passes (tsc --noEmit)
  • Manual verification with MINIMAX_API_KEY set and /model command

How to test

# Set your MiniMax API key
export MINIMAX_API_KEY=your-key

# Run all tests
bun test

# Start Dexter and select MiniMax via /model command
bun start

Add MiniMax as a first-class LLM provider using the OpenAI-compatible
API at api.minimax.io/v1. Follows the existing pattern used by xAI,
Moonshot, and DeepSeek — routes via ChatOpenAI with custom baseURL.

- Register MiniMax in providers.ts with minimax: prefix routing
- Add MODEL_FACTORIES entry in llm.ts (ChatOpenAI + baseURL)
- Add M2.7 and M2.7-highspeed models in model.ts
- Add MINIMAX_API_KEY to env.example and README
- Add 20 tests (unit + integration)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant