Skip to content

feat: add MiniMax as LLM provider for film skill extraction#8

Open
octo-patch wants to merge 1 commit intoForget-C:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider for film skill extraction#8
octo-patch wants to merge 1 commit intoForget-C:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as an alternative LLM provider for film skill extraction. MiniMax provides an OpenAI-compatible API, enabling seamless integration via langchain-openai's ChatOpenAI with a custom base_url.

Changes

Backend (3 files):

  • config.py: Add MINIMAX_API_KEY, MINIMAX_BASE_URL, MINIMAX_MODEL, and LLM_PROVIDER env vars
  • dependencies.py: Refactor get_llm() to support provider auto-detection (OpenAI > MiniMax) and explicit selection via LLM_PROVIDER
  • .env.example: Add MiniMax configuration section with documentation

Frontend (2 files):

  • constants.ts: Add PROVIDER_PRESETS with Base URL and description for OpenAI, MiniMax, 火山引擎, 阿里百炼
  • ProvidersTab.tsx: Auto-populate Base URL and description when selecting a preset provider

Tests (2 files):

  • test_llm_provider.py: 18 unit tests + 2 integration tests for provider resolution, config defaults, error handling, and real MiniMax API calls
  • test_skills_integration.py: Update real LLM integration tests to support MINIMAX_API_KEY

Usage

Set in .env:

Or explicitly select the provider:

Test Plan

  • 18 unit tests pass (pytest tests/test_llm_provider.py -k "not Integration")
  • 2 integration tests pass with real MiniMax API (MINIMAX_API_KEY=... pytest tests/test_llm_provider.py -m integration)
  • Existing tests unaffected
  • Frontend: verify preset auto-fill in provider creation modal

@Forget-C
Copy link
Copy Markdown
Owner

Forget-C commented Apr 1, 2026

Thank you for your submission. @octo-patch

I have updated the code version, the new version is for the model supplier in the database. If convenient, please merge the code and submit again.

- Add MiniMax chat model provider (OpenAI-compatible interface)
- Register MiniMax with default base URL https://api.minimax.io/v1
- Support MiniMax-M2.7 and MiniMax-M2.7-highspeed models
- Add MINIMAX_API_KEY environment variable support via existing Provider API
- Add unit tests for MiniMax provider registration and model names
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from e141960 to 02ecfaa Compare April 24, 2026 09:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants