MiniMax Chat Model node added to AI agents

n8n users can now connect to MiniMax AI directly from AI Agent workflows, with a Hide Thinking option to strip reasoning tags from responses.
The n8n workflow automation platform now supports MiniMax AI as a chat model provider. A new sub-node allows AI Agent workflows to call MiniMax models directly, with seven available versions spanning the M2 through M2.7 families. A region selector handles endpoint routing between international and China deployments.
The node includes a "Hide Thinking" toggle that strips chain-of-thought reasoning tags from model responses. When enabled, users see only the final answer rather than interleaved <think> reasoning blocks. This works by passing a reasoning_split parameter to the underlying OpenAI-compatible API. The credential setup offers a zero-cost test by querying the files endpoint with a voice_clone purpose rather than making a text generation call.
This addition brings MiniMax alongside other supported providers like Mistral Cloud and Moonshot in the n8n LangChain node collection.
View Original GitHub Description
Summary
Adds a MiniMax Chat Model LangChain sub-node for the MiniMax AI platform (OpenAI-compatible API).
- Credential with region selector (International / China) and zero-cost credential test
- Chat model node using
ChatOpenAIwith 7 static models, defaultMiniMax-M2.7 - Hide Thinking option (default: on) to strip
<think>reasoning from responses viareasoning_split - Light/dark icons, Agent V1 allowlist entries, 10 unit tests
How to test
- Add a MiniMax credential with an API key from platform.minimax.io
- Create an AI Agent workflow with the MiniMax Chat Model sub-node
- Send a message — response should be clean (no
<think>tags) - Toggle Hide Thinking off in options —
<think>tags should appear
Related Linear tickets, Github issues, and Community forum posts
https://linear.app/n8n/issue/NODE-4773
Review / Merge checklist
- PR title and summary are descriptive. (conventions)
- Docs updated or follow-up ticket created.
- Tests included.
- PR Labeled with
release/backport(if the PR is an urgent fix that needs to be backported) - I have seen this code, I have run this code, and I take responsibility for this code.