AI Service Subscriptions
Edit pageThis chapter collects the AI service subscription websites currently marked as recommended in repos/docs/public/presets/claude-code/providers/*.json. Choose a provider here first, then continue to Claude Code configuration.
As of March 9, 2026, this chapter covers: ZAI, MiniMax, Alibaba Cloud DashScope, and Volcengine Coding Plan.
Current recommended providers
Section titled “Current recommended providers”| Provider | Strength | Good fit for | Detail page |
|---|---|---|---|
| ZAI | Coding-focused plans with tier mapping | Developers who want Claude Code as a daily driver | View details |
| MiniMax | One default model across tiers, simple onboarding | Teams optimizing for low-friction setup | View details |
| Alibaba Cloud DashScope | Cloud-console alignment and unified management | Teams already working in Alibaba Cloud | View details |
| Volcengine Coding Plan | Strong ecosystem extension and multi-model platform signals | Developers who want Volcengine ecosystem alignment | View details |
Explore each provider
Section titled “Explore each provider” ZAI Review GLM Coding positioning, default model mapping, and the next setup step.
MiniMax See why a single mapped model can simplify Claude Code onboarding.
Alibaba Cloud DashScope A subscription path designed for teams already operating in Alibaba Cloud.
Volcengine Coding Plan Check the Volcengine-oriented subscription path and setup notes for Claude Code.
Quick selection tips
Section titled “Quick selection tips”- Need coding-focused plans and clearer tier mapping: start with ZAI
- Need the easiest setup path first: start with MiniMax
- Already run inside Alibaba Cloud: start with Alibaba Cloud DashScope
- Want Volcengine ecosystem alignment or multi-model platform reach: start with Volcengine Coding Plan
Next step
Section titled “Next step” Open the Claude Code official install entry Jump to the Claude Code section in the unified AI Agent CLI hub, then continue with Anthropic's official install and reference docs.
Read the LLM guide If you still want to compare capability, cost, and model trade-offs, continue with the LLM guide.