AI Service Subscriptions
This chapter collects the AI service subscription websites currently marked as recommended in repos/docs/public/presets/claude-code/providers/*.json, so you can choose a provider first and then move on to Claude Code configuration.
As of March 9, 2026, this chapter covers: ZAI, MiniMax, Alibaba Cloud DashScope, and Volcengine Coding Plan.
Current recommended providers
Section titled “Current recommended providers”| Provider | Strength | Good fit for | Detail page |
|---|---|---|---|
| ZAI | Coding-focused plans with tier mapping | Developers who want Claude Code as a daily driver | View details |
| MiniMax | One default model across tiers, simple onboarding | Teams optimizing for low-friction setup | View details |
| Alibaba Cloud DashScope | Cloud-console alignment and unified management | Teams already working in Alibaba Cloud | View details |
| Volcengine Coding Plan | Strong ecosystem extension and multi-model platform signals | Developers who want Volcengine ecosystem alignment | View details |
Explore each provider
Section titled “Explore each provider” ZAI Review GLM Coding positioning, default model mapping, and the next setup step.
MiniMax See why a single mapped model can simplify Claude Code onboarding.
Alibaba Cloud DashScope A subscription path designed for teams already operating in Alibaba Cloud.
Volcengine Coding Plan Check the Volcengine-oriented subscription path and setup notes for Claude Code.
Quick selection tips
Section titled “Quick selection tips”- Need coding-focused plans and clearer tier mapping: start with ZAI
- Need the easiest setup path first: start with MiniMax
- Already run inside Alibaba Cloud: start with Alibaba Cloud DashScope
- Want Volcengine ecosystem alignment or multi-model platform reach: start with Volcengine Coding Plan
Next step
Section titled “Next step” Continue with Claude Code setup After choosing a provider, return to the setup guide to finish your settings.json or environment variable configuration.
Read the LLM guide If you still want to compare capability, cost, and model trade-offs, continue with the LLM guide.