Hagicode and GLM-5.1 Multi-CLI Integration Guide
Seite bearbeitenHagicode and GLM-5.1 Multi-CLI Integration Guide
Abschnitt betitelt „Hagicode and GLM-5.1 Multi-CLI Integration Guide“Background
Abschnitt betitelt „Background“In the Hagicode project, users can choose from multiple CLI tools to drive AI programming assistants, including Claude Code CLI, GitHub Copilot, OpenCode CLI, Codebuddy CLI, Hermes CLI, and more. These CLI tools are general-purpose AI programming tools on their own, but through Hagicode’s abstraction layer, they can flexibly connect to different AI model providers.
Zhipu AI (ZAI) provides an interface compatible with the Anthropic Claude API, allowing these CLI tools to directly use domestic GLM series models. Among them, GLM-5.1 is Zhipu’s latest large language model release, with significant improvements over GLM-5.0.
Hagicode’s CLI abstraction architecture
Abschnitt betitelt „Hagicode’s CLI abstraction architecture“Hagicode defines 11 CLI provider types through the AIProviderType enum, covering mainstream AI programming CLI tools:
public enum AIProviderType{ ClaudeCodeCli = 0, // Claude Code CLI CodexCli = 1, // GitHub Copilot Codex GitHubCopilot = 2, // GitHub Copilot CodebuddyCli = 3, // Codebuddy CLI OpenCodeCli = 4, // OpenCode CLI IFlowCli = 5, // IFlow CLI HermesCli = 6, // Hermes CLI QoderCli = 7, // Qoder CLI KiroCli = 8, // Kiro CLI KimiCli = 9, // Kimi CLI GeminiCli = 10, // Gemini CLI}Each CLI has corresponding model parameter configuration and supports the model and reasoning parameters:
private static readonly IReadOnlyDictionary<AIProviderType, IReadOnlyList<string>> ManagedModelParameterKeysByProvider = new Dictionary<AIProviderType, IReadOnlyList<string>> { [AIProviderType.ClaudeCodeCli] = ["model", "reasoning"], [AIProviderType.CodexCli] = ["model", "reasoning"], [AIProviderType.OpenCodeCli] = ["model", "reasoning"], [AIProviderType.HermesCli] = ["model", "reasoning"], [AIProviderType.CodebuddyCli] = ["model", "reasoning"], [AIProviderType.QoderCli] = ["model", "reasoning"], [AIProviderType.KiroCli] = ["model", "reasoning"], [AIProviderType.GeminiCli] = ["model"], // Gemini does not support the reasoning parameter // ... };GLM model support system
Abschnitt betitelt „GLM model support system“Hagicode’s Secondary Professions Catalog defines complete support for the GLM model series:
| Model ID | Name | Default Reasoning | Compatible CLI Families |
|---|---|---|---|
glm-4.7 | GLM 4.7 | high | claude, codebuddy, hermes, qoder, kiro |
glm-5 | GLM 5 | high | claude, codebuddy, hermes, qoder, kiro |
glm-5-turbo | GLM 5 Turbo | high | claude, codebuddy, hermes, qoder, kiro |
glm-5.0 | GLM 5.0 (Legacy) | high | claude, codebuddy, hermes, qoder, kiro |
glm-5.1 | GLM 5.1 | high | claude, codebuddy, hermes, qoder, kiro |
Key differences between GLM-5.1 and GLM-5.0
Abschnitt betitelt „Key differences between GLM-5.1 and GLM-5.0“From the implementation in AcpSessionModelBootstrapper.cs, we can clearly see the differences between GLM-5.1 and GLM-5.0:
Standalone implementation of GLM-5.1
Abschnitt betitelt „Standalone implementation of GLM-5.1“GLM-5.1 is a standalone new model identifier with no legacy handling logic:
private const string Glm51ModelValue = "glm-5.1";Definition in the Secondary Professions Catalog:
{ "id": "secondary-glm-5-1", "name": "GLM 5.1", "family": "anthropic", "summary": "hero.professionCopy.secondary.glm51.summary", "sourceLabel": "hero.professionCopy.sources.aiSharedAnthropicModel", "sortOrder": 64, "supportsImage": true, "compatiblePrimaryFamilies": [ "claude", "codebuddy", "hermes", "qoder", "kiro" ], "defaultParameters": { "model": "glm-5.1", "reasoning": "high" }}Model provider configuration
Abschnitt betitelt „Model provider configuration“Zhipu AI (ZAI)
Abschnitt betitelt „Zhipu AI (ZAI)“Zhipu AI provides the most complete GLM model support:
{ "providerId": "zai", "name": "智谱 AI", "description": "智谱 AI 提供的 Claude API 兼容服务", "category": "china-providers", "apiUrl": { "codingPlanForAnthropic": "https://open.bigmodel.cn/api/anthropic" }, "recommended": true, "region": "cn", "defaultModels": { "sonnet": "glm-4.7", "opus": "glm-5", "haiku": "glm-4.5-air" }, "supportedModels": [ "glm-4.7", "glm-5", "glm-4.5-air", "qwen3-coder-next", "qwen3-coder-plus" ], "features": ["experimental-agent-teams"], "authTokenEnv": "ANTHROPIC_AUTH_TOKEN", "referralUrl": "https://www.bigmodel.cn/claude-code?ic=14BY54APZA", "documentationUrl": "https://open.bigmodel.cn/dev/api"}Features:
- Supports the widest variety of GLM model variants
- Provides default mapping across the Sonnet/Opus/Haiku tiers
- Supports the
experimental-agent-teamsfeature
Using GLM-5.1 in different CLIs
Abschnitt betitelt „Using GLM-5.1 in different CLIs“1. Claude Code CLI + GLM-5.1
Abschnitt betitelt „1. Claude Code CLI + GLM-5.1“Claude Code CLI is one of Hagicode’s core CLIs and is configured through the Hero configuration system:
{ "primaryProfessionId": "profession-claude-code", "secondaryProfessionId": "secondary-glm-5-1", "model": "glm-5.1", "reasoning": "high"}Corresponding HeroEquipmentCatalogItem configuration:
{ id: 'secondary-glm-5-1', name: 'GLM 5.1', family: 'anthropic', kind: 'model', primaryFamily: 'claude', compatiblePrimaryFamilies: ['claude', 'codebuddy', 'hermes', 'qoder', 'kiro'], defaultParameters: { model: 'glm-5.1', reasoning: 'high' }}2. OpenCode CLI + GLM-5.1
Abschnitt betitelt „2. OpenCode CLI + GLM-5.1“OpenCode CLI is the most flexible CLI and supports specifying any model in the provider/model format:
Method 1: Use the ZAI provider prefix
{ "primaryProfessionId": "profession-opencode", "model": "zai/glm-5.1", "reasoning": "high"}Method 2: Use the model ID directly
{ "model": "glm-5.1"}Method 3: Frontend configuration UI
In HeroModelEquipmentForm.tsx, OpenCode CLI has a dedicated placeholder hint:
const OPEN_CODE_MODEL_PLACEHOLDER = 'myprovider/glm-4.7';
const modelPlaceholder = primaryProviderType === PCode_Models_AIProviderType.OPEN_CODE_CLI ? OPEN_CODE_MODEL_PLACEHOLDER : 'gpt-5.4';Users can enter:
zai/glm-5.1glm-5.1OpenCode CLI model parsing logic:
internal OpenCodeModelSelection? ResolveModelSelection(string? rawModel){ var normalized = NormalizeOptionalValue(rawModel); if (normalized == null) return null;
var slashIndex = normalized.IndexOf('/'); if (slashIndex < 0) { // No slash: use the model ID directly return new OpenCodeModelSelection { ProviderId = string.Empty, ModelId = normalized, }; }
// Slash exists: parse the provider/model format var providerId = normalized[..slashIndex].Trim(); var modelId = normalized[(slashIndex + 1)..].Trim();
return new OpenCodeModelSelection { ProviderId = providerId, ModelId = modelId, };}3. Codebuddy CLI + GLM-5.1
Abschnitt betitelt „3. Codebuddy CLI + GLM-5.1“Codebuddy CLI has dedicated legacy handling logic:
{ "primaryProfessionId": "profession-codebuddy", "model": "glm-5.1", "reasoning": "high"}Note: Codebuddy retains special handling for GLM-5.0 and does not use legacy normalization:
return !string.Equals(providerName, "CodebuddyCli", StringComparison.OrdinalIgnoreCase) && string.Equals(normalizedModel, LegacyGlm5TurboModelValue, StringComparison.OrdinalIgnoreCase) ? Glm5TurboModelValue : normalizedModel;// For CodebuddyCli, glm-5.0 is not normalized to glm-5-turboEnvironment variable configuration
Abschnitt betitelt „Environment variable configuration“Using Zhipu AI ZAI
Abschnitt betitelt „Using Zhipu AI ZAI“# Set the API keyexport ANTHROPIC_AUTH_TOKEN="***"
# Optional: specify the API endpoint (ZAI uses this endpoint by default)export ANTHROPIC_BASE_URL="https://open.bigmodel.cn/api/anthropic"Using Alibaba Cloud DashScope
Abschnitt betitelt „Using Alibaba Cloud DashScope“# Set the API keyexport ANTHROPIC_AUTH_TOKEN="your-a...-key"
# Specify the Alibaba Cloud endpointexport ANTHROPIC_BASE_URL="https://coding.dashscope.aliyuncs.com/apps/anthropic"Get an API key
Abschnitt betitelt „Get an API key“- Zhipu AI: https://www.bigmodel.cn/claude-code?ic=14BY54APZA
- Alibaba Cloud: https://www.aliyun.com/benefit/ai/aistar?userCode=vmx5szbq
Improvement advantages of GLM-5.1
Abschnitt betitelt „Improvement advantages of GLM-5.1“Compared with GLM-5.0, GLM-5.1 brings the following significant improvements:
1. Better reasoning capability
Abschnitt betitelt „1. Better reasoning capability“According to Zhipu’s official release information, improvements in GLM-5.1 include:
- Stronger code understanding: More accurate analysis of complex code structures
- Longer context comprehension: Supports longer conversational context
- Enhanced tool calling: Higher success rate for MCP tool calls
- Output stability: Reduces randomness and hallucinations
2. Comprehensive multi-CLI compatibility
Abschnitt betitelt „2. Comprehensive multi-CLI compatibility“GLM-5.1 covers all mainstream CLIs supported by Hagicode:
compatiblePrimaryFamilies: [ "claude", // Claude Code CLI "codebuddy", // Codebuddy CLI "hermes", // Hermes CLI "qoder", // Qoder CLI "kiro" // Kiro CLI]1. API key configuration
Abschnitt betitelt „1. API key configuration“Make sure the ANTHROPIC_AUTH_TOKEN environment variable is set correctly. It is the required credential for every CLI to connect to the model.
2. Model availability
Abschnitt betitelt „2. Model availability“GLM-5.1 needs to be enabled by the corresponding model provider:
- The Zhipu AI ZAI platform supports it by default
- Alibaba Cloud DashScope may require a separate application
3. OpenCode CLI format
Abschnitt betitelt „3. OpenCode CLI format“When using the provider/model format, make sure the provider ID is correct:
- Zhipu AI:
zaiorzhipuai - Alibaba Cloud:
aliyunordashscope
4. Reasoning parameter
Abschnitt betitelt „4. Reasoning parameter“highis recommended for the best code generation results- Gemini CLI does not support the reasoning parameter and will ignore this configuration automatically
Conclusion
Abschnitt betitelt „Conclusion“Through a unified abstraction layer, Hagicode enables flexible integration between GLM-5.1 and multiple CLIs. Developers can choose the CLI tool that best fits their preferences and usage scenarios, then use the latest GLM-5.1 model through simple configuration.
As Zhipu’s latest model version, GLM-5.1 offers clear improvements over GLM-5.0:
- An independent version identifier with no legacy burden
- Stronger reasoning and code understanding
- Broad multi-CLI compatibility
- Flexible reasoning level configuration
With the correct environment variables and Hero equipment configured, users can fully unlock the power of GLM-5.1 across different CLI environments.
Continue With HagiCode
Abschnitt betitelt „Continue With HagiCode“If you want to put GLM-5.1, multi-CLI orchestration, and HagiCode’s configuration model into real use, these are the fastest entry points:
- Track the main project and latest implementation progress on GitHub: github.com/HagiCode-org/site
- Visit the official site to understand the product direction, capability boundaries, and install options: hagicode.com
- Start with the Docker Compose guide, then switch models and CLIs in a real environment: docs.hagicode.com/installation/docker-compose
- If you prefer a local desktop workflow, begin with the Desktop entry point: hagicode.com/desktop/
Once you compare Kimi, Claude Code, OpenCode, and other CLIs inside the same abstraction layer, questions about model switching, parameter mapping, and engineering boundaries tend to become much easier to reason about.