Full GLM-5.1 Support and Gemini CLI Integration: HagiCode's Path of Multi-Model Evolution
Full GLM-5.1 Support and Gemini CLI Integration: HagiCode’s Path of Multi-Model Evolution
Section titled “Full GLM-5.1 Support and Gemini CLI Integration: HagiCode’s Path of Multi-Model Evolution”This article introduces two major recent updates to the HagiCode platform: full support for the Zhipu AI GLM-5.1 model and the successful integration of Gemini CLI as the tenth Agent CLI. Together, these updates further strengthen the platform’s multi-model capabilities and multi-CLI ecosystem.
Background
Section titled “Background”Time really does move fast. The development of large language models has been rising like bamboo in spring. Not long ago, we were still cheering for “an AI that can write code.” Now we are already in an era of multi-model collaboration and multi-tool integration. Is that exciting? Perhaps. After all, what developers need has never been just the tool itself, but the ease of adapting to different scenarios and switching flexibly when needed.
As an AI-assisted coding platform, HagiCode has recently welcomed two important developments: first, the full integration of Zhipu AI’s GLM-5.1 model; second, the official addition of Gemini CLI as the tenth supported Agent CLI. These two updates may not sound earth-shaking, but they are unquestionably good news for the platform’s continued maturation.
GLM-5.1 is Zhipu AI’s latest flagship model. Compared with GLM-5.0, it offers stronger reasoning, deeper code understanding, and smoother tool calling. More importantly, it is the first GLM model to support image input. What does that mean? It means users can let the AI look directly at a screenshot instead of struggling to describe the problem in words. Once you’ve used that convenience, you immediately understand its value.
At the same time, through the HagiCode.Libs.Providers architecture, HagiCode successfully integrated Gemini CLI into the platform. This is now the tenth Agent CLI. To be honest, getting to this point does bring a modest sense of accomplishment.
It is also worth mentioning that HagiCode’s image upload feature lets users communicate with AI directly through screenshots. Even when running GLM 4.7, the platform still works well and has already helped complete many important build tasks. As for GLM-5.1, naturally, it goes one step further.
About HagiCode
Section titled “About HagiCode”The approach shared in this article comes from our practical experience in the HagiCode project. HagiCode is an open-source AI-assisted coding platform designed to provide developers with a flexible and powerful AI programming assistant through a multi-model, multi-CLI architecture. Project repository: github.com/HagiCode-org/site
Multi-CLI architecture design
Section titled “Multi-CLI architecture design”One of HagiCode’s core strengths is its support for multiple AI programming CLI tools through a unified abstraction layer. The advantage of this design is actually quite simple: new tools can come in, old tools can stay, and the codebase does not turn into chaos. To be fair, that is how everyone would like life to work.
AIProviderType enum
Section titled “AIProviderType enum”The platform defines supported CLI provider types through the AIProviderType enum:
public enum AIProviderType{ ClaudeCodeCli = 0, // Claude Code CLI CodexCli = 1, // GitHub Copilot Codex GitHubCopilot = 2, // GitHub Copilot CodebuddyCli = 3, // Codebuddy CLI OpenCodeCli = 4, // OpenCode CLI IFlowCli = 5, // IFlow CLI HermesCli = 6, // Hermes CLI QoderCli = 7, // Qoder CLI KiroCli = 8, // Kiro CLI KimiCli = 9, // Kimi CLI GeminiCli = 10, // Gemini CLI (new)}As you can see, Gemini CLI joins this family as the tenth member. Each CLI has its own distinct characteristics and usage scenarios, so users can choose flexibly based on their needs. After all, many roads lead to Rome; some are simply easier than others.
Provider architecture
Section titled “Provider architecture”HagiCode.Libs.Providers provides a unified Provider interface that makes each CLI integration standardized and concise. Taking Gemini CLI as an example:
public class GeminiProvider : ICliProvider<GeminiOptions>{ private static readonly string[] DefaultExecutableCandidates = ["gemini", "gemini-cli"]; private const string ManagedBootstrapArgument = "--acp";
public string Name => "gemini"; public bool IsAvailable => _executableResolver.ResolveFirstAvailablePath(DefaultExecutableCandidates) is not null;}The benefits of this design are:
- Integrating a new CLI only requires implementing one Provider class
- Unified lifecycle management and session pooling
- Automated alias resolution and executable discovery
Put plainly, this design turns complicated things into simpler ones and makes life a bit easier.
Provider Registry
Section titled “Provider Registry”The Provider Registry automatically handles alias mapping and registration:
if (provider is GeminiProvider){ registry.Register(provider.Name, provider, ["gemini-cli"]); continue;}This means users can invoke Gemini CLI with either gemini or gemini-cli, and the system will recognize it automatically. It is like a friend with both a formal name and a nickname - either way, people know who you mean.
GLM-5.1 model support
Section titled “GLM-5.1 model support”GLM-5.1 is Zhipu AI’s latest flagship model, and HagiCode has completed full support for it.
Secondary Professions Catalog
Section titled “Secondary Professions Catalog”HagiCode manages all supported models through the Secondary Professions Catalog. Here is the configuration for the GLM series:
| Model ID | Name | SupportsImage | Compatible CLI Families |
|---|---|---|---|
glm-4.7 | GLM 4.7 | - | claude, codebuddy, hermes, qoder, kiro |
glm-5 | GLM 5 | - | claude, codebuddy, hermes, qoder, kiro |
glm-5-turbo | GLM 5 Turbo | - | claude, codebuddy, hermes, qoder, kiro |
glm-5.0 | GLM 5.0 (Legacy) | - | claude, codebuddy, hermes, qoder, kiro |
glm-5.1 | GLM 5.1 | true | claude, codebuddy, hermes, qoder, kiro |
The key characteristics of GLM-5.1 can be summarized as follows:
- A standalone version identifier with no legacy baggage
- The first GLM model to support image input
- Stronger reasoning and code understanding
- Broad multi-CLI compatibility
GLM-5.1 vs GLM-5.0
Section titled “GLM-5.1 vs GLM-5.0”At the code level, the key difference between GLM-5.1 and GLM-5.0 is shown here:
// GLM-5.0 (Legacy) - contains special retention logicprivate const string Glm50CodebuddySecondaryProfessionId = "secondary-glm-5-codebuddy";private const string Glm50CodebuddyModelValue = "glm-5.0";
// GLM-5.1 - standalone new model identifierprivate const string Glm51SecondaryProfessionId = "secondary-glm-5-1";private const string Glm51ModelValue = "glm-5.1";GLM-5.0 carries the “Legacy” label because it is an old version identifier retained for backward compatibility. GLM-5.1, by contrast, is a brand-new standalone version with no historical burden. Some things stay in the past; others travel lighter and move faster.
Configure GLM-5.1
Section titled “Configure GLM-5.1”Here is a configuration example for using GLM-5.1 in HagiCode:
{ "primaryProfessionId": "profession-claude-code", "secondaryProfessionId": "secondary-glm-5-1", "model": "glm-5.1", "reasoning": "high"}Image upload feature
Section titled “Image upload feature”HagiCode’s image support is implemented through the SupportsImage property on SecondaryProfession:
public class HeroSecondaryProfessionSettingDto{ public bool SupportsImage { get; set; }}In the Secondary Professions Catalog, the GLM-5.1 configuration looks like this:
{ "id": "secondary-glm-5-1", "supportsImage": true}This means users can upload screenshots directly for AI analysis, such as:
- Screenshots of error messages
- Problems in a UI screen
- Data visualization charts
- Code execution results
There is no longer any need to describe everything manually - just upload the screenshot. The convenience of this feature is obvious once you have used it. Sometimes one look says more than a long explanation.
Gemini CLI integration
Section titled “Gemini CLI integration”As the tenth Agent CLI, Gemini CLI is integrated into HagiCode through the standard Provider architecture.
Configuration options
Section titled “Configuration options”Gemini CLI supports a rich set of configuration options:
public class GeminiOptions{ public string? ExecutablePath { get; set; } public string? WorkingDirectory { get; set; } public string? SessionId { get; set; } public string? Model { get; set; } public string? AuthenticationMethod { get; set; } public string? AuthenticationToken { get; set; } public Dictionary<string, string?> AuthenticationInfo { get; set; } public Dictionary<string, string?> EnvironmentVariables { get; set; } public string[] ExtraArguments { get; set; } public TimeSpan? StartupTimeout { get; set; } public CliPoolSettings? PoolSettings { get; set; }}These options cover everything from basic setup to advanced features, giving users the flexibility to configure the CLI around their own needs. Everyone’s workflow is different, so a little flexibility is always welcome.
ACP communication protocol
Section titled “ACP communication protocol”Gemini CLI supports the ACP (Agent Communication Protocol), which is HagiCode’s unified CLI communication standard. Through ACP, different CLIs can interact with the platform in a consistent way, greatly simplifying integration work. In short, it standardizes the complicated parts so everyone can work more easily.
Environment configuration
Section titled “Environment configuration”To use Zhipu AI models, you need to configure the corresponding environment variables.
Zhipu AI ZAI platform
Section titled “Zhipu AI ZAI platform”export ANTHROPIC_AUTH_TOKEN="***"export ANTHROPIC_BASE_URL="https://open.bigmodel.cn/api/anthropic"Alibaba Cloud DashScope
Section titled “Alibaba Cloud DashScope”export ANTHROPIC_AUTH_TOKEN="your-a...-key"export ANTHROPIC_BASE_URL="https://coding.dashscope.aliyuncs.com/apps/anthropic"Once configured, HagiCode can call the GLM-5.1 model normally. It is neither especially hard nor especially easy - you just need to follow the setup as intended.
HagiCode’s own build practice
Section titled “HagiCode’s own build practice”Speaking of real-world practice, the best example is the HagiCode platform’s own build workflow. HagiCode’s development process has already made full use of AI capabilities.
Works well even with GLM 4.7
Section titled “Works well even with GLM 4.7”HagiCode’s platform design is well optimized, so it can still provide a good development experience even with GLM 4.7. The platform has already helped complete multiple important build projects, including:
- Integration of multiple CLI Providers
- Implementation of the image upload feature
- Documentation generation and content publishing
That is actually a good thing. Not everyone needs the newest thing all the time. What suits you best is often what matters most.
GLM-5.1 delivers more with less effort
Section titled “GLM-5.1 delivers more with less effort”After upgrading to GLM-5.1, these capabilities become even stronger:
- Stronger code understanding, reducing back-and-forth communication
- More accurate dependency analysis, pointing in the right direction immediately
- More efficient error diagnosis, locating issues faster
- Support for image input, accelerating problem descriptions
It is like switching from a bicycle to a car. You can still reach the same destination, but the speed and comfort are not the same.
Best practices for multi-CLI integration
Section titled “Best practices for multi-CLI integration”HagiCode.Libs.Providers provides a unified mechanism for registration and usage:
services.AddHagiCodeLibs();
var gemini = serviceProvider.GetRequiredService<ICliProvider<GeminiOptions>>();var codebuddy = serviceProvider.GetRequiredService<ICliProvider<CodebuddyOptions>>();var hermes = serviceProvider.GetRequiredService<ICliProvider<HermesOptions>>();This dependency injection design keeps usage across different CLIs very concise and also makes unit testing and mocking more convenient. Clean code is a way of being responsible to yourself.
There are a few things to keep in mind in actual use:
- API key configuration: Make sure
ANTHROPIC_AUTH_TOKENis set correctly, or the model cannot be called - Model availability: GLM-5.1 needs to be enabled by the corresponding model provider
- Image feature: Only models with
supportsImage: truecan use image upload - CLI installation: Before using Gemini CLI, make sure
geminiorgemini-cliis in the system PATH
These may be small details, but small details handled poorly can turn into big problems, so they are worth paying attention to.
Conclusion
Section titled “Conclusion”With full support for GLM-5.1 and the successful integration of Gemini CLI, HagiCode further strengthens its capabilities as a multi-model, multi-CLI AI programming platform. These updates not only give users more choices, but also demonstrate HagiCode’s forward-looking architecture and scalability.
GLM-5.1’s image support, combined with HagiCode’s screenshot upload feature, makes it possible to let the AI “understand from the image” and greatly reduces the cost of describing problems. And with support for ten CLIs, users can flexibly choose the AI programming assistant that best fits their preferences and scenarios. More choice is almost always a good thing.
Most importantly, HagiCode’s own build practice proves that the platform can already run well and complete complex tasks even with GLM 4.7, while upgrading to GLM-5.1 can further improve development efficiency. Life is often like that too: you do not always need the absolute best, only what suits you. Of course, if what suits you can become even better, then so much the better.
If you are interested in a multi-model, multi-CLI AI programming platform, give HagiCode a try - open source, free, and still evolving. Trying it costs nothing, and it may turn out to be exactly what you need.
References
Section titled “References”- HagiCode GitHub repository
- HagiCode official website
- Zhipu AI Open Platform
- Gemini CLI documentation
- Docker Compose Quick Installation
- Desktop installation
If this article helped you:
- Give it a Star on GitHub: github.com/HagiCode-org/site
- Visit the official website to learn more: hagicode.com
- Watch the 30-minute hands-on demo: www.bilibili.com/video/BV1pirZBuEzq/
- Try the one-click installation: docs.hagicode.com/installation/docker-compose
- Public beta has started, and you are welcome to install and try it
Copyright notice
Section titled “Copyright notice”Thank you for reading. If you found this article useful, feel free to like, bookmark, and share it to show your support. This content was created with AI-assisted collaboration, and the final content was reviewed and confirmed by the author.
- Author: newbe36524
- Original link: https://docs.hagicode.com/blog/2026-03-30-hagicode-glm-5-1-gemini-cli-update/
- Copyright: Unless otherwise stated, all blog posts on this site are licensed under BY-NC-SA. Please cite the source when reposting.