Skip to content

Integration Guide

1 post with the tag “Integration Guide”

Practical Guide to Integrating CodeBuddy CLI into a C# Backend

Practical Guide to Integrating CodeBuddy CLI into a C# Backend

Section titled “Practical Guide to Integrating CodeBuddy CLI into a C# Backend”

This article walks through a complete approach to integrating CodeBuddy CLI into a C# backend project so you can deliver AI coding assistant capabilities end to end.

In modern AI coding assistant development, a single AI Provider often cannot satisfy complex and changing development scenarios. HagiCode, as a multifunctional AI coding assistant, needs to support multiple AI Providers to deliver a better user experience. Users should have enough freedom to choose. In early 2026, the project faced a key decision: how to restore CodeBuddy ACP (Agent Communication Protocol) integration capabilities in the C# backend.

The project had previously implemented CodeBuddy integration, but the related code was removed during a refactor. There is not much to complain about there; during iterative development, something always gets left behind. The goal of this technical solution was to fully restore that capability and improve the architecture so it would be more robust and maintainable.

If you are also considering connecting multiple AI coding assistants to your own project, the approach below may give you some ideas. It reflects lessons we summarized after stepping into plenty of pitfalls, and maybe it can help you avoid a few detours.

The approach shared in this article comes from our practical experience in the HagiCode project. HagiCode is an open-source AI coding assistant project that supports multiple AI Providers and cross-platform operation. To satisfy different user preferences, we need to switch flexibly among different AI coding assistants, which is exactly why we built the CodeBuddy integration described here.

HagiCode uses a modular design, with AI Providers implemented as pluggable components. This architecture lets us add new AI support easily without affecting existing features. When a design is done well up front, it saves a lot of trouble later. If you are interested in our technical architecture, you can view the full source code on GitHub.

The integration between C# and CodeBuddy uses a clear layered architecture. This design makes responsibilities explicit and makes long-term maintenance much easier:

┌─────────────────────────────────────────────┐
│ Provider Contract Layer │
│ AIProviderType enum + extension methods │
├─────────────────────────────────────────────┤
│ Provider Factory Layer │
│ AIProviderFactory dependency injection factory │
├─────────────────────────────────────────────┤
│ Provider Implementation Layer │
│ CodebuddyCliProvider concrete implementation │
├─────────────────────────────────────────────┤
│ ACP Infrastructure Layer │
│ ACPSessionManager / StdioAcpTransport │
│ AcpRpcClient / AcpAgentClient │
└─────────────────────────────────────────────┘

What are the benefits of this layering? Put simply, each layer stays out of the others’ way. If we later want to change the communication mechanism, for example from stdio to WebSocket, we only need to modify the bottom layer, and the business logic above it stays untouched. Nobody wants a communication change to ripple through the entire codebase.

The Provider contract layer is the foundation of the entire architecture. We define the AIProviderType enum, where CodebuddyCli = 3 is used as the enum value, and implement bidirectional mapping between strings and enums through extension methods. That allows strings in configuration files to be converted conveniently into enums, and enums to be converted back to strings for debugging output.

The Provider factory layer is responsible for creating the corresponding Provider instance based on configuration. It uses .NET dependency injection together with ActivatorUtilities.CreateInstance for dynamic creation. The advantage of the factory pattern is that when adding a new Provider, you only need to add the creation logic instead of modifying existing code.

The Provider implementation layer is where the actual work happens. CodebuddyCliProvider implements the IAIProvider interface and provides two invocation modes: ExecuteAsync for non-streaming calls and StreamAsync for streaming calls.

The ACP infrastructure layer provides the communication foundation underneath. This layer handles all protocol details, including process management, message serialization, and response parsing. It is the foundation that keeps everything above it stable.

CodeBuddy uses Stdio (standard input/output) to communicate with external processes. The startup command is simple:

Terminal window
codebuddy --acp

After that, JSON-RPC messages are exchanged through standard input and output. This approach has several advantages:

  1. Fast startup: local process communication avoids network latency
  2. Simple configuration: you only need to specify the executable path
  3. Environment isolation: each session runs in an independent process, so they do not affect one another

Environment variable injection is supported during communication. Common examples include:

  • CODEBUDDY_API_KEY: API key authentication
  • CODEBUDDY_INTERNET_ENVIRONMENT: network environment configuration

As with communication between people, it helps to choose a convenient channel first.

ACP is based on JSON-RPC 2.0. The message format looks roughly like this:

// Request message
{
"jsonrpc": "2.0",
"id": 1,
"method": "agent/prompt",
"params": {
"prompt": "Help me write a sorting algorithm",
"sessionId": "session-123"
}
}
// Response message
{
"jsonrpc": "2.0",
"id": 1,
"result": {
"content": "Here is the AI response..."
}
}

In the real implementation, we encapsulate all of these protocol details so the upper business layer only needs to care about the prompt and response.

First, restore the CodeBuddy type in the enum file:

PCode.Models/AIProviderType.cs
public enum AIProviderType
{
ClaudeCodeCli = 0,
CodexCli = 1,
GitHubCopilot = 2,
CodebuddyCli = 3, // Restore this enum value
OpenCodeCli = 4,
IFlowCli = 5,
}

Then add string mapping in the extension methods so the configuration file can specify the Provider by string:

AIProviderTypeExtensions.cs
private static readonly Dictionary<string, AIProviderType> _typeMap = new(
StringComparer.OrdinalIgnoreCase)
{
["CodebuddyCli"] = AIProviderType.CodebuddyCli,
["Codebuddy"] = AIProviderType.CodebuddyCli,
["codebuddy"] = AIProviderType.CodebuddyCli,
// ... Mappings for other providers
};

Add a CodeBuddy creation branch in the factory class:

AIProviderFactory.cs
private IAIProvider? CreateProvider(AIProviderType providerType, ProviderConfiguration config)
{
return providerType switch
{
AIProviderType.CodebuddyCli =>
ActivatorUtilities.CreateInstance<CodebuddyCliProvider>(
_serviceProvider,
Options.Create(config)),
// ... Other providers
_ => throw new NotSupportedException($"Provider {providerType} not supported")
};
}

This uses dependency injection through ActivatorUtilities, which automatically handles constructor parameter injection and is very convenient.

Below is the core implementation of CodebuddyCliProvider, covering both streaming and non-streaming invocation modes:

public class CodebuddyCliProvider : IAIProvider
{
private readonly ILogger<CodebuddyCliProvider> _logger;
private readonly IACPSessionManager _sessionManager;
private readonly ProviderConfiguration _config;
public string Name => "CodebuddyCli";
public bool SupportsStreaming => true;
public ProviderCapabilities Capabilities { get; }
public CodebuddyCliProvider(
ILogger<CodebuddyCliProvider> logger,
IACPSessionManager sessionManager,
IOptions<ProviderConfiguration> config)
{
_logger = logger;
_sessionManager = sessionManager;
_config = config.Value;
// Define the capabilities of the current Provider
Capabilities = new ProviderCapabilities
{
SupportsStreaming = true,
SupportsTools = true,
SupportsSystemMessages = true,
SupportsArtifacts = false,
MaxTokens = 8192
};
}
// Non-streaming call: return all results together after completion
public async Task<AIResponse> ExecuteAsync(
AIRequest request,
CancellationToken cancellationToken = default)
{
// Create an independent session for the request
var session = await _sessionManager.CreateSessionAsync(
"CodebuddyCli",
request.WorkingDirectory,
cancellationToken,
request.SessionId);
try
{
var fullPrompt = BuildPrompt(request);
await session.SendPromptAsync(fullPrompt, cancellationToken);
var responseBuilder = new StringBuilder();
var toolCalls = new List<AIToolCall>();
// Collect all response chunks
await foreach (var chunk in StreamFromSession(session, cancellationToken))
{
if (!string.IsNullOrEmpty(chunk.Content))
{
responseBuilder.Append(chunk.Content);
}
// Handle tool calls...
}
return new AIResponse
{
Content = AIResultContentSanitizer.SanitizeResultContent(
responseBuilder.ToString()),
ToolCalls = toolCalls,
Provider = Name,
Model = string.Empty
};
}
finally
{
// Release session resources
await session.DisposeAsync();
}
}
// Streaming call: return response chunks in real time
public async IAsyncEnumerable<AIStreamingChunk> StreamAsync(
AIRequest request,
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
var session = await _sessionManager.CreateSessionAsync(
"CodebuddyCli",
request.WorkingDirectory,
cancellationToken);
try
{
var fullPrompt = BuildPrompt(request);
await session.SendPromptAsync(fullPrompt, cancellationToken);
await foreach (var chunk in StreamFromSession(session, cancellationToken))
{
yield return chunk;
}
}
finally
{
await session.DisposeAsync();
}
}
private async IAsyncEnumerable<AIStreamingChunk> StreamFromSession(
IACPSession session,
[EnumeratorCancellation] CancellationToken cancellationToken)
{
// Iterate through all updates in the session
await foreach (var notification in session.ReceiveUpdatesAsync(cancellationToken))
{
switch (notification.Update)
{
case AgentMessageChunkSessionUpdate agentMessage:
// Handle text content chunks
if (agentMessage.Content is AcpImp.TextContentBlock textContent)
{
yield return new AIStreamingChunk
{
Content = textContent.Text,
Type = StreamingChunkType.ContentDelta,
IsComplete = false
};
}
break;
case ToolCallSessionUpdate toolCall:
// Handle tool calls
yield return new AIStreamingChunk
{
Content = string.Empty,
Type = StreamingChunkType.ToolCallDelta,
ToolCallDelta = new AIToolCallDelta
{
Id = toolCall.ToolCallId,
Name = toolCall.Kind.ToString(),
Arguments = toolCall.RawInput?.ToString()
}
};
break;
case AcpImp.PromptCompletedSessionUpdate:
// Response complete
yield break;
}
}
}
// Build the full prompt
private string BuildPrompt(AIRequest request, string? embeddedCommandPrompt = null)
{
var sb = new StringBuilder();
// Embedded command prompt, if present
if (!string.IsNullOrEmpty(embeddedCommandPrompt))
{
sb.AppendLine(embeddedCommandPrompt);
sb.AppendLine();
}
// System message
if (!string.IsNullOrEmpty(request.SystemMessage))
{
sb.AppendLine(request.SystemMessage);
sb.AppendLine();
}
// User prompt
sb.Append(request.Prompt);
return sb.ToString();
}
}

There are several key points in this code:

  1. Session management: each request creates an independent session and releases resources after the request completes. This is a lesson learned through trial and error. If session reuse is not handled well, state pollution appears easily.

  2. Streaming processing: IAsyncEnumerable allows the response to be returned while it is still being generated, instead of waiting for all content to finish. This is especially important for long-text scenarios and significantly improves the user experience.

  3. Tool calls: CodeBuddy supports tool calling (Function Calling), handled through ToolCallSessionUpdate. This capability is critical for complex code editing tasks.

  4. Content filtering: AIResultContentSanitizer is used to filter Think block content and keep the output clean.

Add the related services during module registration:

PCodeClaudeHelperModule.cs
public void ConfigureModule(IServiceCollection context)
{
// Register Provider
context.Services.AddTransient<CodebuddyCliProvider>();
// Register ACP infrastructure
context.Services.AddSingleton<IACPSessionManager, ACPSessionManager>();
context.Services.AddSingleton<IAcpPlatformConfigurationResolver, AcpPlatformConfigurationResolver>();
context.Services.AddSingleton<IAIRequestToAcpMapper, AIRequestToAcpMapper>();
context.Services.AddSingleton<IAcpToAIResponseMapper, AcpToAIResponseMapper>();
}

Add CodeBuddy-related configuration to appsettings.json:

AI:
# Default Provider to use
DefaultProvider: "CodebuddyCli"
# Provider configuration
Providers:
CodebuddyCli:
Type: "CodebuddyCli"
WorkingDirectory: "C:/projects/my-app"
ExecutablePath: "C:/tools/codebuddy.cmd"
# Platform-specific configuration
PlatformConfigurations:
CodebuddyCli:
ExecutablePath: "C:/tools/codebuddy.cmd"
Arguments: "--acp"
StartupTimeoutMs: 5000
EnvironmentVariables:
CODEBUDDY_API_KEY: "${CODEBUDDY_API_KEY}"
CODEBUDDY_INTERNET_ENVIRONMENT: "production"

The corresponding configuration model definition:

public class CodebuddyPlatformConfiguration : IAcpPlatformConfiguration
{
public string ProviderName => "CodebuddyCli";
public AcpTransportType TransportType => AcpTransportType.Stdio;
public string ExecutablePath { get; set; } = "codebuddy";
public string Arguments { get; set; } = "--acp";
public int StartupTimeoutMs { get; set; } = 5000;
public Dictionary<string, string?>? EnvironmentVariables { get; set; }
}

We ran into several typical pitfalls during implementation, and sharing them here may help others avoid the same detours:

  1. Session leak issue: at first, sessions were not released correctly, which exhausted process resources. The solution was to use try-finally to ensure resources are released for every request.

  2. Environment variable passing: Windows and Linux use different environment variable syntax, so we later standardized on Dictionary<string, string?> to handle this.

  3. Timeout configuration: CLI startup takes time, so we set a 5-second startup timeout to avoid fast request failures.

  4. Encoding issues: on Windows, the default encoding may cause garbled Chinese text, so UTF-8 encoding is explicitly specified when starting the process.

  1. Session pool: for frequent short requests, consider implementing a session pool to reuse processes
  2. Connection cache: the factory class already supports caching Provider instances
  3. Async first: use asynchronous programming throughout to avoid blocking threads

Performance is always worth optimizing. The longer users wait, the worse the experience becomes.

This article introduced a complete solution for integrating CodeBuddy CLI into a C# backend, covering the entire process from architecture design to concrete implementation. Through a layered architecture, we separate protocol details from business logic, making the code clearer and easier to maintain.

Key takeaways:

  • Use a layered architecture with a Provider contract layer, factory layer, implementation layer, and infrastructure layer
  • Use JSON-RPC over Stdio for inter-process communication
  • Implement flexible configuration and extensibility through dependency injection
  • Provide both streaming and non-streaming invocation modes

This approach is not only suitable for CodeBuddy; adding new AI Providers can follow the same pattern. If you are also building a similar multi-AI-Provider integration, I hope this article gives you a useful reference.



If this article helped you: