Skip to content

DevOps

2 posts with the tag “DevOps”

How to Automate Steam Releases with GitHub Actions

How to Automate Steam Releases with GitHub Actions

Section titled “How to Automate Steam Releases with GitHub Actions”

This article shares the complete solution we implemented for automated Steam releases in the HagiCode Desktop project, covering the end-to-end automation flow from GitHub Release to the Steam platform, including key technical details such as Steam Guard authentication and multi-platform Depot uploads.

The release workflow on Steam is actually quite different from traditional application distribution. Steam has its own complete update delivery system. Developers need to use SteamCMD to upload build artifacts to Steam’s CDN network, rather than simply dropping in a download link like on other platforms.

The HagiCode Desktop project is preparing for a Steam release, which introduced a few new challenges to our release workflow:

  1. We needed to convert existing build artifacts into a Steam-compatible format.
  2. We had to upload them to Steam through SteamCMD.
  3. We also had to handle Steam Guard authentication.
  4. We needed to support multi-platform Depot uploads for Linux, Windows, and macOS.
  5. We wanted a fully automated flow from GitHub Release to Steam.

The project had already implemented “portable version mode,” which allows the application to detect fixed service payloads packaged in the extra directory. Our goal was to integrate that portable version mode seamlessly with Steam distribution.

The solution shared in this article comes from our practical experience in the HagiCode project. HagiCode is an AI coding assistant that supports desktop usage. Since we are actively working toward launching on Steam, we needed to establish a reliable automated release workflow.

The core of the entire Steam release process is a GitHub Actions workflow that divides the process into three main stages:

┌─────────────────────────────────────────────────────────────┐
│ GitHub Actions Workflow (Steam Release) │
├─────────────────────────────────────────────────────────────┤
│ 1. Preparation Stage: │
│ - Check out portable-version code │
│ - Download build artifacts from GitHub Release │
│ - Extract and prepare the Steam content directory │
│ │
│ 2. SteamCMD Setup: │
│ - Install or reuse SteamCMD │
│ - Authenticate with Steam Guard │
│ │
│ 3. Release Stage: │
│ - Generate Depot VDF configuration files │
│ - Generate App Build VDF configuration files │
│ - Invoke SteamCMD to upload to Steam │
└─────────────────────────────────────────────────────────────┘

This design offers several advantages:

  • It reuses existing GitHub Release artifacts and avoids rebuilding the same outputs.
  • It uses self-hosted runners for secure isolation.
  • It supports switching between preview mode and formal release branches.
  • It includes complete error handling and logging, which makes failures easier to diagnose.

Our workflow supports the following key parameters:

inputs:
release: # Portable Version release tag
description: 'Version tag to release (for example v1.0.0)'
required: true
steam_preview: # Whether to generate a preview build
description: 'Whether to use preview mode'
required: false
default: 'false'
steam_branch: # Steam branch to set live
description: 'Target Steam branch'
required: false
default: 'preview'
steam_description: # Build description override
description: 'Build description'
required: false

For security reasons, we use a self-hosted runner with the steam label:

runs-on:
- self-hosted
- Linux
- X64
- steam

This ensures that Steam releases run on a dedicated runner and keeps sensitive credentials safely isolated.

To prevent releases of the same version from interfering with each other, we configured concurrency control:

concurrency:
group: portable-version-steam-${{ github.event.inputs.release }}
cancel-in-progress: false

Notice that cancel-in-progress: false is set here because Steam releases can take a while, and we do not want a newly triggered run to cancel one that is already uploading.

The prepare-steam-release-input.mjs script is responsible for preparing the inputs required for release:

// Download the GitHub Release build manifest and artifact inventory
const buildManifest = await downloadBuildManifest(releaseTag);
const artifactInventory = await downloadArtifactInventory(releaseTag);
// Download compressed archives for each platform
for (const platform of ['linux-x64', 'win-x64', 'osx-universal']) {
const artifactUrl = getArtifactUrl(artifactInventory, platform);
await downloadArtifact(artifactUrl, platform);
}
// Extract into the Steam content directory structure
await extractToSteamContent(sources, contentRoot);

Steam requires accounts to be protected with Steam Guard, so we implemented a shared-secret-based code generation algorithm:

function generateSteamGuardCode(sharedSecret, timestamp = Date.now()) {
const secret = decodeSharedSecret(sharedSecret);
const time = Math.floor(timestamp / 1000 / 30);
const timeBuffer = Buffer.alloc(8);
timeBuffer.writeBigUInt64BE(BigInt(time));
// Use HMAC-SHA1 to generate a time-based one-time code
const hash = crypto.createHmac('sha1', secret)
.update(timeBuffer)
.digest();
// Convert it into a 5-character Steam Guard code
const code = steamGuardCode(hash);
return code;
}

This implementation is based on Steam Guard’s TOTP (Time-based One-Time Password) mechanism, generating a new verification code every 30 seconds.

VDF (Valve Data Format) is the configuration format used by Steam. We need to generate two types of VDF files:

Depot VDF is used to configure content for each platform:

function buildDepotVdf(depotId, contentRoot) {
return [
'"DepotBuildConfig"',
'{',
` "DepotID" "${escapeVdf(depotId)}"`,
` "ContentRoot" "${escapeVdf(contentRoot)}"`,
' "FileMapping"',
' {',
' "LocalPath" "*"',
' "DepotPath" "."',
' "recursive" "1"',
' }',
'}'
].join('\n');
}

App Build VDF is used to configure the entire application build:

function buildAppBuildVdf(appId, depotBuilds, description, setLive) {
const vdf = [
'"appbuild"',
'{',
` "appid" "${appId}"`,
` "desc" "${escapeVdf(description)}"`,
` "contentroot" "${escapeVdf(contentRoot)}"`,
' "buildoutput" "build_output"',
' "depots"',
' {'
];
for (const [depotId, depotVdfPath] of Object.entries(depotBuilds)) {
vdf.push(` "${depotId}" "${depotVdfPath}"`);
}
if (setLive) {
vdf.push(` }`);
vdf.push(` "setlive" "${setLive}"`);
}
vdf.push('}');
return vdf.join('\n');
}

Finally, the upload is performed by invoking SteamCMD:

await runCommand(steamcmdPath, [
'+login', steamUsername, steamPassword, steamGuardCode,
'+run_app_build', appBuildPath,
'+quit'
]);

This is the final jump in the whole workflow. Once it succeeds, the release is done.

Steam uses the Depot system to manage content for different platforms. We support three main Depots:

PlatformDepot IdentifierArchitecture Support
Linuxlinux-x64x64_64
Windowswin-x64x64_64
macOSosx-universaluniversal, x64_64, arm64

Each Depot has its own content directory and VDF configuration file. This ensures that users on different platforms only download the content they actually need.

First, create a GitHub Release in the portable-version repository that includes:

  • Compressed archives for each platform
  • Build manifest ({tag}.build-manifest.json)
  • Artifact inventory ({tag}.artifact-inventory.json)

Step 2: Trigger the Steam Release Workflow

Section titled “Step 2: Trigger the Steam Release Workflow”

Trigger the workflow manually through GitHub Actions and fill in the required parameters:

  • release: the version tag to publish, such as v1.0.0
  • steam_branch: the target branch, such as preview or public
  • steam_preview: whether to use preview mode

Step 3: Execute the Release Flow Automatically

Section titled “Step 3: Execute the Release Flow Automatically”

The workflow automatically performs the following steps:

  1. Download and extract GitHub Release artifacts.
  2. Install or update SteamCMD.
  3. Generate Steam VDF configuration files.
  4. Authenticate with Steam Guard.
  5. Upload content to the Steam CDN.
  6. Set the specified branch live.

Once this sequence completes, the whole release path is covered.

Configure the following secrets in the GitHub repository settings:

Secret NameDescription
STEAM_USERNAMESteam account username
STEAM_PASSWORDSteam account password
STEAM_SHARED_SECRETSteam Guard shared secret (optional)
STEAM_GUARD_CODESteam Guard code (optional)
STEAM_APP_IDSteam application ID
STEAM_DEPOT_ID_LINUXLinux Depot ID
STEAM_DEPOT_ID_WINDOWSWindows Depot ID
STEAM_DEPOT_ID_MACOSmacOS Depot ID

There is nothing especially unusual about these settings. You simply need all of the expected values in place.

Variable NameDescriptionDefault Value
PORTABLE_VERSION_STEAMCMD_ROOTSteamCMD installation directory~/.local/share/portable-version/steamcmd

On the first run, you need to enter the Steam Guard code manually. After that, it is recommended to configure the shared secret so the code can be generated automatically. This avoids manual intervention on every release.

SteamCMD saves the login token and can reuse it on later runs. You still need to keep an eye on token expiration, because re-authentication is required after the token expires.

Make sure the Steam content directory structure is correct:

steam-content/
├── linux-x64/ # Linux platform content
├── win-x64/ # Windows platform content
└── osx-universal/ # macOS universal binary content

Each directory should contain the complete application files for its corresponding platform.

Preview mode does not set any branch live, so it is suitable for testing and validation:

if [ "$STEAM_PREVIEW_INPUT" = 'true' ]; then
cmd+=(--preview)
fi

This lets you upload to Steam first for verification, then switch to the formal branch after everything checks out.

The scripts include complete error handling and logging:

  • Validate that the GitHub Release exists.
  • Check required metadata files.
  • Ensure platform content is present.
  • Generate a GitHub Actions summary report.

This information is highly valuable for both debugging and auditing.

The workflow generates two kinds of artifacts:

  • portable-steam-release-preparation-{tag}: release preparation metadata
  • portable-steam-build-metadata-{tag}: Steam build metadata

These artifacts can be used for later auditing and debugging. A retention period of 30 days is a practical default.

In the HagiCode project, this automated release workflow has already run successfully across multiple versions. The entire path from GitHub Release to the Steam platform is fully automated and requires no manual intervention.

This significantly improved both release efficiency and reliability. In the past, manually publishing one version took more than 30 minutes. Now the entire process finishes in just a few minutes.

More importantly, the automated workflow reduces the chance of human error. Every release follows the same standardized process, which makes the results more predictable.

With the approach described in this article, we achieved:

  1. Full automation from GitHub Release to the Steam platform.
  2. Multi-platform Depot uploads.
  3. Secure authentication based on Steam Guard.
  4. Flexible switching between preview mode and formal release.
  5. Complete error handling and logging.

This solution is not only suitable for the HagiCode project, but can also serve as a reference for other projects planning to launch on Steam. If you are also considering Steam release automation, I hope this practical experience is useful to you.

Technology can feel complex or simple depending on the path you choose. The key is finding a workflow that fits your needs.

If this article helped you, feel free to star the HagiCode GitHub repository or visit the official website to learn more.

Thank you for reading. If you found this article useful, you are welcome to like, bookmark, and share it. This content was created with AI-assisted collaboration, and the final content was reviewed and confirmed by the author.

How to Use GitHub Actions + image-syncer for Automated Image Sync from Docker Hub to Azure ACR

Automating Image Sync from Docker Hub to Azure ACR

Section titled “Automating Image Sync from Docker Hub to Azure ACR”

This article explains how to use GitHub Actions and the image-syncer tool to automate image synchronization from Docker Hub to Azure Container Registry, solving the problem of slow Docker Hub access in mainland China and some Azure regions, while improving image availability and deployment efficiency in Azure environments.

The HagiCode project uses Docker images as its core runtime components, with the main images hosted on Docker Hub. As the project has evolved and Azure deployment needs have grown, we encountered the following pain points:

  • Slow image pulls, because access to Docker Hub is limited in mainland China and some Azure regions
  • Relying on a single image source creates a single point of failure risk
  • Using Azure Container Registry in Azure environments provides better network performance and integration experience

To solve these problems, we need to establish an automated image synchronization mechanism that regularly syncs images from Docker Hub to Azure ACR, ensuring users get faster image pull speeds and higher availability in Azure environments.

We are building HagiCode, an AI-driven coding assistant that makes development smarter, more convenient, and more enjoyable.

Smart: AI assistance throughout the entire process, from idea to code, boosting coding efficiency several times over. Convenient: Multi-threaded concurrent operations make full use of resources and keep the development workflow smooth. Fun: Gamification and an achievement system make coding less dull and more rewarding.

The project is evolving rapidly. If you are interested in technical writing, knowledge management, or AI-assisted development, welcome to check it out on GitHub.

When defining the solution, we compared multiple technical approaches:

  • Incremental sync: only synchronizes changed image layers, significantly reducing network transfer
  • Resume support: synchronization can resume after network interruptions
  • Concurrency control: supports configurable concurrency to improve large image sync efficiency
  • Robust error handling: built-in retry mechanism for failures (3 times by default)
  • Lightweight deployment: single binary with no dependencies
  • Multi-registry support: compatible with Docker Hub, Azure ACR, Harbor, and more
  • No incremental sync support: each run requires pulling the full image content
  • Lower efficiency: large network transfer volume and longer execution time
  • Simple and easy to use: relies on familiar docker pull / docker push commands
  • Higher complexity: requires Azure CLI authentication setup
  • Functional limitations: az acr import is relatively limited
  • Native integration: integrates well with Azure services

Decision 1: Set the sync frequency to daily at 00:00 UTC

Section titled “Decision 1: Set the sync frequency to daily at 00:00 UTC”
  • Balances image freshness with resource consumption
  • Avoids peak business hours and reduces impact on other operations
  • Docker Hub images are usually updated after daily builds
  • Maintains full consistency with Docker Hub
  • Provides flexible version choices for users
  • Simplifies sync logic by avoiding complex tag filtering rules

Decision 3: Store credentials in GitHub Secrets

Section titled “Decision 3: Store credentials in GitHub Secrets”
  • Natively supported by GitHub Actions with strong security
  • Simple to configure and easy to manage and maintain
  • Supports repository-level access control
  • Use GitHub Secrets for encrypted storage
  • Rotate ACR passwords regularly
  • Limit ACR account permissions to push-only
  • Monitor ACR access logs

Risk 2: Sync failures causing image inconsistency

Section titled “Risk 2: Sync failures causing image inconsistency”
  • image-syncer includes a built-in incremental sync mechanism
  • Automatic retry on failure (3 times by default)
  • Detailed error logs and failure notifications
  • Resume support
  • Incremental sync reduces network transfer
  • Configurable concurrency (10 in the current setup)
  • Monitor the number and size of synchronized images
  • Run synchronization during off-peak hours

We use an automated GitHub Actions + image-syncer solution to synchronize images from Docker Hub to Azure ACR.

  • Create or confirm an Azure Container Registry in Azure Portal
  • Create ACR access credentials (username and password)
  • Confirm access permissions for the Docker Hub image repository

Add the following secrets in the GitHub repository settings:

  • AZURE_ACR_USERNAME: Azure ACR username
  • AZURE_ACR_PASSWORD: Azure ACR password

Configure the workflow in .github/workflows/sync-docker-acr.yml:

  • Scheduled trigger: every day at 00:00 UTC
  • Manual trigger: supports workflow_dispatch
  • Extra trigger: run when the publish branch receives a push (for fast synchronization)
SequenceParticipantActionDescription
1GitHub ActionsTrigger workflowTriggered by schedule, manual run, or a push to the publish branch
2GitHub Actions → image-syncerDownload and run the sync toolEnter the actual sync phase
3image-syncer → Docker HubFetch image manifests and tag listRead source repository metadata
4image-syncer → Azure ACRFetch existing image information from the target repositoryDetermine the current target-side state
5image-syncerCompare source and target differencesIdentify image layers that need to be synchronized
6image-syncer → Docker HubPull changed image layersTransfer only the content that needs updating
7image-syncer → Azure ACRPush changed image layersComplete incremental synchronization
8image-syncer → GitHub ActionsReturn synchronization statisticsIncludes results, differences, and error information
9GitHub ActionsRecord logs and upload artifactsUseful for follow-up auditing and troubleshooting

Here is the actual workflow configuration in use (.github/workflows/sync-docker-acr.yml):

name: Sync Docker Image to Azure ACR
on:
schedule:
- cron: "0 0 * * *" # Every day at 00:00 UTC
workflow_dispatch: # Manual trigger
push:
branches: [publish]
permissions:
contents: read
jobs:
sync:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Download image-syncer
run: |
# Download the image-syncer binary
wget https://github.com/AliyunContainerService/image-syncer/releases/download/v1.5.5/image-syncer-v1.5.5-linux-amd64.tar.gz
tar -zxvf image-syncer-v1.5.5-linux-amd64.tar.gz
chmod +x image-syncer
- name: Create auth config
run: |
# Generate the authentication configuration file (YAML format)
cat > auth.yaml <<EOF
hagicode.azurecr.io:
username: "${{ secrets.AZURE_ACR_USERNAME }}"
password: "${{ secrets.AZURE_ACR_PASSWORD }}"
EOF
- name: Create images config
run: |
# Generate the image synchronization configuration file (YAML format)
cat > images.yaml <<EOF
docker.io/newbe36524/hagicode: hagicode.azurecr.io/hagicode
EOF
- name: Run image-syncer
run: |
# Run synchronization (using the newer --auth and --images parameters)
./image-syncer --auth=./auth.yaml --images=./images.yaml --proc=10 --retries=3
- name: Upload logs
if: always()
uses: actions/upload-artifact@v4
with:
name: sync-logs
path: image-syncer-*.log
retention-days: 7
  • Scheduled trigger: cron: "0 0 * * *" - runs every day at 00:00 UTC
  • Manual trigger: workflow_dispatch - allows users to run it manually in the GitHub UI
  • Push trigger: push: branches: [publish] - triggered when the publish branch receives a push (for fast synchronization)

2. Authentication configuration (auth.yaml)

Section titled “2. Authentication configuration (auth.yaml)”
hagicode.azurecr.io:
username: "${{ secrets.AZURE_ACR_USERNAME }}"
password: "${{ secrets.AZURE_ACR_PASSWORD }}"
docker.io/newbe36524/hagicode: hagicode.azurecr.io/hagicode

This configuration means synchronizing all tags from docker.io/newbe36524/hagicode to hagicode.azurecr.io/hagicode

  • --auth=./auth.yaml: path to the authentication configuration file
  • --images=./images.yaml: path to the image synchronization configuration file
  • --proc=10: set concurrency to 10
  • --retries=3: retry failures 3 times

Configure the following in Settings → Secrets and variables → Actions in the GitHub repository:

Secret NameDescriptionExample ValueHow to Get It
AZURE_ACR_USERNAMEAzure ACR usernamehagicodeAzure Portal → ACR → Access keys
AZURE_ACR_PASSWORDAzure ACR passwordxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxAzure Portal → ACR → Access keys → Password
  1. Open the Actions tab of the GitHub repository
  2. Select the Sync Docker Image to Azure ACR workflow
  3. Click the Run workflow button
  4. Choose the branch and click Run workflow to confirm
  1. Click a specific workflow run record on the Actions page
  2. View the execution logs for each step
  3. Download the sync-logs file from the Artifacts section at the bottom of the page
Terminal window
# Log in to Azure ACR
az acr login --name hagicode
# List images and their tags
az acr repository show-tags --name hagicode --repository hagicode --output table
  • Rotate Azure ACR passwords regularly (recommended every 90 days)
  • Use a dedicated ACR service account with push-only permissions
  • Monitor ACR access logs to detect abnormal access in time
  • Do not output credentials in logs
  • Do not commit credentials to the code repository
  • Adjust the --proc parameter: tune concurrency based on network bandwidth (recommended 5-20)
  • Monitor synchronization time: if it takes too long, consider reducing concurrency
  • Clean up logs regularly: set a reasonable retention-days value (7 days in the current setup)
Error: failed to authenticate to hagicode.azurecr.io

Solution:

  1. Check whether GitHub Secrets are configured correctly
  2. Verify whether the Azure ACR password has expired
  3. Confirm whether the ACR service account permissions are correct
Error: timeout waiting for response

Solution:

  1. Check network connectivity
  2. Reduce concurrency (--proc parameter)
  3. Wait for the network to recover and trigger the workflow again
Warning: some tags failed to sync

Solution:

  1. Check the synchronization logs to identify failed tags
  2. Manually trigger the workflow to synchronize again
  3. Verify that the source image on Docker Hub is working properly
  • Regularly check the Actions page to confirm workflow run status
  • Configure GitHub notifications to receive workflow failure alerts promptly
  • Monitor Azure ACR storage usage
  • Regularly verify tag consistency

Q1: How do I sync specific tags instead of all tags?

Section titled “Q1: How do I sync specific tags instead of all tags?”

Modify the images.yaml configuration file:

# Sync only the latest and v1.0 tags
docker.io/newbe36524/hagicode:latest: hagicode.azurecr.io/hagicode:latest
docker.io/newbe36524/hagicode:v1.0: hagicode.azurecr.io/hagicode:v1.0

Q2: How do I sync multiple image repositories?

Section titled “Q2: How do I sync multiple image repositories?”

Add multiple lines in images.yaml:

docker.io/newbe36524/hagicode: hagicode.azurecr.io/hagicode
docker.io/newbe36524/another-image: hagicode.azurecr.io/another-image

Q3: How do I retry after a synchronization failure?

Section titled “Q3: How do I retry after a synchronization failure?”
  • Automatic retry: image-syncer includes a built-in retry mechanism (3 times by default)
  • Manual retry: click Re-run all jobs on the GitHub Actions page

Q4: How do I view detailed synchronization progress?

Section titled “Q4: How do I view detailed synchronization progress?”
  • View real-time logs on the Actions page
  • Download the sync-logs artifact to see the full log file
  • The log file includes the synchronization status and transfer speed for each tag
  • Initial full synchronization: typically takes 10-30 minutes depending on image size
  • Incremental synchronization: usually 2-5 minutes if image changes are small
  • Time depends on network bandwidth, image size, and concurrency settings

Add a notification step to the workflow:

- name: Notify on success
if: success()
run: |
echo "Docker images synced successfully to Azure ACR"

Add tag filtering logic to the workflow:

- name: Filter tags
run: |
# Sync only tags that start with v
echo "docker.io/newbe36524/hagicode:v* : hagicode.azurecr.io/hagicode:v*" > images.yaml

3. Add a synchronization statistics report

Section titled “3. Add a synchronization statistics report”
- name: Generate report
if: always()
run: |
echo "## Sync Report" >> $GITHUB_STEP_SUMMARY
echo "- Total tags: $(grep -c 'synced' image-syncer-*.log)" >> $GITHUB_STEP_SUMMARY
echo "- Sync time: ${{ steps.sync.outputs.duration }}" >> $GITHUB_STEP_SUMMARY

With the method introduced in this article, we successfully implemented automated image synchronization from Docker Hub to Azure ACR. This solution uses the scheduled and manual trigger capabilities of GitHub Actions together with the incremental synchronization and error-handling features of image-syncer to ensure timely and consistent image synchronization.

We also discussed security best practices, performance optimization, troubleshooting, and other related topics to help users better manage and maintain this synchronization mechanism. We hope this article provides valuable reference material for developers who need to deploy Docker images in Azure environments.


Thank you for reading. If you found this article useful, please click the like button below 👍 so more people can discover it.

This content was created with AI-assisted collaboration, reviewed by me, and reflects my own views and position.