Files
xahaud/.github/actions/xahau-ga-cache-restore/action.yml
Nicholas Dudfield 8b39d0915f feat: replace github actions cache with s3
Replaces actions/cache with custom S3-based caching to avoid upcoming
GitHub Actions cache eviction policy changes.

Changes:
- New S3 cache actions (xahau-ga-cache-restore/save)
- zstd compression (ccache: clang=323/gcc=609 MB, Conan: clang=1.1/gcc=1.9 GB)
- Immutability (first-write-wins)
- Bootstrap mode (creates empty dir if no cache)

Cache clearing tag:
- [ci-ga-clear-cache] - Clear all caches for this job
- [ci-ga-clear-cache:ccache] - Clear only ccache
- [ci-ga-clear-cache:conan] - Clear only conan

Configuration ordering fixes:
- ccache config applied AFTER cache restore (prevents stale cached config)
- Conan profile created AFTER cache restore (prevents stale cached profile)

ccache improvements:
- Single cache directory (~/.ccache)
- Wrapper toolchain (enables ccache without affecting Conan builds)
- Verbose build output (-v flag)
- Fixes #620

Conan improvements:
- Removed branch comparison logic for cache saves
- Cache keys don't include branch names, comparison was ineffective
- Fixes #618

Breaking changes:
- Workflows must pass AWS credentials (aws-access-key-id, aws-secret-access-key)

S3 setup:
- Bucket: xahaud-github-actions-cache-niq (us-east-1)
- Credentials already configured in GitHub secrets
2025-11-03 14:03:54 +07:00

292 lines
10 KiB
YAML
Raw Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

name: 'Xahau Cache Restore (S3)'
bump: 1
description: 'Drop-in replacement for actions/cache/restore using S3 storage'
inputs:
path:
description: 'A list of files, directories, and wildcard patterns to cache (currently only single path supported)'
required: true
key:
description: 'An explicit key for restoring the cache'
required: true
restore-keys:
description: 'An ordered list of prefix-matched keys to use for restoring stale cache if no cache hit occurred for key'
required: false
default: ''
s3-bucket:
description: 'S3 bucket name for cache storage'
required: false
default: 'xahaud-github-actions-cache-niq'
s3-region:
description: 'S3 region'
required: false
default: 'us-east-1'
fail-on-cache-miss:
description: 'Fail the workflow if cache entry is not found'
required: false
default: 'false'
lookup-only:
description: 'Check if a cache entry exists for the given input(s) without downloading it'
required: false
default: 'false'
# Note: Composite actions can't access secrets.* directly - must be passed from workflow
aws-access-key-id:
description: 'AWS Access Key ID for S3 access'
required: true
aws-secret-access-key:
description: 'AWS Secret Access Key for S3 access'
required: true
outputs:
cache-hit:
description: 'A boolean value to indicate an exact match was found for the primary key'
value: ${{ steps.restore-cache.outputs.cache-hit }}
cache-primary-key:
description: 'The key that was used to restore the cache (may be from restore-keys)'
value: ${{ steps.restore-cache.outputs.cache-primary-key }}
cache-matched-key:
description: 'The key that was used to restore the cache (exact or prefix match)'
value: ${{ steps.restore-cache.outputs.cache-matched-key }}
runs:
using: 'composite'
steps:
- name: Restore cache from S3
id: restore-cache
shell: bash
env:
AWS_ACCESS_KEY_ID: ${{ inputs.aws-access-key-id }}
AWS_SECRET_ACCESS_KEY: ${{ inputs.aws-secret-access-key }}
S3_BUCKET: ${{ inputs.s3-bucket }}
S3_REGION: ${{ inputs.s3-region }}
CACHE_KEY: ${{ inputs.key }}
RESTORE_KEYS: ${{ inputs.restore-keys }}
TARGET_PATH: ${{ inputs.path }}
FAIL_ON_MISS: ${{ inputs.fail-on-cache-miss }}
LOOKUP_ONLY: ${{ inputs.lookup-only }}
run: |
set -euo pipefail
echo "=========================================="
echo "Xahau Cache Restore (S3)"
echo "=========================================="
echo "Target path: ${TARGET_PATH}"
echo "Cache key: ${CACHE_KEY}"
echo "S3 bucket: s3://${S3_BUCKET}"
echo ""
# Normalize target path (expand tilde)
if [[ "${TARGET_PATH}" == ~* ]]; then
TARGET_PATH="${HOME}${TARGET_PATH:1}"
fi
# Canonicalize path (Linux only - macOS realpath doesn't support -m)
if [[ "$OSTYPE" == "linux-gnu"* ]]; then
TARGET_PATH=$(realpath -m "${TARGET_PATH}")
fi
echo "Normalized target path: ${TARGET_PATH}"
echo ""
# Debug: Show commit message
echo "=========================================="
echo "DEBUG: Cache clear tag detection"
echo "=========================================="
echo "Raw commit message:"
echo "${XAHAU_GA_COMMIT_MSG}"
echo ""
echo "Searching for: [ci-ga-clear-cache] or [ci-ga-clear-cache:*]"
echo ""
# Check for [ci-ga-clear-cache] tag in commit message (with optional search terms)
# Examples:
# [ci-ga-clear-cache] - Clear this job's cache
# [ci-ga-clear-cache:ccache] - Clear only if key contains "ccache"
# [ci-ga-clear-cache:gcc Debug] - Clear only if key contains both "gcc" AND "Debug"
# Extract search terms if present (e.g., "ccache" from "[ci-ga-clear-cache:ccache]")
SEARCH_TERMS=$(echo "${XAHAU_GA_COMMIT_MSG}" | grep -o '\[ci-ga-clear-cache:[^]]*\]' | sed 's/\[ci-ga-clear-cache://;s/\]//' || echo "")
SHOULD_CLEAR=false
if [ -n "${SEARCH_TERMS}" ]; then
# Search terms provided - check if THIS cache key matches ALL terms (AND logic)
echo "🔍 [ci-ga-clear-cache:${SEARCH_TERMS}] detected"
echo "Checking if cache key matches search terms..."
echo " Cache key: ${CACHE_KEY}"
echo " Search terms: ${SEARCH_TERMS}"
echo ""
MATCHES=true
for term in ${SEARCH_TERMS}; do
if ! echo "${CACHE_KEY}" | grep -q "${term}"; then
MATCHES=false
echo " ✗ Key does not contain '${term}'"
break
else
echo " ✓ Key contains '${term}'"
fi
done
if [ "${MATCHES}" = "true" ]; then
echo ""
echo "✅ Cache key matches all search terms - will clear cache"
SHOULD_CLEAR=true
else
echo ""
echo "⏭️ Cache key doesn't match search terms - skipping cache clear"
fi
elif echo "${XAHAU_GA_COMMIT_MSG}" | grep -q '\[ci-ga-clear-cache\]'; then
# No search terms - always clear this job's cache
echo "🗑️ [ci-ga-clear-cache] detected in commit message"
echo "Clearing cache for key: ${CACHE_KEY}"
SHOULD_CLEAR=true
fi
if [ "${SHOULD_CLEAR}" = "true" ]; then
echo ""
# Delete base layer
S3_BASE_KEY="s3://${S3_BUCKET}/${CACHE_KEY}-base.tar.zst"
if aws s3 ls "${S3_BASE_KEY}" --region "${S3_REGION}" >/dev/null 2>&1; then
echo "Deleting base layer: ${S3_BASE_KEY}"
aws s3 rm "${S3_BASE_KEY}" --region "${S3_REGION}" 2>/dev/null || true
echo "✓ Base layer deleted"
else
echo " No base layer found to delete"
fi
echo ""
echo "✅ Cache cleared successfully"
echo "Build will proceed from scratch (bootstrap mode)"
echo ""
fi
# Function to try restoring a cache key
try_restore_key() {
local key=$1
local s3_key="s3://${S3_BUCKET}/${key}-base.tar.zst"
echo "Checking for key: ${key}"
if aws s3 ls "${s3_key}" --region "${S3_REGION}" >/dev/null 2>&1; then
echo "✓ Found cache: ${s3_key}"
return 0
else
echo "✗ Not found: ${key}"
return 1
fi
}
# Try exact match first
MATCHED_KEY=""
EXACT_MATCH="false"
if try_restore_key "${CACHE_KEY}"; then
MATCHED_KEY="${CACHE_KEY}"
EXACT_MATCH="true"
echo ""
echo "🎯 Exact cache hit for key: ${CACHE_KEY}"
else
# Try restore-keys (prefix matching)
if [ -n "${RESTORE_KEYS}" ]; then
echo ""
echo "Primary key not found, trying restore-keys..."
while IFS= read -r restore_key; do
[ -z "${restore_key}" ] && continue
restore_key=$(echo "${restore_key}" | xargs)
if try_restore_key "${restore_key}"; then
MATCHED_KEY="${restore_key}"
EXACT_MATCH="false"
echo ""
echo "✓ Cache restored from fallback key: ${restore_key}"
break
fi
done <<< "${RESTORE_KEYS}"
fi
fi
# Check if we found anything
if [ -z "${MATCHED_KEY}" ]; then
echo ""
echo "❌ No cache found for key: ${CACHE_KEY}"
if [ "${FAIL_ON_MISS}" = "true" ]; then
echo "fail-on-cache-miss is enabled, failing workflow"
exit 1
fi
# Set outputs for cache miss
echo "cache-hit=false" >> $GITHUB_OUTPUT
echo "cache-primary-key=" >> $GITHUB_OUTPUT
echo "cache-matched-key=" >> $GITHUB_OUTPUT
# Create empty cache directory
mkdir -p "${TARGET_PATH}"
echo ""
echo "=========================================="
echo "Cache restore completed (bootstrap mode)"
echo "Created empty cache directory: ${TARGET_PATH}"
echo "=========================================="
exit 0
fi
# If lookup-only, we're done
if [ "${LOOKUP_ONLY}" = "true" ]; then
echo "cache-hit=${EXACT_MATCH}" >> $GITHUB_OUTPUT
echo "cache-primary-key=${CACHE_KEY}" >> $GITHUB_OUTPUT
echo "cache-matched-key=${MATCHED_KEY}" >> $GITHUB_OUTPUT
echo ""
echo "=========================================="
echo "Cache lookup completed (lookup-only mode)"
echo "Cache exists: ${MATCHED_KEY}"
echo "=========================================="
exit 0
fi
# Download and extract cache
S3_KEY="s3://${S3_BUCKET}/${MATCHED_KEY}-base.tar.zst"
TEMP_TARBALL="/tmp/xahau-cache-restore-$$.tar.zst"
echo ""
echo "Downloading cache..."
aws s3 cp "${S3_KEY}" "${TEMP_TARBALL}" --region "${S3_REGION}" --no-progress
TARBALL_SIZE=$(du -h "${TEMP_TARBALL}" | cut -f1)
echo "✓ Downloaded: ${TARBALL_SIZE}"
# Create parent directory if needed
mkdir -p "$(dirname "${TARGET_PATH}")"
# Remove existing target if it exists
if [ -e "${TARGET_PATH}" ]; then
echo "Removing existing target: ${TARGET_PATH}"
rm -rf "${TARGET_PATH}"
fi
# Create target directory and extract
mkdir -p "${TARGET_PATH}"
echo ""
echo "Extracting cache..."
zstd -d -c "${TEMP_TARBALL}" | tar -xf - -C "${TARGET_PATH}"
echo "✓ Cache extracted to: ${TARGET_PATH}"
# Cleanup
rm -f "${TEMP_TARBALL}"
# Set outputs
echo "cache-hit=${EXACT_MATCH}" >> $GITHUB_OUTPUT
echo "cache-primary-key=${CACHE_KEY}" >> $GITHUB_OUTPUT
echo "cache-matched-key=${MATCHED_KEY}" >> $GITHUB_OUTPUT
echo ""
echo "=========================================="
echo "Cache restore completed successfully"
echo "Cache hit: ${EXACT_MATCH}"
echo "Matched key: ${MATCHED_KEY}"
echo "=========================================="