Compare commits

...

13 Commits

Author SHA1 Message Date
Bart
9de4bcac05 Clean up scripts 2025-12-23 10:08:44 -05:00
Bart
90234d1fd0 Improve readme, reduce number of configs to build 2025-12-23 10:04:43 -05:00
Bart
2543f2eb58 Exclude scripts and example code from Codecov 2025-12-22 20:23:57 -05:00
Bart
5c3eaa5101 Fix missing imports 2025-12-22 19:43:30 -05:00
Bart
6dfaeb11bc Skip Clang 20+ on ARM, reduce macOS and Windows builds, temporarily run all 2025-12-22 19:42:16 -05:00
Bart
01d57c9aa3 Update image SHA for Debian and RHEL, pass JSON array as string 2025-12-22 18:58:19 -05:00
Bart
c115b77970 Reusable workflows do not support choice 2025-12-22 18:48:30 -05:00
Bart
05a76895ad ci: Support more flexible strategy matrix generation 2025-12-22 16:52:45 -05:00
Bart
40198d9792 ci: Remove superfluous build directory creation (#6159)
This change modifies the build directory structure from `build/build/xxx` or `.build/build/xxx` to just `build/xxx`. Namely, the `conanfile.py` has the CMake generators build directory hardcoded to `build/generators`. We may as well leverage the top-level build directory without introducing another layer of directory nesting.
2025-12-22 16:30:23 -05:00
Bart
f059f0beda Set version to 3.2.0-b0 (#6153) 2025-12-17 18:21:01 -05:00
Mayukha Vadari
41c1be2bac refactor: remove Json::Object and related files/classes (#5894)
`Json::Object` and related objects are not used at all, so this change removes `include/xrpl/json/Object.h` and all downstream files. There are a number of minor downstream changes as well.

Full list of deleted classes and functions:
* `Json::Collections`
* `Json::Object`
* `Json::Array`
* `Json::WriterObject`
* `Json::setArray`
* `Json::addObject`
* `Json::appendArray`
* `Json::appendObject`

The last helper function, `copyFrom`, seemed a bit more complex and was actually used in a few places, so it was moved to `LedgerToJson.h` instead of deleting it.
2025-12-15 13:40:08 -05:00
Bart
f816ffa55f ci: Update shared actions (#6147)
The latest update to `cleanup-workspace`, `get-nproc`, and `prepare-runner` moved the action to the repository root directory, and also includes some ccache changes. In response, this change updates the various shared actions to the latest commit hash.
2025-12-12 19:47:34 +00:00
liuyueyangxmu
cf748702af chore: Fix some typos in comments (#6082) 2025-12-12 11:06:17 -05:00
52 changed files with 3114 additions and 1641 deletions

View File

@@ -32,7 +32,9 @@ parsers:
slack_app: false
ignore:
- "src/test/"
- "src/tests/"
- ".github/scripts/"
- "include/xrpl/beast/test/"
- "include/xrpl/beast/unit_test/"
- "src/test/"
- "src/tests/"
- "tests/"

View File

@@ -4,9 +4,6 @@ description: "Install Conan dependencies, optionally forcing a rebuild of all de
# Note that actions do not support 'type' and all inputs are strings, see
# https://docs.github.com/en/actions/reference/workflows-and-actions/metadata-syntax#inputs.
inputs:
build_dir:
description: "The directory where to build."
required: true
build_type:
description: 'The build type to use ("Debug", "Release").'
required: true
@@ -28,17 +25,13 @@ runs:
- name: Install Conan dependencies
shell: bash
env:
BUILD_DIR: ${{ inputs.build_dir }}
BUILD_NPROC: ${{ inputs.build_nproc }}
BUILD_OPTION: ${{ inputs.force_build == 'true' && '*' || 'missing' }}
BUILD_TYPE: ${{ inputs.build_type }}
LOG_VERBOSITY: ${{ inputs.log_verbosity }}
run: |
echo 'Installing dependencies.'
mkdir -p "${BUILD_DIR}"
cd "${BUILD_DIR}"
conan install \
--output-folder . \
--build="${BUILD_OPTION}" \
--options:host='&:tests=True' \
--options:host='&:xrpld=True' \
@@ -46,4 +39,4 @@ runs:
--conf:all tools.build:jobs=${BUILD_NPROC} \
--conf:all tools.build:verbosity="${LOG_VERBOSITY}" \
--conf:all tools.compilation:verbosity="${LOG_VERBOSITY}" \
..
.

View File

@@ -0,0 +1,118 @@
# Strategy Matrix
The scripts in this directory will generate a strategy matrix for GitHub Actions
CI, depending on the trigger that caused the workflow to run and the platform
specified.
There are several build, test, and publish settings that can be enabled for each
configuration. The settings are combined in a Cartesian product to generate the
full matrix, while filtering out any combinations not applicable to the trigger.
## Platforms
We support three platforms: Linux, macOS, and Windows.
### Linux
We support a variety of distributions (Debian, RHEL, and Ubuntu) and compilers
(GCC and Clang) on Linux. As there are so many combinations, we don't run them
all. Instead, we focus on a few key ones for PR commits and merges, while we run
most of them on a scheduled or ad hoc basis.
Some noteworthy configurations are:
- The official release build is GCC 14 on Debian Bullseye.
- Although we generally enable assertions in release builds, we disable them
for the official release build.
- We publish .deb and .rpm packages for this build, as well as a Docker image.
- For PR commits we also publish packages and images for testing purposes.
- Antithesis instrumentation is only supported on Clang 16+ on AMD64.
- We publish a Docker image for this build, but no packages.
- Coverage reports are generated on Bullseye with GCC 15.
- It must be enabled for both commits (to show PR coverage) and merges (to
show default branch coverage).
Note that we try to run pipelines equally across both AMD64 and ARM64, but in
some cases we cannot build on ARM64:
- All Clang 20+ builds on ARM64 are currently skipped due to a Boost build
error.
- All RHEL builds on AMD64 are currently skipped due to a build failure that
needs further investigation.
Also note that to create a Docker image we ideally build on both AMD64 and
ARM64 to create a multi-arch image. Both configs should therefore be triggered
by the same event. However, as the script outputs individual configs, the
workflow must be able to run both builds separately and then merge the
single-arch images afterward into a multi-arch image.
### MacOS
We support building on macOS, which uses the Apple Clang compiler and the ARM64
architecture. We use default settings for all builds, and don't publish any
packages or images.
### Windows
We also support building on Windows, which uses the MSVC compiler and the AMD64
architecture. While we could build on ARM64, we have not yet found a suitable
cloud machine to use as a GitHub runner. We use default settings for all builds,
and don't publish any packages or images.
## Triggers
We have four triggers that can cause the workflow to run:
- `commit`: A commit is pushed to a branch for which a pull request is open.
- `merge`: A pull request is merged.
- `label`: A label is added to a pull request.
- `schedule`: The workflow is run on a scheduled basis.
The `label` trigger is currently not used, but it is reserved for future use.
The `schedule` trigger is used to run the workflow each weekday, and is also
used for ad hoc testing via the `workflow_dispatch` event.
### Dependencies
The pipeline that is run for the `schedule` trigger will recompile and upload
all Conan packages to the remote for each configuration that is enabled. In
case any dependencies were added or updated in a recently merged PR, they will
then be available in the remote for the following pipeline runs. It is therefore
important that all configurations that are enabled for the `commit`, `merge`,
and `label` triggers are also enabled for the `schedule` trigger. We run
additional configurations in the `schedule` trigger that are not run for the
other triggers, to get extra confidence that the codebase can compile and run on
all supported platforms.
#### Caveats
There is some nuance here in that certain options affect the compilation of the
dependencies, while others do not. This means that that same options need to be
enabled for the `schedule` trigger as for the other triggers to ensure any
dependency changes get cached in the Conan remote.
- Build mode (`unity`): Does not affect the dependencies.
- Build option (`coverage`, `voidstar`): Does not affect the dependencies.
- Build option (`sanitizer asan`, `sanitizer tsan`): Affects the dependencies.
- Build type (`debug`, `release`): Affects the dependencies.
- Build type (`publish`): Same effect as `release` on the dependencies.
- Test option (`reference fee`): Does not affect the dependencies.
- Publish option (`package`, `image`): Does not affect the dependencies.
## Usage
Our GitHub CI pipeline uses the `generate.py` script to generate the matrix for
the current workflow invocation. Naturally, the script can be run locally to
generate the matrix for testing purposes, e.g.:
```bash
python3 generate.py --platform=linux --trigger=commit
```
If you want to pretty-print the output, you can pipe it to `jq` after stripping
off the `matrix=` prefix, e.g.:
```bash
python3 generate.py --platform=linux --trigger=commit | cut -d= -f2- | jq
```

View File

View File

@@ -1,301 +1,211 @@
#!/usr/bin/env python3
import argparse
import dataclasses
import itertools
import json
from dataclasses import dataclass
from pathlib import Path
from collections.abc import Iterator
THIS_DIR = Path(__file__).parent.resolve()
import linux
import macos
import windows
from helpers.defs import *
from helpers.enums import *
from helpers.funcs import *
from helpers.unique import *
# The GitHub runner tags to use for the different architectures.
RUNNER_TAGS = {
Arch.LINUX_AMD64: ["self-hosted", "Linux", "X64", "heavy"],
Arch.LINUX_ARM64: ["self-hosted", "Linux", "ARM64", "heavy-arm64"],
Arch.MACOS_ARM64: ["self-hosted", "macOS", "ARM64", "mac-runner-m1"],
Arch.WINDOWS_AMD64: ["self-hosted", "Windows", "devbox"],
}
@dataclass
class Config:
architecture: list[dict]
os: list[dict]
build_type: list[str]
cmake_args: list[str]
def generate_configs(distros: list[Distro], trigger: Trigger) -> list[Config]:
"""Generate a strategy matrix for GitHub Actions CI.
Args:
distros: The distros to generate the matrix for.
trigger: The trigger that caused the workflow to run.
Returns:
list[Config]: The generated configurations.
Raises:
ValueError: If any of the required fields are empty or invalid.
TypeError: If any of the required fields are of the wrong type.
"""
configs = []
for distro in distros:
for config in generate_config_for_distro(distro, trigger):
configs.append(config)
if not is_unique(configs):
raise ValueError("configs must be a list of unique Config")
return configs
"""
Generate a strategy matrix for GitHub Actions CI.
def generate_config_for_distro(distro: Distro, trigger: Trigger) -> Iterator[Config]:
"""Generate a strategy matrix for a specific distro.
On each PR commit we will build a selection of Debian, RHEL, Ubuntu, MacOS, and
Windows configurations, while upon merge into the develop, release, or master
branches, we will build all configurations, and test most of them.
Args:
distro: The distro to generate the matrix for.
trigger: The trigger that caused the workflow to run.
We will further set additional CMake arguments as follows:
- All builds will have the `tests`, `werr`, and `xrpld` options.
- All builds will have the `wextra` option except for GCC 12 and Clang 16.
- All release builds will have the `assert` option.
- Certain Debian Bookworm configurations will change the reference fee, enable
codecov, and enable voidstar in PRs.
"""
Yields:
Config: The next configuration to build.
Raises:
ValueError: If any of the required fields are empty or invalid.
TypeError: If any of the required fields are of the wrong type.
def generate_strategy_matrix(all: bool, config: Config) -> list:
configurations = []
for architecture, os, build_type, cmake_args in itertools.product(
config.architecture, config.os, config.build_type, config.cmake_args
):
# The default CMake target is 'all' for Linux and MacOS and 'install'
# for Windows, but it can get overridden for certain configurations.
cmake_target = "install" if os["distro_name"] == "windows" else "all"
# We build and test all configurations by default, except for Windows in
# Debug, because it is too slow, as well as when code coverage is
# enabled as that mode already runs the tests.
build_only = False
if os["distro_name"] == "windows" and build_type == "Debug":
build_only = True
# Only generate a subset of configurations in PRs.
if not all:
# Debian:
# - Bookworm using GCC 13: Release and Unity on linux/amd64, set
# the reference fee to 500.
# - Bookworm using GCC 15: Debug and no Unity on linux/amd64, enable
# code coverage (which will be done below).
# - Bookworm using Clang 16: Debug and no Unity on linux/arm64,
# enable voidstar.
# - Bookworm using Clang 17: Release and no Unity on linux/amd64,
# set the reference fee to 1000.
# - Bookworm using Clang 20: Debug and Unity on linux/amd64.
if os["distro_name"] == "debian":
skip = True
if os["distro_version"] == "bookworm":
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-13"
and build_type == "Release"
and "-Dunity=ON" in cmake_args
and architecture["platform"] == "linux/amd64"
):
cmake_args = f"-DUNIT_TEST_REFERENCE_FEE=500 {cmake_args}"
skip = False
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-15"
and build_type == "Debug"
and "-Dunity=OFF" in cmake_args
and architecture["platform"] == "linux/amd64"
):
skip = False
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "clang-16"
and build_type == "Debug"
and "-Dunity=OFF" in cmake_args
and architecture["platform"] == "linux/arm64"
):
cmake_args = f"-Dvoidstar=ON {cmake_args}"
skip = False
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "clang-17"
and build_type == "Release"
and "-Dunity=ON" in cmake_args
and architecture["platform"] == "linux/amd64"
):
cmake_args = f"-DUNIT_TEST_REFERENCE_FEE=1000 {cmake_args}"
skip = False
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "clang-20"
and build_type == "Debug"
and "-Dunity=ON" in cmake_args
and architecture["platform"] == "linux/amd64"
):
skip = False
if skip:
continue
# RHEL:
# - 9 using GCC 12: Debug and Unity on linux/amd64.
# - 10 using Clang: Release and no Unity on linux/amd64.
if os["distro_name"] == "rhel":
skip = True
if os["distro_version"] == "9":
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-12"
and build_type == "Debug"
and "-Dunity=ON" in cmake_args
and architecture["platform"] == "linux/amd64"
):
skip = False
elif os["distro_version"] == "10":
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "clang-any"
and build_type == "Release"
and "-Dunity=OFF" in cmake_args
and architecture["platform"] == "linux/amd64"
):
skip = False
if skip:
continue
# Ubuntu:
# - Jammy using GCC 12: Debug and no Unity on linux/arm64.
# - Noble using GCC 14: Release and Unity on linux/amd64.
# - Noble using Clang 18: Debug and no Unity on linux/amd64.
# - Noble using Clang 19: Release and Unity on linux/arm64.
if os["distro_name"] == "ubuntu":
skip = True
if os["distro_version"] == "jammy":
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-12"
and build_type == "Debug"
and "-Dunity=OFF" in cmake_args
and architecture["platform"] == "linux/arm64"
):
skip = False
elif os["distro_version"] == "noble":
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-14"
and build_type == "Release"
and "-Dunity=ON" in cmake_args
and architecture["platform"] == "linux/amd64"
):
skip = False
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "clang-18"
and build_type == "Debug"
and "-Dunity=OFF" in cmake_args
and architecture["platform"] == "linux/amd64"
):
skip = False
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "clang-19"
and build_type == "Release"
and "-Dunity=ON" in cmake_args
and architecture["platform"] == "linux/arm64"
):
skip = False
if skip:
continue
# MacOS:
# - Debug and no Unity on macos/arm64.
if os["distro_name"] == "macos" and not (
build_type == "Debug"
and "-Dunity=OFF" in cmake_args
and architecture["platform"] == "macos/arm64"
):
continue
# Windows:
# - Release and Unity on windows/amd64.
if os["distro_name"] == "windows" and not (
build_type == "Release"
and "-Dunity=ON" in cmake_args
and architecture["platform"] == "windows/amd64"
):
continue
# Additional CMake arguments.
cmake_args = f"{cmake_args} -Dtests=ON -Dwerr=ON -Dxrpld=ON"
if not f"{os['compiler_name']}-{os['compiler_version']}" in [
"gcc-12",
"clang-16",
]:
cmake_args = f"{cmake_args} -Dwextra=ON"
if build_type == "Release":
cmake_args = f"{cmake_args} -Dassert=ON"
# We skip all RHEL on arm64 due to a build failure that needs further
# investigation.
if os["distro_name"] == "rhel" and architecture["platform"] == "linux/arm64":
"""
for spec in distro.specs:
if trigger not in spec.triggers:
continue
# We skip all clang 20+ on arm64 due to Boost build error.
if (
f"{os['compiler_name']}-{os['compiler_version']}"
in ["clang-20", "clang-21"]
and architecture["platform"] == "linux/arm64"
):
os_name = distro.os_name
os_version = distro.os_version
compiler_name = distro.compiler_name
compiler_version = distro.compiler_version
image_sha = distro.image_sha
yield from generate_config_for_distro_spec(
os_name,
os_version,
compiler_name,
compiler_version,
image_sha,
spec,
trigger,
)
def generate_config_for_distro_spec(
os_name: str,
os_version: str,
compiler_name: str,
compiler_version: str,
image_sha: str,
spec: Spec,
trigger: Trigger,
) -> Iterator[Config]:
"""Generate a strategy matrix for a specific distro and spec.
Args:
os_name: The OS name.
os_version: The OS version.
compiler_name: The compiler name.
compiler_version: The compiler version.
image_sha: The image SHA.
spec: The spec to generate the matrix for.
trigger: The trigger that caused the workflow to run.
Yields:
Config: The next configuration to build.
"""
for trigger_, arch, build_mode, build_type in itertools.product(
spec.triggers, spec.archs, spec.build_modes, spec.build_types
):
if trigger_ != trigger:
continue
# Enable code coverage for Debian Bookworm using GCC 15 in Debug and no
# Unity on linux/amd64
if (
f"{os['compiler_name']}-{os['compiler_version']}" == "gcc-15"
and build_type == "Debug"
and "-Dunity=OFF" in cmake_args
and architecture["platform"] == "linux/amd64"
):
cmake_args = f"-Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0 {cmake_args}"
build_option = spec.build_option
test_option = spec.test_option
publish_option = spec.publish_option
# Generate a unique name for the configuration, e.g. macos-arm64-debug
# or debian-bookworm-gcc-12-amd64-release-unity.
config_name = os["distro_name"]
if (n := os["distro_version"]) != "":
config_name += f"-{n}"
if (n := os["compiler_name"]) != "":
config_name += f"-{n}"
if (n := os["compiler_version"]) != "":
config_name += f"-{n}"
config_name += (
f"-{architecture['platform'][architecture['platform'].find('/') + 1 :]}"
)
config_name += f"-{build_type.lower()}"
if "-Dunity=ON" in cmake_args:
config_name += "-unity"
# Add the configuration to the list, with the most unique fields first,
# so that they are easier to identify in the GitHub Actions UI, as long
# names get truncated.
configurations.append(
{
"config_name": config_name,
"cmake_args": cmake_args,
"cmake_target": cmake_target,
"build_only": build_only,
"build_type": build_type,
"os": os,
"architecture": architecture,
}
# Determine the configuration name.
config_name = generate_config_name(
os_name,
os_version,
compiler_name,
compiler_version,
arch,
build_type,
build_mode,
build_option,
)
return configurations
# Determine the CMake arguments.
cmake_args = generate_cmake_args(
compiler_name,
compiler_version,
build_type,
build_mode,
build_option,
test_option,
)
# Determine the CMake target.
cmake_target = generate_cmake_target(os_name, build_type)
def read_config(file: Path) -> Config:
config = json.loads(file.read_text())
if (
config["architecture"] is None
or config["os"] is None
or config["build_type"] is None
or config["cmake_args"] is None
):
raise Exception("Invalid configuration file.")
# Determine whether to enable running tests, and to create a package
# and/or image.
enable_tests, enable_package, enable_image = generate_enable_options(
os_name, build_type, publish_option
)
return Config(**config)
# Determine the image to run in, if applicable.
image = generate_image_name(
os_name,
os_version,
compiler_name,
compiler_version,
image_sha,
)
# Generate the configuration.
yield Config(
config_name=config_name,
cmake_args=cmake_args,
cmake_target=cmake_target,
build_type=("Debug" if build_type == BuildType.DEBUG else "Release"),
enable_tests=enable_tests,
enable_package=enable_package,
enable_image=enable_image,
runs_on=RUNNER_TAGS[arch],
image=image,
)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument(
"-a",
"--all",
help="Set to generate all configurations (generally used when merging a PR) or leave unset to generate a subset of configurations (generally used when committing to a PR).",
action="store_true",
"--platform",
"-p",
required=False,
type=Platform,
choices=list(Platform),
help="The platform to run on.",
)
parser.add_argument(
"-c",
"--config",
help="Path to the JSON file containing the strategy matrix configurations.",
required=False,
type=Path,
"--trigger",
"-t",
required=True,
type=Trigger,
choices=list(Trigger),
help="The trigger that caused the workflow to run.",
)
args = parser.parse_args()
matrix = []
if args.config is None or args.config == "":
matrix += generate_strategy_matrix(
args.all, read_config(THIS_DIR / "linux.json")
)
matrix += generate_strategy_matrix(
args.all, read_config(THIS_DIR / "macos.json")
)
matrix += generate_strategy_matrix(
args.all, read_config(THIS_DIR / "windows.json")
)
else:
matrix += generate_strategy_matrix(args.all, read_config(args.config))
# Collect the distros to generate configs for.
distros = []
if args.platform in [None, Platform.LINUX]:
distros += linux.DEBIAN_DISTROS + linux.RHEL_DISTROS + linux.UBUNTU_DISTROS
if args.platform in [None, Platform.MACOS]:
distros += macos.DISTROS
if args.platform in [None, Platform.WINDOWS]:
distros += windows.DISTROS
# Generate the strategy matrix.
print(f"matrix={json.dumps({'include': matrix})}")
# Generate the configs.
configs = generate_configs(distros, args.trigger)
# Convert the configs into the format expected by GitHub Actions.
include = []
for config in configs:
include.append(dataclasses.asdict(config))
print(f"matrix={json.dumps({'include': include})}")

View File

@@ -0,0 +1,466 @@
import pytest
from generate import *
@pytest.fixture
def macos_distro():
return Distro(
os_name="macos",
specs=[
Spec(
archs=[Arch.MACOS_ARM64],
build_modes=[BuildMode.UNITY_OFF],
build_option=BuildOption.COVERAGE,
build_types=[BuildType.RELEASE],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
],
)
@pytest.fixture
def windows_distro():
return Distro(
os_name="windows",
specs=[
Spec(
archs=[Arch.WINDOWS_AMD64],
build_modes=[BuildMode.UNITY_ON],
build_option=BuildOption.SANITIZE_ASAN,
build_types=[BuildType.DEBUG],
publish_option=PublishOption.IMAGE_ONLY,
test_option=TestOption.REFERENCE_FEE_500,
triggers=[Trigger.COMMIT, Trigger.SCHEDULE],
)
],
)
@pytest.fixture
def linux_distro():
return Distro(
os_name="debian",
os_version="bookworm",
compiler_name="clang",
compiler_version="16",
image_sha="a1b2c3d4",
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
build_modes=[BuildMode.UNITY_OFF],
build_option=BuildOption.SANITIZE_TSAN,
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.LABEL],
),
Spec(
archs=[Arch.LINUX_AMD64, Arch.LINUX_ARM64],
build_modes=[BuildMode.UNITY_OFF, BuildMode.UNITY_ON],
build_option=BuildOption.VOIDSTAR,
build_types=[BuildType.PUBLISH],
publish_option=PublishOption.PACKAGE_AND_IMAGE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT, Trigger.LABEL],
),
],
)
def test_macos_generate_config_for_distro_spec_matches_trigger(macos_distro):
trigger = Trigger.COMMIT
distro = macos_distro
result = list(
generate_config_for_distro_spec(
distro.os_name,
distro.os_version,
distro.compiler_name,
distro.compiler_version,
distro.image_sha,
distro.specs[0],
trigger,
)
)
assert result == [
Config(
config_name="macos-coverage-release-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dassert=ON -Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0",
cmake_target="all",
build_type="Release",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["self-hosted", "macOS", "ARM64", "mac-runner-m1"],
image=None,
)
]
def test_macos_generate_config_for_distro_spec_no_match_trigger(macos_distro):
trigger = Trigger.MERGE
distro = macos_distro
result = list(
generate_config_for_distro_spec(
distro.os_name,
distro.os_version,
distro.compiler_name,
distro.compiler_version,
distro.image_sha,
distro.specs[0],
trigger,
)
)
assert result == []
def test_macos_generate_config_for_distro_matches_trigger(macos_distro):
trigger = Trigger.COMMIT
distro = macos_distro
result = list(generate_config_for_distro(distro, trigger))
assert result == [
Config(
config_name="macos-coverage-release-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dassert=ON -Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0",
cmake_target="all",
build_type="Release",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["self-hosted", "macOS", "ARM64", "mac-runner-m1"],
image=None,
)
]
def test_macos_generate_config_for_distro_no_match_trigger(macos_distro):
trigger = Trigger.MERGE
distro = macos_distro
result = list(generate_config_for_distro(distro, trigger))
assert result == []
def test_windows_generate_config_for_distro_spec_matches_trigger(
windows_distro,
):
trigger = Trigger.COMMIT
distro = windows_distro
result = list(
generate_config_for_distro_spec(
distro.os_name,
distro.os_version,
distro.compiler_name,
distro.compiler_version,
distro.image_sha,
distro.specs[0],
trigger,
)
)
assert result == [
Config(
config_name="windows-asan-debug-unity-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dunity=ON -DUNIT_TEST_REFERENCE_FEE=500",
cmake_target="install",
build_type="Debug",
enable_tests=False,
enable_package=False,
enable_image=True,
runs_on=["self-hosted", "Windows", "devbox"],
image=None,
)
]
def test_windows_generate_config_for_distro_spec_no_match_trigger(
windows_distro,
):
trigger = Trigger.MERGE
distro = windows_distro
result = list(
generate_config_for_distro_spec(
distro.os_name,
distro.os_version,
distro.compiler_name,
distro.compiler_version,
distro.image_sha,
distro.specs[0],
trigger,
)
)
assert result == []
def test_windows_generate_config_for_distro_matches_trigger(
windows_distro,
):
trigger = Trigger.COMMIT
distro = windows_distro
result = list(generate_config_for_distro(distro, trigger))
assert result == [
Config(
config_name="windows-asan-debug-unity-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dunity=ON -DUNIT_TEST_REFERENCE_FEE=500",
cmake_target="install",
build_type="Debug",
enable_tests=False,
enable_package=False,
enable_image=True,
runs_on=["self-hosted", "Windows", "devbox"],
image=None,
)
]
def test_windows_generate_config_for_distro_no_match_trigger(
windows_distro,
):
trigger = Trigger.MERGE
distro = windows_distro
result = list(generate_config_for_distro(distro, trigger))
assert result == []
def test_linux_generate_config_for_distro_spec_matches_trigger(linux_distro):
trigger = Trigger.LABEL
distro = linux_distro
result = list(
generate_config_for_distro_spec(
distro.os_name,
distro.os_version,
distro.compiler_name,
distro.compiler_version,
distro.image_sha,
distro.specs[1],
trigger,
)
)
assert result == [
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "X64", "heavy"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-unity-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dunity=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "X64", "heavy"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "ARM64", "heavy-arm64"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-unity-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dunity=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "ARM64", "heavy-arm64"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
]
def test_linux_generate_config_for_distro_spec_no_match_trigger(linux_distro):
trigger = Trigger.MERGE
distro = linux_distro
result = list(
generate_config_for_distro_spec(
distro.os_name,
distro.os_version,
distro.compiler_name,
distro.compiler_version,
distro.image_sha,
distro.specs[1],
trigger,
)
)
assert result == []
def test_linux_generate_config_for_distro_matches_trigger(linux_distro):
trigger = Trigger.LABEL
distro = linux_distro
result = list(generate_config_for_distro(distro, trigger))
assert result == [
Config(
config_name="debian-bookworm-clang-16-tsan-debug-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["self-hosted", "Linux", "X64", "heavy"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "X64", "heavy"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-unity-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dunity=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "X64", "heavy"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "ARM64", "heavy-arm64"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-unity-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dunity=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "ARM64", "heavy-arm64"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
]
def test_linux_generate_config_for_distro_no_match_trigger(linux_distro):
trigger = Trigger.MERGE
distro = linux_distro
result = list(generate_config_for_distro(distro, trigger))
assert result == []
def test_generate_configs(macos_distro, windows_distro, linux_distro):
trigger = Trigger.COMMIT
distros = [macos_distro, windows_distro, linux_distro]
result = generate_configs(distros, trigger)
assert result == [
Config(
config_name="macos-coverage-release-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dassert=ON -Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0",
cmake_target="all",
build_type="Release",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["self-hosted", "macOS", "ARM64", "mac-runner-m1"],
image=None,
),
Config(
config_name="windows-asan-debug-unity-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dunity=ON -DUNIT_TEST_REFERENCE_FEE=500",
cmake_target="install",
build_type="Debug",
enable_tests=False,
enable_package=False,
enable_image=True,
runs_on=["self-hosted", "Windows", "devbox"],
image=None,
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "X64", "heavy"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-unity-amd64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dunity=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "X64", "heavy"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "ARM64", "heavy-arm64"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
Config(
config_name="debian-bookworm-clang-16-voidstar-publish-unity-arm64",
cmake_args="-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dunity=ON -Dvoidstar=ON",
cmake_target="install",
build_type="Release",
enable_tests=True,
enable_package=True,
enable_image=True,
runs_on=["self-hosted", "Linux", "ARM64", "heavy-arm64"],
image="ghcr.io/xrplf/ci/debian-bookworm:clang-16-a1b2c3d4",
),
]
def test_generate_configs_raises_on_duplicate_configs(macos_distro):
trigger = Trigger.COMMIT
distros = [macos_distro, macos_distro]
with pytest.raises(ValueError):
generate_configs(distros, trigger)

View File

View File

@@ -0,0 +1,190 @@
from dataclasses import dataclass, field
from helpers.enums import *
from helpers.unique import *
@dataclass
class Config:
"""Represents a configuration to include in the strategy matrix.
Raises:
ValueError: If any of the required fields are empty or invalid.
TypeError: If any of the required fields are of the wrong type.
"""
config_name: str
cmake_args: str
cmake_target: str
build_type: str
enable_tests: bool
enable_package: bool
enable_image: bool
runs_on: list[str]
image: str | None = None
def __post_init__(self):
if not self.config_name:
raise ValueError("config_name cannot be empty")
if not isinstance(self.config_name, str):
raise TypeError("config_name must be a string")
if not self.cmake_args:
raise ValueError("cmake_args cannot be empty")
if not isinstance(self.cmake_args, str):
raise TypeError("cmake_args must be a string")
if not self.cmake_target:
raise ValueError("cmake_target cannot be empty")
if not isinstance(self.cmake_target, str):
raise TypeError("cmake_target must be a string")
if self.cmake_target not in ["all", "install"]:
raise ValueError("cmake_target must be 'all' or 'install'")
if not self.build_type:
raise ValueError("build_type cannot be empty")
if not isinstance(self.build_type, str):
raise TypeError("build_type must be a string")
if self.build_type not in ["Debug", "Release"]:
raise ValueError("build_type must be 'Debug' or 'Release'")
if not isinstance(self.enable_tests, bool):
raise TypeError("enable_tests must be a boolean")
if not isinstance(self.enable_package, bool):
raise TypeError("enable_package must be a boolean")
if not isinstance(self.enable_image, bool):
raise TypeError("enable_image must be a boolean")
if not self.runs_on:
raise ValueError("runs_on cannot be empty")
if not isinstance(self.runs_on, list):
raise TypeError("runs_on must be a list")
if not all(isinstance(runner, str) for runner in self.runs_on):
raise TypeError("runs_on must be a list of strings")
if not all(self.runs_on):
raise ValueError("runs_on must be a list of non-empty strings")
if len(self.runs_on) != len(set(self.runs_on)):
raise ValueError("runs_on must be a list of unique strings")
if self.image and not isinstance(self.image, str):
raise TypeError("image must be a string")
@dataclass
class Spec:
"""Represents a specification used by a configuration.
Raises:
ValueError: If any of the required fields are empty.
TypeError: If any of the required fields are of the wrong type.
"""
archs: list[Arch] = field(
default_factory=lambda: [Arch.LINUX_AMD64, Arch.LINUX_ARM64]
)
build_option: BuildOption = BuildOption.NONE
build_modes: list[BuildMode] = field(
default_factory=lambda: [BuildMode.UNITY_OFF, BuildMode.UNITY_ON]
)
build_types: list[BuildType] = field(
default_factory=lambda: [BuildType.DEBUG, BuildType.RELEASE]
)
publish_option: PublishOption = PublishOption.NONE
test_option: TestOption = TestOption.NONE
triggers: list[Trigger] = field(
default_factory=lambda: [Trigger.COMMIT, Trigger.MERGE, Trigger.SCHEDULE]
)
def __post_init__(self):
if not self.archs:
raise ValueError("archs cannot be empty")
if not isinstance(self.archs, list):
raise TypeError("archs must be a list")
if not all(isinstance(arch, str) for arch in self.archs):
raise TypeError("archs must be a list of Arch")
if len(self.archs) != len(set(self.archs)):
raise ValueError("archs must be a list of unique Arch")
if not isinstance(self.build_option, BuildOption):
raise TypeError("build_option must be a BuildOption")
if not self.build_modes:
raise ValueError("build_modes cannot be empty")
if not isinstance(self.build_modes, list):
raise TypeError("build_modes must be a list")
if not all(
isinstance(build_mode, BuildMode) for build_mode in self.build_modes
):
raise TypeError("build_modes must be a list of BuildMode")
if len(self.build_modes) != len(set(self.build_modes)):
raise ValueError("build_modes must be a list of unique BuildMode")
if not self.build_types:
raise ValueError("build_types cannot be empty")
if not isinstance(self.build_types, list):
raise TypeError("build_types must be a list")
if not all(
isinstance(build_type, BuildType) for build_type in self.build_types
):
raise TypeError("build_types must be a list of BuildType")
if len(self.build_types) != len(set(self.build_types)):
raise ValueError("build_types must be a list of unique BuildType")
if not isinstance(self.publish_option, PublishOption):
raise TypeError("publish_option must be a PublishOption")
if not isinstance(self.test_option, TestOption):
raise TypeError("test_option must be a TestOption")
if not self.triggers:
raise ValueError("triggers cannot be empty")
if not isinstance(self.triggers, list):
raise TypeError("triggers must be a list")
if not all(isinstance(trigger, Trigger) for trigger in self.triggers):
raise TypeError("triggers must be a list of Trigger")
if len(self.triggers) != len(set(self.triggers)):
raise ValueError("triggers must be a list of unique Trigger")
@dataclass
class Distro:
"""Represents a Linux, Windows or macOS distribution with specifications.
Raises:
ValueError: If any of the required fields are empty.
TypeError: If any of the required fields are of the wrong type.
"""
os_name: str
os_version: str = ""
compiler_name: str = ""
compiler_version: str = ""
image_sha: str = ""
specs: list[Spec] = field(default_factory=list)
def __post_init__(self):
if not self.os_name:
raise ValueError("os_name cannot be empty")
if not isinstance(self.os_name, str):
raise TypeError("os_name must be a string")
if self.os_version and not isinstance(self.os_version, str):
raise TypeError("os_version must be a string")
if self.compiler_name and not isinstance(self.compiler_name, str):
raise TypeError("compiler_name must be a string")
if self.compiler_version and not isinstance(self.compiler_version, str):
raise TypeError("compiler_version must be a string")
if self.image_sha and not isinstance(self.image_sha, str):
raise TypeError("image_sha must be a string")
if not self.specs:
raise ValueError("specs cannot be empty")
if not isinstance(self.specs, list):
raise TypeError("specs must be a list")
if not all(isinstance(spec, Spec) for spec in self.specs):
raise TypeError("specs must be a list of Spec")
if not is_unique(self.specs):
raise ValueError("specs must be a list of unique Spec")

View File

@@ -0,0 +1,743 @@
import pytest
from helpers.defs import *
from helpers.enums import *
from helpers.funcs import *
def test_config_valid_none_image():
assert Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image=None,
)
def test_config_valid_empty_image():
assert Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="install",
build_type="Debug",
enable_tests=False,
enable_package=True,
enable_image=False,
runs_on=["label"],
image="",
)
def test_config_valid_with_image():
assert Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="install",
build_type="Release",
enable_tests=False,
enable_package=True,
enable_image=True,
runs_on=["label"],
image="image",
)
def test_config_raises_on_empty_config_name():
with pytest.raises(ValueError):
Config(
config_name="",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_wrong_config_name():
with pytest.raises(TypeError):
Config(
config_name=123,
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_empty_cmake_args():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_wrong_cmake_args():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args=123,
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_empty_cmake_target():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_invalid_cmake_target():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="invalid",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_wrong_cmake_target():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target=123,
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_empty_build_type():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_invalid_build_type():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="invalid",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_wrong_build_type():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type=123,
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_wrong_enable_tests():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=123,
enable_package=False,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_wrong_enable_package():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=123,
enable_image=False,
runs_on=["label"],
image="image",
)
def test_config_raises_on_wrong_enable_image():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=True,
enable_image=123,
runs_on=["label"],
image="image",
)
def test_config_raises_on_none_runs_on():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=None,
image="image",
)
def test_config_raises_on_empty_runs_on():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=[],
image="image",
)
def test_config_raises_on_invalid_runs_on():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=[""],
image="image",
)
def test_config_raises_on_wrong_runs_on():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=[123],
image="image",
)
def test_config_raises_on_duplicate_runs_on():
with pytest.raises(ValueError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label", "label"],
image="image",
)
def test_config_raises_on_wrong_image():
with pytest.raises(TypeError):
Config(
config_name="config",
cmake_args="-Doption=ON",
cmake_target="all",
build_type="Debug",
enable_tests=True,
enable_package=False,
enable_image=False,
runs_on=["label"],
image=123,
)
def test_spec_valid():
assert Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_none_archs():
with pytest.raises(ValueError):
Spec(
archs=None,
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_empty_archs():
with pytest.raises(ValueError):
Spec(
archs=[],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_wrong_archs():
with pytest.raises(TypeError):
Spec(
archs=[123],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_duplicate_archs():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64, Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_wrong_build_option():
with pytest.raises(TypeError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=123,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_none_build_modes():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=None,
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_empty_build_modes():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_wrong_build_modes():
with pytest.raises(TypeError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[123],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_none_build_types():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=None,
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_empty_build_types():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_wrong_build_types():
with pytest.raises(TypeError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[123],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_duplicate_build_types():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG, BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_wrong_publish_option():
with pytest.raises(TypeError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=123,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_wrong_test_option():
with pytest.raises(TypeError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=123,
triggers=[Trigger.COMMIT],
)
def test_spec_raises_on_none_triggers():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=None,
)
def test_spec_raises_on_empty_triggers():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[],
)
def test_spec_raises_on_wrong_triggers():
with pytest.raises(TypeError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[123],
)
def test_spec_raises_on_duplicate_triggers():
with pytest.raises(ValueError):
Spec(
archs=[Arch.LINUX_AMD64],
build_option=BuildOption.NONE,
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.NONE,
test_option=TestOption.NONE,
triggers=[Trigger.COMMIT, Trigger.COMMIT],
)
def test_distro_valid_none_image_sha():
assert Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha=None,
specs=[Spec()], # This is valid due to the default values.
)
def test_distro_valid_empty_os_compiler_image_sha():
assert Distro(
os_name="os_name",
os_version="",
compiler_name="",
compiler_version="",
image_sha="",
specs=[Spec()],
)
def test_distro_valid_with_image():
assert Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[Spec()],
)
def test_distro_raises_on_empty_os_name():
with pytest.raises(ValueError):
Distro(
os_name="",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[Spec()],
)
def test_distro_raises_on_wrong_os_name():
with pytest.raises(TypeError):
Distro(
os_name=123,
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[Spec()],
)
def test_distro_raises_on_wrong_os_version():
with pytest.raises(TypeError):
Distro(
os_name="os_name",
os_version=123,
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[Spec()],
)
def test_distro_raises_on_wrong_compiler_name():
with pytest.raises(TypeError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name=123,
compiler_version="compiler_version",
image_sha="image_sha",
specs=[Spec()],
)
def test_distro_raises_on_wrong_compiler_version():
with pytest.raises(TypeError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version=123,
image_sha="image_sha",
specs=[Spec()],
)
def test_distro_raises_on_wrong_image_sha():
with pytest.raises(TypeError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha=123,
specs=[Spec()],
)
def test_distro_raises_on_none_specs():
with pytest.raises(ValueError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=None,
)
def test_distro_raises_on_empty_specs():
with pytest.raises(ValueError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[],
)
def test_distro_raises_on_invalid_specs():
with pytest.raises(ValueError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[Spec(triggers=[])],
)
def test_distro_raises_on_duplicate_specs():
with pytest.raises(ValueError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[Spec(), Spec()],
)
def test_distro_raises_on_wrong_specs():
with pytest.raises(TypeError):
Distro(
os_name="os_name",
os_version="os_version",
compiler_name="compiler_name",
compiler_version="compiler_version",
image_sha="image_sha",
specs=[123],
)

View File

@@ -0,0 +1,75 @@
from enum import StrEnum, auto
class Arch(StrEnum):
"""Represents architectures to build for."""
LINUX_AMD64 = "linux/amd64"
LINUX_ARM64 = "linux/arm64"
MACOS_ARM64 = "macos/arm64"
WINDOWS_AMD64 = "windows/amd64"
class BuildMode(StrEnum):
"""Represents whether to perform a unity or non-unity build."""
UNITY_OFF = auto()
UNITY_ON = auto()
class BuildOption(StrEnum):
"""Represents build options to enable."""
NONE = auto()
COVERAGE = auto()
SANITIZE_ASAN = (
auto()
) # Address Sanitizer, also includes Undefined Behavior Sanitizer.
SANITIZE_TSAN = (
auto()
) # Thread Sanitizer, also includes Undefined Behavior Sanitizer.
VOIDSTAR = auto()
class BuildType(StrEnum):
"""Represents the build type to use."""
DEBUG = auto()
RELEASE = auto()
PUBLISH = auto() # Release build without assertions.
class PublishOption(StrEnum):
"""Represents whether to publish a package, an image, or both."""
NONE = auto()
PACKAGE_ONLY = auto()
IMAGE_ONLY = auto()
PACKAGE_AND_IMAGE = auto()
class TestOption(StrEnum):
"""Represents test options to enable, specifically the reference fee to use."""
__test__ = False # Tell pytest to not consider this as a test class.
NONE = "" # Use the default reference fee of 10.
REFERENCE_FEE_500 = "500"
REFERENCE_FEE_1000 = "1000"
class Platform(StrEnum):
"""Represents the platform to use."""
LINUX = "linux"
MACOS = "macos"
WINDOWS = "windows"
class Trigger(StrEnum):
"""Represents the trigger that caused the workflow to run."""
COMMIT = "commit"
LABEL = "label"
MERGE = "merge"
SCHEDULE = "schedule"

View File

@@ -0,0 +1,235 @@
from helpers.defs import *
from helpers.enums import *
def generate_config_name(
os_name: str,
os_version: str | None,
compiler_name: str | None,
compiler_version: str | None,
arch: Arch,
build_type: BuildType,
build_mode: BuildMode,
build_option: BuildOption,
) -> str:
"""Create a configuration name based on the distro details and build
attributes.
The configuration name is used as the display name in the GitHub Actions
UI, and since GitHub truncates long names we have to make sure the most
important information is at the beginning of the name.
Args:
os_name (str): The OS name.
os_version (str): The OS version.
compiler_name (str): The compiler name.
compiler_version (str): The compiler version.
arch (Arch): The architecture.
build_type (BuildType): The build type.
build_mode (BuildMode): The build mode.
build_option (BuildOption): The build option.
Returns:
str: The configuration name.
Raises:
ValueError: If the OS name is empty.
"""
if not os_name:
raise ValueError("os_name cannot be empty")
config_name = os_name
if os_version:
config_name += f"-{os_version}"
if compiler_name:
config_name += f"-{compiler_name}"
if compiler_version:
config_name += f"-{compiler_version}"
if build_option == BuildOption.COVERAGE:
config_name += "-coverage"
elif build_option == BuildOption.VOIDSTAR:
config_name += "-voidstar"
elif build_option == BuildOption.SANITIZE_ASAN:
config_name += "-asan"
elif build_option == BuildOption.SANITIZE_TSAN:
config_name += "-tsan"
if build_type == BuildType.DEBUG:
config_name += "-debug"
elif build_type == BuildType.RELEASE:
config_name += "-release"
elif build_type == BuildType.PUBLISH:
config_name += "-publish"
if build_mode == BuildMode.UNITY_ON:
config_name += "-unity"
config_name += f"-{arch.value.split('/')[1]}"
return config_name
def generate_cmake_args(
compiler_name: str | None,
compiler_version: str | None,
build_type: BuildType,
build_mode: BuildMode,
build_option: BuildOption,
test_option: TestOption,
) -> str:
"""Create the CMake arguments based on the build type and enabled build
options.
- All builds will have the `tests`, `werr`, and `xrpld` options.
- All builds will have the `wextra` option except for GCC 12 and Clang 16.
- All release builds will have the `assert` option.
- Set the unity option if specified.
- Set the coverage option if specified.
- Set the voidstar option if specified.
- Set the reference fee if specified.
Args:
compiler_name (str): The compiler name.
compiler_version (str): The compiler version.
build_type (BuildType): The build type.
build_mode (BuildMode): The build mode.
build_option (BuildOption): The build option.
test_option (TestOption): The test option.
Returns:
str: The CMake arguments.
"""
cmake_args = "-Dtests=ON -Dwerr=ON -Dxrpld=ON"
if not f"{compiler_name}-{compiler_version}" in [
"gcc-12",
"clang-16",
]:
cmake_args += " -Dwextra=ON"
if build_type == BuildType.RELEASE:
cmake_args += " -Dassert=ON"
if build_mode == BuildMode.UNITY_ON:
cmake_args += " -Dunity=ON"
if build_option == BuildOption.COVERAGE:
cmake_args += " -Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0"
elif build_option == BuildOption.SANITIZE_ASAN:
pass # TODO: Add ASAN-UBSAN flags.
elif build_option == BuildOption.SANITIZE_TSAN:
pass # TODO: Add TSAN-UBSAN flags.
elif build_option == BuildOption.VOIDSTAR:
cmake_args += " -Dvoidstar=ON"
if test_option != TestOption.NONE:
cmake_args += f" -DUNIT_TEST_REFERENCE_FEE={test_option.value}"
return cmake_args
def generate_cmake_target(os_name: str, build_type: BuildType) -> str:
"""Create the CMake target based on the build type.
The `install` target is used for Windows and for publishing a package, while
the `all` target is used for all other configurations.
Args:
os_name (str): The OS name.
build_type (BuildType): The build type.
Returns:
str: The CMake target.
"""
if os_name == "windows" or build_type == BuildType.PUBLISH:
return "install"
return "all"
def generate_enable_options(
os_name: str,
build_type: BuildType,
publish_option: PublishOption,
) -> tuple[bool, bool, bool]:
"""Create the enable flags based on the OS name, build option, and publish
option.
We build and test all configurations by default, except for Windows in
Debug, because it is too slow.
Args:
os_name (str): The OS name.
build_type (BuildType): The build type.
publish_option (PublishOption): The publish option.
Returns:
tuple: A tuple containing the enable test, enable package, and enable image flags.
"""
enable_tests = (
False if os_name == "windows" and build_type == BuildType.DEBUG else True
)
enable_package = (
True
if publish_option
in [
PublishOption.PACKAGE_ONLY,
PublishOption.PACKAGE_AND_IMAGE,
]
else False
)
enable_image = (
True
if publish_option
in [
PublishOption.IMAGE_ONLY,
PublishOption.PACKAGE_AND_IMAGE,
]
else False
)
return enable_tests, enable_package, enable_image
def generate_image_name(
os_name: str,
os_version: str,
compiler_name: str,
compiler_version: str,
image_sha: str,
) -> str | None:
"""Create the Docker image name based on the distro details.
Args:
os_name (str): The OS name.
os_version (str): The OS version.
compiler_name (str): The compiler name.
compiler_version (str): The compiler version.
image_sha (str): The image SHA.
Returns:
str: The Docker image name or None if not applicable.
Raises:
ValueError: If any of the arguments is empty for Linux.
"""
if os_name == "windows" or os_name == "macos":
return None
if not os_name:
raise ValueError("os_name cannot be empty")
if not os_version:
raise ValueError("os_version cannot be empty")
if not compiler_name:
raise ValueError("compiler_name cannot be empty")
if not compiler_version:
raise ValueError("compiler_version cannot be empty")
if not image_sha:
raise ValueError("image_sha cannot be empty")
return f"ghcr.io/xrplf/ci/{os_name}-{os_version}:{compiler_name}-{compiler_version}-{image_sha}"

View File

@@ -0,0 +1,419 @@
import pytest
from helpers.enums import *
from helpers.funcs import *
def test_generate_config_name_a_b_c_d_debug_amd64():
assert (
generate_config_name(
"a",
"b",
"c",
"d",
Arch.LINUX_AMD64,
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
)
== "a-b-c-d-debug-amd64"
)
def test_generate_config_name_a_b_c_release_unity_arm64():
assert (
generate_config_name(
"a",
"b",
"c",
"",
Arch.LINUX_ARM64,
BuildType.RELEASE,
BuildMode.UNITY_ON,
BuildOption.NONE,
)
== "a-b-c-release-unity-arm64"
)
def test_generate_config_name_a_b_coverage_publish_amd64():
assert (
generate_config_name(
"a",
"b",
"",
"",
Arch.LINUX_AMD64,
BuildType.PUBLISH,
BuildMode.UNITY_OFF,
BuildOption.COVERAGE,
)
== "a-b-coverage-publish-amd64"
)
def test_generate_config_name_a_asan_debug_unity_arm64():
assert (
generate_config_name(
"a",
"",
"",
"",
Arch.LINUX_ARM64,
BuildType.DEBUG,
BuildMode.UNITY_ON,
BuildOption.SANITIZE_ASAN,
)
== "a-asan-debug-unity-arm64"
)
def test_generate_config_name_a_c_tsan_release_amd64():
assert (
generate_config_name(
"a",
"",
"c",
"",
Arch.LINUX_AMD64,
BuildType.RELEASE,
BuildMode.UNITY_OFF,
BuildOption.SANITIZE_TSAN,
)
== "a-c-tsan-release-amd64"
)
def test_generate_config_name_a_d_voidstar_debug_amd64():
assert (
generate_config_name(
"a",
"",
"",
"d",
Arch.LINUX_AMD64,
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.VOIDSTAR,
)
== "a-d-voidstar-debug-amd64"
)
def test_generate_config_name_raises_on_none_os_name():
with pytest.raises(ValueError):
generate_config_name(
None,
"b",
"c",
"d",
Arch.LINUX_AMD64,
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
)
def test_generate_config_name_raises_on_empty_os_name():
with pytest.raises(ValueError):
generate_config_name(
"",
"b",
"c",
"d",
Arch.LINUX_AMD64,
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
)
def test_generate_cmake_args_a_b_debug():
assert (
generate_cmake_args(
"a",
"b",
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON"
)
def test_generate_cmake_args_gcc_12_no_wextra():
assert (
generate_cmake_args(
"gcc",
"12",
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON"
)
def test_generate_cmake_args_clang_16_no_wextra():
assert (
generate_cmake_args(
"clang",
"16",
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON"
)
def test_generate_cmake_args_a_b_release():
assert (
generate_cmake_args(
"a",
"b",
BuildType.RELEASE,
BuildMode.UNITY_OFF,
BuildOption.NONE,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dassert=ON"
)
def test_generate_cmake_args_a_b_publish():
assert (
generate_cmake_args(
"a",
"b",
BuildType.PUBLISH,
BuildMode.UNITY_OFF,
BuildOption.NONE,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON"
)
def test_generate_cmake_args_a_b_unity():
assert (
generate_cmake_args(
"a",
"b",
BuildType.DEBUG,
BuildMode.UNITY_ON,
BuildOption.NONE,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dunity=ON"
)
def test_generate_cmake_args_a_b_coverage():
assert (
generate_cmake_args(
"a",
"b",
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.COVERAGE,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0"
)
def test_generate_cmake_args_a_b_voidstar():
assert (
generate_cmake_args(
"a",
"b",
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.VOIDSTAR,
TestOption.NONE,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dvoidstar=ON"
)
def test_generate_cmake_args_a_b_reference_fee_500():
assert (
generate_cmake_args(
"a",
"b",
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
TestOption.REFERENCE_FEE_500,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -DUNIT_TEST_REFERENCE_FEE=500"
)
def test_generate_cmake_args_a_b_reference_fee_1000():
assert (
generate_cmake_args(
"a",
"b",
BuildType.DEBUG,
BuildMode.UNITY_OFF,
BuildOption.NONE,
TestOption.REFERENCE_FEE_1000,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -DUNIT_TEST_REFERENCE_FEE=1000"
)
def test_generate_cmake_args_a_b_multiple():
assert (
generate_cmake_args(
"a",
"b",
BuildType.RELEASE,
BuildMode.UNITY_ON,
BuildOption.VOIDSTAR,
TestOption.REFERENCE_FEE_500,
)
== "-Dtests=ON -Dwerr=ON -Dxrpld=ON -Dwextra=ON -Dassert=ON -Dunity=ON -Dvoidstar=ON -DUNIT_TEST_REFERENCE_FEE=500"
)
def test_generate_cmake_target_linux_debug():
assert generate_cmake_target("linux", BuildType.DEBUG) == "all"
def test_generate_cmake_target_linux_release():
assert generate_cmake_target("linux", BuildType.RELEASE) == "all"
def test_generate_cmake_target_linux_publish():
assert generate_cmake_target("linux", BuildType.PUBLISH) == "install"
def test_generate_cmake_target_macos_debug():
assert generate_cmake_target("macos", BuildType.DEBUG) == "all"
def test_generate_cmake_target_macos_release():
assert generate_cmake_target("macos", BuildType.RELEASE) == "all"
def test_generate_cmake_target_macos_publish():
assert generate_cmake_target("macos", BuildType.PUBLISH) == "install"
def test_generate_cmake_target_windows_debug():
assert generate_cmake_target("windows", BuildType.DEBUG) == "install"
def test_generate_cmake_target_windows_release():
assert generate_cmake_target("windows", BuildType.DEBUG) == "install"
def test_generate_cmake_target_windows_publish():
assert generate_cmake_target("windows", BuildType.DEBUG) == "install"
def test_generate_enable_options_linux_debug_no_publish():
assert generate_enable_options("linux", BuildType.DEBUG, PublishOption.NONE) == (
True,
False,
False,
)
def test_generate_enable_options_linux_release_package_only():
assert generate_enable_options(
"linux", BuildType.RELEASE, PublishOption.PACKAGE_ONLY
) == (True, True, False)
def test_generate_enable_options_linux_publish_image_only():
assert generate_enable_options(
"linux", BuildType.PUBLISH, PublishOption.IMAGE_ONLY
) == (True, False, True)
def test_generate_enable_options_macos_debug_package_only():
assert generate_enable_options(
"macos", BuildType.DEBUG, PublishOption.PACKAGE_ONLY
) == (True, True, False)
def test_generate_enable_options_macos_release_image_only():
assert generate_enable_options(
"macos", BuildType.RELEASE, PublishOption.IMAGE_ONLY
) == (True, False, True)
def test_generate_enable_options_macos_publish_package_and_image():
assert generate_enable_options(
"macos", BuildType.PUBLISH, PublishOption.PACKAGE_AND_IMAGE
) == (True, True, True)
def test_generate_enable_options_windows_debug_package_and_image():
assert generate_enable_options(
"windows", BuildType.DEBUG, PublishOption.PACKAGE_AND_IMAGE
) == (False, True, True)
def test_generate_enable_options_windows_release_no_publish():
assert generate_enable_options(
"windows", BuildType.RELEASE, PublishOption.NONE
) == (True, False, False)
def test_generate_enable_options_windows_publish_image_only():
assert generate_enable_options(
"windows", BuildType.PUBLISH, PublishOption.IMAGE_ONLY
) == (True, False, True)
def test_generate_image_name_linux():
assert generate_image_name("a", "b", "c", "d", "e") == "ghcr.io/xrplf/ci/a-b:c-d-e"
def test_generate_image_name_linux_raises_on_empty_os_name():
with pytest.raises(ValueError):
generate_image_name("", "b", "c", "d", "e")
def test_generate_image_name_linux_raises_on_empty_os_version():
with pytest.raises(ValueError):
generate_image_name("a", "", "c", "d", "e")
def test_generate_image_name_linux_raises_on_empty_compiler_name():
with pytest.raises(ValueError):
generate_image_name("a", "b", "", "d", "e")
def test_generate_image_name_linux_raises_on_empty_compiler_version():
with pytest.raises(ValueError):
generate_image_name("a", "b", "c", "", "e")
def test_generate_image_name_linux_raises_on_empty_image_sha():
with pytest.raises(ValueError):
generate_image_name("a", "b", "c", "e", "")
def test_generate_image_name_macos():
assert generate_image_name("macos", "", "", "", "") is None
def test_generate_image_name_macos_extra():
assert generate_image_name("macos", "value", "does", "not", "matter") is None
def test_generate_image_name_windows():
assert generate_image_name("windows", "", "", "", "") is None
def test_generate_image_name_windows_extra():
assert generate_image_name("windows", "value", "does", "not", "matter") is None

View File

@@ -0,0 +1,30 @@
import json
from dataclasses import _is_dataclass_instance, asdict
from typing import Any
def is_unique(items: list[Any]) -> bool:
"""Check if a list of dataclass objects contains only unique items.
As the items may not be hashable, we convert them to JSON strings first, and
then check if the list of strings is the same size as the set of strings.
Args:
items: The list of dataclass objects to check.
Returns:
True if the list contains only unique items, False otherwise.
Raises:
TypeError: If any of the items is not a dataclass.
"""
l = list()
s = set()
for item in items:
if not _is_dataclass_instance(item):
raise TypeError("items must be a list of dataclasses")
j = json.dumps(asdict(item))
l.append(j)
s.add(j)
return len(l) == len(s)

View File

@@ -0,0 +1,40 @@
from dataclasses import dataclass
import pytest
from helpers.unique import *
@dataclass
class ExampleInt:
value: int
@dataclass
class ExampleList:
values: list[int]
def test_unique_int():
assert is_unique([ExampleInt(1), ExampleInt(2), ExampleInt(3)])
def test_not_unique_int():
assert not is_unique([ExampleInt(1), ExampleInt(2), ExampleInt(1)])
def test_unique_list():
assert is_unique(
[ExampleList([1, 2, 3]), ExampleList([4, 5, 6]), ExampleList([7, 8, 9])]
)
def test_not_unique_list():
assert not is_unique(
[ExampleList([1, 2, 3]), ExampleList([4, 5, 6]), ExampleList([1, 2, 3])]
)
def test_unique_raises_on_non_dataclass():
with pytest.raises(TypeError):
is_unique([1, 2, 3])

View File

@@ -1,212 +0,0 @@
{
"architecture": [
{
"platform": "linux/amd64",
"runner": ["self-hosted", "Linux", "X64", "heavy"]
},
{
"platform": "linux/arm64",
"runner": ["self-hosted", "Linux", "ARM64", "heavy-arm64"]
}
],
"os": [
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "12",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "13",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "gcc",
"compiler_version": "15",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "16",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "17",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "18",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "19",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "bookworm",
"compiler_name": "clang",
"compiler_version": "20",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "trixie",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "trixie",
"compiler_name": "gcc",
"compiler_version": "15",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "trixie",
"compiler_name": "clang",
"compiler_version": "20",
"image_sha": "0525eae"
},
{
"distro_name": "debian",
"distro_version": "trixie",
"compiler_name": "clang",
"compiler_version": "21",
"image_sha": "0525eae"
},
{
"distro_name": "rhel",
"distro_version": "8",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "e1782cd"
},
{
"distro_name": "rhel",
"distro_version": "8",
"compiler_name": "clang",
"compiler_version": "any",
"image_sha": "e1782cd"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "gcc",
"compiler_version": "12",
"image_sha": "e1782cd"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "gcc",
"compiler_version": "13",
"image_sha": "e1782cd"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "e1782cd"
},
{
"distro_name": "rhel",
"distro_version": "9",
"compiler_name": "clang",
"compiler_version": "any",
"image_sha": "e1782cd"
},
{
"distro_name": "rhel",
"distro_version": "10",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "e1782cd"
},
{
"distro_name": "rhel",
"distro_version": "10",
"compiler_name": "clang",
"compiler_version": "any",
"image_sha": "e1782cd"
},
{
"distro_name": "ubuntu",
"distro_version": "jammy",
"compiler_name": "gcc",
"compiler_version": "12",
"image_sha": "e1782cd"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "gcc",
"compiler_version": "13",
"image_sha": "e1782cd"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "gcc",
"compiler_version": "14",
"image_sha": "e1782cd"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "16",
"image_sha": "e1782cd"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "17",
"image_sha": "e1782cd"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "18",
"image_sha": "e1782cd"
},
{
"distro_name": "ubuntu",
"distro_version": "noble",
"compiler_name": "clang",
"compiler_version": "19",
"image_sha": "e1782cd"
}
],
"build_type": ["Debug", "Release"],
"cmake_args": ["-Dunity=OFF", "-Dunity=ON"]
}

385
.github/scripts/strategy-matrix/linux.py vendored Executable file
View File

@@ -0,0 +1,385 @@
from helpers.defs import *
from helpers.enums import *
# The default CI image SHAs to use, which can be specified per distro group and
# can be overridden for individual distros, which is useful when debugging using
# a locally built CI image. See https://github.com/XRPLF/ci for the images.
DEBIAN_SHA = "sha-ca4517d"
RHEL_SHA = "sha-ca4517d"
UBUNTU_SHA = "sha-84afd81"
# We only build a selection of configurations for the various triggers to reduce
# pipeline runtime. Across all three operating systems we aim to cover all GCC
# and Clang versions, while not duplicating configurations too much. See also
# the README for more details.
# The Debian distros to build configurations for.
#
# We have the following distros available:
# - Debian Bullseye: GCC 12-15
# - Debian Bookworm: GCC 13-15, Clang 16-20
# - Debian Trixie: GCC 14-15, Clang 20-21
DEBIAN_DISTROS = [
Distro(
os_name="debian",
os_version="bullseye",
compiler_name="gcc",
compiler_version="14",
image_sha=DEBIAN_SHA,
specs=[
Spec(
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
publish_option=PublishOption.PACKAGE_ONLY,
triggers=[Trigger.COMMIT, Trigger.LABEL],
),
Spec(
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.PUBLISH],
publish_option=PublishOption.PACKAGE_AND_IMAGE,
triggers=[Trigger.MERGE],
),
Spec(
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="bullseye",
compiler_name="gcc",
compiler_version="15",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
build_modes=[BuildMode.UNITY_ON],
build_option=BuildOption.COVERAGE,
build_types=[BuildType.DEBUG],
triggers=[Trigger.COMMIT, Trigger.MERGE],
),
Spec(
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="bookworm",
compiler_name="gcc",
compiler_version="15",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="bookworm",
compiler_name="clang",
compiler_version="16",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
build_modes=[BuildMode.UNITY_OFF],
build_option=BuildOption.VOIDSTAR,
build_types=[BuildType.DEBUG],
publish_option=PublishOption.IMAGE_ONLY,
triggers=[Trigger.COMMIT],
),
Spec(
archs=[Arch.LINUX_ARM64],
build_modes=[BuildMode.UNITY_ON],
build_types=[BuildType.RELEASE],
triggers=[Trigger.MERGE],
),
Spec(
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="bookworm",
compiler_name="clang",
compiler_version="17",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="bookworm",
compiler_name="clang",
compiler_version="18",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="bookworm",
compiler_name="clang",
compiler_version="19",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="trixie",
compiler_name="gcc",
compiler_version="15",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="debian",
os_version="trixie",
compiler_name="clang",
compiler_version="21",
image_sha=DEBIAN_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
triggers=[Trigger.MERGE],
),
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
]
# The RHEL distros to build configurations for.
#
# We have the following distros available:
# - RHEL 8: GCC 14, Clang "any"
# - RHEL 9: GCC 12-14, Clang "any"
# - RHEL 10: GCC 14, Clang "any"
RHEL_DISTROS = [
Distro(
os_name="rhel",
os_version="8",
compiler_name="gcc",
compiler_version="14",
image_sha=RHEL_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="rhel",
os_version="8",
compiler_name="clang",
compiler_version="any",
image_sha=RHEL_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="rhel",
os_version="9",
compiler_name="gcc",
compiler_version="12",
image_sha=RHEL_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
build_modes=[BuildMode.UNITY_ON],
build_types=[BuildType.DEBUG],
triggers=[Trigger.COMMIT],
),
Spec(
archs=[Arch.LINUX_AMD64],
build_modes=[BuildMode.UNITY_ON],
build_types=[BuildType.RELEASE],
triggers=[Trigger.MERGE],
),
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="rhel",
os_version="9",
compiler_name="gcc",
compiler_version="13",
image_sha=RHEL_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="rhel",
os_version="10",
compiler_name="clang",
compiler_version="any",
image_sha=RHEL_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
]
# The Ubuntu distros to build configurations for.
#
# We have the following distros available:
# - Ubuntu Jammy (22.04): GCC 12
# - Ubuntu Noble (24.04): GCC 13-14, Clang 16-20
UBUNTU_DISTROS = [
Distro(
os_name="ubuntu",
os_version="jammy",
compiler_name="gcc",
compiler_version="12",
image_sha=UBUNTU_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="ubuntu",
os_version="noble",
compiler_name="gcc",
compiler_version="13",
image_sha=UBUNTU_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
build_modes=[BuildMode.UNITY_ON],
build_types=[BuildType.RELEASE],
triggers=[Trigger.MERGE],
),
Spec(
archs=[Arch.LINUX_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="ubuntu",
os_version="noble",
compiler_name="gcc",
compiler_version="14",
image_sha=UBUNTU_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="ubuntu",
os_version="noble",
compiler_name="clang",
compiler_version="17",
image_sha=UBUNTU_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
triggers=[Trigger.MERGE],
),
Spec(
archs=[Arch.LINUX_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="ubuntu",
os_version="noble",
compiler_name="clang",
compiler_version="18",
image_sha=UBUNTU_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="ubuntu",
os_version="noble",
compiler_name="clang",
compiler_version="19",
image_sha=UBUNTU_SHA,
specs=[
Spec(
archs=[Arch.LINUX_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
Distro(
os_name="ubuntu",
os_version="noble",
compiler_name="clang",
compiler_version="20",
image_sha=UBUNTU_SHA,
specs=[
Spec(
archs=[Arch.LINUX_AMD64],
build_modes=[BuildMode.UNITY_ON],
build_types=[BuildType.DEBUG],
triggers=[Trigger.COMMIT],
),
Spec(
archs=[Arch.LINUX_AMD64],
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.RELEASE],
triggers=[Trigger.MERGE],
),
Spec(
archs=[Arch.LINUX_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
]

View File

@@ -1,22 +0,0 @@
{
"architecture": [
{
"platform": "macos/arm64",
"runner": ["self-hosted", "macOS", "ARM64", "mac-runner-m1"]
}
],
"os": [
{
"distro_name": "macos",
"distro_version": "",
"compiler_name": "",
"compiler_version": "",
"image_sha": ""
}
],
"build_type": ["Debug", "Release"],
"cmake_args": [
"-Dunity=OFF -DCMAKE_POLICY_VERSION_MINIMUM=3.5",
"-Dunity=ON -DCMAKE_POLICY_VERSION_MINIMUM=3.5"
]
}

20
.github/scripts/strategy-matrix/macos.py vendored Executable file
View File

@@ -0,0 +1,20 @@
from helpers.defs import *
from helpers.enums import *
DISTROS = [
Distro(
os_name="macos",
specs=[
Spec(
archs=[Arch.MACOS_ARM64],
build_modes=[BuildMode.UNITY_OFF],
build_types=[BuildType.DEBUG],
triggers=[Trigger.COMMIT, Trigger.MERGE],
),
Spec(
archs=[Arch.MACOS_ARM64],
triggers=[Trigger.SCHEDULE],
),
],
),
]

View File

@@ -1,19 +0,0 @@
{
"architecture": [
{
"platform": "windows/amd64",
"runner": ["self-hosted", "Windows", "devbox"]
}
],
"os": [
{
"distro_name": "windows",
"distro_version": "",
"compiler_name": "",
"compiler_version": "",
"image_sha": ""
}
],
"build_type": ["Debug", "Release"],
"cmake_args": ["-Dunity=OFF", "-Dunity=ON"]
}

20
.github/scripts/strategy-matrix/windows.py vendored Executable file
View File

@@ -0,0 +1,20 @@
from helpers.defs import *
from helpers.enums import *
DISTROS = [
Distro(
os_name="windows",
specs=[
Spec(
archs=[Arch.WINDOWS_AMD64],
build_modes=[BuildMode.UNITY_ON],
build_types=[BuildType.RELEASE],
triggers=[Trigger.COMMIT, Trigger.MERGE],
),
Spec(
archs=[Arch.WINDOWS_AMD64],
triggers=[Trigger.SCHEDULE],
),
],
),
]

View File

@@ -112,9 +112,10 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [linux, macos, windows]
platform: [linux, macos, windows]
with:
os: ${{ matrix.os }}
platform: ${{ matrix.platform }}
trigger: commit
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -66,9 +66,10 @@ jobs:
strategy:
fail-fast: ${{ github.event_name == 'merge_group' }}
matrix:
os: [linux, macos, windows]
platform: [linux, macos, windows]
with:
os: ${{ matrix.os }}
strategy_matrix: ${{ github.event_name == 'schedule' && 'all' || 'minimal' }}
platform: ${{ matrix.platform }}
# The workflow dispatch event uses the same trigger as the schedule event.
trigger: ${{ github.event_name == 'push' && 'merge' || 'schedule' }}
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -22,7 +22,7 @@ defaults:
shell: bash
env:
BUILD_DIR: .build
BUILD_DIR: build
NPROC_SUBTRACT: 2
jobs:
@@ -36,7 +36,7 @@ jobs:
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Get number of processors
uses: XRPLF/actions/.github/actions/get-nproc@046b1620f6bfd6cd0985dc82c3df02786801fe0a
uses: XRPLF/actions/get-nproc@2ece4ec6ab7de266859a6f053571425b2bd684b6
id: nproc
with:
subtract: ${{ env.NPROC_SUBTRACT }}

View File

@@ -3,16 +3,6 @@ name: Build and test configuration
on:
workflow_call:
inputs:
build_dir:
description: "The directory where to build."
required: true
type: string
build_only:
description: 'Whether to only build or to build and test the code ("true", "false").'
required: true
type: boolean
build_type:
description: 'The build type to use ("Debug", "Release").'
type: string
@@ -29,6 +19,21 @@ on:
type: string
required: true
enable_tests:
description: "Whether to run the tests."
required: true
type: boolean
enable_package:
description: "Whether to publish a package."
required: true
type: boolean
enable_image:
description: "Whether to publish an image."
required: true
type: boolean
runs_on:
description: Runner to run the job on as a JSON string
required: true
@@ -59,6 +64,11 @@ defaults:
run:
shell: bash
env:
# Conan installs the generators in the build/generators directory, see the
# layout() method in conanfile.py. We then run CMake from the build directory.
BUILD_DIR: build
jobs:
build-and-test:
name: ${{ inputs.config_name }}
@@ -71,13 +81,13 @@ jobs:
steps:
- name: Cleanup workspace (macOS and Windows)
if: ${{ runner.os == 'macOS' || runner.os == 'Windows' }}
uses: XRPLF/actions/.github/actions/cleanup-workspace@01b244d2718865d427b499822fbd3f15e7197fcc
uses: XRPLF/actions/cleanup-workspace@2ece4ec6ab7de266859a6f053571425b2bd684b6
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@99685816bb60a95a66852f212f382580e180df3a
uses: XRPLF/actions/prepare-runner@2ece4ec6ab7de266859a6f053571425b2bd684b6
with:
disable_ccache: false
@@ -85,7 +95,7 @@ jobs:
uses: ./.github/actions/print-env
- name: Get number of processors
uses: XRPLF/actions/.github/actions/get-nproc@046b1620f6bfd6cd0985dc82c3df02786801fe0a
uses: XRPLF/actions/get-nproc@2ece4ec6ab7de266859a6f053571425b2bd684b6
id: nproc
with:
subtract: ${{ inputs.nproc_subtract }}
@@ -96,7 +106,6 @@ jobs:
- name: Build dependencies
uses: ./.github/actions/build-deps
with:
build_dir: ${{ inputs.build_dir }}
build_nproc: ${{ steps.nproc.outputs.nproc }}
build_type: ${{ inputs.build_type }}
# Set the verbosity to "quiet" for Windows to avoid an excessive
@@ -104,7 +113,7 @@ jobs:
log_verbosity: ${{ runner.os == 'Windows' && 'quiet' || 'verbose' }}
- name: Configure CMake
working-directory: ${{ inputs.build_dir }}
working-directory: ${{ env.BUILD_DIR }}
env:
BUILD_TYPE: ${{ inputs.build_type }}
CMAKE_ARGS: ${{ inputs.cmake_args }}
@@ -117,7 +126,7 @@ jobs:
..
- name: Build the binary
working-directory: ${{ inputs.build_dir }}
working-directory: ${{ env.BUILD_DIR }}
env:
BUILD_NPROC: ${{ steps.nproc.outputs.nproc }}
BUILD_TYPE: ${{ inputs.build_type }}
@@ -132,8 +141,6 @@ jobs:
- name: Upload the binary (Linux)
if: ${{ github.repository_owner == 'XRPLF' && runner.os == 'Linux' }}
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
env:
BUILD_DIR: ${{ inputs.build_dir }}
with:
name: xrpld-${{ inputs.config_name }}
path: ${{ env.BUILD_DIR }}/xrpld
@@ -142,7 +149,7 @@ jobs:
- name: Check linking (Linux)
if: ${{ runner.os == 'Linux' }}
working-directory: ${{ inputs.build_dir }}
working-directory: ${{ env.BUILD_DIR }}
run: |
ldd ./xrpld
if [ "$(ldd ./xrpld | grep -E '(libstdc\+\+|libgcc)' | wc -l)" -eq 0 ]; then
@@ -154,13 +161,13 @@ jobs:
- name: Verify presence of instrumentation (Linux)
if: ${{ runner.os == 'Linux' && env.ENABLED_VOIDSTAR == 'true' }}
working-directory: ${{ inputs.build_dir }}
working-directory: ${{ env.BUILD_DIR }}
run: |
./xrpld --version | grep libvoidstar
- name: Run the separate tests
if: ${{ !inputs.build_only }}
working-directory: ${{ inputs.build_dir }}
if: ${{ inputs.enable_tests }}
working-directory: ${{ env.BUILD_DIR }}
# Windows locks some of the build files while running tests, and parallel jobs can collide
env:
BUILD_TYPE: ${{ inputs.build_type }}
@@ -172,15 +179,15 @@ jobs:
-j "${PARALLELISM}"
- name: Run the embedded tests
if: ${{ !inputs.build_only }}
working-directory: ${{ runner.os == 'Windows' && format('{0}/{1}', inputs.build_dir, inputs.build_type) || inputs.build_dir }}
if: ${{ inputs.enable_tests }}
working-directory: ${{ runner.os == 'Windows' && format('{0}/{1}', env.BUILD_DIR, inputs.build_type) || env.BUILD_DIR }}
env:
BUILD_NPROC: ${{ steps.nproc.outputs.nproc }}
run: |
./xrpld --unittest --unittest-jobs "${BUILD_NPROC}"
- name: Debug failure (Linux)
if: ${{ failure() && runner.os == 'Linux' && !inputs.build_only }}
if: ${{ (failure() || cancelled()) && runner.os == 'Linux' && inputs.enable_tests }}
run: |
echo "IPv4 local port range:"
cat /proc/sys/net/ipv4/ip_local_port_range
@@ -188,8 +195,8 @@ jobs:
netstat -an
- name: Prepare coverage report
if: ${{ !inputs.build_only && env.ENABLED_COVERAGE == 'true' }}
working-directory: ${{ inputs.build_dir }}
if: ${{ github.repository_owner == 'XRPLF' && env.ENABLED_COVERAGE == 'true' }}
working-directory: ${{ env.BUILD_DIR }}
env:
BUILD_NPROC: ${{ steps.nproc.outputs.nproc }}
BUILD_TYPE: ${{ inputs.build_type }}
@@ -201,13 +208,13 @@ jobs:
--target coverage
- name: Upload coverage report
if: ${{ github.repository_owner == 'XRPLF' && !inputs.build_only && env.ENABLED_COVERAGE == 'true' }}
if: ${{ github.repository_owner == 'XRPLF' && env.ENABLED_COVERAGE == 'true' }}
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
with:
disable_search: true
disable_telem: true
fail_ci_if_error: true
files: ${{ inputs.build_dir }}/coverage.xml
files: ${{ env.BUILD_DIR }}/coverage.xml
plugins: noop
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -8,21 +8,14 @@ name: Build and test
on:
workflow_call:
inputs:
build_dir:
description: "The directory where to build."
platform:
description: "The platform to generate the strategy matrix for ('linux', 'macos', 'windows'). If not provided all platforms are used."
required: false
type: string
default: ".build"
os:
description: 'The operating system to use for the build ("linux", "macos", "windows").'
trigger:
description: "The trigger that caused the workflow to run ('commit', 'label', 'merge', 'schedule')."
required: true
type: string
strategy_matrix:
# TODO: Support additional strategies, e.g. "ubuntu" for generating all Ubuntu configurations.
description: 'The strategy matrix to use for generating the configurations ("minimal", "all").'
required: false
type: string
default: "minimal"
secrets:
CODECOV_TOKEN:
description: "The Codecov token to use for uploading coverage reports."
@@ -33,8 +26,8 @@ jobs:
generate-matrix:
uses: ./.github/workflows/reusable-strategy-matrix.yml
with:
os: ${{ inputs.os }}
strategy_matrix: ${{ inputs.strategy_matrix }}
platform: ${{ inputs.platform }}
trigger: ${{ inputs.trigger }}
# Build and test the binary for each configuration.
build-test-config:
@@ -46,13 +39,14 @@ jobs:
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
max-parallel: 10
with:
build_dir: ${{ inputs.build_dir }}
build_only: ${{ matrix.build_only }}
build_type: ${{ matrix.build_type }}
cmake_args: ${{ matrix.cmake_args }}
cmake_target: ${{ matrix.cmake_target }}
runs_on: ${{ toJSON(matrix.architecture.runner) }}
image: ${{ contains(matrix.architecture.platform, 'linux') && format('ghcr.io/xrplf/ci/{0}-{1}:{2}-{3}-sha-{4}', matrix.os.distro_name, matrix.os.distro_version, matrix.os.compiler_name, matrix.os.compiler_version, matrix.os.image_sha) || '' }}
enable_tests: ${{ matrix.enable_tests }}
enable_package: ${{ matrix.enable_package }}
enable_image: ${{ matrix.enable_image }}
runs_on: ${{ toJson(matrix.runs_on) }}
image: ${{ matrix.image }}
config_name: ${{ matrix.config_name }}
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -3,16 +3,14 @@ name: Generate strategy matrix
on:
workflow_call:
inputs:
os:
description: 'The operating system to use for the build ("linux", "macos", "windows").'
platform:
description: "The platform to generate the strategy matrix for ('linux', 'macos', 'windows'). If not provided all platforms are used."
required: false
type: string
strategy_matrix:
# TODO: Support additional strategies, e.g. "ubuntu" for generating all Ubuntu configurations.
description: 'The strategy matrix to use for generating the configurations ("minimal", "all").'
required: false
trigger:
description: "The trigger that caused the workflow to run ('commit', 'label', 'merge', 'schedule')."
required: true
type: string
default: "minimal"
outputs:
matrix:
description: "The generated strategy matrix."
@@ -40,6 +38,6 @@ jobs:
working-directory: .github/scripts/strategy-matrix
id: generate
env:
GENERATE_CONFIG: ${{ inputs.os != '' && format('--config={0}.json', inputs.os) || '' }}
GENERATE_OPTION: ${{ inputs.strategy_matrix == 'all' && '--all' || '' }}
run: ./generate.py ${GENERATE_OPTION} ${GENERATE_CONFIG} >> "${GITHUB_OUTPUT}"
PLATFORM: ${{ inputs.platform != '' && format('--platform={0}', inputs.platform) || '' }}
TRIGGER: ${{ format('--trigger={0}', inputs.trigger) }}
run: ./generate.py ${PLATFORM} ${TRIGGER} >> "${GITHUB_OUTPUT}"

View File

@@ -19,17 +19,17 @@ on:
branches: [develop]
paths:
# This allows testing changes to the upload workflow in a PR
- .github/workflows/upload-conan-deps.yml
- ".github/workflows/upload-conan-deps.yml"
push:
branches: [develop]
paths:
- .github/workflows/upload-conan-deps.yml
- .github/workflows/reusable-strategy-matrix.yml
- .github/actions/build-deps/action.yml
- .github/actions/setup-conan/action.yml
- ".github/workflows/upload-conan-deps.yml"
- ".github/workflows/reusable-strategy-matrix.yml"
- ".github/actions/build-deps/action.yml"
- ".github/actions/setup-conan/action.yml"
- ".github/scripts/strategy-matrix/**"
- conanfile.py
- conan.lock
- "conanfile.py"
- "conan.lock"
env:
CONAN_REMOTE_NAME: xrplf
@@ -49,7 +49,8 @@ jobs:
generate-matrix:
uses: ./.github/workflows/reusable-strategy-matrix.yml
with:
strategy_matrix: ${{ github.event_name == 'pull_request' && 'minimal' || 'all' }}
# The workflow dispatch event uses the same trigger as the schedule event.
trigger: ${{ github.event_name == 'pull_request' && 'commit' || (github.event_name == 'push' && 'merge' || 'schedule') }}
# Build and upload the dependencies for each configuration.
run-upload-conan-deps:
@@ -59,18 +60,18 @@ jobs:
fail-fast: false
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
max-parallel: 10
runs-on: ${{ matrix.architecture.runner }}
container: ${{ contains(matrix.architecture.platform, 'linux') && format('ghcr.io/xrplf/ci/{0}-{1}:{2}-{3}-sha-{4}', matrix.os.distro_name, matrix.os.distro_version, matrix.os.compiler_name, matrix.os.compiler_version, matrix.os.image_sha) || null }}
runs-on: ${{ matrix.runs_on }}
container: ${{ matrix.image }}
steps:
- name: Cleanup workspace (macOS and Windows)
if: ${{ runner.os == 'macOS' || runner.os == 'Windows' }}
uses: XRPLF/actions/.github/actions/cleanup-workspace@01b244d2718865d427b499822fbd3f15e7197fcc
uses: XRPLF/actions/cleanup-workspace@2ece4ec6ab7de266859a6f053571425b2bd684b6
- name: Checkout repository
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Prepare runner
uses: XRPLF/actions/.github/actions/prepare-runner@99685816bb60a95a66852f212f382580e180df3a
uses: XRPLF/actions/prepare-runner@2ece4ec6ab7de266859a6f053571425b2bd684b6
with:
disable_ccache: false
@@ -78,7 +79,7 @@ jobs:
uses: ./.github/actions/print-env
- name: Get number of processors
uses: XRPLF/actions/.github/actions/get-nproc@046b1620f6bfd6cd0985dc82c3df02786801fe0a
uses: XRPLF/actions/get-nproc@2ece4ec6ab7de266859a6f053571425b2bd684b6
id: nproc
with:
subtract: ${{ env.NPROC_SUBTRACT }}
@@ -92,7 +93,6 @@ jobs:
- name: Build dependencies
uses: ./.github/actions/build-deps
with:
build_dir: .build
build_nproc: ${{ steps.nproc.outputs.nproc }}
build_type: ${{ matrix.build_type }}
force_build: ${{ github.event_name == 'schedule' || github.event.inputs.force_source_build == 'true' }}

1
.gitignore vendored
View File

@@ -19,6 +19,7 @@ Release/
/tmp/
CMakeSettings.json
CMakeUserPresets.json
__pycache__
# Coverage files.
*.gcno

View File

@@ -1,445 +0,0 @@
#ifndef XRPL_JSON_OBJECT_H_INCLUDED
#define XRPL_JSON_OBJECT_H_INCLUDED
#include <xrpl/json/Writer.h>
#include <memory>
namespace Json {
/**
Collection is a base class for Array and Object, classes which provide the
facade of JSON collections for the O(1) JSON writer, while still using no
heap memory and only a very small amount of stack.
From http://json.org, JSON has two types of collection: array, and object.
Everything else is a *scalar* - a number, a string, a boolean, the special
value null, or a legacy Json::Value.
Collections must write JSON "as-it-goes" in order to get the strong
performance guarantees. This puts restrictions upon API users:
1. Only one collection can be open for change at any one time.
This condition is enforced automatically and a std::logic_error thrown if
it is violated.
2. A tag may only be used once in an Object.
Some objects have many tags, so this condition might be a little
expensive. Enforcement of this condition is turned on in debug builds and
a std::logic_error is thrown when the tag is added for a second time.
Code samples:
Writer writer;
// An empty object.
{
Object::Root (writer);
}
// Outputs {}
// An object with one scalar value.
{
Object::Root root (writer);
write["hello"] = "world";
}
// Outputs {"hello":"world"}
// Same, using chaining.
{
Object::Root (writer)["hello"] = "world";
}
// Output is the same.
// Add several scalars, with chaining.
{
Object::Root (writer)
.set ("hello", "world")
.set ("flag", false)
.set ("x", 42);
}
// Outputs {"hello":"world","flag":false,"x":42}
// Add an array.
{
Object::Root root (writer);
{
auto array = root.setArray ("hands");
array.append ("left");
array.append ("right");
}
}
// Outputs {"hands":["left", "right"]}
// Same, using chaining.
{
Object::Root (writer)
.setArray ("hands")
.append ("left")
.append ("right");
}
// Output is the same.
// Add an object.
{
Object::Root root (writer);
{
auto object = root.setObject ("hands");
object["left"] = false;
object["right"] = true;
}
}
// Outputs {"hands":{"left":false,"right":true}}
// Same, using chaining.
{
Object::Root (writer)
.setObject ("hands")
.set ("left", false)
.set ("right", true);
}
}
// Outputs {"hands":{"left":false,"right":true}}
Typical ways to make mistakes and get a std::logic_error:
Writer writer;
Object::Root root (writer);
// Repeat a tag.
{
root ["hello"] = "world";
root ["hello"] = "there"; // THROWS! in a debug build.
}
// Open a subcollection, then set something else.
{
auto object = root.setObject ("foo");
root ["hello"] = "world"; // THROWS!
}
// Open two subcollections at a time.
{
auto object = root.setObject ("foo");
auto array = root.setArray ("bar"); // THROWS!!
}
For more examples, check the unit tests.
*/
class Collection
{
public:
Collection(Collection&& c) noexcept;
Collection&
operator=(Collection&& c) noexcept;
Collection() = delete;
~Collection();
protected:
// A null parent means "no parent at all".
// Writers cannot be null.
Collection(Collection* parent, Writer*);
void
checkWritable(std::string const& label);
Collection* parent_;
Writer* writer_;
bool enabled_;
};
class Array;
//------------------------------------------------------------------------------
/** Represents a JSON object being written to a Writer. */
class Object : protected Collection
{
public:
/** Object::Root is the only Collection that has a public constructor. */
class Root;
/** Set a scalar value in the Object for a key.
A JSON scalar is a single value - a number, string, boolean, nullptr or
a Json::Value.
`set()` throws an exception if this object is disabled (which means that
one of its children is enabled).
In a debug build, `set()` also throws an exception if the key has
already been set() before.
An operator[] is provided to allow writing `object["key"] = scalar;`.
*/
template <typename Scalar>
void
set(std::string const& key, Scalar const&);
void
set(std::string const& key, Json::Value const&);
// Detail class and method used to implement operator[].
class Proxy;
Proxy
operator[](std::string const& key);
Proxy
operator[](Json::StaticString const& key);
/** Make a new Object at a key and return it.
This Object is disabled until that sub-object is destroyed.
Throws an exception if this Object was already disabled.
*/
Object
setObject(std::string const& key);
/** Make a new Array at a key and return it.
This Object is disabled until that sub-array is destroyed.
Throws an exception if this Object was already disabled.
*/
Array
setArray(std::string const& key);
protected:
friend class Array;
Object(Collection* parent, Writer* w) : Collection(parent, w)
{
}
};
class Object::Root : public Object
{
public:
/** Each Object::Root must be constructed with its own unique Writer. */
Root(Writer&);
};
//------------------------------------------------------------------------------
/** Represents a JSON array being written to a Writer. */
class Array : private Collection
{
public:
/** Append a scalar to the Arrary.
Throws an exception if this array is disabled (which means that one of
its sub-collections is enabled).
*/
template <typename Scalar>
void
append(Scalar const&);
/**
Appends a Json::Value to an array.
Throws an exception if this Array was disabled.
*/
void
append(Json::Value const&);
/** Append a new Object and return it.
This Array is disabled until that sub-object is destroyed.
Throws an exception if this Array was disabled.
*/
Object
appendObject();
/** Append a new Array and return it.
This Array is disabled until that sub-array is destroyed.
Throws an exception if this Array was already disabled.
*/
Array
appendArray();
protected:
friend class Object;
Array(Collection* parent, Writer* w) : Collection(parent, w)
{
}
};
//------------------------------------------------------------------------------
// Generic accessor functions to allow Json::Value and Collection to
// interoperate.
/** Add a new subarray at a named key in a Json object. */
Json::Value&
setArray(Json::Value&, Json::StaticString const& key);
/** Add a new subarray at a named key in a Json object. */
Array
setArray(Object&, Json::StaticString const& key);
/** Add a new subobject at a named key in a Json object. */
Json::Value&
addObject(Json::Value&, Json::StaticString const& key);
/** Add a new subobject at a named key in a Json object. */
Object
addObject(Object&, Json::StaticString const& key);
/** Append a new subarray to a Json array. */
Json::Value&
appendArray(Json::Value&);
/** Append a new subarray to a Json array. */
Array
appendArray(Array&);
/** Append a new subobject to a Json object. */
Json::Value&
appendObject(Json::Value&);
/** Append a new subobject to a Json object. */
Object
appendObject(Array&);
/** Copy all the keys and values from one object into another. */
void
copyFrom(Json::Value& to, Json::Value const& from);
/** Copy all the keys and values from one object into another. */
void
copyFrom(Object& to, Json::Value const& from);
/** An Object that contains its own Writer. */
class WriterObject
{
public:
WriterObject(Output const& output)
: writer_(std::make_unique<Writer>(output))
, object_(std::make_unique<Object::Root>(*writer_))
{
}
WriterObject(WriterObject&& other) = default;
Object*
operator->()
{
return object_.get();
}
Object&
operator*()
{
return *object_;
}
private:
std::unique_ptr<Writer> writer_;
std::unique_ptr<Object::Root> object_;
};
WriterObject
stringWriterObject(std::string&);
//------------------------------------------------------------------------------
// Implementation details.
// Detail class for Object::operator[].
class Object::Proxy
{
private:
Object& object_;
std::string const key_;
public:
Proxy(Object& object, std::string const& key);
template <class T>
void
operator=(T const& t)
{
object_.set(key_, t);
// Note: This function shouldn't return *this, because it's a trap.
//
// In Json::Value, foo[jss::key] returns a reference to a
// mutable Json::Value contained _inside_ foo. But in the case of
// Json::Object, where we write once only, there isn't any such
// reference that can be returned. Returning *this would return an
// object "a level higher" than in Json::Value, leading to obscure bugs,
// particularly in generic code.
}
};
//------------------------------------------------------------------------------
template <typename Scalar>
void
Array::append(Scalar const& value)
{
checkWritable("append");
if (writer_)
writer_->append(value);
}
template <typename Scalar>
void
Object::set(std::string const& key, Scalar const& value)
{
checkWritable("set");
if (writer_)
writer_->set(key, value);
}
inline Json::Value&
setArray(Json::Value& json, Json::StaticString const& key)
{
return (json[key] = Json::arrayValue);
}
inline Array
setArray(Object& json, Json::StaticString const& key)
{
return json.setArray(std::string(key));
}
inline Json::Value&
addObject(Json::Value& json, Json::StaticString const& key)
{
return (json[key] = Json::objectValue);
}
inline Object
addObject(Object& object, Json::StaticString const& key)
{
return object.setObject(std::string(key));
}
inline Json::Value&
appendArray(Json::Value& json)
{
return json.append(Json::arrayValue);
}
inline Array
appendArray(Array& json)
{
return json.appendArray();
}
inline Json::Value&
appendObject(Json::Value& json)
{
return json.append(Json::objectValue);
}
inline Object
appendObject(Array& json)
{
return json.appendObject();
}
} // namespace Json
#endif

View File

@@ -58,14 +58,14 @@ static_assert(apiMaximumSupportedVersion >= apiMinimumSupportedVersion);
static_assert(apiBetaVersion >= apiMaximumSupportedVersion);
static_assert(apiMaximumValidVersion >= apiMaximumSupportedVersion);
template <class JsonObject>
void
setVersion(JsonObject& parent, unsigned int apiVersion, bool betaEnabled)
inline void
setVersion(Json::Value& parent, unsigned int apiVersion, bool betaEnabled)
{
XRPL_ASSERT(
apiVersion != apiInvalidVersion,
"xrpl::RPC::setVersion : input is valid");
auto& retObj = addObject(parent, jss::version);
auto& retObj = parent[jss::version] = Json::objectValue;
if (apiVersion == apiVersionIfUnspecified)
{

View File

@@ -209,33 +209,11 @@ get_error_info(error_code_i code);
/** Add or update the json update to reflect the error code. */
/** @{ */
template <class JsonValue>
void
inject_error(error_code_i code, JsonValue& json)
{
ErrorInfo const& info(get_error_info(code));
json[jss::error] = info.token;
json[jss::error_code] = info.code;
json[jss::error_message] = info.message;
}
inject_error(error_code_i code, Json::Value& json);
template <class JsonValue>
void
inject_error(int code, JsonValue& json)
{
inject_error(error_code_i(code), json);
}
template <class JsonValue>
void
inject_error(error_code_i code, std::string const& message, JsonValue& json)
{
ErrorInfo const& info(get_error_info(code));
json[jss::error] = info.token;
json[jss::error_code] = info.code;
json[jss::error_message] = message;
}
inject_error(error_code_i code, std::string const& message, Json::Value& json);
/** @} */
/** Returns a new json object that reflects the error code. */

View File

@@ -9,7 +9,7 @@ namespace xrpl {
bool
isRpcError(Json::Value jvResult);
Json::Value
rpcError(int iError);
rpcError(error_code_i iError);
} // namespace xrpl

View File

@@ -1,233 +0,0 @@
#include <xrpl/basics/contract.h>
#include <xrpl/beast/utility/instrumentation.h>
#include <xrpl/json/Object.h>
#include <xrpl/json/Output.h>
#include <xrpl/json/Writer.h>
#include <xrpl/json/json_value.h>
#include <stdexcept>
#include <utility>
namespace Json {
Collection::Collection(Collection* parent, Writer* writer)
: parent_(parent), writer_(writer), enabled_(true)
{
checkWritable("Collection::Collection()");
if (parent_)
{
check(parent_->enabled_, "Parent not enabled in constructor");
parent_->enabled_ = false;
}
}
Collection::~Collection()
{
if (writer_)
writer_->finish();
if (parent_)
parent_->enabled_ = true;
}
Collection&
Collection::operator=(Collection&& that) noexcept
{
parent_ = that.parent_;
writer_ = that.writer_;
enabled_ = that.enabled_;
that.parent_ = nullptr;
that.writer_ = nullptr;
that.enabled_ = false;
return *this;
}
Collection::Collection(Collection&& that) noexcept
{
*this = std::move(that);
}
void
Collection::checkWritable(std::string const& label)
{
if (!enabled_)
xrpl::Throw<std::logic_error>(label + ": not enabled");
if (!writer_)
xrpl::Throw<std::logic_error>(label + ": not writable");
}
//------------------------------------------------------------------------------
Object::Root::Root(Writer& w) : Object(nullptr, &w)
{
writer_->startRoot(Writer::object);
}
Object
Object::setObject(std::string const& key)
{
checkWritable("Object::setObject");
if (writer_)
writer_->startSet(Writer::object, key);
return Object(this, writer_);
}
Array
Object::setArray(std::string const& key)
{
checkWritable("Object::setArray");
if (writer_)
writer_->startSet(Writer::array, key);
return Array(this, writer_);
}
//------------------------------------------------------------------------------
Object
Array::appendObject()
{
checkWritable("Array::appendObject");
if (writer_)
writer_->startAppend(Writer::object);
return Object(this, writer_);
}
Array
Array::appendArray()
{
checkWritable("Array::makeArray");
if (writer_)
writer_->startAppend(Writer::array);
return Array(this, writer_);
}
//------------------------------------------------------------------------------
Object::Proxy::Proxy(Object& object, std::string const& key)
: object_(object), key_(key)
{
}
Object::Proxy
Object::operator[](std::string const& key)
{
return Proxy(*this, key);
}
Object::Proxy
Object::operator[](Json::StaticString const& key)
{
return Proxy(*this, std::string(key));
}
//------------------------------------------------------------------------------
void
Array::append(Json::Value const& v)
{
auto t = v.type();
switch (t)
{
case Json::nullValue:
return append(nullptr);
case Json::intValue:
return append(v.asInt());
case Json::uintValue:
return append(v.asUInt());
case Json::realValue:
return append(v.asDouble());
case Json::stringValue:
return append(v.asString());
case Json::booleanValue:
return append(v.asBool());
case Json::objectValue: {
auto object = appendObject();
copyFrom(object, v);
return;
}
case Json::arrayValue: {
auto array = appendArray();
for (auto& item : v)
array.append(item);
return;
}
}
UNREACHABLE("Json::Array::append : invalid type"); // LCOV_EXCL_LINE
}
void
Object::set(std::string const& k, Json::Value const& v)
{
auto t = v.type();
switch (t)
{
case Json::nullValue:
return set(k, nullptr);
case Json::intValue:
return set(k, v.asInt());
case Json::uintValue:
return set(k, v.asUInt());
case Json::realValue:
return set(k, v.asDouble());
case Json::stringValue:
return set(k, v.asString());
case Json::booleanValue:
return set(k, v.asBool());
case Json::objectValue: {
auto object = setObject(k);
copyFrom(object, v);
return;
}
case Json::arrayValue: {
auto array = setArray(k);
for (auto& item : v)
array.append(item);
return;
}
}
UNREACHABLE("Json::Object::set : invalid type"); // LCOV_EXCL_LINE
}
//------------------------------------------------------------------------------
namespace {
template <class Object>
void
doCopyFrom(Object& to, Json::Value const& from)
{
XRPL_ASSERT(from.isObjectOrNull(), "Json::doCopyFrom : valid input type");
auto members = from.getMemberNames();
for (auto& m : members)
to[m] = from[m];
}
} // namespace
void
copyFrom(Json::Value& to, Json::Value const& from)
{
if (!to) // Short circuit this very common case.
to = from;
else
doCopyFrom(to, from);
}
void
copyFrom(Object& to, Json::Value const& from)
{
doCopyFrom(to, from);
}
WriterObject
stringWriterObject(std::string& s)
{
return WriterObject(stringOutput(s));
}
} // namespace Json

View File

@@ -17,7 +17,7 @@ namespace BuildInfo {
// and follow the format described at http://semver.org/
//------------------------------------------------------------------------------
// clang-format off
char const* const versionString = "3.1.0-b0"
char const* const versionString = "3.2.0-b0"
// clang-format on
#if defined(DEBUG) || defined(SANITIZER)

View File

@@ -160,6 +160,24 @@ constexpr ErrorInfo unknownError;
//------------------------------------------------------------------------------
void
inject_error(error_code_i code, Json::Value& json)
{
ErrorInfo const& info(get_error_info(code));
json[jss::error] = info.token;
json[jss::error_code] = info.code;
json[jss::error_message] = info.message;
}
void
inject_error(error_code_i code, std::string const& message, Json::Value& json)
{
ErrorInfo const& info(get_error_info(code));
json[jss::error] = info.token;
json[jss::error_code] = info.code;
json[jss::error_message] = message;
}
ErrorInfo const&
get_error_info(error_code_i code)
{

View File

@@ -9,7 +9,7 @@ struct RPCErr;
// VFALCO NOTE Deprecated function
Json::Value
rpcError(int iError)
rpcError(error_code_i iError)
{
Json::Value jvResult(Json::objectValue);
RPC::inject_error(iError, jvResult);

View File

@@ -303,7 +303,7 @@ public:
// offers unfunded.
// b. Carol's remaining 800 offers are consumed as unfunded.
// c. 199 of alice's XRP(1) to USD(3) offers are consumed.
// A book step is allowed to consume a maxium of 1000 offers
// A book step is allowed to consume a maximum of 1000 offers
// at a given quality, and that limit is now reached.
// d. Now the strand is dry, even though there are still funded
// XRP(1) to USD(3) offers available.
@@ -384,7 +384,7 @@ public:
// offers unfunded.
// b. Carol's remaining 800 offers are consumed as unfunded.
// c. 199 of alice's XRP(1) to USD(3) offers are consumed.
// A book step is allowed to consume a maxium of 1000 offers
// A book step is allowed to consume a maximum of 1000 offers
// at a given quality, and that limit is now reached.
// d. Now the strand is dry, even though there are still funded
// XRP(1) to USD(3) offers available. Bob has spent 400 EUR and

View File

@@ -1298,7 +1298,7 @@ public:
testNegativeBalance(FeatureBitset features)
{
// This test creates an offer test for negative balance
// with transfer fees and miniscule funds.
// with transfer fees and minuscule funds.
testcase("Negative Balance");
using namespace jtx;

View File

@@ -254,7 +254,7 @@ class RCLValidations_test : public beast::unit_test::suite
BEAST_EXPECT(trie.branchSupport(ledg_258) == 4);
// Move three of the s258 ledgers to s259, which splits the trie
// due to the 256 ancestory limit
// due to the 256 ancestry limit
BEAST_EXPECT(trie.remove(ledg_258, 3));
trie.insert(ledg_259, 3);
trie.getPreferred(1);
@@ -275,7 +275,7 @@ class RCLValidations_test : public beast::unit_test::suite
// then verify the remove call works
// past bug: remove had assumed the first child of a node in the trie
// which matches is the *only* child in the trie which matches.
// This is **NOT** true with the limited 256 ledger ancestory
// This is **NOT** true with the limited 256 ledger ancestry
// quirk of RCLValidation and prevents deleting the old support
// for ledger 257

View File

@@ -3821,7 +3821,7 @@ public:
return result;
};
testcase("straightfoward positive case");
testcase("straightforward positive case");
{
// Queue up some transactions at a too-low fee.
auto aliceSeq = env.seq(alice);

View File

@@ -18,14 +18,14 @@ namespace csf {
- Comparison : T a, b; bool res = a < b
- Addition: T a, b; T c = a + b;
- Multiplication : T a, std::size_t b; T c = a * b;
- Divison: T a; std::size_t b; T c = a/b;
- Division: T a; std::size_t b; T c = a/b;
*/
template <class T, class Compare = std::less<T>>
class Histogram
{
// TODO: Consider logarithimic bins around expected median if this becomes
// TODO: Consider logarithmic bins around expected median if this becomes
// unscaleable
std::map<T, std::size_t, Compare> counts_;
std::size_t samples = 0;

View File

@@ -31,7 +31,7 @@ struct Rate
/** Submits transactions to a specified peer
Submits successive transactions beginning at start, then spaced according
to succesive calls of distribution(), until stop.
to successive calls of distribution(), until stop.
@tparam Distribution is a `UniformRandomBitGenerator` from the STL that
is used by random distributions to generate random samples

View File

@@ -1,239 +0,0 @@
#include <test/json/TestOutputSuite.h>
#include <xrpl/beast/unit_test.h>
#include <xrpl/json/Object.h>
namespace Json {
class JsonObject_test : public xrpl::test::TestOutputSuite
{
void
setup(std::string const& testName)
{
testcase(testName);
output_.clear();
}
std::unique_ptr<WriterObject> writerObject_;
Object&
makeRoot()
{
writerObject_ =
std::make_unique<WriterObject>(stringWriterObject(output_));
return **writerObject_;
}
void
expectResult(std::string const& expected)
{
writerObject_.reset();
TestOutputSuite::expectResult(expected);
}
public:
void
testTrivial()
{
setup("trivial");
{
auto& root = makeRoot();
(void)root;
}
expectResult("{}");
}
void
testSimple()
{
setup("simple");
{
auto& root = makeRoot();
root["hello"] = "world";
root["skidoo"] = 23;
root["awake"] = false;
root["temperature"] = 98.6;
}
expectResult(
"{\"hello\":\"world\","
"\"skidoo\":23,"
"\"awake\":false,"
"\"temperature\":98.6}");
}
void
testOneSub()
{
setup("oneSub");
{
auto& root = makeRoot();
root.setArray("ar");
}
expectResult("{\"ar\":[]}");
}
void
testSubs()
{
setup("subs");
{
auto& root = makeRoot();
{
// Add an array with three entries.
auto array = root.setArray("ar");
array.append(23);
array.append(false);
array.append(23.5);
}
{
// Add an object with one entry.
auto obj = root.setObject("obj");
obj["hello"] = "world";
}
{
// Add another object with two entries.
Json::Value value;
value["h"] = "w";
value["f"] = false;
root["obj2"] = value;
}
}
// Json::Value has an unstable order...
auto case1 =
"{\"ar\":[23,false,23.5],"
"\"obj\":{\"hello\":\"world\"},"
"\"obj2\":{\"h\":\"w\",\"f\":false}}";
auto case2 =
"{\"ar\":[23,false,23.5],"
"\"obj\":{\"hello\":\"world\"},"
"\"obj2\":{\"f\":false,\"h\":\"w\"}}";
writerObject_.reset();
BEAST_EXPECT(output_ == case1 || output_ == case2);
}
void
testSubsShort()
{
setup("subsShort");
{
auto& root = makeRoot();
{
// Add an array with three entries.
auto array = root.setArray("ar");
array.append(23);
array.append(false);
array.append(23.5);
}
// Add an object with one entry.
root.setObject("obj")["hello"] = "world";
{
// Add another object with two entries.
auto object = root.setObject("obj2");
object.set("h", "w");
object.set("f", false);
}
}
expectResult(
"{\"ar\":[23,false,23.5],"
"\"obj\":{\"hello\":\"world\"},"
"\"obj2\":{\"h\":\"w\",\"f\":false}}");
}
void
testFailureObject()
{
{
setup("object failure assign");
auto& root = makeRoot();
auto obj = root.setObject("o1");
expectException([&]() { root["fail"] = "complete"; });
}
{
setup("object failure object");
auto& root = makeRoot();
auto obj = root.setObject("o1");
expectException([&]() { root.setObject("o2"); });
}
{
setup("object failure Array");
auto& root = makeRoot();
auto obj = root.setArray("o1");
expectException([&]() { root.setArray("o2"); });
}
}
void
testFailureArray()
{
{
setup("array failure append");
auto& root = makeRoot();
auto array = root.setArray("array");
auto subarray = array.appendArray();
auto fail = [&]() { array.append("fail"); };
expectException(fail);
}
{
setup("array failure appendArray");
auto& root = makeRoot();
auto array = root.setArray("array");
auto subarray = array.appendArray();
auto fail = [&]() { array.appendArray(); };
expectException(fail);
}
{
setup("array failure appendObject");
auto& root = makeRoot();
auto array = root.setArray("array");
auto subarray = array.appendArray();
auto fail = [&]() { array.appendObject(); };
expectException(fail);
}
}
void
testKeyFailure()
{
setup("repeating keys");
auto& root = makeRoot();
root.set("foo", "bar");
root.set("baz", 0);
// setting key again throws in !NDEBUG builds
auto set_again = [&]() { root.set("foo", "bar"); };
#ifdef NDEBUG
set_again();
pass();
#else
expectException(set_again);
#endif
}
void
run() override
{
testTrivial();
testSimple();
testOneSub();
testSubs();
testSubsShort();
testFailureObject();
testFailureArray();
testKeyFailure();
}
};
BEAST_DEFINE_TESTSUITE(JsonObject, json, xrpl);
} // namespace Json

View File

@@ -3,7 +3,6 @@
#include <xrpld/rpc/RPCCall.h>
#include <xrpl/basics/contract.h>
#include <xrpl/json/Object.h>
#include <xrpl/protocol/ErrorCodes.h>
#include <xrpl/protocol/HashPrefix.h>
#include <xrpl/protocol/Indexes.h>
@@ -83,7 +82,7 @@ cmdToJSONRPC(
// If paramsObj is not empty, put it in a [params] array.
if (paramsObj.begin() != paramsObj.end())
{
auto& paramsArray = Json::setArray(jv, jss::params);
auto& paramsArray = jv[jss::params] = Json::arrayValue;
paramsArray.append(paramsObj);
}
if (paramsObj.isMember(jss::jsonrpc))

View File

@@ -7,7 +7,6 @@
#include <xrpld/rpc/Context.h>
#include <xrpl/basics/chrono.h>
#include <xrpl/json/Object.h>
#include <xrpl/protocol/serialize.h>
namespace xrpl {
@@ -42,10 +41,9 @@ struct LedgerFill
std::optional<NetClock::time_point> closeTime;
};
/** Given a Ledger and options, fill a Json::Object or Json::Value with a
/** Given a Ledger and options, fill a Json::Value with a
description of the ledger.
*/
void
addJson(Json::Value&, LedgerFill const&);
@@ -53,6 +51,10 @@ addJson(Json::Value&, LedgerFill const&);
Json::Value
getJson(LedgerFill const&);
/** Copy all the keys and values from one object into another. */
void
copyFrom(Json::Value& to, Json::Value const& from);
} // namespace xrpl
#endif

View File

@@ -32,10 +32,9 @@ isBinary(LedgerFill const& fill)
return fill.options & LedgerFill::binary;
}
template <class Object>
void
fillJson(
Object& json,
Json::Value& json,
bool closed,
LedgerHeader const& info,
bool bFull,
@@ -78,9 +77,8 @@ fillJson(
}
}
template <class Object>
void
fillJsonBinary(Object& json, bool closed, LedgerHeader const& info)
fillJsonBinary(Json::Value& json, bool closed, LedgerHeader const& info)
{
if (!closed)
json[jss::closed] = false;
@@ -207,11 +205,10 @@ fillJsonTx(
return txJson;
}
template <class Object>
void
fillJsonTx(Object& json, LedgerFill const& fill)
fillJsonTx(Json::Value& json, LedgerFill const& fill)
{
auto&& txns = setArray(json, jss::transactions);
auto& txns = json[jss::transactions] = Json::arrayValue;
auto bBinary = isBinary(fill);
auto bExpanded = isExpanded(fill);
@@ -238,12 +235,11 @@ fillJsonTx(Object& json, LedgerFill const& fill)
}
}
template <class Object>
void
fillJsonState(Object& json, LedgerFill const& fill)
fillJsonState(Json::Value& json, LedgerFill const& fill)
{
auto& ledger = fill.ledger;
auto&& array = Json::setArray(json, jss::accountState);
auto& array = json[jss::accountState] = Json::arrayValue;
auto expanded = isExpanded(fill);
auto binary = isBinary(fill);
@@ -251,7 +247,7 @@ fillJsonState(Object& json, LedgerFill const& fill)
{
if (binary)
{
auto&& obj = appendObject(array);
auto& obj = array.append(Json::objectValue);
obj[jss::hash] = to_string(sle->key());
obj[jss::tx_blob] = serializeHex(*sle);
}
@@ -262,17 +258,16 @@ fillJsonState(Object& json, LedgerFill const& fill)
}
}
template <class Object>
void
fillJsonQueue(Object& json, LedgerFill const& fill)
fillJsonQueue(Json::Value& json, LedgerFill const& fill)
{
auto&& queueData = Json::setArray(json, jss::queue_data);
auto& queueData = json[jss::queue_data] = Json::arrayValue;
auto bBinary = isBinary(fill);
auto bExpanded = isExpanded(fill);
for (auto const& tx : fill.txQueue)
{
auto&& txJson = appendObject(queueData);
auto& txJson = queueData.append(Json::objectValue);
txJson[jss::fee_level] = to_string(tx.feeLevel);
if (tx.lastValid)
txJson[jss::LastLedgerSequence] = *tx.lastValid;
@@ -297,9 +292,8 @@ fillJsonQueue(Object& json, LedgerFill const& fill)
}
}
template <class Object>
void
fillJson(Object& json, LedgerFill const& fill)
fillJson(Json::Value& json, LedgerFill const& fill)
{
// TODO: what happens if bBinary and bExtracted are both set?
// Is there a way to report this back?
@@ -327,7 +321,7 @@ fillJson(Object& json, LedgerFill const& fill)
void
addJson(Json::Value& json, LedgerFill const& fill)
{
auto&& object = Json::addObject(json, jss::ledger);
auto& object = json[jss::ledger] = Json::objectValue;
fillJson(object, fill);
if ((fill.options & LedgerFill::dumpQueue) && !fill.txQueue.empty())
@@ -342,4 +336,20 @@ getJson(LedgerFill const& fill)
return json;
}
void
copyFrom(Json::Value& to, Json::Value const& from)
{
if (!to) // Short circuit this very common case.
to = from;
else
{
// TODO: figure out if there is a way to remove this clause
// or check that it does/needs to do a deep copy
XRPL_ASSERT(from.isObjectOrNull(), "copyFrom : invalid input type");
auto const members = from.getMemberNames();
for (auto const& m : members)
to[m] = from[m];
}
}
} // namespace xrpl

View File

@@ -94,9 +94,8 @@ public:
/** Apply the Status to a JsonObject
*/
template <class Object>
void
inject(Object& object) const
inject(Json::Value& object) const
{
if (auto ec = toErrorCode())
{

View File

@@ -277,7 +277,7 @@ checkPayment(
app);
if (pf.findPaths(app.config().PATH_SEARCH_OLD))
{
// 4 is the maxium paths
// 4 is the maximum paths
pf.computePathRanks(4);
STPath fullLiquidityPath;
STPathSet paths;

View File

@@ -3,8 +3,6 @@
#include <xrpld/app/main/Application.h>
#include <xrpl/json/Object.h>
namespace xrpl {
Json::Value

View File

@@ -75,6 +75,43 @@ LedgerHandler::check()
return Status::OK;
}
void
LedgerHandler::writeResult(Json::Value& value)
{
if (ledger_)
{
copyFrom(value, result_);
addJson(value, {*ledger_, &context_, options_, queueTxs_});
}
else
{
auto& master = context_.app.getLedgerMaster();
{
auto& closed = value[jss::closed] = Json::objectValue;
addJson(closed, {*master.getClosedLedger(), &context_, 0});
}
{
auto& open = value[jss::open] = Json::objectValue;
addJson(open, {*master.getCurrentLedger(), &context_, 0});
}
}
Json::Value warnings{Json::arrayValue};
if (context_.params.isMember(jss::type))
{
Json::Value& w = warnings.append(Json::objectValue);
w[jss::id] = warnRPC_FIELDS_DEPRECATED;
w[jss::message] =
"Some fields from your request are deprecated. Please check the "
"documentation at "
"https://xrpl.org/docs/references/http-websocket-apis/ "
"and update your request. Field `type` is deprecated.";
}
if (warnings.size())
value[jss::warnings] = std::move(warnings);
}
} // namespace RPC
std::pair<org::xrpl::rpc::v1::GetLedgerResponse, grpc::Status>

View File

@@ -9,7 +9,6 @@
#include <xrpld/rpc/Status.h>
#include <xrpld/rpc/detail/Handler.h>
#include <xrpl/json/Object.h>
#include <xrpl/ledger/ReadView.h>
#include <xrpl/protocol/ApiVersion.h>
#include <xrpl/protocol/jss.h>
@@ -37,9 +36,8 @@ public:
Status
check();
template <class Object>
void
writeResult(Object&);
writeResult(Json::Value&);
static constexpr char name[] = "ledger";
@@ -59,49 +57,6 @@ private:
int options_ = 0;
};
////////////////////////////////////////////////////////////////////////////////
////////////////////////////////////////////////////////////////////////////////
//
// Implementation.
template <class Object>
void
LedgerHandler::writeResult(Object& value)
{
if (ledger_)
{
Json::copyFrom(value, result_);
addJson(value, {*ledger_, &context_, options_, queueTxs_});
}
else
{
auto& master = context_.app.getLedgerMaster();
{
auto&& closed = Json::addObject(value, jss::closed);
addJson(closed, {*master.getClosedLedger(), &context_, 0});
}
{
auto&& open = Json::addObject(value, jss::open);
addJson(open, {*master.getCurrentLedger(), &context_, 0});
}
}
Json::Value warnings{Json::arrayValue};
if (context_.params.isMember(jss::type))
{
Json::Value& w = warnings.append(Json::objectValue);
w[jss::id] = warnRPC_FIELDS_DEPRECATED;
w[jss::message] =
"Some fields from your request are deprecated. Please check the "
"documentation at "
"https://xrpl.org/docs/references/http-websocket-apis/ "
"and update your request. Field `type` is deprecated.";
}
if (warnings.size())
value[jss::warnings] = std::move(warnings);
}
} // namespace RPC
} // namespace xrpl

View File

@@ -20,9 +20,8 @@ public:
return Status::OK;
}
template <class Object>
void
writeResult(Object& obj)
writeResult(Json::Value& obj)
{
setVersion(obj, apiVersion_, betaEnabled_);
}