Compare commits

..

24 Commits

Author SHA1 Message Date
Olek
6a6fed5dce More hostfunctions (#5451)
* Bug fixes:
- Fix bugs found during schedule table tests
- Add more tests
- Add parameters passing for runEscrowWasm function

* Add new host-functions
 fix wamr logging
 add runtime passing through HF
 fix runEscrowWasm interface

* Improve logs

* Fix logging bug

* Set 4k limit for update_data HF

* allHF wasm module fixes
2025-05-30 19:01:27 -04:00
Mayukha Vadari
1f8aece8cd feat: add a GasUsed parameter to the metadata (#5456) 2025-05-29 16:36:55 -04:00
Mayukha Vadari
6c6f8cd4f9 Merge remote-tracking branch 'upstream/develop' into develop3 2025-05-29 13:05:11 -04:00
Mayukha Vadari
fb1311e013 uncomment???? 2025-05-28 14:00:50 -04:00
Mayukha Vadari
ce31acf030 debug comments 2025-05-28 13:48:38 -04:00
Mayukha Vadari
31ad5ac63b Merge remote-tracking branch 'upstream/ripple/smart-escrow' into develop3 2025-05-27 18:29:41 -04:00
Mayukha Vadari
1ede0bdec4 fix: fix fixtures (#5445) 2025-05-23 17:37:14 -04:00
Mayukha Vadari
aef32ead2c better WASM logging to match rippled (#5395)
* basic logging

* pass in Journal

* log level based on journal level

* clean up

* attempt at adding WAMR logging properly

* improve logline

* maybe_unused

* fix

* fix

* fix segfault

* add test
2025-05-23 10:31:02 -04:00
Mayukha Vadari
5b43ec7f73 refactor: switch function name from ready to finish (#5430) 2025-05-20 16:12:19 -04:00
Olek
1e9ff88a00 Fix CI build issues
* Mac build fix
* Windows build fix
* Windows instruction counter fix
2025-05-08 12:39:37 -04:00
Mayukha Vadari
bb9bb5f5c5 Merge branch 'ripple/smart-escrow' into develop2 2025-05-01 18:44:06 -04:00
Mayukha Vadari
c533abd8b6 Update size and compute cap defaults (#5417) 2025-05-01 18:41:51 -04:00
Olek
bb9bc764bc Switch to WAMR (#5416)
* Switch to WAMR
2025-05-01 18:02:06 -04:00
Mayukha Vadari
b4b53a6cb7 Merge branch 'ripple/smart-escrow' into develop2 2025-04-29 15:25:54 -04:00
Mayukha Vadari
9c0204906c fix reference fee tests 2025-04-29 15:25:00 -04:00
Mayukha Vadari
4670b373c1 try to fix tests 2025-04-29 14:10:27 -04:00
Mayukha Vadari
f03b5883bd More host functions (#5411)
* getNFT

* escrow keylet

* account keylet

* credential keylet

* oracle keylet

* hook everything in

* fix stuff
2025-04-29 12:39:12 -04:00
Mayukha Vadari
f8b2fe4dd5 fix imports 2025-04-28 17:43:15 -04:00
Mayukha Vadari
be4a0c9c2b Merge remote-tracking branch 'upstream/ripple/smart-escrow' into develop2 2025-04-28 17:14:28 -04:00
Mayukha Vadari
f37d52d8e9 Set up fees for WASM processing (#5393)
* set up fields

* throw error if allowance is too high

* votable gas price

* fix comments

* hook everything together

* make test less flaky (hopefully)

* fix other tests

* fix some tests

* fix tests

* clean up

* add more tests

* uncomment other tests

* respond to comments

* fix build

* respond to comments
2025-04-24 08:47:13 -04:00
Mayukha Vadari
177cdaf550 Connect votable gas limit into VM (#5360)
* [WIP] add gas limit

* [WIP] host function escrow tests

* finish test

* uncomment out tests
2025-03-25 10:55:33 -04:00
pwang200
1573a443b7 smart escrow devnet 1 host functions (#5353)
* devnet 1 host functions

* clang-format

* fix build issues
2025-03-24 17:07:17 -04:00
Mayukha Vadari
911c0466c0 Merge develop into ripple/smart-escrow (#5357)
* Set version to 2.4.0

* refactor: Remove unused and add missing includes (#5293)

The codebase is filled with includes that are unused, and which thus can be removed. At the same time, the files often do not include all headers that contain the definitions used in those files. This change uses clang-format and clang-tidy to clean up the includes, with minor manual intervention to ensure the code compiles on all platforms.

* refactor: Calculate numFeatures automatically (#5324)

Requiring manual updates of numFeatures is an annoying manual process that is easily forgotten, and leads to frequent merge conflicts. This change takes advantage of the `XRPL_FEATURE` and `XRPL_FIX` macros, and adds a new `XRPL_RETIRE` macro to automatically set `numFeatures`.

* refactor: Improve ordering of headers with clang-format (#5343)

Removes all manual header groupings from source and header files by leveraging clang-format options.

* Rename "deadlock" to "stall" in `LoadManager` (#5341)

What the LoadManager class does is stall detection, which is not the same as deadlock detection. In the condition of severe CPU starvation, LoadManager will currently intentionally crash rippled reporting `LogicError: Deadlock detected`. This error message is misleading as the condition being detected is not a deadlock. This change fixes and refactors the code in response.

* Adds hub.xrpl-commons.org as a new Bootstrap Cluster (#5263)

* fix: Error message for ledger_entry rpc (#5344)

Changes the error to `malformedAddress` for `permissioned_domain` in the `ledger_entry` rpc, when the account is not a string. This change makes it more clear to a user what is wrong with their request.

* fix: Handle invalid marker parameter in grpc call (#5317)

The `end_marker` is used to limit the range of ledger entries to fetch. If `end_marker` is less than `marker`, a crash can occur. This change adds an additional check.

* fix: trust line RPC no ripple flag (#5345)

The Trustline RPC `no_ripple` flag gets set depending on `lsfDefaultRipple` flag, which is not a flag of a trustline but of the account root. The `lsfDefaultRipple` flag does not provide any insight if this particular trust line has `lsfLowNoRipple` or `lsfHighNoRipple` flag set, so it should not be used here at all. This change simplifies the logic.

* refactor: Updates Conan dependencies: RocksDB (#5335)

Updates RocksDB to version 9.7.3, the latest version supported in Conan 1.x. A patch for 9.7.4 that fixes a memory leak is included.

* fix: Remove null pointer deref, just do abort (#5338)

This change removes the existing undefined behavior from `LogicError`, so we can be certain that there will be always a stacktrace.

De-referencing a null pointer is an old trick to generate `SIGSEGV`, which would typically also create a stacktrace. However it is also an undefined behaviour and compilers can do something else. A more robust way to create a stacktrace while crashing the program is to use `std::abort`, which we have also used in this location for a long time. If we combine the two, we might not get the expected behaviour - namely, the nullpointer deref followed by `std::abort`, as handled in certain compiler versions may not immediately cause a crash. We have observed stacktrace being wiped instead, and thread put in indeterminate state, then stacktrace created without any useful information.

* chore: Add PR number to payload (#5310)

This PR adds one more payload field to the libXRPL compatibility check workflow - the PR number itself.

* chore: Update link to ripple-binary-codec (#5355)

The link to ripple-binary-codec's definitions.json appears to be outdated. The updated link is also documented here: https://xrpl.org/docs/references/protocol/binary-format#definitions-file

* Prevent consensus from getting stuck in the establish phase (#5277)

- Detects if the consensus process is "stalled". If it is, then we can declare a 
  consensus and end successfully even if we do not have 80% agreement on
  our proposal.
  - "Stalled" is defined as:
    - We have a close time consensus
    - Each disputed transaction is individually stalled:
      - It has been in the final "stuck" 95% requirement for at least 2
        (avMIN_ROUNDS) "inner rounds" of phaseEstablish,
      - and either all of the other trusted proposers or this validator, if proposing,
        have had the same vote(s) for at least 4 (avSTALLED_ROUNDS) "inner
        rounds", and at least 80% of the validators (including this one, if
        appropriate) agree about the vote (whether yes or no).
- If we have been in the establish phase for more than 10x the previous
  consensus establish phase's time, then consensus is considered "expired",
  and we will leave the round, which sends a partial validation (indicating
  that the node is moving on without validating). Two restrictions avoid
  prematurely exiting, or having an extended exit in extreme situations.
  - The 10x time is clamped to be within a range of 15s
    (ledgerMAX_CONSENSUS) to 120s (ledgerABANDON_CONSENSUS).
  - If consensus has not had an opportunity to walk through all avalanche
    states (defined as not going through 8 "inner rounds" of phaseEstablish),
    then ConsensusState::Expired is treated as ConsensusState::No.
- When enough nodes leave the round, any remaining nodes will see they've
  fallen behind, and move on, too, generally before hitting the timeout. Any
  validations or partial validations sent during this time will help the
  consensus process bring the nodes back together.

---------

Co-authored-by: Michael Legleux <mlegleux@ripple.com>
Co-authored-by: Bart <bthomee@users.noreply.github.com>
Co-authored-by: Ed Hennis <ed@ripple.com>
Co-authored-by: Bronek Kozicki <brok@incorrekt.com>
Co-authored-by: Darius Tumas <Tokeiito@users.noreply.github.com>
Co-authored-by: Sergey Kuznetsov <skuznetsov@ripple.com>
Co-authored-by: cyan317 <120398799+cindyyan317@users.noreply.github.com>
Co-authored-by: Vlad <129996061+vvysokikh1@users.noreply.github.com>
Co-authored-by: Alex Kremer <akremer@ripple.com>
2025-03-20 16:47:14 -04:00
Mayukha Vadari
b6a95f9970 PoC Smart Escrows (#5340)
* wasmedge in unittest

* add WashVM.h and cpp

* accountID comparison (vector<u8>) working

* json decode tx and ledger object with two buffers working

* wasm return a buffer working

* add a failure test case to P2P3

* host function return ledger sqn

* instruction gas and host function gas

* basics

* add scaffold

* add amendment check

* working PoC

* get test working

* fix clang-format

* prototype #2

* p2p3

* [WIP] P4

* P5

* add calculateBaseFee

* add FinishFunction preflight checks (+ tests)

* additional reserve for sfFinishFunction

* higher fees for EscrowFinish

* rename amendment to SmartEscrow

* make fee voting changes, add basic tests

* clean up

* clean up

* clean up

* more cleanup

* add subscribe tests

* add more tests

* undo formatting

* undo formatting

* remove bad comment

* more debugging statements

* fix clang-format

* fix rebase issues

* fix more rebase issues

* more rebase fixes

* add source code for wasm

* respond to comments

* add const

---------

Co-authored-by: Peng Wang <pwang200@gmail.com>
2025-03-20 14:08:06 -04:00
402 changed files with 32601 additions and 20644 deletions

View File

@@ -1,5 +1,5 @@
---
Language: Cpp
Language: Cpp
AccessModifierOffset: -4
AlignAfterOpenBracket: AlwaysBreak
AlignConsecutiveAssignments: false
@@ -19,52 +19,52 @@ AlwaysBreakTemplateDeclarations: true
BinPackArguments: false
BinPackParameters: false
BraceWrapping:
AfterClass: true
AfterClass: true
AfterControlStatement: true
AfterEnum: false
AfterFunction: true
AfterNamespace: false
AfterEnum: false
AfterFunction: true
AfterNamespace: false
AfterObjCDeclaration: true
AfterStruct: true
AfterUnion: true
BeforeCatch: true
BeforeElse: true
IndentBraces: false
AfterStruct: true
AfterUnion: true
BeforeCatch: true
BeforeElse: true
IndentBraces: false
BreakBeforeBinaryOperators: false
BreakBeforeBraces: Custom
BreakBeforeTernaryOperators: true
BreakConstructorInitializersBeforeComma: true
ColumnLimit: 80
CommentPragmas: '^ IWYU pragma:'
ColumnLimit: 80
CommentPragmas: "^ IWYU pragma:"
ConstructorInitializerAllOnOneLineOrOnePerLine: true
ConstructorInitializerIndentWidth: 4
ContinuationIndentWidth: 4
Cpp11BracedListStyle: true
DerivePointerAlignment: false
DisableFormat: false
DisableFormat: false
ExperimentalAutoDetectBinPacking: false
ForEachMacros: [ Q_FOREACH, BOOST_FOREACH ]
IncludeBlocks: Regroup
ForEachMacros: [Q_FOREACH, BOOST_FOREACH]
IncludeBlocks: Regroup
IncludeCategories:
- Regex: '^<(test)/'
Priority: 0
- Regex: '^<(xrpld)/'
Priority: 1
- Regex: '^<(xrpl)/'
Priority: 2
- Regex: '^<(boost)/'
Priority: 3
- Regex: '^.*/'
Priority: 4
- Regex: '^.*\.h'
Priority: 5
- Regex: '.*'
Priority: 6
IncludeIsMainRegex: '$'
- Regex: "^<(test)/"
Priority: 0
- Regex: "^<(xrpld)/"
Priority: 1
- Regex: "^<(xrpl)/"
Priority: 2
- Regex: "^<(boost)/"
Priority: 3
- Regex: "^.*/"
Priority: 4
- Regex: '^.*\.h'
Priority: 5
- Regex: ".*"
Priority: 6
IncludeIsMainRegex: "$"
IndentCaseLabels: true
IndentFunctionDeclarationAfterType: false
IndentRequiresClause: true
IndentWidth: 4
IndentWidth: 4
IndentWrappedFunctionNames: false
KeepEmptyLinesAtTheStartOfBlocks: false
MaxEmptyLinesToKeep: 1
@@ -78,20 +78,20 @@ PenaltyBreakString: 1000
PenaltyExcessCharacter: 1000000
PenaltyReturnTypeOnItsOwnLine: 200
PointerAlignment: Left
ReflowComments: true
ReflowComments: true
RequiresClausePosition: OwnLine
SortIncludes: true
SortIncludes: true
SpaceAfterCStyleCast: false
SpaceBeforeAssignmentOperators: true
SpaceBeforeParens: ControlStatements
SpaceInEmptyParentheses: false
SpacesBeforeTrailingComments: 2
SpacesInAngles: false
SpacesInAngles: false
SpacesInContainerLiterals: true
SpacesInCStyleCastParentheses: false
SpacesInParentheses: false
SpacesInSquareBrackets: false
Standard: Cpp11
TabWidth: 8
UseTab: Never
QualifierAlignment: Right
Standard: Cpp11
TabWidth: 8
UseTab: Never
QualifierAlignment: Right

View File

@@ -1,76 +0,0 @@
name: Build and Test (Linux and MacOS)
inputs:
build_dir:
description: 'The directory where to build.'
required: true
type: string
build_type:
description: 'The build type to use.'
required: true
type: string
cmake_args:
description: 'Additional arguments to pass to CMake.'
required: false
type: string
cmake_generator:
description: 'The CMake generator to use for the build.'
required: true
type: string
cmake_target:
description: 'The CMake target to build.'
required: true
type: string
os:
description: 'A string representing which operating system is used.'
required: true
type: choice
options:
- Linux
- MacOS
- Windows
# Install the Conan profiles and log into the specified remote. We first remove
# the remote if it already exists, which can occur on self-hosted runners where
# the workspace is not cleaned up between runs.
runs:
using: composite
steps:
- name: Configure CMake
shell: bash
working-directory: ${{ inputs.build_dir }}
run: |
cmake \
${{ inputs.cmake_generator && format('-G "{0}"', inputs.cmake_generator) || '' }} \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${{ inputs.build_type }} \
-Dtests=TRUE \
-Dxrpld=TRUE \
${{ inputs.cmake_args }} \
..
- name: Build the binary
shell: bash
working-directory: ${{ inputs.build_dir }}
run: |
cmake --build . \
--config ${{ inputs.build_type }} \
--parallel $(nproc) \
--target ${{ inputs.cmake_target }}
- name: Check linking
if: inputs.os == 'Linux'
shell: bash
working-directory: ${{ inputs.build_dir }}
run: |
ldd ./rippled
if [ "$(ldd ./rippled | grep -E '(libstdc\+\+|libgcc)' | wc -l)" -eq 0 ]; then
echo 'The binary is statically linked.'
else
echo 'The binary is dynamically linked.'
exit 1
fi
- name: Test the binary
shell: bash
working-directory: ${{ inputs.build_dir }}/${{ inputs.os == 'Windows' && inputs.build_type || '' }}
run: |
./rippled --unittest --unittest-jobs $(nproc)
ctest -j $(nproc) --output-on-failure

34
.github/actions/build/action.yml vendored Normal file
View File

@@ -0,0 +1,34 @@
name: build
inputs:
generator:
default: null
configuration:
required: true
cmake-args:
default: null
cmake-target:
default: all
# An implicit input is the environment variable `build_dir`.
runs:
using: composite
steps:
- name: configure
shell: bash
run: |
cd ${build_dir}
cmake \
${{ inputs.generator && format('-G "{0}"', inputs.generator) || '' }} \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }} \
-Dtests=TRUE \
-Dxrpld=TRUE \
${{ inputs.cmake-args }} \
..
- name: build
shell: bash
run: |
cmake \
--build ${build_dir} \
--config ${{ inputs.configuration }} \
--parallel ${NUM_PROCESSORS:-$(nproc)} \
--target ${{ inputs.cmake-target }}

View File

@@ -1,51 +0,0 @@
name: Configure Conan
inputs:
conan_global_conf:
description: 'The contents of the global Conan configuration file.'
required: true
type: string
conan_remote_name:
description: 'The name of the Conan remote to use.'
required: true
type: string
conan_remote_url:
description: 'The URL of the Conan remote to use.'
required: true
type: string
conan_remote_username:
description: 'The username for logging into the Conan remote.'
required: true
type: string
conan_remote_password:
description: 'The password for logging into the Conan remote.'
required: true
type: string
# Install the Conan profiles and log into the specified remote. We first remove
# the remote if it already exists, which can occur on self-hosted runners where
# the workspace is not cleaned up between runs.
runs:
using: composite
steps:
- name: Install Conan profile
shell: bash
run: |
echo "${{ inputs.conan_global_conf }}" >> $(conan config home)/global.conf
conan config install conan/profiles/default -tf $(conan config home)/profiles/
echo "Installed Conan profile:"
conan profile show
- name: Add Conan remote
shell: bash
run: |
if conan remote list | grep -q '${{ inputs.conan_remote_name }}'; then
conan remote remove ${{ inputs.conan_remote_name }}
echo "Removed Conan remote '${{ inputs.conan_remote_name }}'."
fi
conan remote add --index 0 ${{ inputs.conan_remote_name }} ${{ inputs.conan_remote_url }}
echo "Added new conan remote '${{ inputs.conan_remote_name }}' at ${{ inputs.conan_remote_url }}."
- name: Log into Conan remote
shell: bash
run: |
conan remote login ${{ inputs.conan_remote_name }} ${{ inputs.conan_remote_username }} --password "${{ inputs.conan_remote_password }}"
conan remote list-users

58
.github/actions/dependencies/action.yml vendored Normal file
View File

@@ -0,0 +1,58 @@
name: dependencies
inputs:
configuration:
required: true
# An implicit input is the environment variable `build_dir`.
runs:
using: composite
steps:
- name: unlock Conan
shell: bash
run: conan remove --locks
- name: export custom recipes
shell: bash
run: |
conan config set general.revisions_enabled=1
conan export external/snappy snappy/1.1.10@
conan export external/rocksdb rocksdb/9.7.3@
conan export external/soci soci/4.0.3@
conan export external/nudb nudb/2.0.8@
conan export -k external/wamr wamr/2.2.0@
- name: add Ripple Conan remote
shell: bash
run: |
conan remote list
conan remote remove ripple || true
# Do not quote the URL. An empty string will be accepted (with
# a non-fatal warning), but a missing argument will not.
conan remote add ripple ${{ env.CONAN_URL }} --insert 0
- name: try to authenticate to Ripple Conan remote
id: remote
shell: bash
run: |
# `conan user` implicitly uses the environment variables
# CONAN_LOGIN_USERNAME_<REMOTE> and CONAN_PASSWORD_<REMOTE>.
# https://docs.conan.io/1/reference/commands/misc/user.html#using-environment-variables
# https://docs.conan.io/1/reference/env_vars.html#conan-login-username-conan-login-username-remote-name
# https://docs.conan.io/1/reference/env_vars.html#conan-password-conan-password-remote-name
echo outcome=$(conan user --remote ripple --password >&2 \
&& echo success || echo failure) | tee ${GITHUB_OUTPUT}
- name: list missing binaries
id: binaries
shell: bash
# Print the list of dependencies that would need to be built locally.
# A non-empty list means we have "failed" to cache binaries remotely.
run: |
echo missing=$(conan info . --build missing --settings build_type=${{ inputs.configuration }} --json 2>/dev/null | grep '^\[') | tee ${GITHUB_OUTPUT}
- name: install dependencies
shell: bash
run: |
mkdir ${build_dir}
cd ${build_dir}
conan install \
--output-folder . \
--build missing \
--options tests=True \
--options xrpld=True \
--settings build_type=${{ inputs.configuration }} \
..

View File

@@ -1,37 +0,0 @@
name: Install-dependencies
inputs:
build_dir:
description: 'The directory where to build.'
required: true
type: string
build_type:
description: 'The build type to use.'
required: true
type: string
conan_remote_name:
description: 'The name of the Conan remote to use.'
required: true
type: string
# Install the Conan profiles and log into the specified remote. We first remove
# the remote if it already exists, which can occur on self-hosted runners where
# the workspace is not cleaned up between runs.
runs:
using: composite
steps:
- name: Install Conan dependencies
shell: bash
run: |
mkdir -p ${{ inputs.build_dir }}
cd ${{ inputs.build_dir }}
conan install \
--output-folder . \
--build * \
--options:host "&:tests=True" \
--options:host "&:xrpld=True" \
--settings:all build_type=${{ inputs.build_type }} \
..
- name: Upload Conan dependencies
shell: bash
run: conan upload '*' --confirm --remote ${{ inputs.conan_remote_name }}

View File

@@ -1,225 +0,0 @@
# This workflow builds and tests the binary on various Debian configurations.
name: Debian
on:
workflow_call:
inputs:
build_dir:
description: 'The directory where to build.'
required: false
type: string
default: '.build'
conan_remote_name:
description: 'The name of the Conan remote to use.'
required: true
type: string
conan_remote_url:
description: 'The URL of the Conan remote to use.'
required: true
type: string
secrets:
conan_remote_username:
description: 'The username for logging into the Conan remote.'
required: true
conan_remote_password:
description: 'The password for logging into the Conan remote.'
required: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-debian
cancel-in-progress: true
defaults:
run:
shell: bash
env:
# Global configuration for Conan. This is used to set the number of parallel
# downloads, uploads, and build jobs. The verbosity is set to verbose to
# provide more information during the build process.
CONAN_GLOBAL_CONF: |
core.download:parallel={{ os.cpu_count() }}
core.upload:parallel={{ os.cpu_count() }}
tools.build:jobs={{ (os.cpu_count() * 4/5) | int }}
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
# GitHub does not allow us to specify a reusable matrix strategy, so to avoid
# duplication, we define it here using environment variables and create the
# matrix in the first job. The matrix defined below should be kept in sync
# with https://github.com/XRPLF/ci/blob/main/.github/workflows/debian.yml.
STRATEGY_MATRIX_ARCHITECTURE: >-
[
{
"platform": "linux/amd64",
"runner": ["self-hosted", "Linux", "X64", "devbox"]
},
{
"platform": "linux/arm64",
"runner": ["self-hosted", "Linux", "ARM64", "devbox"]
}
]
STRATEGY_MATRIX_OS: >-
[
{
"distro": "debian",
"release": "bookworm",
"compiler_name": "gcc",
"compiler_version": "12"
},
{
"distro": "debian",
"release": "bookworm",
"compiler_name": "gcc",
"compiler_version": "13"
},
{
"distro": "debian",
"release": "bookworm",
"compiler_name": "gcc",
"compiler_version": "14"
},
{
"distro": "debian",
"release": "bookworm",
"compiler_name": "clang",
"compiler_version": "16"
},
{
"distro": "debian",
"release": "bookworm",
"compiler_name": "clang",
"compiler_version": "17"
},
{
"distro": "debian",
"release": "bookworm",
"compiler_name": "clang",
"compiler_version": "18"
},
{
"distro": "debian",
"release": "bookworm",
"compiler_name": "clang",
"compiler_version": "19"
}
]
STRATEGY_MATRIX_BUILD_TYPE: >-
[
"Debug",
"Release"
]
STRATEGY_MATRIX_CMAKE_ARGS: >-
[
"-DUnity=OFF",
"-DUnity=ON"
]
# STRATEGY_MATRIX_ARCHITECTURE: >-
# [
# {
# "platform": "linux/amd64",
# "runner": ["self-hosted", "Linux", "X64"]
# }
# ]
# STRATEGY_MATRIX_OS: >-
# [
# {
# "distro": "debian",
# "release": "bookworm",
# "compiler_name": "gcc",
# "compiler_version": "12"
# }
# ]
# STRATEGY_MATRIX_BUILD_TYPE: >-
# [
# "Release"
# ]
# STRATEGY_MATRIX_CMAKE_ARGS: >-
# [
# "-DUnity=ON"
# ]
jobs:
# Generate the strategy matrix and expose environment variables to be used by
# following jobs. Exposing env vars this way is needed as they cannot be
# directly passed as inputs to reusable workflows (although they can be passed
# as inputs to actions).
generate-outputs:
runs-on: ubuntu-latest
steps:
- name: Generate outputs
id: generate
run: |
echo "strategy_matrix_architecture=$(jq -c <<< '${{ env.STRATEGY_MATRIX_ARCHITECTURE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_os=$(jq -c <<< '${{ env.STRATEGY_MATRIX_OS }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_build_type=$(jq -c <<< '${{ env.STRATEGY_MATRIX_BUILD_TYPE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_cmake_args=$(jq -c <<< '${{ env.STRATEGY_MATRIX_CMAKE_ARGS }}')" >> "$GITHUB_OUTPUT"
outputs:
conan_global_conf: ${{ env.CONAN_GLOBAL_CONF }}
strategy_matrix_architecture: ${{ steps.generate.outputs.strategy_matrix_architecture }}
strategy_matrix_os: ${{ steps.generate.outputs.strategy_matrix_os }}
strategy_matrix_build_type: ${{ steps.generate.outputs.strategy_matrix_build_type }}
strategy_matrix_cmake_args: ${{ steps.generate.outputs.strategy_matrix_cmake_args }}
# Install and cache the dependencies, and then build and test the binary using
# various configurations.
build-test:
needs:
- generate-outputs
runs-on: ${{ matrix.architecture.runner }}
container: ghcr.io/xrplf/ci/${{ matrix.os.distro }}-${{ matrix.os.release }}:${{ matrix.os.compiler_name }}-${{ matrix.os.compiler_version }}
strategy:
fail-fast: false
max-parallel: 4
matrix:
architecture: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_architecture) }}
os: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_os) }}
build_type: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_build_type) }}
cmake_args: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_cmake_args) }}
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check configuration
shell: bash
run: |
echo "Checking path"
echo ${PATH} | tr ':' '\n'
echo "Checking environment variables."
env | sort
- name: Check versions
shell: bash
run: |
echo "Checking CMake version."
cmake --version
echo "Checking compiler version."
${CC} --version
echo "Checking Conan version."
conan --version
echo "Checking Ninja version."
ninja --version
- name: Configure Conan
uses: ./.github/actions/configure-conan
with:
conan_global_conf: ${{ inputs.conan_global_conf }}
conan_remote_name: ${{ inputs.conan_remote_name }}
conan_remote_url: ${{ inputs.conan_remote_url }}
conan_remote_username: ${{ secrets.conan_remote_username }}
conan_remote_password: ${{ secrets.conan_remote_password }}
- name: Install dependencies
uses: ./.github/actions/install-dependencies
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
conan_remote_name: ${{ inputs.conan_remote_name }}
- name: Build and test the binary
uses: ./.github/actions/build-test
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
cmake_args: ${{ matrix.cmake_args }}
cmake_generator: 'Ninja'
cmake_target: 'all'
os: 'Linux'

View File

@@ -1,160 +0,0 @@
# This workflow builds and tests the binary on various MacOS configurations.
name: MacOS
on:
workflow_call:
inputs:
build_dir:
description: 'The directory where to build.'
required: false
type: string
default: '.build'
conan_remote_name:
description: 'The name of the Conan remote to use.'
required: true
type: string
conan_remote_url:
description: 'The URL of the Conan remote to use.'
required: true
type: string
secrets:
conan_remote_username:
description: 'The username for logging into the Conan remote.'
required: true
conan_remote_password:
description: 'The password for logging into the Conan remote.'
required: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-macos
cancel-in-progress: true
defaults:
run:
shell: bash
env:
# Global configuration for Conan. This is used to set the number of parallel
# downloads, uploads, and build jobs. The verbosity is set to verbose to
# provide more information during the build process.
CONAN_GLOBAL_CONF: |
core.download:parallel={{ os.cpu_count() }}
core.upload:parallel={{ os.cpu_count() }}
tools.build:jobs={{ (os.cpu_count() * 4/5) | int }}
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
# GitHub does not allow us to specify a reusable matrix strategy, so to avoid
# duplication, we define it here using environment variables and create the
# matrix in the first job.
STRATEGY_MATRIX_ARCHITECTURE: >-
[
{
"runner": ["self-hosted", "macOS", "ARM64", "devbox"]
}
]
STRATEGY_MATRIX_BUILD_TYPE: >-
[
"Debug",
"Release"
]
STRATEGY_MATRIX_CMAKE_ARGS: >-
[
"-DCMAKE_POLICY_VERSION_MINIMUM=3.5 -DUnity=OFF",
"-DCMAKE_POLICY_VERSION_MINIMUM=3.5 -DUnity=ON"
]
# STRATEGY_MATRIX_ARCHITECTURE: >-
# [
# {
# "runner": ["self-hosted", "macOS", "ARM64", "mac-runner-m1"]
# }
# ]
# STRATEGY_MATRIX_BUILD_TYPE: >-
# [
# "Release"
# ]
# STRATEGY_MATRIX_CMAKE_ARGS: >-
# [
# "-DCMAKE_POLICY_VERSION_MINIMUM=3.5 -DUnity=ON"
# ]
jobs:
# Generate the strategy matrix and expose environment variables to be used by
# following jobs. Exposing env vars this way is needed as they cannot be
# directly passed as inputs to reusable workflows (although they can be passed
# as inputs to actions).
generate-outputs:
runs-on: ubuntu-latest
steps:
- name: Generate outputs
id: generate
run: |
echo "strategy_matrix_architecture=$(jq -c <<< '${{ env.STRATEGY_MATRIX_ARCHITECTURE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_build_type=$(jq -c <<< '${{ env.STRATEGY_MATRIX_BUILD_TYPE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_cmake_args=$(jq -c <<< '${{ env.STRATEGY_MATRIX_CMAKE_ARGS }}')" >> "$GITHUB_OUTPUT"
outputs:
conan_global_conf: ${{ env.CONAN_GLOBAL_CONF }}
strategy_matrix_architecture: ${{ steps.generate.outputs.strategy_matrix_architecture }}
strategy_matrix_build_type: ${{ steps.generate.outputs.strategy_matrix_build_type }}
strategy_matrix_cmake_args: ${{ steps.generate.outputs.strategy_matrix_cmake_args }}
# Install and cache the dependencies, and then build and test the binary using
# various configurations.
build-test:
needs:
- generate-outputs
runs-on: ${{ matrix.architecture.runner }}
strategy:
fail-fast: false
max-parallel: 4
matrix:
architecture: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_architecture) }}
build_type: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_build_type) }}
cmake_args: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_cmake_args) }}
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check configuration
shell: bash
run: |
echo "Checking path"
echo ${PATH} | tr ':' '\n'
echo "Checking environment variables."
env | sort
- name: Check versions
shell: bash
run: |
echo "Checking CMake version."
cmake --version
echo "Checking compiler version."
clang --version
echo "Checking Conan version."
conan --version
echo "Checking Ninja version."
ninja --version
- name: Configure Conan
uses: ./.github/actions/configure-conan
with:
conan_global_conf: ${{ inputs.conan_global_conf }}
conan_remote_name: ${{ inputs.conan_remote_name }}
conan_remote_url: ${{ inputs.conan_remote_url }}
conan_remote_username: ${{ secrets.conan_remote_username }}
conan_remote_password: ${{ secrets.conan_remote_password }}
- name: Install dependencies
uses: ./.github/actions/install-dependencies
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
conan_remote_name: ${{ inputs.conan_remote_name }}
- name: Build and test the binary
uses: ./.github/actions/build-test
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
cmake_args: ${{ matrix.cmake_args }}
cmake_generator: 'Ninja'
cmake_target: 'all'
os: 'MacOS'

View File

@@ -1,202 +0,0 @@
# This workflow builds and tests the binary on various Red Hat Enterprise Linux
# configurations.
name: RHEL
on:
workflow_call:
inputs:
build_dir:
description: 'The directory where to build.'
required: false
type: string
default: '.build'
conan_remote_name:
description: 'The name of the Conan remote to use.'
required: true
type: string
conan_remote_url:
description: 'The URL of the Conan remote to use.'
required: true
type: string
secrets:
conan_remote_username:
description: 'The username for logging into the Conan remote.'
required: true
conan_remote_password:
description: 'The password for logging into the Conan remote.'
required: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-rhel
cancel-in-progress: true
defaults:
run:
shell: bash
env:
# Global configuration for Conan. This is used to set the number of parallel
# downloads, uploads, and build jobs. The verbosity is set to verbose to
# provide more information during the build process.
CONAN_GLOBAL_CONF: |
core.download:parallel={{ os.cpu_count() }}
core.upload:parallel={{ os.cpu_count() }}
tools.build:jobs={{ (os.cpu_count() * 4/5) | int }}
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
# GitHub does not allow us to specify a reusable matrix strategy, so to avoid
# duplication, we define it here using environment variables and create the
# matrix in the first job. The matrix defined below should be kept in sync
# with https://github.com/XRPLF/ci/blob/main/.github/workflows/rhel.yml.
STRATEGY_MATRIX_ARCHITECTURE: >-
[
{
"platform": "linux/amd64",
"runner": ["self-hosted", "Linux", "X64", "devbox"]
},
{
"platform": "linux/arm64",
"runner": ["self-hosted", "Linux", "ARM64", "devbox"]
}
]
STRATEGY_MATRIX_OS: >-
[
{
"distro": "rhel",
"release": "9.6",
"compiler_name": "gcc",
"compiler_version": "13"
},
{
"distro": "rhel",
"release": "9.6",
"compiler_name": "gcc",
"compiler_version": "14"
},
{
"distro": "rhel",
"release": "9.6",
"compiler_name": "clang",
"compiler_version": "any"
}
]
STRATEGY_MATRIX_BUILD_TYPE: >-
[
"Debug",
"Release"
]
STRATEGY_MATRIX_CMAKE_ARGS: >-
[
"-DUnity=OFF",
"-DUnity=ON"
]
# STRATEGY_MATRIX_ARCHITECTURE: >-
# [
# {
# "platform": "linux/amd64",
# "runner": ["self-hosted", "Linux", "X64"]
# }
# ]
# STRATEGY_MATRIX_OS: >-
# [
# {
# "distro": "rhel",
# "release": "9.6",
# "compiler_name": "gcc",
# "compiler_version": "13"
# }
# ]
# STRATEGY_MATRIX_BUILD_TYPE: >-
# [
# "Release"
# ]
# STRATEGY_MATRIX_CMAKE_ARGS: >-
# [
# "-DUnity=ON"
# ]
jobs:
# Generate the strategy matrix and expose environment variables to be used by
# following jobs. Exposing env vars this way is needed as they cannot be
# directly passed as inputs to reusable workflows (although they can be passed
# as inputs to actions).
generate-outputs:
runs-on: ubuntu-latest
steps:
- name: Generate outputs
id: generate
run: |
echo "strategy_matrix_architecture=$(jq -c <<< '${{ env.STRATEGY_MATRIX_ARCHITECTURE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_os=$(jq -c <<< '${{ env.STRATEGY_MATRIX_OS }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_build_type=$(jq -c <<< '${{ env.STRATEGY_MATRIX_BUILD_TYPE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_cmake_args=$(jq -c <<< '${{ env.STRATEGY_MATRIX_CMAKE_ARGS }}')" >> "$GITHUB_OUTPUT"
outputs:
conan_global_conf: ${{ env.CONAN_GLOBAL_CONF }}
strategy_matrix_architecture: ${{ steps.generate.outputs.strategy_matrix_architecture }}
strategy_matrix_os: ${{ steps.generate.outputs.strategy_matrix_os }}
strategy_matrix_build_type: ${{ steps.generate.outputs.strategy_matrix_build_type }}
strategy_matrix_cmake_args: ${{ steps.generate.outputs.strategy_matrix_cmake_args }}
# Install and cache the dependencies, and then build and test the binary using
# various configurations.
build-test:
needs:
- generate-outputs
runs-on: ${{ matrix.architecture.runner }}
container: ghcr.io/xrplf/ci/${{ matrix.os.distro }}-${{ matrix.os.release }}:${{ matrix.os.compiler_name }}-${{ matrix.os.compiler_version }}
strategy:
fail-fast: false
max-parallel: 4
matrix:
architecture: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_architecture) }}
os: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_os) }}
build_type: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_build_type) }}
cmake_args: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_cmake_args) }}
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check configuration
shell: bash
run: |
echo "Checking path"
echo ${PATH} | tr ':' '\n'
echo "Checking environment variables."
env | sort
- name: Check versions
shell: bash
run: |
echo "Checking CMake version."
cmake --version
echo "Checking compiler version."
${CC} --version
echo "Checking Conan version."
conan --version
echo "Checking Ninja version."
ninja --version
- name: Configure Conan
uses: ./.github/actions/configure-conan
with:
conan_global_conf: ${{ inputs.conan_global_conf }}
conan_remote_name: ${{ inputs.conan_remote_name }}
conan_remote_url: ${{ inputs.conan_remote_url }}
conan_remote_username: ${{ secrets.conan_remote_username }}
conan_remote_password: ${{ secrets.conan_remote_password }}
- name: Install dependencies
uses: ./.github/actions/install-dependencies
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
conan_remote_name: ${{ inputs.conan_remote_name }}
- name: Build and test the binary
uses: ./.github/actions/build-test
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
cmake_args: ${{ matrix.cmake_args }}
cmake_generator: 'Ninja'
cmake_target: 'all'
os: 'Linux'

View File

@@ -1,225 +0,0 @@
# This workflow builds and tests the binary on various Ubuntu configurations.
name: Ubuntu
on:
workflow_call:
inputs:
build_dir:
description: 'The directory where to build.'
required: false
type: string
default: '.build'
conan_remote_name:
description: 'The name of the Conan remote to use.'
required: true
type: string
conan_remote_url:
description: 'The URL of the Conan remote to use.'
required: true
type: string
secrets:
conan_remote_username:
description: 'The username for logging into the Conan remote.'
required: true
conan_remote_password:
description: 'The password for logging into the Conan remote.'
required: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-ubuntu
cancel-in-progress: true
defaults:
run:
shell: bash
env:
# Global configuration for Conan. This is used to set the number of parallel
# downloads, uploads, and build jobs. The verbosity is set to verbose to
# provide more information during the build process.
CONAN_GLOBAL_CONF: |
core.download:parallel={{ os.cpu_count() }}
core.upload:parallel={{ os.cpu_count() }}
tools.build:jobs={{ (os.cpu_count() * 4/5) | int }}
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
# GitHub does not allow us to specify a reusable matrix strategy, so to avoid
# duplication, we define it here using environment variables and create the
# matrix in the first job. The matrix defined below should be kept in sync
# with https://github.com/XRPLF/ci/blob/main/.github/workflows/ubuntu.yml.
STRATEGY_MATRIX_ARCHITECTURE: >-
[
{
"platform": "linux/amd64",
"runner": [self-hosted, Linux, X64, devbox]
},
{
"platform": "linux/arm64",
"runner": [self-hosted, Linux, ARM64, devbox]
}
]
STRATEGY_MATRIX_OS: >-
[
{
"distro": "ubuntu",
"release": "jammy",
"compiler_name": "gcc",
"compiler_version": "12"
},
{
"distro": "ubuntu",
"release": "noble",
"compiler_name": "gcc",
"compiler_version": "13"
},
{
"distro": "ubuntu",
"release": "noble",
"compiler_name": "gcc",
"compiler_version": "14"
},
{
"distro": "ubuntu",
"release": "noble",
"compiler_name": "clang",
"compiler_version": "16"
},
{
"distro": "ubuntu",
"release": "noble",
"compiler_name": "clang",
"compiler_version": "17"
},
{
"distro": "ubuntu",
"release": "noble",
"compiler_name": "clang",
"compiler_version": "18"
},
{
"distro": "ubuntu",
"release": "noble",
"compiler_name": "clang",
"compiler_version": "19"
}
]
STRATEGY_MATRIX_BUILD_TYPE: >-
[
"Debug",
"Release"
]
STRATEGY_MATRIX_CMAKE_ARGS: >-
[
"-DUnity=OFF",
"-DUnity=ON"
]
# STRATEGY_MATRIX_ARCHITECTURE: >-
# [
# {
# "platform": "linux/amd64",
# "runner": ["self-hosted", "Linux", "X64"]
# }
# ]
# STRATEGY_MATRIX_OS: >-
# [
# {
# "distro": "ubuntu",
# "release": "jammy",
# "compiler_name": "gcc",
# "compiler_version": "12"
# }
# ]
# STRATEGY_MATRIX_BUILD_TYPE: >-
# [
# "Release"
# ]
# STRATEGY_MATRIX_CMAKE_ARGS: >-
# [
# "-DUnity=ON"
# ]
jobs:
# Generate the strategy matrix and expose environment variables to be used by
# following jobs. Exposing env vars this way is needed as they cannot be
# directly passed as inputs to reusable workflows (although they can be passed
# as inputs to actions).
generate-outputs:
runs-on: ubuntu-latest
steps:
- name: Generate outputs
id: generate
run: |
echo "strategy_matrix_architecture=$(jq -c <<< '${{ env.STRATEGY_MATRIX_ARCHITECTURE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_os=$(jq -c <<< '${{ env.STRATEGY_MATRIX_OS }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_build_type=$(jq -c <<< '${{ env.STRATEGY_MATRIX_BUILD_TYPE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_cmake_args=$(jq -c <<< '${{ env.STRATEGY_MATRIX_CMAKE_ARGS }}')" >> "$GITHUB_OUTPUT"
outputs:
conan_global_conf: ${{ env.CONAN_GLOBAL_CONF }}
strategy_matrix_architecture: ${{ steps.generate.outputs.strategy_matrix_architecture }}
strategy_matrix_os: ${{ steps.generate.outputs.strategy_matrix_os }}
strategy_matrix_build_type: ${{ steps.generate.outputs.strategy_matrix_build_type }}
strategy_matrix_cmake_args: ${{ steps.generate.outputs.strategy_matrix_cmake_args }}
# Install and cache the dependencies, and then build and test the binary using
# various configurations.
build-test:
needs:
- generate-outputs
runs-on: ${{ matrix.architecture.runner }}
container: ghcr.io/xrplf/ci/${{ matrix.os.distro }}-${{ matrix.os.release }}:${{ matrix.os.compiler_name }}-${{ matrix.os.compiler_version }}
strategy:
fail-fast: false
max-parallel: 4
matrix:
architecture: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_architecture) }}
os: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_os) }}
build_type: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_build_type) }}
cmake_args: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_cmake_args) }}
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check configuration
shell: bash
run: |
echo "Checking path"
echo ${PATH} | tr ':' '\n'
echo "Checking environment variables."
env | sort
- name: Check versions
shell: bash
run: |
echo "Checking CMake version."
cmake --version
echo "Checking compiler version."
${CC} --version
echo "Checking Conan version."
conan --version
echo "Checking Ninja version."
ninja --version
- name: Configure Conan
uses: ./.github/actions/configure-conan
with:
conan_global_conf: ${{ inputs.conan_global_conf }}
conan_remote_name: ${{ inputs.conan_remote_name }}
conan_remote_url: ${{ inputs.conan_remote_url }}
conan_remote_username: ${{ secrets.conan_remote_username }}
conan_remote_password: ${{ secrets.conan_remote_password }}
- name: Install dependencies
uses: ./.github/actions/install-dependencies
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
conan_remote_name: ${{ inputs.conan_remote_name }}
- name: Build and test the binary
uses: ./.github/actions/build-test
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
cmake_args: ${{ matrix.cmake_args }}
cmake_generator: 'Ninja'
cmake_target: 'all'
os: 'Linux'

View File

@@ -1,159 +0,0 @@
# This workflow builds and tests the binary on various Windows configurations.
name: Windows
on:
workflow_call:
inputs:
build_dir:
description: 'The directory where to build.'
required: false
type: string
default: '.build'
conan_remote_name:
description: 'The name of the Conan remote to use.'
required: true
type: string
conan_remote_url:
description: 'The URL of the Conan remote to use.'
required: true
type: string
secrets:
conan_remote_username:
description: 'The username for logging into the Conan remote.'
required: true
conan_remote_password:
description: 'The password for logging into the Conan remote.'
required: true
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-windows
cancel-in-progress: true
defaults:
run:
shell: bash
env:
# Global configuration for Conan. This is used to set the number of parallel
# downloads, uploads, and build jobs. The verbosity is set to verbose to
# provide more information during the build process.
CONAN_GLOBAL_CONF: |
core.download:parallel={{ os.cpu_count() }}
core.upload:parallel={{ os.cpu_count() }}
tools.build:jobs={{ (os.cpu_count() * 4/5) | int }}
tools.build:verbosity=verbose
tools.compilation:verbosity=verbose
# GitHub does not allow us to specify a reusable matrix strategy, so to avoid
# duplication, we define it here using environment variables and create the
# matrix in the first job.
STRATEGY_MATRIX_ARCHITECTURE: >-
[
{
"runner": ["self-hosted", "Windows", "devbox"]
}
]
STRATEGY_MATRIX_BUILD_TYPE: >-
[
"Debug",
"Release"
]
STRATEGY_MATRIX_CMAKE_ARGS: >-
[
"-DUnity=OFF",
"-DUnity=ON"
]
# STRATEGY_MATRIX_ARCHITECTURE: >-
# [
# {
# "runner": "windows-latest"
# }
# ]
# STRATEGY_MATRIX_BUILD_TYPE: >-
# [
# "Debug"
# ]
# STRATEGY_MATRIX_CMAKE_ARGS: >-
# [
# "-DUnity=ON"
# ]
jobs:
# Generate the strategy matrix and expose environment variables to be used by
# following jobs. Exposing env vars this way is needed as they cannot be
# directly passed as inputs to reusable workflows (although they can be passed
# as inputs to actions).
generate-outputs:
runs-on: ubuntu-latest
steps:
- name: Generate outputs
id: generate
run: |
echo "strategy_matrix_architecture=$(jq -c <<< '${{ env.STRATEGY_MATRIX_ARCHITECTURE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_build_type=$(jq -c <<< '${{ env.STRATEGY_MATRIX_BUILD_TYPE }}')" >> "$GITHUB_OUTPUT"
echo "strategy_matrix_cmake_args=$(jq -c <<< '${{ env.STRATEGY_MATRIX_CMAKE_ARGS }}')" >> "$GITHUB_OUTPUT"
outputs:
conan_global_conf: ${{ env.CONAN_GLOBAL_CONF }}
strategy_matrix_architecture: ${{ steps.generate.outputs.strategy_matrix_architecture }}
strategy_matrix_build_type: ${{ steps.generate.outputs.strategy_matrix_build_type }}
strategy_matrix_cmake_args: ${{ steps.generate.outputs.strategy_matrix_cmake_args }}
# Install and cache the dependencies, and then build and test the binary using
# various configurations.
build-test:
needs:
- generate-outputs
runs-on: ${{ matrix.architecture.runner }}
strategy:
fail-fast: false
max-parallel: 4
matrix:
architecture: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_architecture) }}
build_type: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_build_type) }}
cmake_args: ${{ fromJson(needs.generate-outputs.outputs.strategy_matrix_cmake_args) }}
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check configuration
shell: pwsh
run: |
echo "Checking path"
$env:PATH -split ';' | Sort-Object
echo "Checking environment variables."
- name: choose Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065
with:
python-version: 3.13
- name: install Conan
run: pip install wheel conan
- name: Check versions
shell: bash
run: |
echo "Checking CMake version."
cmake --version
echo "Checking Conan version."
conan --version
- name: Configure Conan
uses: ./.github/actions/configure-conan
with:
conan_global_conf: ${{ inputs.conan_global_conf }}
conan_remote_name: ${{ inputs.conan_remote_name }}
conan_remote_url: ${{ inputs.conan_remote_url }}
conan_remote_username: ${{ secrets.conan_remote_username }}
conan_remote_password: ${{ secrets.conan_remote_password }}
- name: Install dependencies
uses: ./.github/actions/install-dependencies
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
conan_remote_name: ${{ inputs.conan_remote_name }}
- name: Build and test the binary
uses: ./.github/actions/build-test
with:
build_dir: ${{ inputs.build_dir }}
build_type: ${{ matrix.build_type }}
cmake_args: ${{ matrix.cmake_args }}
cmake_generator: 'Visual Studio 17 2022'
cmake_target: 'install'
os: 'Windows'

View File

@@ -1,57 +0,0 @@
# This workflow checks if the code is formatted according to the .clang-format
# rules.
name: Clang Format
# This workflow can only be triggered by other workflows.
on: workflow_call
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-clang-format
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
clang-format:
runs-on: ubuntu-latest
container: ghcr.io/xrplf/ci/tools-rippled-clang-format
steps:
# The $GITHUB_WORKSPACE and ${{ github.workspace }} might not point to the
# same directory for jobs running in containers. The actions/checkout step
# is *supposed* to checkout into $GITHUB_WORKSPACE and then add it to
# safe.directory (see instructions at https://github.com/actions/checkout)
# but that is apparently not happening for some container images. We
# therefore preemptively add both directories to safe.directory. See also
# https://github.com/actions/runner/issues/2058 for more details.
- name: Configure git safe.directory
run: |
git config --global --add safe.directory $GITHUB_WORKSPACE
git config --global --add safe.directory ${{ github.workspace }}
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check version
run: clang-format --version
- name: Format code
run: find include src tests -type f \( -name '*.cpp' -o -name '*.hpp' -o -name '*.h' -o -name '*.ipp' \) -exec clang-format -i {} +
- name: Check for differences
env:
MESSAGE: |
One or more files did not conform to the formatting specified in
.clang-format. Maybe you did not run 'git-clang-format' or
'clang-format' before committing, or your version of clang-format
has an incompatibility with the one used here (see the "Check
version" step above).
Run 'git-clang-format --extensions cpp,h,hpp,ipp develop' in your
repo, and then commit and push the changes.
run: |
DIFF=$(git status --porcelain)
if [ -n "${DIFF}" ]; then
# Print the files that changed to give the contributor a hint about
# what to expect when running git-clang-format on their own machine.
git status
echo "${MESSAGE}"
exit 1
fi

View File

@@ -1,46 +0,0 @@
# This workflow checks if the dependencies between the modules are correctly
# indexed.
name: Levelization
# This workflow can only be triggered by other workflows.
on: workflow_call
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-levelization
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
levelization:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check levelization
run: Builds/levelization/levelization.sh
- name: Check for differences
env:
MESSAGE: |
The dependency relationships between the modules in rippled have
changed, which may be an improvement or a regression.
A rule of thumb is that if your changes caused something to be
removed from loops.txt, it's probably an improvement, while if
something was added, it's probably a regression.
Run './Builds/levelization/levelization.sh' in your repo, and then
commit and push the changes. See Builds/levelization/README.md for
more info.
run: |
DIFF=$(git status --porcelain)
if [ -n "${DIFF}" ]; then
# Print the differences to give the contributor a hint about what to
# expect when running levelization on their own machine.
git diff
echo "${MESSAGE}"
exit 1
fi

63
.github/workflows/clang-format.yml vendored Normal file
View File

@@ -0,0 +1,63 @@
name: clang-format
on:
push:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
jobs:
check:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
runs-on: ubuntu-24.04
env:
CLANG_VERSION: 18
steps:
- uses: actions/checkout@v4
- name: Install clang-format
run: |
codename=$( lsb_release --codename --short )
sudo tee /etc/apt/sources.list.d/llvm.list >/dev/null <<EOF
deb http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
deb-src http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
EOF
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add
sudo apt-get update
sudo apt-get install clang-format-${CLANG_VERSION}
- name: Format first-party sources
run: find include src tests -type f \( -name '*.cpp' -o -name '*.hpp' -o -name '*.h' -o -name '*.ipp' \) -exec clang-format-${CLANG_VERSION} -i {} +
- name: Check for differences
id: assert
run: |
set -o pipefail
git diff --exit-code | tee "clang-format.patch"
- name: Upload patch
if: failure() && steps.assert.outcome == 'failure'
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: clang-format.patch
if-no-files-found: ignore
path: clang-format.patch
- name: What happened?
if: failure() && steps.assert.outcome == 'failure'
env:
PREAMBLE: |
If you are reading this, you are looking at a failed Github Actions
job. That means you pushed one or more files that did not conform
to the formatting specified in .clang-format. That may be because
you neglected to run 'git clang-format' or 'clang-format' before
committing, or that your version of clang-format has an
incompatibility with the one on this
machine, which is:
SUGGESTION: |
To fix it, you can do one of two things:
1. Download and apply the patch generated as an artifact of this
job to your repo, commit, and push.
2. Run 'git-clang-format --extensions cpp,h,hpp,ipp develop'
in your repo, commit, and push.
run: |
echo "${PREAMBLE}"
clang-format-${CLANG_VERSION} --version
echo "${SUGGESTION}"
exit 1

View File

@@ -1,53 +0,0 @@
name: Documentation
# TODO: Use `workflow_run` to trigger this workflow after checks have completed.
# This can only be done if the `checks` workflow already exists on the default
# branch (i.e. `develop`), so we cannot do this yet.
# See https://docs.github.com/en/actions/reference/workflows-and-actions/events-that-trigger-workflows#workflow_run.
on:
push:
branches:
- develop
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
doxygen:
runs-on: ubuntu-latest
permissions:
contents: write
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Check configuration
run: |
echo "Checking path"
echo ${PATH} | tr ':' '\n'
echo "Checking environment variables."
env | sort
- name: Check versions
run: |
echo "Checking CMake version."
cmake --version
echo "Checking Doxygen version."
doxygen --version
- name: Build documentation
run: |
mkdir build
cd build
cmake -Donly_docs=TRUE ..
cmake --build . --target docs --parallel $(nproc)
- name: Publish documentation
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: build/docs/html

37
.github/workflows/doxygen.yml vendored Normal file
View File

@@ -0,0 +1,37 @@
name: Build and publish Doxygen documentation
# To test this workflow, push your changes to your fork's `develop` branch.
on:
push:
branches:
- develop
- doxygen
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
documentation:
runs-on: ubuntu-latest
permissions:
contents: write
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
steps:
- name: checkout
uses: actions/checkout@v4
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
cmake --version
doxygen --version
env | sort
- name: build
run: |
mkdir build
cd build
cmake -Donly_docs=TRUE ..
cmake --build . --target docs --parallel $(nproc)
- name: publish
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: build/docs/html

53
.github/workflows/levelization.yml vendored Normal file
View File

@@ -0,0 +1,53 @@
name: levelization
on:
push:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
jobs:
check:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
runs-on: ubuntu-latest
env:
CLANG_VERSION: 10
steps:
- uses: actions/checkout@v4
- name: Check levelization
run: Builds/levelization/levelization.sh
- name: Check for differences
id: assert
run: |
set -o pipefail
git diff --exit-code | tee "levelization.patch"
- name: Upload patch
if: failure() && steps.assert.outcome == 'failure'
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: levelization.patch
if-no-files-found: ignore
path: levelization.patch
- name: What happened?
if: failure() && steps.assert.outcome == 'failure'
env:
MESSAGE: |
If you are reading this, you are looking at a failed Github
Actions job. That means you changed the dependency relationships
between the modules in rippled. That may be an improvement or a
regression. This check doesn't judge.
A rule of thumb, though, is that if your changes caused
something to be removed from loops.txt, that's probably an
improvement. If something was added, it's probably a regression.
To fix it, you can do one of two things:
1. Download and apply the patch generated as an artifact of this
job to your repo, commit, and push.
2. Run './Builds/levelization/levelization.sh' in your repo,
commit, and push.
See Builds/levelization/README.md for more info.
run: |
echo "${MESSAGE}"
exit 1

View File

@@ -1,13 +1,13 @@
name: Check libXRPL compatibility with Clio
env:
CONAN_URL: https://conan.ripplex.io
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
on:
pull_request:
paths:
- 'src/libxrpl/protocol/BuildInfo.cpp'
- 'check-libxrpl.yml'
- '.github/workflows/libxrpl.yml'
types: [opened, reopened, synchronize, ready_for_review]
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -33,15 +33,18 @@ jobs:
repo-token: ${{ secrets.GITHUB_TOKEN }}
wait-interval: 10
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
uses: actions/checkout@v4
- name: Generate channel
id: channel
shell: bash
run: |
echo channel="clio/pr_${{ github.event.pull_request.number }}" | tee ${GITHUB_OUTPUT}
- name: Export new package
shell: bash
run: |
conan export . ${{ steps.channel.outputs.channel }}
- name: Add Ripple Conan remote
shell: bash
run: |
conan remote list
conan remote remove ripple || true
@@ -49,11 +52,13 @@ jobs:
conan remote add ripple ${{ env.CONAN_URL }} --insert 0
- name: Parse new version
id: version
shell: bash
run: |
echo version="$(cat src/libxrpl/protocol/BuildInfo.cpp | grep "versionString =" \
| awk -F '"' '{print $2}')" | tee ${GITHUB_OUTPUT}
- name: Try to authenticate to Ripple Conan remote
id: remote
shell: bash
run: |
# `conan user` implicitly uses the environment variables CONAN_LOGIN_USERNAME_<REMOTE> and CONAN_PASSWORD_<REMOTE>.
# https://docs.conan.io/1/reference/commands/misc/user.html#using-environment-variables
@@ -64,6 +69,7 @@ jobs:
- name: Upload new package
id: upload
if: (steps.remote.outputs.outcome == 'success')
shell: bash
run: |
echo "conan upload version ${{ steps.version.outputs.version }} on channel ${{ steps.channel.outputs.channel }}"
echo outcome=$(conan upload xrpl/${{ steps.version.outputs.version }}@${{ steps.channel.outputs.channel }} --remote ripple --confirm >&2 \
@@ -77,6 +83,7 @@ jobs:
steps:
- name: Notify Clio about new version
if: (needs.publish.outputs.outcome == 'success')
shell: bash
run: |
gh api --method POST -H "Accept: application/vnd.github+json" -H "X-GitHub-Api-Version: 2022-11-28" \
/repos/xrplf/clio/dispatches -f "event_type=check_libxrpl" \

99
.github/workflows/macos.yml vendored Normal file
View File

@@ -0,0 +1,99 @@
name: macos
on:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
push:
# If the branches list is ever changed, be sure to change it on all
# build/test jobs (nix, macos, windows, instrumentation)
branches:
# Always build the package branches
- develop
- release
- master
# Branches that opt-in to running
- 'ci/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
test:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
strategy:
matrix:
platform:
- macos
generator:
- Ninja
configuration:
- Release
runs-on: [self-hosted, macOS]
env:
# The `build` action requires these variables.
build_dir: .build
NUM_PROCESSORS: 12
steps:
- name: checkout
uses: actions/checkout@v4
- name: install Conan
run: |
brew install conan@1
echo '/opt/homebrew/opt/conan@1/bin' >> $GITHUB_PATH
- name: install Ninja
if: matrix.generator == 'Ninja'
run: brew install ninja
- name: install python
run: |
if which python > /dev/null 2>&1; then
echo "Python executable exists"
else
brew install python@3.13
ln -s /opt/homebrew/bin/python3 /opt/homebrew/bin/python
fi
- name: install cmake
run: |
if which cmake > /dev/null 2>&1; then
echo "cmake executable exists"
else
brew install cmake
fi
- name: install nproc
run: |
brew install coreutils
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
python --version
conan --version
cmake --version
nproc --version
echo -n "nproc returns: "
nproc
system_profiler SPHardwareDataType
sysctl -n hw.logicalcpu
clang --version
- name: configure Conan
run : |
conan profile new default --detect || true
conan profile update settings.compiler.cppstd=20 default
- name: build dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: ${{ matrix.generator }}
configuration: ${{ matrix.configuration }}
cmake-args: "-Dassert=TRUE -Dwerr=TRUE ${{ matrix.cmake-args }}"
- name: test
run: |
n=$(nproc)
echo "Using $n test jobs"
${build_dir}/rippled --unittest --unittest-jobs $n

View File

@@ -1,90 +0,0 @@
# This workflow runs all workflows to check, build and test the project on
# various Linux flavors, as well as MacOS and Windows.
name: Main
# This workflow is triggered on every push to the repository, including pull
# requests. However, it will not run if the pull request is a draft unless it
# has the 'DraftRunCI' label. As GitHub Actions does not support such `if`
# conditions here, the individual jobs will check the pull request state and
# labels to determine whether to run or not. The workflows called by the jobs
# may also have their own conditions to partially or completely skip execution
# as needed.
on: push
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
# check-clang-format:
# if: github.event.pull_request.draft == false || contains(github.event.pull_request.labels.*.name, 'DraftRunCI')
# uses: ./.github/workflows/check-clang-format.yml
#
# check-levelization:
# if: github.event.pull_request.draft == false || contains(github.event.pull_request.labels.*.name, 'DraftRunCI')
# uses: ./.github/workflows/check-levelization.yml
debian:
# needs:
# - check-clang-format
# - check-levelization
uses: ./.github/workflows/build-debian.yml
with:
conan_remote_name: ${{ vars.CONAN_REMOTE_NAME }}
conan_remote_url: ${{ vars.CONAN_REMOTE_URL }}
secrets:
conan_remote_username: ${{ secrets.CONAN_REMOTE_USERNAME }}
conan_remote_password: ${{ secrets.CONAN_REMOTE_PASSWORD }}
# rhel:
## needs:
## - check-clang-format
## - check-levelization
# uses: ./.github/workflows/build-rhel.yml
# with:
# conan_remote_name: ${{ vars.CONAN_REMOTE_NAME }}
# conan_remote_url: ${{ vars.CONAN_REMOTE_URL }}
# secrets:
# conan_remote_username: ${{ secrets.CONAN_REMOTE_USERNAME }}
# conan_remote_password: ${{ secrets.CONAN_REMOTE_PASSWORD }}
ubuntu:
# needs:
# - check-clang-format
# - check-levelization
uses: ./.github/workflows/build-ubuntu.yml
with:
conan_remote_name: ${{ vars.CONAN_REMOTE_NAME }}
conan_remote_url: ${{ vars.CONAN_REMOTE_URL }}
secrets:
conan_remote_username: ${{ secrets.CONAN_REMOTE_USERNAME }}
conan_remote_password: ${{ secrets.CONAN_REMOTE_PASSWORD }}
macos:
# needs:
# - check-clang-format
# - check-levelization
uses: ./.github/workflows/build-macos.yml
with:
conan_remote_name: ${{ vars.CONAN_REMOTE_NAME }}
conan_remote_url: ${{ vars.CONAN_REMOTE_URL }}
secrets:
conan_remote_username: ${{ secrets.CONAN_REMOTE_USERNAME }}
conan_remote_password: ${{ secrets.CONAN_REMOTE_PASSWORD }}
windows:
# needs:
# - check-clang-format
# - check-levelization
uses: ./.github/workflows/build-windows.yml
with:
conan_remote_name: ${{ vars.CONAN_REMOTE_NAME }}
conan_remote_url: ${{ vars.CONAN_REMOTE_URL }}
secrets:
conan_remote_username: ${{ secrets.CONAN_REMOTE_USERNAME }}
conan_remote_password: ${{ secrets.CONAN_REMOTE_PASSWORD }}

View File

@@ -1,66 +1,60 @@
name: Check for missing commits
name: missing-commits
# TODO: Use `workflow_run` to trigger this workflow after checks have completed.
# This can only be done if the `checks` workflow already exists on the default
# branch (i.e. `develop`), so we cannot do this yet.
# See https://docs.github.com/en/actions/reference/workflows-and-actions/events-that-trigger-workflows#workflow_run.
on:
push:
branches:
# Only check that the branches are up to date when updating the
# relevant branches.
- develop
- release
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
defaults:
run:
shell: bash
jobs:
check:
runs-on: ubuntu-latest
up_to_date:
runs-on: ubuntu-24.04
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for missing commits
id: commits
env:
MESSAGE: |
SUGGESTION: |
If you are reading this, then the commits indicated above are missing
from the "develop" and/or "release" branch. Do a reverse-merge as soon
as possible. See CONTRIBUTING.md for instructions.
If you are reading this, then the commits indicated above are
missing from "develop" and/or "release". Do a reverse-merge
as soon as possible. See CONTRIBUTING.md for instructions.
run: |
set -o pipefail
# Branches are ordered by how "canonical" they are. Every commit in one
# branch should be in all the branches behind it.
order=(master release develop)
# Branches ordered by how "canonical" they are. Every commit in
# one branch should be in all the branches behind it
order=( master release develop )
branches=()
for branch in "${order[@]}"; do
# Check that the branches exist so that this job will work on forked
# repos, which don't necessarily have master and release branches.
echo "Checking if ${branch} exists."
for branch in "${order[@]}"
do
# Check that the branches exist so that this job will work on
# forked repos, which don't necessarily have master and
# release branches.
if git ls-remote --exit-code --heads origin \
refs/heads/${branch} > /dev/null; then
branches+=(origin/${branch})
refs/heads/${branch} > /dev/null
then
branches+=( origin/${branch} )
fi
done
prior=()
for branch in "${branches[@]}"; do
if [[ ${#prior[@]} -ne 0 ]]; then
echo "Checking ${prior[@]} for commits missing from ${branch}."
for branch in "${branches[@]}"
do
if [[ ${#prior[@]} -ne 0 ]]
then
echo "Checking ${prior[@]} for commits missing from ${branch}"
git log --oneline --no-merges "${prior[@]}" \
^$branch | tee -a "missing-commits.txt"
echo
fi
prior+=("${branch}")
prior+=( "${branch}" )
done
if [[ $(cat missing-commits.txt | wc -l) -ne 0 ]]; then
echo "${MESSAGE}"
if [[ $( cat missing-commits.txt | wc -l ) -ne 0 ]]
then
echo "${SUGGESTION}"
exit 1
fi

444
.github/workflows/nix.yml vendored Normal file
View File

@@ -0,0 +1,444 @@
name: nix
on:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
push:
# If the branches list is ever changed, be sure to change it on all
# build/test jobs (nix, macos, windows)
branches:
# Always build the package branches
- develop
- release
- master
# Branches that opt-in to running
- "ci/**"
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
# This workflow has multiple job matrixes.
# They can be considered phases because most of the matrices ("test",
# "coverage", "conan", ) depend on the first ("dependencies").
#
# The first phase has a job in the matrix for each combination of
# variables that affects dependency ABI:
# platform, compiler, and configuration.
# It creates a GitHub artifact holding the Conan profile,
# and builds and caches binaries for all the dependencies.
# If an Artifactory remote is configured, they are cached there.
# If not, they are added to the GitHub artifact.
# GitHub's "cache" action has a size limit (10 GB) that is too small
# to hold the binaries if they are built locally.
# We must use the "{upload,download}-artifact" actions instead.
#
# The remaining phases have a job in the matrix for each test
# configuration. They install dependency binaries from the cache,
# whichever was used, and build and test rippled.
#
# "instrumentation" is independent, but is included here because it also
# builds on linux in the same "on:" conditions.
jobs:
dependencies:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
- clang
configuration:
- Debug
- Release
include:
- compiler: gcc
profile:
version: 11
cc: /usr/bin/gcc
cxx: /usr/bin/g++
- compiler: clang
profile:
version: 14
cc: /usr/bin/clang-14
cxx: /usr/bin/clang++-14
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
env:
build_dir: .build
steps:
- name: upgrade conan
run: |
pip install --upgrade "conan<2"
- name: checkout
uses: actions/checkout@v4
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
lsb_release -a || true
${{ matrix.profile.cc }} --version
conan --version
cmake --version
env | sort
- name: configure Conan
run: |
conan profile new default --detect
conan profile update settings.compiler.cppstd=20 default
conan profile update settings.compiler=${{ matrix.compiler }} default
conan profile update settings.compiler.version=${{ matrix.profile.version }} default
conan profile update settings.compiler.libcxx=libstdc++11 default
conan profile update env.CC=${{ matrix.profile.cc }} default
conan profile update env.CXX=${{ matrix.profile.cxx }} default
conan profile update conf.tools.build:compiler_executables='{"c": "${{ matrix.profile.cc }}", "cpp": "${{ matrix.profile.cxx }}"}' default
- name: archive profile
# Create this archive before dependencies are added to the local cache.
run: tar -czf conan.tar -C ~/.conan .
- name: build dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
with:
configuration: ${{ matrix.configuration }}
- name: upload archive
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
path: conan.tar
if-no-files-found: error
test:
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
- clang
configuration:
- Debug
- Release
cmake-args:
-
- "-Dunity=ON"
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
env:
build_dir: .build
steps:
- name: upgrade conan
run: |
pip install --upgrade "conan<2"
- name: download cache
uses: actions/download-artifact@v4
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
- name: extract cache
run: |
mkdir -p ~/.conan
tar -xzf conan.tar -C ~/.conan
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
- name: checkout
uses: actions/checkout@v4
- name: dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: Ninja
configuration: ${{ matrix.configuration }}
cmake-args: "-Dassert=TRUE -Dwerr=TRUE ${{ matrix.cmake-args }}"
- name: test
run: |
${build_dir}/rippled --unittest --unittest-jobs $(nproc)
reference-fee-test:
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
configuration:
- Debug
cmake-args:
- "-DUNIT_TEST_REFERENCE_FEE=200"
- "-DUNIT_TEST_REFERENCE_FEE=1000"
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
env:
build_dir: .build
steps:
- name: upgrade conan
run: |
pip install --upgrade "conan<2"
- name: download cache
uses: actions/download-artifact@v4
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
- name: extract cache
run: |
mkdir -p ~/.conan
tar -xzf conan.tar -C ~/.conan
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
- name: checkout
uses: actions/checkout@v4
- name: dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: Ninja
configuration: ${{ matrix.configuration }}
cmake-args: "-Dassert=TRUE -Dwerr=TRUE ${{ matrix.cmake-args }}"
- name: test
run: |
${build_dir}/rippled --unittest --unittest-jobs $(nproc)
coverage:
strategy:
fail-fast: false
matrix:
platform:
- linux
compiler:
- gcc
configuration:
- Debug
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
env:
build_dir: .build
steps:
- name: upgrade conan
run: |
pip install --upgrade "conan<2"
- name: download cache
uses: actions/download-artifact@v4
with:
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
- name: extract cache
run: |
mkdir -p ~/.conan
tar -xzf conan.tar -C ~/.conan
- name: install gcovr
run: pip install "gcovr>=7,<9"
- name: check environment
run: |
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
gcovr --version
env | sort
ls ~/.conan
- name: checkout
uses: actions/checkout@v4
- name: dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
with:
configuration: ${{ matrix.configuration }}
- name: build
uses: ./.github/actions/build
with:
generator: Ninja
configuration: ${{ matrix.configuration }}
cmake-args: >-
-Dassert=TRUE
-Dwerr=TRUE
-Dcoverage=ON
-Dcoverage_format=xml
-DCODE_COVERAGE_VERBOSE=ON
-DCMAKE_CXX_FLAGS="-O0"
-DCMAKE_C_FLAGS="-O0"
cmake-target: coverage
- name: move coverage report
shell: bash
run: |
mv "${build_dir}/coverage.xml" ./
- name: archive coverage report
uses: actions/upload-artifact@v4
with:
name: coverage.xml
path: coverage.xml
retention-days: 30
- name: upload coverage report
uses: wandalen/wretry.action@v1.4.10
with:
action: codecov/codecov-action@v4.5.0
with: |
files: coverage.xml
fail_ci_if_error: true
disable_search: true
verbose: true
plugin: noop
token: ${{ secrets.CODECOV_TOKEN }}
attempt_limit: 5
attempt_delay: 210000 # in milliseconds
conan:
needs: dependencies
runs-on: [self-hosted, heavy]
container: ghcr.io/xrplf/rippled-build-ubuntu:aaf5e3e
env:
build_dir: .build
configuration: Release
steps:
- name: upgrade conan
run: |
pip install --upgrade "conan<2"
- name: download cache
uses: actions/download-artifact@v4
with:
name: linux-gcc-${{ env.configuration }}
- name: extract cache
run: |
mkdir -p ~/.conan
tar -xzf conan.tar -C ~/.conan
- name: check environment
run: |
env | sort
echo ${PATH} | tr ':' '\n'
conan --version
cmake --version
- name: checkout
uses: actions/checkout@v4
- name: dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
with:
configuration: ${{ env.configuration }}
- name: export
run: |
version=$(conan inspect --raw version .)
reference="xrpl/${version}@local/test"
conan remove -f ${reference} || true
conan export . local/test
echo "reference=${reference}" >> "${GITHUB_ENV}"
- name: build
run: |
cd tests/conan
mkdir ${build_dir}
cd ${build_dir}
conan install .. --output-folder . \
--require-override ${reference} --build missing
cmake .. \
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=./build/${configuration}/generators/conan_toolchain.cmake \
-DCMAKE_BUILD_TYPE=${configuration}
cmake --build .
./example | grep '^[[:digit:]]\+\.[[:digit:]]\+\.[[:digit:]]\+'
# NOTE we are not using dependencies built above because it lags with
# compiler versions. Instrumentation requires clang version 16 or
# later
instrumentation-build:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
env:
CLANG_RELEASE: 16
strategy:
fail-fast: false
runs-on: [self-hosted, heavy]
container: debian:bookworm
steps:
- name: install prerequisites
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update
apt-get install --yes --no-install-recommends \
clang-${CLANG_RELEASE} clang++-${CLANG_RELEASE} \
python3-pip python-is-python3 make cmake git wget
apt-get clean
update-alternatives --install \
/usr/bin/clang clang /usr/bin/clang-${CLANG_RELEASE} 100 \
--slave /usr/bin/clang++ clang++ /usr/bin/clang++-${CLANG_RELEASE}
update-alternatives --auto clang
pip install --no-cache --break-system-packages "conan<2"
- name: checkout
uses: actions/checkout@v4
- name: prepare environment
run: |
mkdir ${GITHUB_WORKSPACE}/.build
echo "SOURCE_DIR=$GITHUB_WORKSPACE" >> $GITHUB_ENV
echo "BUILD_DIR=$GITHUB_WORKSPACE/.build" >> $GITHUB_ENV
echo "CC=/usr/bin/clang" >> $GITHUB_ENV
echo "CXX=/usr/bin/clang++" >> $GITHUB_ENV
- name: configure Conan
run: |
conan profile new --detect default
conan profile update settings.compiler=clang default
conan profile update settings.compiler.version=${CLANG_RELEASE} default
conan profile update settings.compiler.libcxx=libstdc++11 default
conan profile update settings.compiler.cppstd=20 default
conan profile update options.rocksdb=False default
conan profile update \
'conf.tools.build:compiler_executables={"c": "/usr/bin/clang", "cpp": "/usr/bin/clang++"}' default
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_DISABLE_CONCEPTS"' default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
conan export external/snappy snappy/1.1.10@
conan export external/soci soci/4.0.3@
conan export -k external/wamr wamr/2.2.0@
- name: build dependencies
run: |
cd ${BUILD_DIR}
conan install ${SOURCE_DIR} \
--output-folder ${BUILD_DIR} \
--install-folder ${BUILD_DIR} \
--build missing \
--settings build_type=Debug
- name: build with instrumentation
run: |
cd ${BUILD_DIR}
cmake -S ${SOURCE_DIR} -B ${BUILD_DIR} \
-Dvoidstar=ON \
-Dtests=ON \
-Dxrpld=ON \
-DCMAKE_BUILD_TYPE=Debug \
-DSECP256K1_BUILD_BENCHMARK=OFF \
-DSECP256K1_BUILD_TESTS=OFF \
-DSECP256K1_BUILD_EXHAUSTIVE_TESTS=OFF \
-DCMAKE_TOOLCHAIN_FILE=${BUILD_DIR}/build/generators/conan_toolchain.cmake
cmake --build . --parallel $(nproc)
- name: verify instrumentation enabled
run: |
cd ${BUILD_DIR}
./rippled --version | grep libvoidstar
- name: run unit tests
run: |
cd ${BUILD_DIR}
./rippled -u --unittest-jobs $(( $(nproc)/4 ))

99
.github/workflows/windows.yml vendored Normal file
View File

@@ -0,0 +1,99 @@
name: windows
on:
pull_request:
types: [opened, reopened, synchronize, ready_for_review]
push:
# If the branches list is ever changed, be sure to change it on all
# build/test jobs (nix, macos, windows, instrumentation)
branches:
# Always build the package branches
- develop
- release
- master
# Branches that opt-in to running
- 'ci/**'
# https://docs.github.com/en/actions/using-jobs/using-concurrency
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
test:
if: ${{ github.event_name == 'push' || github.event.pull_request.draft != true || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
strategy:
fail-fast: false
matrix:
version:
- generator: Visual Studio 17 2022
runs-on: windows-2022
configuration:
- type: Release
tests: true
- type: Debug
# Skip running unit tests on debug builds, because they
# take an unreasonable amount of time
tests: false
runtime: d
runs-on: ${{ matrix.version.runs-on }}
env:
build_dir: .build
steps:
- name: checkout
uses: actions/checkout@v4
- name: choose Python
uses: actions/setup-python@v5
with:
python-version: 3.9
- name: learn Python cache directory
id: pip-cache
shell: bash
run: |
python -m pip install --upgrade pip
echo "dir=$(pip cache dir)" | tee ${GITHUB_OUTPUT}
- name: restore Python cache directory
uses: actions/cache@v4
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-${{ hashFiles('.github/workflows/windows.yml') }}
- name: install Conan
run: pip install wheel 'conan<2'
- name: check environment
run: |
dir env:
$env:PATH -split ';'
python --version
conan --version
cmake --version
- name: configure Conan
shell: bash
run: |
conan profile new default --detect
conan profile update settings.compiler.cppstd=20 default
conan profile update \
settings.compiler.runtime=MT${{ matrix.configuration.runtime }} \
default
- name: build dependencies
uses: ./.github/actions/dependencies
env:
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
with:
configuration: ${{ matrix.configuration.type }}
- name: build
uses: ./.github/actions/build
with:
generator: '${{ matrix.version.generator }}'
configuration: ${{ matrix.configuration.type }}
# Hard code for now. Move to the matrix if varied options are needed
cmake-args: '-Dassert=TRUE -Dwerr=TRUE -Dreporting=OFF -Dunity=ON'
cmake-target: install
- name: test
shell: bash
if: ${{ matrix.configuration.tests }}
run: |
${build_dir}/${{ matrix.configuration.type }}/rippled --unittest \
--unittest-jobs $(nproc)

View File

@@ -1,6 +1,6 @@
# .pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: v18.1.8
rev: v18.1.3
hooks:
- id: clang-format

View File

@@ -167,18 +167,54 @@ It does not explicitly link the C++ standard library,
which allows you to statically link it with GCC, if you want.
```
# Conan 1.x
conan export external/snappy snappy/1.1.10@
# Conan 2.x
conan export --version 1.1.10 external/snappy
```
Export our [Conan recipe for RocksDB](./external/rocksdb).
It does not override paths to dependencies when building with Visual Studio.
```
# Conan 1.x
conan export external/rocksdb rocksdb/9.7.3@
# Conan 2.x
conan export --version 9.7.3 external/rocksdb
```
Export our [Conan recipe for SOCI](./external/soci).
It patches their CMake to correctly import its dependencies.
```
# Conan 1.x
conan export external/soci soci/4.0.3@
# Conan 2.x
conan export --version 4.0.3 external/soci
```
Export our [Conan recipe for NuDB](./external/nudb).
It fixes some source files to add missing `#include`s.
```
# Conan 1.x
conan export external/nudb nudb/2.0.8@
# Conan 2.x
conan export --version 2.0.8 external/nudb
```
Export our [Conan recipe for WAMR](./external/wamr).
It add metering and expose some internal structures.
```
# Conan 1.x
conan export external/wamr wamr/2.2.0@
# Conan 2.x
conan export --version 2.2.0 external/wamr
```
### Build and Test
1. Create a build directory and move into it.
@@ -370,13 +406,18 @@ and can be helpful for detecting `#include` omissions.
## Troubleshooting
### Conan
After any updates or changes to dependencies, you may need to do the following:
1. Remove your build directory.
2. Remove the Conan cache: `conan remove "*" -c`
3. Re-run [conan install](#build-and-test).
2. Remove the Conan cache:
```
rm -rf ~/.conan/data
```
4. Re-run [conan install](#build-and-test).
### 'protobuf/port_def.inc' file not found
@@ -394,6 +435,54 @@ For example, if you want to build Debug:
1. For conan install, pass `--settings build_type=Debug`
2. For cmake, pass `-DCMAKE_BUILD_TYPE=Debug`
### no std::result_of
If your compiler version is recent enough to have removed `std::result_of` as
part of C++20, e.g. Apple Clang 15.0, then you might need to add a preprocessor
definition to your build.
```
conan profile update 'options.boost:extra_b2_flags="define=BOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'env.CFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
conan profile update 'conf.tools.build:cflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
```
### call to 'async_teardown' is ambiguous
If you are compiling with an early version of Clang 16, then you might hit
a [regression][6] when compiling C++20 that manifests as an [error in a Boost
header][7]. You can workaround it by adding this preprocessor definition:
```
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_DISABLE_CONCEPTS"' default
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
```
### recompile with -fPIC
If you get a linker error suggesting that you recompile Boost with
position-independent code, such as:
```
/usr/bin/ld.gold: error: /home/username/.conan/data/boost/1.77.0/_/_/package/.../lib/libboost_container.a(alloc_lib.o):
requires unsupported dynamic reloc 11; recompile with -fPIC
```
Conan most likely downloaded a bad binary distribution of the dependency.
This seems to be a [bug][1] in Conan just for Boost 1.77.0 compiled with GCC
for Linux. The solution is to build the dependency locally by passing
`--build boost` when calling `conan install`.
```
conan install --build boost ...
```
## Add a Dependency
If you want to experiment with a new package, follow these steps:

View File

@@ -72,15 +72,15 @@ It generates many files of [results](results):
desired as described above. In a perfect repo, this file will be
empty.
This file is committed to the repo, and is used by the [levelization
Github workflow](../../.github/workflows/check-levelization.yml) to validate
Github workflow](../../.github/workflows/levelization.yml) to validate
that nothing changed.
* [`ordering.txt`](results/ordering.txt): A list showing relationships
between modules where there are no loops as they actually exist, as
opposed to how they are desired as described above.
This file is committed to the repo, and is used by the [levelization
Github workflow](../../.github/workflows/check-levelization.yml) to validate
Github workflow](../../.github/workflows/levelization.yml) to validate
that nothing changed.
* [`levelization.yml`](../../.github/workflows/check-levelization.yml)
* [`levelization.yml`](../../.github/workflows/levelization.yml)
Github Actions workflow to test that levelization loops haven't
changed. Unfortunately, if changes are detected, it can't tell if
they are improvements or not, so if you have resolved any issues or

View File

@@ -132,7 +132,6 @@ test.shamap > xrpl.protocol
test.toplevel > test.csf
test.toplevel > xrpl.json
test.unit_test > xrpl.basics
tests.libxrpl > xrpl.basics
xrpl.json > xrpl.basics
xrpl.protocol > xrpl.basics
xrpl.protocol > xrpl.json

View File

@@ -90,11 +90,6 @@ set_target_properties(OpenSSL::SSL PROPERTIES
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2
)
set(SECP256K1_INSTALL TRUE)
set(SECP256K1_BUILD_BENCHMARK FALSE)
set(SECP256K1_BUILD_TESTS FALSE)
set(SECP256K1_BUILD_EXHAUSTIVE_TESTS FALSE)
set(SECP256K1_BUILD_CTIME_TESTS FALSE)
set(SECP256K1_BUILD_EXAMPLES FALSE)
add_subdirectory(external/secp256k1)
add_library(secp256k1::secp256k1 ALIAS secp256k1)
add_subdirectory(external/ed25519-donna)
@@ -120,6 +115,7 @@ endif()
find_package(nudb REQUIRED)
find_package(date REQUIRED)
find_package(xxHash REQUIRED)
find_package(wamr REQUIRED)
target_link_libraries(ripple_libs INTERFACE
ed25519::ed25519
@@ -149,8 +145,3 @@ set(PROJECT_EXPORT_SET RippleExports)
include(RippledCore)
include(RippledInstall)
include(RippledValidatorKeys)
if(tests)
include(CTest)
add_subdirectory(src/tests/libxrpl)
endif()

4817
RELEASENOTES.md Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -83,7 +83,7 @@ To report a qualifying bug, please send a detailed report to:
|Long Key ID | `0xCD49A0AFC57929BE` |
|Fingerprint | `24E6 3B02 37E0 FA9C 5E96 8974 CD49 A0AF C579 29BE` |
The full PGP key for this address, which is also available on several key servers (e.g. on [keyserver.ubuntu.com](https://keyserver.ubuntu.com)), is:
The full PGP key for this address, which is also available on several key servers (e.g. on [keys.gnupg.net](https://keys.gnupg.net)), is:
```
-----BEGIN PGP PUBLIC KEY BLOCK-----
mQINBFUwGHYBEAC0wpGpBPkd8W1UdQjg9+cEFzeIEJRaoZoeuJD8mofwI5Ejnjdt

View File

@@ -1249,6 +1249,39 @@
# Example:
# owner_reserve = 2000000 # 2 XRP
#
# extension_compute_limit = <gas>
#
# The extension compute limit is the maximum amount of gas that can be
# consumed by a single transaction. The gas limit is used to prevent
# transactions from consuming too many resources.
#
# If this parameter is unspecified, rippled will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# extension_compute_limit = 1000000 # 1 million gas
#
# extension_size_limit = <bytes>
#
# The extension size limit is the maximum size of a WASM extension in
# bytes. The size limit is used to prevent extensions from consuming
# too many resources.
#
# If this parameter is unspecified, rippled will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# extension_size_limit = 100000 # 100 kb
#
# gas_price = <bytes>
#
# The gas price is the conversion between WASM gas and its price in drops.
#
# If this parameter is unspecified, rippled will use an internal
# default. Don't change this without understanding the consequences.
#
# Example:
# gas_price = 1000000 # 1 drop per gas
#-------------------------------------------------------------------------------
#
# 9. Misc Settings

View File

@@ -90,15 +90,28 @@ if (MSVC)
-errorreport:none
-machine:X64)
else ()
# HACK : because these need to come first, before any warning demotion
string (APPEND CMAKE_CXX_FLAGS " -Wall -Wdeprecated")
if (wextra)
string (APPEND CMAKE_CXX_FLAGS " -Wextra -Wno-unused-parameter")
endif ()
# not MSVC
target_compile_options (common
INTERFACE
-Wall
-Wdeprecated
$<$<BOOL:${wextra}>:-Wextra -Wno-unused-parameter>
$<$<BOOL:${werr}>:-Werror>
-fstack-protector
$<$<COMPILE_LANGUAGE:CXX>:
-frtti
-Wnon-virtual-dtor
>
-Wno-sign-compare
-Wno-unused-but-set-variable
-Wno-char-subscripts
-Wno-format
-Wno-unused-local-typedefs
-fstack-protector
$<$<BOOL:${is_gcc}>:
-Wno-unused-but-set-variable
-Wno-deprecated
>
$<$<NOT:$<CONFIG:Debug>>:-fno-strict-aliasing>
# tweak gcc optimization for debug
$<$<AND:$<BOOL:${is_gcc}>,$<CONFIG:Debug>>:-O0>

View File

@@ -65,6 +65,7 @@ target_link_libraries(xrpl.imports.main
xrpl.libpb
xxHash::xxhash
$<$<BOOL:${voidstar}>:antithesis-sdk-cpp>
wamr::wamr
)
include(add_module)

View File

@@ -2,6 +2,16 @@
convenience variables and sanity checks
#]===================================================================]
include(ProcessorCount)
if (NOT ep_procs)
ProcessorCount(ep_procs)
if (ep_procs GREATER 1)
# never use more than half of cores for EP builds
math (EXPR ep_procs "${ep_procs} / 2")
message (STATUS "Using ${ep_procs} cores for ExternalProject builds.")
endif ()
endif ()
get_property(is_multiconfig GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
set (CMAKE_CONFIGURATION_TYPES "Debug;Release" CACHE STRING "" FORCE)

View File

@@ -18,7 +18,7 @@ if(tests)
endif()
endif()
option(unity "Creates a build using UNITY support in cmake." OFF)
option(unity "Creates a build using UNITY support in cmake. This is the default" ON)
if(unity)
if(NOT is_ci)
set(CMAKE_UNITY_BUILD_BATCH_SIZE 15 CACHE STRING "")

View File

@@ -2,6 +2,7 @@ find_package(Boost 1.82 REQUIRED
COMPONENTS
chrono
container
context
coroutine
date_time
filesystem
@@ -23,7 +24,7 @@ endif()
target_link_libraries(ripple_boost
INTERFACE
Boost::headers
Boost::boost
Boost::chrono
Boost::container
Boost::coroutine

View File

@@ -1,41 +0,0 @@
include(isolate_headers)
function(xrpl_add_test name)
set(target ${PROJECT_NAME}.test.${name})
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
"${CMAKE_CURRENT_SOURCE_DIR}/${name}/*.cpp"
"${CMAKE_CURRENT_SOURCE_DIR}/${name}.cpp"
)
add_executable(${target} EXCLUDE_FROM_ALL ${ARGN} ${sources})
isolate_headers(
${target}
"${CMAKE_SOURCE_DIR}"
"${CMAKE_SOURCE_DIR}/tests/${name}"
PRIVATE
)
# Make sure the test isn't optimized away in unity builds
set_target_properties(${target} PROPERTIES
UNITY_BUILD_MODE GROUP
UNITY_BUILD_BATCH_SIZE 0) # Adjust as needed
add_test(NAME ${target} COMMAND ${target})
set_tests_properties(
${target} PROPERTIES
FIXTURES_REQUIRED ${target}_fixture
)
add_test(
NAME ${target}.build
COMMAND
${CMAKE_COMMAND}
--build ${CMAKE_BINARY_DIR}
--config $<CONFIG>
--target ${target}
)
set_tests_properties(${target}.build PROPERTIES
FIXTURES_SETUP ${target}_fixture
)
endfunction()

View File

@@ -1,34 +0,0 @@
{% set os = detect_api.detect_os() %}
{% set arch = detect_api.detect_arch() %}
{% set compiler, version, compiler_exe = detect_api.detect_default_compiler() %}
{% set compiler_version = version %}
{% if os == "Linux" %}
{% set compiler_version = detect_api.default_compiler_version(compiler, version) %}
{% endif %}
[settings]
os={{ os }}
arch={{ arch }}
build_type=Debug
compiler={{compiler}}
compiler.version={{ compiler_version }}
compiler.cppstd=20
{% if os == "Windows" %}
compiler.runtime=static
{% else %}
compiler.libcxx={{detect_api.detect_libcxx(compiler, version, compiler_exe)}}
{% endif %}
[conf]
{% if compiler == "clang" and compiler_version >= 19 %}
tools.build:cxxflags=['-Wno-missing-template-arg-list-after-template-kw']
{% endif %}
{% if compiler == "apple-clang" and compiler_version >= 17 %}
tools.build:cxxflags=['-Wno-missing-template-arg-list-after-template-kw']
{% endif %}
{% if compiler == "gcc" and compiler_version < 13 %}
tools.build:cxxflags=['-Wno-restrict']
{% endif %}
[tool_requires]
!cmake/*: cmake/[>=3 <4]

View File

@@ -1,7 +1,8 @@
from conan import ConanFile, __version__ as conan_version
from conan import ConanFile
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
import re
class Xrpl(ConanFile):
name = 'xrpl'
@@ -24,20 +25,19 @@ class Xrpl(ConanFile):
}
requires = [
'date/3.0.3',
'grpc/1.50.1',
'libarchive/3.8.1',
'nudb/2.0.9',
'openssl/1.1.1w',
'libarchive/3.7.6',
'nudb/2.0.8',
'openssl/1.1.1v',
'soci/4.0.3',
'xxhash/0.8.2',
'zlib/1.3.1',
]
test_requires = [
'doctest/2.4.11',
'wamr/2.2.0',
]
tool_requires = [
'protobuf/3.21.12',
'protobuf/3.21.9',
]
default_options = {
@@ -89,31 +89,26 @@ class Xrpl(ConanFile):
}
def set_version(self):
if self.version is None:
path = f'{self.recipe_folder}/src/libxrpl/protocol/BuildInfo.cpp'
regex = r'versionString\s?=\s?\"(.*)\"'
with open(path, encoding='utf-8') as file:
matches = (re.search(regex, line) for line in file)
match = next(m for m in matches if m)
self.version = match.group(1)
path = f'{self.recipe_folder}/src/libxrpl/protocol/BuildInfo.cpp'
regex = r'versionString\s?=\s?\"(.*)\"'
with open(path, 'r') as file:
matches = (re.search(regex, line) for line in file)
match = next(m for m in matches if m)
self.version = match.group(1)
def configure(self):
if self.settings.compiler == 'apple-clang':
self.options['boost'].visibility = 'global'
def requirements(self):
# Conan 2 requires transitive headers to be specified
transitive_headers_opt = {'transitive_headers': True} if conan_version.split('.')[0] == '2' else {}
self.requires('boost/1.86.0', force=True, **transitive_headers_opt)
self.requires('date/3.0.4', **transitive_headers_opt)
self.requires('boost/1.83.0', force=True)
self.requires('lz4/1.10.0', force=True)
self.requires('protobuf/3.21.12', force=True)
self.requires('sqlite3/3.49.1', force=True)
self.requires('protobuf/3.21.9', force=True)
self.requires('sqlite3/3.47.0', force=True)
if self.options.jemalloc:
self.requires('jemalloc/5.3.0')
if self.options.rocksdb:
self.requires('rocksdb/10.0.1')
self.requires('xxhash/0.8.3', **transitive_headers_opt)
self.requires('rocksdb/9.7.3')
exports_sources = (
'CMakeLists.txt',
@@ -132,6 +127,7 @@ class Xrpl(ConanFile):
self.folders.generators = 'build/generators'
generators = 'CMakeDeps'
def generate(self):
tc = CMakeToolchain(self)
tc.variables['tests'] = self.options.tests
@@ -168,17 +164,7 @@ class Xrpl(ConanFile):
# `include/`, not `include/ripple/proto/`.
libxrpl.includedirs = ['include', 'include/ripple/proto']
libxrpl.requires = [
'boost::headers',
'boost::chrono',
'boost::container',
'boost::coroutine',
'boost::date_time',
'boost::filesystem',
'boost::json',
'boost::program_options',
'boost::regex',
'boost::system',
'boost::thread',
'boost::boost',
'date::date',
'grpc::grpc++',
'libarchive::libarchive',

View File

@@ -1,4 +1,4 @@
cmake_minimum_required(VERSION 3.18)
cmake_minimum_required(VERSION 3.25)
# Note, version set explicitly by rippled project
project(antithesis-sdk-cpp VERSION 0.4.4 LANGUAGES CXX)

View File

@@ -17,9 +17,6 @@ add_library(ed25519 STATIC
)
add_library(ed25519::ed25519 ALIAS ed25519)
target_link_libraries(ed25519 PUBLIC OpenSSL::SSL)
if(NOT MSVC)
target_compile_options(ed25519 PRIVATE -Wno-implicit-fallthrough)
endif()
include(GNUInstallDirs)

10
external/nudb/conandata.yml vendored Normal file
View File

@@ -0,0 +1,10 @@
sources:
"2.0.8":
url: "https://github.com/CPPAlliance/NuDB/archive/2.0.8.tar.gz"
sha256: "9b71903d8ba111cd893ab064b9a8b6ac4124ed8bd6b4f67250205bc43c7f13a8"
patches:
"2.0.8":
- patch_file: "patches/2.0.8-0001-add-include-stdexcept-for-msvc.patch"
patch_description: "Fix build for MSVC by including stdexcept"
patch_type: "portability"
patch_source: "https://github.com/cppalliance/NuDB/pull/100/files"

72
external/nudb/conanfile.py vendored Normal file
View File

@@ -0,0 +1,72 @@
import os
from conan import ConanFile
from conan.tools.build import check_min_cppstd
from conan.tools.files import apply_conandata_patches, copy, export_conandata_patches, get
from conan.tools.layout import basic_layout
required_conan_version = ">=1.52.0"
class NudbConan(ConanFile):
name = "nudb"
description = "A fast key/value insert-only database for SSD drives in C++11"
license = "BSL-1.0"
url = "https://github.com/conan-io/conan-center-index"
homepage = "https://github.com/CPPAlliance/NuDB"
topics = ("header-only", "KVS", "insert-only")
package_type = "header-library"
settings = "os", "arch", "compiler", "build_type"
no_copy_source = True
@property
def _min_cppstd(self):
return 11
def export_sources(self):
export_conandata_patches(self)
def layout(self):
basic_layout(self, src_folder="src")
def requirements(self):
self.requires("boost/1.83.0")
def package_id(self):
self.info.clear()
def validate(self):
if self.settings.compiler.cppstd:
check_min_cppstd(self, self._min_cppstd)
def source(self):
get(self, **self.conan_data["sources"][self.version], strip_root=True)
def build(self):
apply_conandata_patches(self)
def package(self):
copy(self, "LICENSE*",
dst=os.path.join(self.package_folder, "licenses"),
src=self.source_folder)
copy(self, "*",
dst=os.path.join(self.package_folder, "include"),
src=os.path.join(self.source_folder, "include"))
def package_info(self):
self.cpp_info.bindirs = []
self.cpp_info.libdirs = []
self.cpp_info.set_property("cmake_target_name", "NuDB")
self.cpp_info.set_property("cmake_target_aliases", ["NuDB::nudb"])
self.cpp_info.set_property("cmake_find_mode", "both")
self.cpp_info.components["core"].set_property("cmake_target_name", "nudb")
self.cpp_info.components["core"].names["cmake_find_package"] = "nudb"
self.cpp_info.components["core"].names["cmake_find_package_multi"] = "nudb"
self.cpp_info.components["core"].requires = ["boost::thread", "boost::system"]
# TODO: to remove in conan v2 once cmake_find_package_* generators removed
self.cpp_info.names["cmake_find_package"] = "NuDB"
self.cpp_info.names["cmake_find_package_multi"] = "NuDB"

View File

@@ -0,0 +1,24 @@
diff --git a/include/nudb/detail/stream.hpp b/include/nudb/detail/stream.hpp
index 6c07bf1..e0ce8ed 100644
--- a/include/nudb/detail/stream.hpp
+++ b/include/nudb/detail/stream.hpp
@@ -14,6 +14,7 @@
#include <cstdint>
#include <cstring>
#include <memory>
+#include <stdexcept>
namespace nudb {
namespace detail {
diff --git a/include/nudb/impl/context.ipp b/include/nudb/impl/context.ipp
index beb7058..ffde0b3 100644
--- a/include/nudb/impl/context.ipp
+++ b/include/nudb/impl/context.ipp
@@ -9,6 +9,7 @@
#define NUDB_IMPL_CONTEXT_IPP
#include <nudb/detail/store_base.hpp>
+#include <stdexcept>
namespace nudb {

12
external/rocksdb/conandata.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
sources:
"9.7.3":
url: "https://github.com/facebook/rocksdb/archive/refs/tags/v9.7.3.tar.gz"
sha256: "acfabb989cbfb5b5c4d23214819b059638193ec33dad2d88373c46448d16d38b"
patches:
"9.7.3":
- patch_file: "patches/9.x.x-0001-exclude-thirdparty.patch"
patch_description: "Do not include thirdparty.inc"
patch_type: "portability"
- patch_file: "patches/9.7.3-0001-memory-leak.patch"
patch_description: "Fix a leak of obsolete blob files left open until DB::Close()"
patch_type: "portability"

235
external/rocksdb/conanfile.py vendored Normal file
View File

@@ -0,0 +1,235 @@
import os
import glob
import shutil
from conan import ConanFile
from conan.errors import ConanInvalidConfiguration
from conan.tools.build import check_min_cppstd
from conan.tools.cmake import CMake, CMakeDeps, CMakeToolchain, cmake_layout
from conan.tools.files import apply_conandata_patches, collect_libs, copy, export_conandata_patches, get, rm, rmdir
from conan.tools.microsoft import check_min_vs, is_msvc, is_msvc_static_runtime
from conan.tools.scm import Version
required_conan_version = ">=1.53.0"
class RocksDBConan(ConanFile):
name = "rocksdb"
description = "A library that provides an embeddable, persistent key-value store for fast storage"
license = ("GPL-2.0-only", "Apache-2.0")
url = "https://github.com/conan-io/conan-center-index"
homepage = "https://github.com/facebook/rocksdb"
topics = ("database", "leveldb", "facebook", "key-value")
package_type = "library"
settings = "os", "arch", "compiler", "build_type"
options = {
"shared": [True, False],
"fPIC": [True, False],
"lite": [True, False],
"with_gflags": [True, False],
"with_snappy": [True, False],
"with_lz4": [True, False],
"with_zlib": [True, False],
"with_zstd": [True, False],
"with_tbb": [True, False],
"with_jemalloc": [True, False],
"enable_sse": [False, "sse42", "avx2"],
"use_rtti": [True, False],
}
default_options = {
"shared": False,
"fPIC": True,
"lite": False,
"with_snappy": False,
"with_lz4": False,
"with_zlib": False,
"with_zstd": False,
"with_gflags": False,
"with_tbb": False,
"with_jemalloc": False,
"enable_sse": False,
"use_rtti": False,
}
@property
def _min_cppstd(self):
return "11" if Version(self.version) < "8.8.1" else "17"
@property
def _compilers_minimum_version(self):
return {} if self._min_cppstd == "11" else {
"apple-clang": "10",
"clang": "7",
"gcc": "7",
"msvc": "191",
"Visual Studio": "15",
}
def export_sources(self):
export_conandata_patches(self)
def config_options(self):
if self.settings.os == "Windows":
del self.options.fPIC
if self.settings.arch != "x86_64":
del self.options.with_tbb
if self.settings.build_type == "Debug":
self.options.use_rtti = True # Rtti are used in asserts for debug mode...
def configure(self):
if self.options.shared:
self.options.rm_safe("fPIC")
def layout(self):
cmake_layout(self, src_folder="src")
def requirements(self):
if self.options.with_gflags:
self.requires("gflags/2.2.2")
if self.options.with_snappy:
self.requires("snappy/1.1.10")
if self.options.with_lz4:
self.requires("lz4/1.10.0")
if self.options.with_zlib:
self.requires("zlib/[>=1.2.11 <2]")
if self.options.with_zstd:
self.requires("zstd/1.5.6")
if self.options.get_safe("with_tbb"):
self.requires("onetbb/2021.12.0")
if self.options.with_jemalloc:
self.requires("jemalloc/5.3.0")
def validate(self):
if self.settings.compiler.get_safe("cppstd"):
check_min_cppstd(self, self._min_cppstd)
minimum_version = self._compilers_minimum_version.get(str(self.settings.compiler), False)
if minimum_version and Version(self.settings.compiler.version) < minimum_version:
raise ConanInvalidConfiguration(
f"{self.ref} requires C++{self._min_cppstd}, which your compiler does not support."
)
if self.settings.arch not in ["x86_64", "ppc64le", "ppc64", "mips64", "armv8"]:
raise ConanInvalidConfiguration("Rocksdb requires 64 bits")
check_min_vs(self, "191")
if self.version == "6.20.3" and \
self.settings.os == "Linux" and \
self.settings.compiler == "gcc" and \
Version(self.settings.compiler.version) < "5":
raise ConanInvalidConfiguration("Rocksdb 6.20.3 is not compilable with gcc <5.") # See https://github.com/facebook/rocksdb/issues/3522
def source(self):
get(self, **self.conan_data["sources"][self.version], strip_root=True)
def generate(self):
tc = CMakeToolchain(self)
tc.variables["FAIL_ON_WARNINGS"] = False
tc.variables["WITH_TESTS"] = False
tc.variables["WITH_TOOLS"] = False
tc.variables["WITH_CORE_TOOLS"] = False
tc.variables["WITH_BENCHMARK_TOOLS"] = False
tc.variables["WITH_FOLLY_DISTRIBUTED_MUTEX"] = False
if is_msvc(self):
tc.variables["WITH_MD_LIBRARY"] = not is_msvc_static_runtime(self)
tc.variables["ROCKSDB_INSTALL_ON_WINDOWS"] = self.settings.os == "Windows"
tc.variables["ROCKSDB_LITE"] = self.options.lite
tc.variables["WITH_GFLAGS"] = self.options.with_gflags
tc.variables["WITH_SNAPPY"] = self.options.with_snappy
tc.variables["WITH_LZ4"] = self.options.with_lz4
tc.variables["WITH_ZLIB"] = self.options.with_zlib
tc.variables["WITH_ZSTD"] = self.options.with_zstd
tc.variables["WITH_TBB"] = self.options.get_safe("with_tbb", False)
tc.variables["WITH_JEMALLOC"] = self.options.with_jemalloc
tc.variables["ROCKSDB_BUILD_SHARED"] = self.options.shared
tc.variables["ROCKSDB_LIBRARY_EXPORTS"] = self.settings.os == "Windows" and self.options.shared
tc.variables["ROCKSDB_DLL" ] = self.settings.os == "Windows" and self.options.shared
tc.variables["USE_RTTI"] = self.options.use_rtti
if not bool(self.options.enable_sse):
tc.variables["PORTABLE"] = True
tc.variables["FORCE_SSE42"] = False
elif self.options.enable_sse == "sse42":
tc.variables["PORTABLE"] = True
tc.variables["FORCE_SSE42"] = True
elif self.options.enable_sse == "avx2":
tc.variables["PORTABLE"] = False
tc.variables["FORCE_SSE42"] = False
# not available yet in CCI
tc.variables["WITH_NUMA"] = False
tc.generate()
deps = CMakeDeps(self)
if self.options.with_jemalloc:
deps.set_property("jemalloc", "cmake_file_name", "JeMalloc")
deps.set_property("jemalloc", "cmake_target_name", "JeMalloc::JeMalloc")
if self.options.with_zstd:
deps.set_property("zstd", "cmake_target_name", "zstd::zstd")
deps.generate()
def build(self):
apply_conandata_patches(self)
cmake = CMake(self)
cmake.configure()
cmake.build()
def _remove_static_libraries(self):
rm(self, "rocksdb.lib", os.path.join(self.package_folder, "lib"))
for lib in glob.glob(os.path.join(self.package_folder, "lib", "*.a")):
if not lib.endswith(".dll.a"):
os.remove(lib)
def _remove_cpp_headers(self):
for path in glob.glob(os.path.join(self.package_folder, "include", "rocksdb", "*")):
if path != os.path.join(self.package_folder, "include", "rocksdb", "c.h"):
if os.path.isfile(path):
os.remove(path)
else:
shutil.rmtree(path)
def package(self):
copy(self, "COPYING", src=self.source_folder, dst=os.path.join(self.package_folder, "licenses"))
copy(self, "LICENSE*", src=self.source_folder, dst=os.path.join(self.package_folder, "licenses"))
cmake = CMake(self)
cmake.install()
if self.options.shared:
self._remove_static_libraries()
self._remove_cpp_headers() # Force stable ABI for shared libraries
rmdir(self, os.path.join(self.package_folder, "lib", "cmake"))
rmdir(self, os.path.join(self.package_folder, "lib", "pkgconfig"))
def package_info(self):
cmake_target = "rocksdb-shared" if self.options.shared else "rocksdb"
self.cpp_info.set_property("cmake_file_name", "RocksDB")
self.cpp_info.set_property("cmake_target_name", f"RocksDB::{cmake_target}")
# TODO: back to global scope in conan v2 once cmake_find_package* generators removed
self.cpp_info.components["librocksdb"].libs = collect_libs(self)
if self.settings.os == "Windows":
self.cpp_info.components["librocksdb"].system_libs = ["shlwapi", "rpcrt4"]
if self.options.shared:
self.cpp_info.components["librocksdb"].defines = ["ROCKSDB_DLL"]
elif self.settings.os in ["Linux", "FreeBSD"]:
self.cpp_info.components["librocksdb"].system_libs = ["pthread", "m"]
if self.options.lite:
self.cpp_info.components["librocksdb"].defines.append("ROCKSDB_LITE")
# TODO: to remove in conan v2 once cmake_find_package* generators removed
self.cpp_info.names["cmake_find_package"] = "RocksDB"
self.cpp_info.names["cmake_find_package_multi"] = "RocksDB"
self.cpp_info.components["librocksdb"].names["cmake_find_package"] = cmake_target
self.cpp_info.components["librocksdb"].names["cmake_find_package_multi"] = cmake_target
self.cpp_info.components["librocksdb"].set_property("cmake_target_name", f"RocksDB::{cmake_target}")
if self.options.with_gflags:
self.cpp_info.components["librocksdb"].requires.append("gflags::gflags")
if self.options.with_snappy:
self.cpp_info.components["librocksdb"].requires.append("snappy::snappy")
if self.options.with_lz4:
self.cpp_info.components["librocksdb"].requires.append("lz4::lz4")
if self.options.with_zlib:
self.cpp_info.components["librocksdb"].requires.append("zlib::zlib")
if self.options.with_zstd:
self.cpp_info.components["librocksdb"].requires.append("zstd::zstd")
if self.options.get_safe("with_tbb"):
self.cpp_info.components["librocksdb"].requires.append("onetbb::onetbb")
if self.options.with_jemalloc:
self.cpp_info.components["librocksdb"].requires.append("jemalloc::jemalloc")

View File

@@ -0,0 +1,319 @@
diff --git a/HISTORY.md b/HISTORY.md
index 36d472229..05ad1a202 100644
--- a/HISTORY.md
+++ b/HISTORY.md
@@ -1,6 +1,10 @@
# Rocksdb Change Log
> NOTE: Entries for next release do not go here. Follow instructions in `unreleased_history/README.txt`
+## 9.7.4 (10/31/2024)
+### Bug Fixes
+* Fix a leak of obsolete blob files left open until DB::Close(). This bug was introduced in version 9.4.0.
+
## 9.7.3 (10/16/2024)
### Behavior Changes
* OPTIONS file to be loaded by remote worker is now preserved so that it does not get purged by the primary host. A similar technique as how we are preserving new SST files from getting purged is used for this. min_options_file_numbers_ is tracked like pending_outputs_ is tracked.
diff --git a/db/blob/blob_file_cache.cc b/db/blob/blob_file_cache.cc
index 5f340aadf..1b9faa238 100644
--- a/db/blob/blob_file_cache.cc
+++ b/db/blob/blob_file_cache.cc
@@ -42,6 +42,7 @@ Status BlobFileCache::GetBlobFileReader(
assert(blob_file_reader);
assert(blob_file_reader->IsEmpty());
+ // NOTE: sharing same Cache with table_cache
const Slice key = GetSliceForKey(&blob_file_number);
assert(cache_);
@@ -98,4 +99,13 @@ Status BlobFileCache::GetBlobFileReader(
return Status::OK();
}
+void BlobFileCache::Evict(uint64_t blob_file_number) {
+ // NOTE: sharing same Cache with table_cache
+ const Slice key = GetSliceForKey(&blob_file_number);
+
+ assert(cache_);
+
+ cache_.get()->Erase(key);
+}
+
} // namespace ROCKSDB_NAMESPACE
diff --git a/db/blob/blob_file_cache.h b/db/blob/blob_file_cache.h
index 740e67ada..6858d012b 100644
--- a/db/blob/blob_file_cache.h
+++ b/db/blob/blob_file_cache.h
@@ -36,6 +36,15 @@ class BlobFileCache {
uint64_t blob_file_number,
CacheHandleGuard<BlobFileReader>* blob_file_reader);
+ // Called when a blob file is obsolete to ensure it is removed from the cache
+ // to avoid effectively leaking the open file and assicated memory
+ void Evict(uint64_t blob_file_number);
+
+ // Used to identify cache entries for blob files (not normally useful)
+ static const Cache::CacheItemHelper* GetHelper() {
+ return CacheInterface::GetBasicHelper();
+ }
+
private:
using CacheInterface =
BasicTypedCacheInterface<BlobFileReader, CacheEntryRole::kMisc>;
diff --git a/db/column_family.h b/db/column_family.h
index e4b7adde8..86637736a 100644
--- a/db/column_family.h
+++ b/db/column_family.h
@@ -401,6 +401,7 @@ class ColumnFamilyData {
SequenceNumber earliest_seq);
TableCache* table_cache() const { return table_cache_.get(); }
+ BlobFileCache* blob_file_cache() const { return blob_file_cache_.get(); }
BlobSource* blob_source() const { return blob_source_.get(); }
// See documentation in compaction_picker.h
diff --git a/db/db_impl/db_impl.cc b/db/db_impl/db_impl.cc
index 261593423..06573ac2e 100644
--- a/db/db_impl/db_impl.cc
+++ b/db/db_impl/db_impl.cc
@@ -659,8 +659,9 @@ Status DBImpl::CloseHelper() {
// We need to release them before the block cache is destroyed. The block
// cache may be destroyed inside versions_.reset(), when column family data
// list is destroyed, so leaving handles in table cache after
- // versions_.reset() may cause issues.
- // Here we clean all unreferenced handles in table cache.
+ // versions_.reset() may cause issues. Here we clean all unreferenced handles
+ // in table cache, and (for certain builds/conditions) assert that no obsolete
+ // files are hanging around unreferenced (leak) in the table/blob file cache.
// Now we assume all user queries have finished, so only version set itself
// can possibly hold the blocks from block cache. After releasing unreferenced
// handles here, only handles held by version set left and inside
@@ -668,6 +669,9 @@ Status DBImpl::CloseHelper() {
// time a handle is released, we erase it from the cache too. By doing that,
// we can guarantee that after versions_.reset(), table cache is empty
// so the cache can be safely destroyed.
+#ifndef NDEBUG
+ TEST_VerifyNoObsoleteFilesCached(/*db_mutex_already_held=*/true);
+#endif // !NDEBUG
table_cache_->EraseUnRefEntries();
for (auto& txn_entry : recovered_transactions_) {
@@ -3227,6 +3231,8 @@ Status DBImpl::MultiGetImpl(
s = Status::Aborted();
break;
}
+ // This could be a long-running operation
+ ROCKSDB_THREAD_YIELD_HOOK();
}
// Post processing (decrement reference counts and record statistics)
diff --git a/db/db_impl/db_impl.h b/db/db_impl/db_impl.h
index 5e4fa310b..ccc0abfa7 100644
--- a/db/db_impl/db_impl.h
+++ b/db/db_impl/db_impl.h
@@ -1241,9 +1241,14 @@ class DBImpl : public DB {
static Status TEST_ValidateOptions(const DBOptions& db_options) {
return ValidateOptions(db_options);
}
-
#endif // NDEBUG
+ // In certain configurations, verify that the table/blob file cache only
+ // contains entries for live files, to check for effective leaks of open
+ // files. This can only be called when purging of obsolete files has
+ // "settled," such as during parts of DB Close().
+ void TEST_VerifyNoObsoleteFilesCached(bool db_mutex_already_held) const;
+
// persist stats to column family "_persistent_stats"
void PersistStats();
diff --git a/db/db_impl/db_impl_debug.cc b/db/db_impl/db_impl_debug.cc
index 790a50d7a..67f5b4aaf 100644
--- a/db/db_impl/db_impl_debug.cc
+++ b/db/db_impl/db_impl_debug.cc
@@ -9,6 +9,7 @@
#ifndef NDEBUG
+#include "db/blob/blob_file_cache.h"
#include "db/column_family.h"
#include "db/db_impl/db_impl.h"
#include "db/error_handler.h"
@@ -328,5 +329,49 @@ size_t DBImpl::TEST_EstimateInMemoryStatsHistorySize() const {
InstrumentedMutexLock l(&const_cast<DBImpl*>(this)->stats_history_mutex_);
return EstimateInMemoryStatsHistorySize();
}
+
+void DBImpl::TEST_VerifyNoObsoleteFilesCached(
+ bool db_mutex_already_held) const {
+ // This check is somewhat expensive and obscure to make a part of every
+ // unit test in every build variety. Thus, we only enable it for ASAN builds.
+ if (!kMustFreeHeapAllocations) {
+ return;
+ }
+
+ std::optional<InstrumentedMutexLock> l;
+ if (db_mutex_already_held) {
+ mutex_.AssertHeld();
+ } else {
+ l.emplace(&mutex_);
+ }
+
+ std::vector<uint64_t> live_files;
+ for (auto cfd : *versions_->GetColumnFamilySet()) {
+ if (cfd->IsDropped()) {
+ continue;
+ }
+ // Sneakily add both SST and blob files to the same list
+ cfd->current()->AddLiveFiles(&live_files, &live_files);
+ }
+ std::sort(live_files.begin(), live_files.end());
+
+ auto fn = [&live_files](const Slice& key, Cache::ObjectPtr, size_t,
+ const Cache::CacheItemHelper* helper) {
+ if (helper != BlobFileCache::GetHelper()) {
+ // Skip non-blob files for now
+ // FIXME: diagnose and fix the leaks of obsolete SST files revealed in
+ // unit tests.
+ return;
+ }
+ // See TableCache and BlobFileCache
+ assert(key.size() == sizeof(uint64_t));
+ uint64_t file_number;
+ GetUnaligned(reinterpret_cast<const uint64_t*>(key.data()), &file_number);
+ // Assert file is in sorted live_files
+ assert(
+ std::binary_search(live_files.begin(), live_files.end(), file_number));
+ };
+ table_cache_->ApplyToAllEntries(fn, {});
+}
} // namespace ROCKSDB_NAMESPACE
#endif // NDEBUG
diff --git a/db/db_iter.cc b/db/db_iter.cc
index e02586377..bf4749eb9 100644
--- a/db/db_iter.cc
+++ b/db/db_iter.cc
@@ -540,6 +540,8 @@ bool DBIter::FindNextUserEntryInternal(bool skipping_saved_key,
} else {
iter_.Next();
}
+ // This could be a long-running operation due to tombstones, etc.
+ ROCKSDB_THREAD_YIELD_HOOK();
} while (iter_.Valid());
valid_ = false;
diff --git a/db/table_cache.cc b/db/table_cache.cc
index 71fc29c32..8a5be75e8 100644
--- a/db/table_cache.cc
+++ b/db/table_cache.cc
@@ -164,6 +164,7 @@ Status TableCache::GetTableReader(
}
Cache::Handle* TableCache::Lookup(Cache* cache, uint64_t file_number) {
+ // NOTE: sharing same Cache with BlobFileCache
Slice key = GetSliceForFileNumber(&file_number);
return cache->Lookup(key);
}
@@ -179,6 +180,7 @@ Status TableCache::FindTable(
size_t max_file_size_for_l0_meta_pin, Temperature file_temperature) {
PERF_TIMER_GUARD_WITH_CLOCK(find_table_nanos, ioptions_.clock);
uint64_t number = file_meta.fd.GetNumber();
+ // NOTE: sharing same Cache with BlobFileCache
Slice key = GetSliceForFileNumber(&number);
*handle = cache_.Lookup(key);
TEST_SYNC_POINT_CALLBACK("TableCache::FindTable:0",
diff --git a/db/version_builder.cc b/db/version_builder.cc
index ed8ab8214..c98f53f42 100644
--- a/db/version_builder.cc
+++ b/db/version_builder.cc
@@ -24,6 +24,7 @@
#include <vector>
#include "cache/cache_reservation_manager.h"
+#include "db/blob/blob_file_cache.h"
#include "db/blob/blob_file_meta.h"
#include "db/dbformat.h"
#include "db/internal_stats.h"
@@ -744,12 +745,9 @@ class VersionBuilder::Rep {
return Status::Corruption("VersionBuilder", oss.str());
}
- // Note: we use C++11 for now but in C++14, this could be done in a more
- // elegant way using generalized lambda capture.
- VersionSet* const vs = version_set_;
- const ImmutableCFOptions* const ioptions = ioptions_;
-
- auto deleter = [vs, ioptions](SharedBlobFileMetaData* shared_meta) {
+ auto deleter = [vs = version_set_, ioptions = ioptions_,
+ bc = cfd_ ? cfd_->blob_file_cache()
+ : nullptr](SharedBlobFileMetaData* shared_meta) {
if (vs) {
assert(ioptions);
assert(!ioptions->cf_paths.empty());
@@ -758,6 +756,9 @@ class VersionBuilder::Rep {
vs->AddObsoleteBlobFile(shared_meta->GetBlobFileNumber(),
ioptions->cf_paths.front().path);
}
+ if (bc) {
+ bc->Evict(shared_meta->GetBlobFileNumber());
+ }
delete shared_meta;
};
@@ -766,7 +767,7 @@ class VersionBuilder::Rep {
blob_file_number, blob_file_addition.GetTotalBlobCount(),
blob_file_addition.GetTotalBlobBytes(),
blob_file_addition.GetChecksumMethod(),
- blob_file_addition.GetChecksumValue(), deleter);
+ blob_file_addition.GetChecksumValue(), std::move(deleter));
mutable_blob_file_metas_.emplace(
blob_file_number, MutableBlobFileMetaData(std::move(shared_meta)));
diff --git a/db/version_set.h b/db/version_set.h
index 9336782b1..024f869e7 100644
--- a/db/version_set.h
+++ b/db/version_set.h
@@ -1514,7 +1514,6 @@ class VersionSet {
void GetLiveFilesMetaData(std::vector<LiveFileMetaData>* metadata);
void AddObsoleteBlobFile(uint64_t blob_file_number, std::string path) {
- // TODO: Erase file from BlobFileCache?
obsolete_blob_files_.emplace_back(blob_file_number, std::move(path));
}
diff --git a/include/rocksdb/version.h b/include/rocksdb/version.h
index 2a19796b8..0afa2cab1 100644
--- a/include/rocksdb/version.h
+++ b/include/rocksdb/version.h
@@ -13,7 +13,7 @@
// minor or major version number planned for release.
#define ROCKSDB_MAJOR 9
#define ROCKSDB_MINOR 7
-#define ROCKSDB_PATCH 3
+#define ROCKSDB_PATCH 4
// Do not use these. We made the mistake of declaring macros starting with
// double underscore. Now we have to live with our choice. We'll deprecate these
diff --git a/port/port.h b/port/port.h
index 13aa56d47..141716e5b 100644
--- a/port/port.h
+++ b/port/port.h
@@ -19,3 +19,19 @@
#elif defined(OS_WIN)
#include "port/win/port_win.h"
#endif
+
+#ifdef OS_LINUX
+// A temporary hook into long-running RocksDB threads to support modifying their
+// priority etc. This should become a public API hook once the requirements
+// are better understood.
+extern "C" void RocksDbThreadYield() __attribute__((__weak__));
+#define ROCKSDB_THREAD_YIELD_HOOK() \
+ { \
+ if (RocksDbThreadYield) { \
+ RocksDbThreadYield(); \
+ } \
+ }
+#else
+#define ROCKSDB_THREAD_YIELD_HOOK() \
+ {}
+#endif

View File

@@ -0,0 +1,30 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 93b884d..b715cb6 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -106,14 +106,9 @@ endif()
include(CMakeDependentOption)
if(MSVC)
- option(WITH_GFLAGS "build with GFlags" OFF)
option(WITH_XPRESS "build with windows built in compression" OFF)
- option(ROCKSDB_SKIP_THIRDPARTY "skip thirdparty.inc" OFF)
-
- if(NOT ROCKSDB_SKIP_THIRDPARTY)
- include(${CMAKE_CURRENT_SOURCE_DIR}/thirdparty.inc)
- endif()
-else()
+endif()
+if(TRUE)
if(CMAKE_SYSTEM_NAME MATCHES "FreeBSD" AND NOT CMAKE_SYSTEM_NAME MATCHES "kFreeBSD")
# FreeBSD has jemalloc as default malloc
# but it does not have all the jemalloc files in include/...
@@ -126,7 +121,7 @@ else()
endif()
endif()
- if(MINGW)
+ if(MSVC OR MINGW)
option(WITH_GFLAGS "build with GFlags" OFF)
else()
option(WITH_GFLAGS "build with GFlags" ON)

40
external/snappy/conandata.yml vendored Normal file
View File

@@ -0,0 +1,40 @@
sources:
"1.1.10":
url: "https://github.com/google/snappy/archive/1.1.10.tar.gz"
sha256: "49d831bffcc5f3d01482340fe5af59852ca2fe76c3e05df0e67203ebbe0f1d90"
"1.1.9":
url: "https://github.com/google/snappy/archive/1.1.9.tar.gz"
sha256: "75c1fbb3d618dd3a0483bff0e26d0a92b495bbe5059c8b4f1c962b478b6e06e7"
"1.1.8":
url: "https://github.com/google/snappy/archive/1.1.8.tar.gz"
sha256: "16b677f07832a612b0836178db7f374e414f94657c138e6993cbfc5dcc58651f"
"1.1.7":
url: "https://github.com/google/snappy/archive/1.1.7.tar.gz"
sha256: "3dfa02e873ff51a11ee02b9ca391807f0c8ea0529a4924afa645fbf97163f9d4"
patches:
"1.1.10":
- patch_file: "patches/1.1.10-0001-fix-inlining-failure.patch"
patch_description: "disable inlining for compilation error"
patch_type: "portability"
- patch_file: "patches/1.1.9-0002-no-Werror.patch"
patch_description: "disable 'warning as error' options"
patch_type: "portability"
- patch_file: "patches/1.1.10-0003-fix-clobber-list-older-llvm.patch"
patch_description: "disable inline asm on apple-clang"
patch_type: "portability"
- patch_file: "patches/1.1.9-0004-rtti-by-default.patch"
patch_description: "remove 'disable rtti'"
patch_type: "conan"
"1.1.9":
- patch_file: "patches/1.1.9-0001-fix-inlining-failure.patch"
patch_description: "disable inlining for compilation error"
patch_type: "portability"
- patch_file: "patches/1.1.9-0002-no-Werror.patch"
patch_description: "disable 'warning as error' options"
patch_type: "portability"
- patch_file: "patches/1.1.9-0003-fix-clobber-list-older-llvm.patch"
patch_description: "disable inline asm on apple-clang"
patch_type: "portability"
- patch_file: "patches/1.1.9-0004-rtti-by-default.patch"
patch_description: "remove 'disable rtti'"
patch_type: "conan"

89
external/snappy/conanfile.py vendored Normal file
View File

@@ -0,0 +1,89 @@
from conan import ConanFile
from conan.tools.build import check_min_cppstd
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
from conan.tools.files import apply_conandata_patches, copy, export_conandata_patches, get, rmdir
from conan.tools.scm import Version
import os
required_conan_version = ">=1.54.0"
class SnappyConan(ConanFile):
name = "snappy"
description = "A fast compressor/decompressor"
topics = ("google", "compressor", "decompressor")
url = "https://github.com/conan-io/conan-center-index"
homepage = "https://github.com/google/snappy"
license = "BSD-3-Clause"
package_type = "library"
settings = "os", "arch", "compiler", "build_type"
options = {
"shared": [True, False],
"fPIC": [True, False],
}
default_options = {
"shared": False,
"fPIC": True,
}
def export_sources(self):
export_conandata_patches(self)
def config_options(self):
if self.settings.os == 'Windows':
del self.options.fPIC
def configure(self):
if self.options.shared:
self.options.rm_safe("fPIC")
def layout(self):
cmake_layout(self, src_folder="src")
def validate(self):
if self.settings.compiler.get_safe("cppstd"):
check_min_cppstd(self, 11)
def source(self):
get(self, **self.conan_data["sources"][self.version], strip_root=True)
def generate(self):
tc = CMakeToolchain(self)
tc.variables["SNAPPY_BUILD_TESTS"] = False
if Version(self.version) >= "1.1.8":
tc.variables["SNAPPY_FUZZING_BUILD"] = False
tc.variables["SNAPPY_REQUIRE_AVX"] = False
tc.variables["SNAPPY_REQUIRE_AVX2"] = False
tc.variables["SNAPPY_INSTALL"] = True
if Version(self.version) >= "1.1.9":
tc.variables["SNAPPY_BUILD_BENCHMARKS"] = False
tc.generate()
def build(self):
apply_conandata_patches(self)
cmake = CMake(self)
cmake.configure()
cmake.build()
def package(self):
copy(self, "COPYING", src=self.source_folder, dst=os.path.join(self.package_folder, "licenses"))
cmake = CMake(self)
cmake.install()
rmdir(self, os.path.join(self.package_folder, "lib", "cmake"))
def package_info(self):
self.cpp_info.set_property("cmake_file_name", "Snappy")
self.cpp_info.set_property("cmake_target_name", "Snappy::snappy")
# TODO: back to global scope in conan v2 once cmake_find_package* generators removed
self.cpp_info.components["snappylib"].libs = ["snappy"]
if not self.options.shared:
if self.settings.os in ["Linux", "FreeBSD"]:
self.cpp_info.components["snappylib"].system_libs.append("m")
# TODO: to remove in conan v2 once cmake_find_package* generators removed
self.cpp_info.names["cmake_find_package"] = "Snappy"
self.cpp_info.names["cmake_find_package_multi"] = "Snappy"
self.cpp_info.components["snappylib"].names["cmake_find_package"] = "snappy"
self.cpp_info.components["snappylib"].names["cmake_find_package_multi"] = "snappy"
self.cpp_info.components["snappylib"].set_property("cmake_target_name", "Snappy::snappy")

View File

@@ -0,0 +1,13 @@
diff --git a/snappy-stubs-internal.h b/snappy-stubs-internal.h
index 1548ed7..3b4a9f3 100644
--- a/snappy-stubs-internal.h
+++ b/snappy-stubs-internal.h
@@ -100,7 +100,7 @@
// Inlining hints.
#if HAVE_ATTRIBUTE_ALWAYS_INLINE
-#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE __attribute__((always_inline))
+#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#else
#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#endif // HAVE_ATTRIBUTE_ALWAYS_INLINE

View File

@@ -0,0 +1,13 @@
diff --git a/snappy.cc b/snappy.cc
index d414718..e4efb59 100644
--- a/snappy.cc
+++ b/snappy.cc
@@ -1132,7 +1132,7 @@ inline size_t AdvanceToNextTagX86Optimized(const uint8_t** ip_p, size_t* tag) {
size_t literal_len = *tag >> 2;
size_t tag_type = *tag;
bool is_literal;
-#if defined(__GCC_ASM_FLAG_OUTPUTS__) && defined(__x86_64__)
+#if defined(__GCC_ASM_FLAG_OUTPUTS__) && defined(__x86_64__) && ( (!defined(__clang__) && !defined(__APPLE__)) || (!defined(__APPLE__) && defined(__clang__) && (__clang_major__ >= 9)) || (defined(__APPLE__) && defined(__clang__) && (__clang_major__ > 11)) )
// TODO clang misses the fact that the (c & 3) already correctly
// sets the zero flag.
asm("and $3, %k[tag_type]\n\t"

View File

@@ -0,0 +1,14 @@
Fixes the following error:
error: inlining failed in call to always_inline size_t snappy::AdvanceToNextTag(const uint8_t**, size_t*): function body can be overwritten at link time
--- snappy-stubs-internal.h
+++ snappy-stubs-internal.h
@@ -100,7 +100,7 @@
// Inlining hints.
#ifdef HAVE_ATTRIBUTE_ALWAYS_INLINE
-#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE __attribute__((always_inline))
+#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#else
#define SNAPPY_ATTRIBUTE_ALWAYS_INLINE
#endif

View File

@@ -0,0 +1,12 @@
--- CMakeLists.txt
+++ CMakeLists.txt
@@ -69,7 +69,7 @@
- # Use -Werror for clang only.
+if(0)
if(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
if(NOT CMAKE_CXX_FLAGS MATCHES "-Werror")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Werror")
endif(NOT CMAKE_CXX_FLAGS MATCHES "-Werror")
endif(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
-
+endif()

View File

@@ -0,0 +1,12 @@
asm clobbers do not work for clang < 9 and apple-clang < 11 (found by SpaceIm)
--- snappy.cc
+++ snappy.cc
@@ -1026,7 +1026,7 @@
size_t literal_len = *tag >> 2;
size_t tag_type = *tag;
bool is_literal;
-#if defined(__GNUC__) && defined(__x86_64__)
+#if defined(__GNUC__) && defined(__x86_64__) && ( (!defined(__clang__) && !defined(__APPLE__)) || (!defined(__APPLE__) && defined(__clang__) && (__clang_major__ >= 9)) || (defined(__APPLE__) && defined(__clang__) && (__clang_major__ > 11)) )
// TODO clang misses the fact that the (c & 3) already correctly
// sets the zero flag.
asm("and $3, %k[tag_type]\n\t"

View File

@@ -0,0 +1,20 @@
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -53,8 +53,6 @@ if(CMAKE_CXX_COMPILER_ID STREQUAL "MSVC")
add_definitions(-D_HAS_EXCEPTIONS=0)
# Disable RTTI.
- string(REGEX REPLACE "/GR" "" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
- set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /GR-")
else(CMAKE_CXX_COMPILER_ID STREQUAL "MSVC")
# Use -Wall for clang and gcc.
if(NOT CMAKE_CXX_FLAGS MATCHES "-Wall")
@@ -78,8 +76,6 @@ endif()
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fno-exceptions")
# Disable RTTI.
- string(REGEX REPLACE "-frtti" "" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
- set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fno-rtti")
endif(CMAKE_CXX_COMPILER_ID STREQUAL "MSVC")
# BUILD_SHARED_LIBS is a standard CMake variable, but we declare it here to make

12
external/soci/conandata.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
sources:
"4.0.3":
url: "https://github.com/SOCI/soci/archive/v4.0.3.tar.gz"
sha256: "4b1ff9c8545c5d802fbe06ee6cd2886630e5c03bf740e269bb625b45cf934928"
patches:
"4.0.3":
- patch_file: "patches/0001-Remove-hardcoded-INSTALL_NAME_DIR-for-relocatable-li.patch"
patch_description: "Generate relocatable libraries on MacOS"
patch_type: "portability"
- patch_file: "patches/0002-Fix-soci_backend.patch"
patch_description: "Fix variable names for dependencies"
patch_type: "conan"

212
external/soci/conanfile.py vendored Normal file
View File

@@ -0,0 +1,212 @@
from conan import ConanFile
from conan.tools.build import check_min_cppstd
from conan.tools.cmake import CMake, CMakeDeps, CMakeToolchain, cmake_layout
from conan.tools.files import apply_conandata_patches, copy, export_conandata_patches, get, rmdir
from conan.tools.microsoft import is_msvc
from conan.tools.scm import Version
from conan.errors import ConanInvalidConfiguration
import os
required_conan_version = ">=1.55.0"
class SociConan(ConanFile):
name = "soci"
homepage = "https://github.com/SOCI/soci"
url = "https://github.com/conan-io/conan-center-index"
description = "The C++ Database Access Library "
topics = ("mysql", "odbc", "postgresql", "sqlite3")
license = "BSL-1.0"
settings = "os", "arch", "compiler", "build_type"
options = {
"shared": [True, False],
"fPIC": [True, False],
"empty": [True, False],
"with_sqlite3": [True, False],
"with_db2": [True, False],
"with_odbc": [True, False],
"with_oracle": [True, False],
"with_firebird": [True, False],
"with_mysql": [True, False],
"with_postgresql": [True, False],
"with_boost": [True, False],
}
default_options = {
"shared": False,
"fPIC": True,
"empty": False,
"with_sqlite3": False,
"with_db2": False,
"with_odbc": False,
"with_oracle": False,
"with_firebird": False,
"with_mysql": False,
"with_postgresql": False,
"with_boost": False,
}
def export_sources(self):
export_conandata_patches(self)
def layout(self):
cmake_layout(self, src_folder="src")
def config_options(self):
if self.settings.os == "Windows":
self.options.rm_safe("fPIC")
def configure(self):
if self.options.shared:
self.options.rm_safe("fPIC")
def requirements(self):
if self.options.with_sqlite3:
self.requires("sqlite3/3.47.0")
if self.options.with_odbc and self.settings.os != "Windows":
self.requires("odbc/2.3.11")
if self.options.with_mysql:
self.requires("libmysqlclient/8.1.0")
if self.options.with_postgresql:
self.requires("libpq/15.5")
if self.options.with_boost:
self.requires("boost/1.83.0")
@property
def _minimum_compilers_version(self):
return {
"Visual Studio": "14",
"gcc": "4.8",
"clang": "3.8",
"apple-clang": "8.0"
}
def validate(self):
if self.settings.compiler.get_safe("cppstd"):
check_min_cppstd(self, 11)
compiler = str(self.settings.compiler)
compiler_version = Version(self.settings.compiler.version.value)
if compiler not in self._minimum_compilers_version:
self.output.warning("{} recipe lacks information about the {} compiler support.".format(self.name, self.settings.compiler))
elif compiler_version < self._minimum_compilers_version[compiler]:
raise ConanInvalidConfiguration("{} requires a {} version >= {}".format(self.name, compiler, compiler_version))
prefix = "Dependencies for"
message = "not configured in this conan package."
if self.options.with_db2:
# self.requires("db2/0.0.0") # TODO add support for db2
raise ConanInvalidConfiguration("{} DB2 {} ".format(prefix, message))
if self.options.with_oracle:
# self.requires("oracle_db/0.0.0") # TODO add support for oracle
raise ConanInvalidConfiguration("{} ORACLE {} ".format(prefix, message))
if self.options.with_firebird:
# self.requires("firebird/0.0.0") # TODO add support for firebird
raise ConanInvalidConfiguration("{} firebird {} ".format(prefix, message))
def source(self):
get(self, **self.conan_data["sources"][self.version], strip_root=True)
def generate(self):
tc = CMakeToolchain(self)
tc.variables["SOCI_SHARED"] = self.options.shared
tc.variables["SOCI_STATIC"] = not self.options.shared
tc.variables["SOCI_TESTS"] = False
tc.variables["SOCI_CXX11"] = True
tc.variables["SOCI_EMPTY"] = self.options.empty
tc.variables["WITH_SQLITE3"] = self.options.with_sqlite3
tc.variables["WITH_DB2"] = self.options.with_db2
tc.variables["WITH_ODBC"] = self.options.with_odbc
tc.variables["WITH_ORACLE"] = self.options.with_oracle
tc.variables["WITH_FIREBIRD"] = self.options.with_firebird
tc.variables["WITH_MYSQL"] = self.options.with_mysql
tc.variables["WITH_POSTGRESQL"] = self.options.with_postgresql
tc.variables["WITH_BOOST"] = self.options.with_boost
tc.generate()
deps = CMakeDeps(self)
deps.generate()
def build(self):
apply_conandata_patches(self)
cmake = CMake(self)
cmake.configure()
cmake.build()
def package(self):
copy(self, "LICENSE_1_0.txt", dst=os.path.join(self.package_folder, "licenses"), src=self.source_folder)
cmake = CMake(self)
cmake.install()
rmdir(self, os.path.join(self.package_folder, "lib", "cmake"))
def package_info(self):
self.cpp_info.set_property("cmake_file_name", "SOCI")
target_suffix = "" if self.options.shared else "_static"
lib_prefix = "lib" if is_msvc(self) and not self.options.shared else ""
version = Version(self.version)
lib_suffix = "_{}_{}".format(version.major, version.minor) if self.settings.os == "Windows" else ""
# soci_core
self.cpp_info.components["soci_core"].set_property("cmake_target_name", "SOCI::soci_core{}".format(target_suffix))
self.cpp_info.components["soci_core"].libs = ["{}soci_core{}".format(lib_prefix, lib_suffix)]
if self.options.with_boost:
self.cpp_info.components["soci_core"].requires.append("boost::boost")
# soci_empty
if self.options.empty:
self.cpp_info.components["soci_empty"].set_property("cmake_target_name", "SOCI::soci_empty{}".format(target_suffix))
self.cpp_info.components["soci_empty"].libs = ["{}soci_empty{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_empty"].requires = ["soci_core"]
# soci_sqlite3
if self.options.with_sqlite3:
self.cpp_info.components["soci_sqlite3"].set_property("cmake_target_name", "SOCI::soci_sqlite3{}".format(target_suffix))
self.cpp_info.components["soci_sqlite3"].libs = ["{}soci_sqlite3{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_sqlite3"].requires = ["soci_core", "sqlite3::sqlite3"]
# soci_odbc
if self.options.with_odbc:
self.cpp_info.components["soci_odbc"].set_property("cmake_target_name", "SOCI::soci_odbc{}".format(target_suffix))
self.cpp_info.components["soci_odbc"].libs = ["{}soci_odbc{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_odbc"].requires = ["soci_core"]
if self.settings.os == "Windows":
self.cpp_info.components["soci_odbc"].system_libs.append("odbc32")
else:
self.cpp_info.components["soci_odbc"].requires.append("odbc::odbc")
# soci_mysql
if self.options.with_mysql:
self.cpp_info.components["soci_mysql"].set_property("cmake_target_name", "SOCI::soci_mysql{}".format(target_suffix))
self.cpp_info.components["soci_mysql"].libs = ["{}soci_mysql{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_mysql"].requires = ["soci_core", "libmysqlclient::libmysqlclient"]
# soci_postgresql
if self.options.with_postgresql:
self.cpp_info.components["soci_postgresql"].set_property("cmake_target_name", "SOCI::soci_postgresql{}".format(target_suffix))
self.cpp_info.components["soci_postgresql"].libs = ["{}soci_postgresql{}".format(lib_prefix, lib_suffix)]
self.cpp_info.components["soci_postgresql"].requires = ["soci_core", "libpq::libpq"]
# TODO: to remove in conan v2 once cmake_find_package* generators removed
self.cpp_info.names["cmake_find_package"] = "SOCI"
self.cpp_info.names["cmake_find_package_multi"] = "SOCI"
self.cpp_info.components["soci_core"].names["cmake_find_package"] = "soci_core{}".format(target_suffix)
self.cpp_info.components["soci_core"].names["cmake_find_package_multi"] = "soci_core{}".format(target_suffix)
if self.options.empty:
self.cpp_info.components["soci_empty"].names["cmake_find_package"] = "soci_empty{}".format(target_suffix)
self.cpp_info.components["soci_empty"].names["cmake_find_package_multi"] = "soci_empty{}".format(target_suffix)
if self.options.with_sqlite3:
self.cpp_info.components["soci_sqlite3"].names["cmake_find_package"] = "soci_sqlite3{}".format(target_suffix)
self.cpp_info.components["soci_sqlite3"].names["cmake_find_package_multi"] = "soci_sqlite3{}".format(target_suffix)
if self.options.with_odbc:
self.cpp_info.components["soci_odbc"].names["cmake_find_package"] = "soci_odbc{}".format(target_suffix)
self.cpp_info.components["soci_odbc"].names["cmake_find_package_multi"] = "soci_odbc{}".format(target_suffix)
if self.options.with_mysql:
self.cpp_info.components["soci_mysql"].names["cmake_find_package"] = "soci_mysql{}".format(target_suffix)
self.cpp_info.components["soci_mysql"].names["cmake_find_package_multi"] = "soci_mysql{}".format(target_suffix)
if self.options.with_postgresql:
self.cpp_info.components["soci_postgresql"].names["cmake_find_package"] = "soci_postgresql{}".format(target_suffix)
self.cpp_info.components["soci_postgresql"].names["cmake_find_package_multi"] = "soci_postgresql{}".format(target_suffix)

View File

@@ -0,0 +1,39 @@
From d491bf7b5040d314ffd0c6310ba01f78ff44c85e Mon Sep 17 00:00:00 2001
From: Rasmus Thomsen <rasmus.thomsen@dampsoft.de>
Date: Fri, 14 Apr 2023 09:16:29 +0200
Subject: [PATCH] Remove hardcoded INSTALL_NAME_DIR for relocatable libraries
on MacOS
---
cmake/SociBackend.cmake | 2 +-
src/core/CMakeLists.txt | 1 -
2 files changed, 1 insertion(+), 2 deletions(-)
diff --git a/cmake/SociBackend.cmake b/cmake/SociBackend.cmake
index 5d4ef0df..39fe1f77 100644
--- a/cmake/SociBackend.cmake
+++ b/cmake/SociBackend.cmake
@@ -171,7 +171,7 @@ macro(soci_backend NAME)
set_target_properties(${THIS_BACKEND_TARGET}
PROPERTIES
SOVERSION ${${PROJECT_NAME}_SOVERSION}
- INSTALL_NAME_DIR ${CMAKE_INSTALL_PREFIX}/lib)
+ )
if(APPLE)
set_target_properties(${THIS_BACKEND_TARGET}
diff --git a/src/core/CMakeLists.txt b/src/core/CMakeLists.txt
index 3e7deeae..f9eae564 100644
--- a/src/core/CMakeLists.txt
+++ b/src/core/CMakeLists.txt
@@ -59,7 +59,6 @@ if (SOCI_SHARED)
PROPERTIES
VERSION ${SOCI_VERSION}
SOVERSION ${SOCI_SOVERSION}
- INSTALL_NAME_DIR ${CMAKE_INSTALL_PREFIX}/lib
CLEAN_DIRECT_OUTPUT 1)
endif()
--
2.25.1

View File

@@ -0,0 +1,24 @@
diff --git a/cmake/SociBackend.cmake b/cmake/SociBackend.cmake
index 0a664667..3fa2ed95 100644
--- a/cmake/SociBackend.cmake
+++ b/cmake/SociBackend.cmake
@@ -31,14 +31,13 @@ macro(soci_backend_deps_found NAME DEPS SUCCESS)
if(NOT DEPEND_FOUND)
list(APPEND DEPS_NOT_FOUND ${dep})
else()
- string(TOUPPER "${dep}" DEPU)
- if( ${DEPU}_INCLUDE_DIR )
- list(APPEND DEPS_INCLUDE_DIRS ${${DEPU}_INCLUDE_DIR})
+ if( ${dep}_INCLUDE_DIR )
+ list(APPEND DEPS_INCLUDE_DIRS ${${dep}_INCLUDE_DIR})
endif()
- if( ${DEPU}_INCLUDE_DIRS )
- list(APPEND DEPS_INCLUDE_DIRS ${${DEPU}_INCLUDE_DIRS})
+ if( ${dep}_INCLUDE_DIRS )
+ list(APPEND DEPS_INCLUDE_DIRS ${${dep}_INCLUDE_DIRS})
endif()
- list(APPEND DEPS_LIBRARIES ${${DEPU}_LIBRARIES})
+ list(APPEND DEPS_LIBRARIES ${${dep}_LIBRARIES})
endif()
endforeach()

6
external/wamr/conandata.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
patches:
2.2.0:
- patch_description: add metering to iwasm interpreter
patch_file: patches/ripp_metering.patch
patch_type: conan

91
external/wamr/conanfile.py vendored Normal file
View File

@@ -0,0 +1,91 @@
from conans import ConanFile, tools
from conan.tools.cmake import CMake, CMakeToolchain, CMakeDeps, cmake_layout
from conan.tools.files import (
apply_conandata_patches,
export_conandata_patches,
# get,
)
# import os
required_conan_version = ">=1.55.0"
class WamrConan(ConanFile):
name = "wamr"
version = "2.2.0"
license = "Apache License v2.0"
url = "https://github.com/bytecodealliance/wasm-micro-runtime.git"
description = "Webassembly micro runtime"
package_type = "library"
settings = "os", "compiler", "build_type", "arch"
options = {"shared": [True, False], "fPIC": [True, False]}
default_options = {"shared": False, "fPIC": True}
generators = "CMakeToolchain", "CMakeDeps"
# requires = [("llvm/20.1.1@")]
def export_sources(self):
export_conandata_patches(self)
pass
# def build_requirements(self):
# self.tool_requires("llvm/20.1.1")
def config_options(self):
if self.settings.os == "Windows":
del self.options.fPIC
def layout(self):
cmake_layout(self, src_folder="src")
def source(self):
git = tools.Git()
git.clone(
"https://github.com/bytecodealliance/wasm-micro-runtime.git",
"c883fafead005e87ad3122b05409886f507c1cb0",
shallow=True,
)
# get(self, **self.conan_data["sources"][self.version], strip_root=True)
def generate(self):
tc = CMakeToolchain(self)
tc.variables["WAMR_BUILD_INTERP"] = 1
tc.variables["WAMR_BUILD_FAST_INTERP"] = 1
tc.variables["WAMR_BUILD_INSTRUCTION_METERING"] = 1
tc.variables["WAMR_BUILD_AOT"] = 0
tc.variables["WAMR_BUILD_JIT"] = 0
tc.variables["WAMR_BUILD_FAST_JIT"] = 0
tc.variables["WAMR_DISABLE_HW_BOUND_CHECK"] = 1
tc.variables["WAMR_DISABLE_STACK_HW_BOUND_CHECK"] = 1
tc.variables["WAMR_BH_LOG"] = "wamr_log_to_rippled"
# tc.variables["WAMR_BUILD_FAST_JIT"] = 0 if self.settings.os == "Windows" else 1
# ll_dep = self.dependencies["llvm"]
# self.output.info(f"-----------package_folder: {type(ll_dep.__dict__)}")
# tc.variables["LLVM_DIR"] = os.path.join(ll_dep.package_folder, "lib", "cmake", "llvm")
tc.generate()
# This generates "foo-config.cmake" and "bar-config.cmake" in self.generators_folder
deps = CMakeDeps(self)
deps.generate()
def build(self):
apply_conandata_patches(self)
cmake = CMake(self)
cmake.verbose = True
cmake.configure()
cmake.build()
# self.run(f'echo {self.source_folder}')
# Explicit way:
# self.run('cmake %s/hello %s' % (self.source_folder, cmake.command_line))
# self.run("cmake --build . %s" % cmake.build_config)
def package(self):
cmake = CMake(self)
cmake.verbose = True
cmake.install()
def package_info(self):
self.cpp_info.libs = ["iwasm"]
self.cpp_info.names["cmake_find_package"] = "wamr"
self.cpp_info.names["cmake_find_package_multi"] = "wamr"

View File

@@ -0,0 +1,470 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 551991f8..5f48a0b8 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -1,7 +1,7 @@
# Copyright (C) 2019 Intel Corporation. All rights reserved.
# SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
-cmake_minimum_required (VERSION 3.14)
+cmake_minimum_required (VERSION 3.20)
option(BUILD_SHARED_LIBS "Build using shared libraries" OFF)
diff --git a/build-scripts/config_common.cmake b/build-scripts/config_common.cmake
index 1cb50235..bd103022 100644
--- a/build-scripts/config_common.cmake
+++ b/build-scripts/config_common.cmake
@@ -669,6 +669,10 @@ if (WAMR_BUILD_AOT_VALIDATOR EQUAL 1)
message (" AOT validator enabled")
add_definitions (-DWASM_ENABLE_AOT_VALIDATOR=1)
endif ()
+if (WAMR_BUILD_INSTRUCTION_METERING EQUAL 1)
+ message (" Instruction metering enabled")
+ add_definitions (-DWASM_ENABLE_INSTRUCTION_METERING=1)
+endif ()
########################################
# Show Phase4 Wasm proposals status.
diff --git a/core/config.h b/core/config.h
index cb1189c9..a4e1499e 100644
--- a/core/config.h
+++ b/core/config.h
@@ -716,4 +716,8 @@ unless used elsewhere */
#define WASM_ENABLE_AOT_VALIDATOR 0
#endif
+#ifndef WASM_ENABLE_INSTRUCTION_METERING
+#define WASM_ENABLE_INSTRUCTION_METERING 0
+#endif
+
#endif /* end of _CONFIG_H_ */
diff --git a/core/iwasm/common/wasm_c_api.c b/core/iwasm/common/wasm_c_api.c
index 269ec577..bc6fd01b 100644
--- a/core/iwasm/common/wasm_c_api.c
+++ b/core/iwasm/common/wasm_c_api.c
@@ -5389,3 +5389,8 @@ wasm_instance_get_wasm_func_exec_time(const wasm_instance_t *instance,
return -1.0;
#endif
}
+
+wasm_exec_env_t wasm_instance_exec_env(const wasm_instance_t*instance)
+{
+ return wasm_runtime_get_exec_env_singleton(instance->inst_comm_rt);
+}
diff --git a/core/iwasm/common/wasm_exec_env.c b/core/iwasm/common/wasm_exec_env.c
index e33fd9f3..d1ff9c41 100644
--- a/core/iwasm/common/wasm_exec_env.c
+++ b/core/iwasm/common/wasm_exec_env.c
@@ -85,6 +85,12 @@ wasm_exec_env_create_internal(struct WASMModuleInstanceCommon *module_inst,
wasm_runtime_dump_exec_env_mem_consumption(exec_env);
#endif
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ exec_env->instructions_to_execute = -1;
+ for(int i = 0; i < 256; ++i)
+ exec_env->instructions_schedule[i] = 1;
+#endif
+
return exec_env;
#ifdef OS_ENABLE_HW_BOUND_CHECK
diff --git a/core/iwasm/common/wasm_exec_env.h b/core/iwasm/common/wasm_exec_env.h
index ce0c1fa7..2713a092 100644
--- a/core/iwasm/common/wasm_exec_env.h
+++ b/core/iwasm/common/wasm_exec_env.h
@@ -87,6 +87,12 @@ typedef struct WASMExecEnv {
uint8 *bottom;
} wasm_stack;
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ /* instructions to execute */
+ int64 instructions_to_execute;
+ int64 instructions_schedule[256];
+#endif
+
#if WASM_ENABLE_FAST_JIT != 0
/**
* Cache for
diff --git a/core/iwasm/common/wasm_runtime_common.c b/core/iwasm/common/wasm_runtime_common.c
index d33c0272..21bfdf29 100644
--- a/core/iwasm/common/wasm_runtime_common.c
+++ b/core/iwasm/common/wasm_runtime_common.c
@@ -2285,6 +2285,31 @@ wasm_runtime_access_exce_check_guard_page()
}
#endif
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+
+void
+wasm_runtime_set_instruction_count_limit(WASMExecEnv *exec_env,
+ int64 instructions_to_execute)
+{
+ exec_env->instructions_to_execute = instructions_to_execute;
+}
+
+int64
+wasm_runtime_get_instruction_count_limit(WASMExecEnv *exec_env)
+{
+ return exec_env->instructions_to_execute;
+}
+
+void
+wasm_runtime_set_instruction_schedule(WASMExecEnv *exec_env,
+ int64 const *instructions_schedule)
+{
+ for(int i = 0; i < 256; ++i)
+ exec_env->instructions_schedule[i] = instructions_schedule[i];
+}
+
+#endif
+
WASMFuncType *
wasm_runtime_get_function_type(const WASMFunctionInstanceCommon *function,
uint32 module_type)
@@ -7792,13 +7817,14 @@ wasm_runtime_get_module_name(wasm_module_t module)
bool
wasm_runtime_detect_native_stack_overflow(WASMExecEnv *exec_env)
{
+#if WASM_DISABLE_STACK_HW_BOUND_CHECK == 0
uint8 *boundary = exec_env->native_stack_boundary;
RECORD_STACK_USAGE(exec_env, (uint8 *)&boundary);
if (boundary == NULL) {
/* the platform doesn't support os_thread_get_stack_boundary */
return true;
}
-#if defined(OS_ENABLE_HW_BOUND_CHECK) && WASM_DISABLE_STACK_HW_BOUND_CHECK == 0
+#if defined(OS_ENABLE_HW_BOUND_CHECK)
uint32 page_size = os_getpagesize();
uint32 guard_page_count = STACK_OVERFLOW_CHECK_GUARD_PAGE_COUNT;
boundary = boundary + page_size * guard_page_count;
@@ -7808,6 +7834,7 @@ wasm_runtime_detect_native_stack_overflow(WASMExecEnv *exec_env)
"native stack overflow");
return false;
}
+#endif
return true;
}
@@ -7830,7 +7857,7 @@ wasm_runtime_detect_native_stack_overflow_size(WASMExecEnv *exec_env,
boundary = boundary - WASM_STACK_GUARD_SIZE + requested_size;
if ((uint8 *)&boundary < boundary) {
wasm_runtime_set_exception(wasm_runtime_get_module_inst(exec_env),
- "native stack overflow");
+ "native s stack overflow");
return false;
}
return true;
diff --git a/core/iwasm/common/wasm_runtime_common.h b/core/iwasm/common/wasm_runtime_common.h
index 8ac032bf..5ca5d489 100644
--- a/core/iwasm/common/wasm_runtime_common.h
+++ b/core/iwasm/common/wasm_runtime_common.h
@@ -791,9 +791,25 @@ WASM_RUNTIME_API_EXTERN void
wasm_runtime_set_native_stack_boundary(WASMExecEnv *exec_env,
uint8 *native_stack_boundary);
-#if WASM_CONFIGURABLE_BOUNDS_CHECKS != 0
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+
/* See wasm_export.h for description */
WASM_RUNTIME_API_EXTERN void
+wasm_runtime_set_instruction_count_limit(WASMExecEnv *exec_env,
+ int64 instructions_to_execute);
+WASM_RUNTIME_API_EXTERN int64
+wasm_runtime_get_instruction_count_limit(WASMExecEnv *exec_env);
+
+WASM_RUNTIME_API_EXTERN void
+wasm_runtime_set_instruction_schedule(WASMExecEnv *exec_env,
+ int64 const *instructions_schedule);
+
+#endif
+
+#if WASM_CONFIGURABLE_BOUNDS_CHECKS != 0
+/* See wasm_export.h for description */
+WASM_RUNTIME_API_EXTERN
+void
wasm_runtime_set_bounds_checks(WASMModuleInstanceCommon *module_inst,
bool enable);
diff --git a/core/iwasm/include/wasm_c_api.h b/core/iwasm/include/wasm_c_api.h
index 241a0eec..9eb0dde1 100644
--- a/core/iwasm/include/wasm_c_api.h
+++ b/core/iwasm/include/wasm_c_api.h
@@ -19,8 +19,10 @@
#if defined(_MSC_BUILD)
#if defined(COMPILING_WASM_RUNTIME_API)
#define WASM_API_EXTERN __declspec(dllexport)
-#else
+#elif defined(_DLL)
#define WASM_API_EXTERN __declspec(dllimport)
+#else
+#define WASM_API_EXTERN
#endif
#else
#define WASM_API_EXTERN
@@ -701,6 +703,11 @@ WASM_API_EXTERN double wasm_instance_sum_wasm_exec_time(const wasm_instance_t*);
// func_name. If the function is not found, return 0.
WASM_API_EXTERN double wasm_instance_get_wasm_func_exec_time(const wasm_instance_t*, const char *);
+struct WASMExecEnv;
+typedef struct WASMExecEnv *wasm_exec_env_t;
+
+WASM_API_EXTERN wasm_exec_env_t wasm_instance_exec_env(const wasm_instance_t*);
+
///////////////////////////////////////////////////////////////////////////////
// Convenience
diff --git a/core/iwasm/include/wasm_export.h b/core/iwasm/include/wasm_export.h
index b73a0364..3fd0949f 100644
--- a/core/iwasm/include/wasm_export.h
+++ b/core/iwasm/include/wasm_export.h
@@ -20,8 +20,10 @@
#if defined(_MSC_BUILD)
#if defined(COMPILING_WASM_RUNTIME_API)
#define WASM_RUNTIME_API_EXTERN __declspec(dllexport)
-#else
+#elif defined(_DLL)
#define WASM_RUNTIME_API_EXTERN __declspec(dllimport)
+#else
+#define WASM_RUNTIME_API_EXTERN
#endif
#elif defined(__GNUC__) || defined(__clang__)
#define WASM_RUNTIME_API_EXTERN __attribute__((visibility("default")))
@@ -1821,6 +1823,27 @@ WASM_RUNTIME_API_EXTERN void
wasm_runtime_set_native_stack_boundary(wasm_exec_env_t exec_env,
uint8_t *native_stack_boundary);
+/**
+ * Set the instruction count limit to the execution environment.
+ * By default the instruction count limit is -1, which means no limit.
+ * However, if the instruction count limit is set to a positive value,
+ * the execution will be terminated when the instruction count reaches
+ * the limit.
+ *
+ * @param exec_env the execution environment
+ * @param instruction_count the instruction count limit
+ */
+WASM_RUNTIME_API_EXTERN void
+wasm_runtime_set_instruction_count_limit(wasm_exec_env_t exec_env,
+ int64_t instruction_count);
+
+WASM_RUNTIME_API_EXTERN int64_t
+wasm_runtime_get_instruction_count_limit(wasm_exec_env_t exec_env);
+
+WASM_RUNTIME_API_EXTERN void
+wasm_runtime_set_instruction_schedule(wasm_exec_env_t exec_env,
+ int64_t const *instructions_schedule);
+
/**
* Dump runtime memory consumption, including:
* Exec env memory consumption
diff --git a/core/iwasm/interpreter/wasm_interp_classic.c b/core/iwasm/interpreter/wasm_interp_classic.c
index 41ac4c72..bd8f714a 100644
--- a/core/iwasm/interpreter/wasm_interp_classic.c
+++ b/core/iwasm/interpreter/wasm_interp_classic.c
@@ -1516,10 +1516,13 @@ wasm_interp_call_func_import(WASMModuleInstance *module_inst,
} \
os_mutex_unlock(&exec_env->wait_lock); \
} \
+ CHECK_INSTRUCTION_LIMIT(); \
goto *handle_table[*frame_ip++]; \
} while (0)
#else
-#define HANDLE_OP_END() FETCH_OPCODE_AND_DISPATCH()
+#define HANDLE_OP_END() \
+ CHECK_INSTRUCTION_LIMIT(); \
+ FETCH_OPCODE_AND_DISPATCH()
#endif
#else /* else of WASM_ENABLE_LABELS_AS_VALUES */
@@ -1542,9 +1545,12 @@ wasm_interp_call_func_import(WASMModuleInstance *module_inst,
} \
os_mutex_unlock(&exec_env->wait_lock); \
} \
+ CHECK_INSTRUCTION_LIMIT(); \
continue;
#else
-#define HANDLE_OP_END() continue
+#define HANDLE_OP_END() \
+ CHECK_INSTRUCTION_LIMIT(); \
+ continue;
#endif
#endif /* end of WASM_ENABLE_LABELS_AS_VALUES */
@@ -1562,6 +1568,20 @@ get_global_addr(uint8 *global_data, WASMGlobalInstance *global)
#endif
}
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+#define CHECK_INSTRUCTION_LIMIT() \
+ if (instructions_to_execute >= 0) \
+ { \
+ instructions_to_execute -= instructions_schedule[opcode];\
+ if (instructions_to_execute < 0) { \
+ wasm_set_exception(module, "instruction limit exceeded"); \
+ goto got_exception; \
+ } \
+ }
+#else
+#define CHECK_INSTRUCTION_LIMIT() (void)0
+#endif
+
static void
wasm_interp_call_func_bytecode(WASMModuleInstance *module,
WASMExecEnv *exec_env,
@@ -1605,6 +1625,17 @@ wasm_interp_call_func_bytecode(WASMModuleInstance *module,
uint32 local_idx, local_offset, global_idx;
uint8 local_type, *global_addr;
uint32 cache_index, type_index, param_cell_num, cell_num;
+
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ int64 instructions_to_execute = -1;
+ int64 const *instructions_schedule = NULL;
+ if(exec_env)
+ {
+ instructions_to_execute = exec_env->instructions_to_execute;
+ instructions_schedule = exec_env->instructions_schedule;
+ }
+#endif
+
#if WASM_ENABLE_EXCE_HANDLING != 0
int32_t exception_tag_index;
#endif
@@ -6859,6 +6890,11 @@ wasm_interp_call_func_bytecode(WASMModuleInstance *module,
FREE_FRAME(exec_env, frame);
wasm_exec_env_set_cur_frame(exec_env, prev_frame);
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ if(exec_env)
+ exec_env->instructions_to_execute = instructions_to_execute;
+#endif
+
if (!prev_frame->ip) {
/* Called from native. */
return;
@@ -6899,6 +6935,12 @@ wasm_interp_call_func_bytecode(WASMModuleInstance *module,
}
#endif
SYNC_ALL_TO_FRAME();
+
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ if(exec_env)
+ exec_env->instructions_to_execute = instructions_to_execute;
+#endif
+
return;
#if WASM_ENABLE_LABELS_AS_VALUES == 0
diff --git a/core/iwasm/interpreter/wasm_interp_fast.c b/core/iwasm/interpreter/wasm_interp_fast.c
index f33ad60e..9cbf2010 100644
--- a/core/iwasm/interpreter/wasm_interp_fast.c
+++ b/core/iwasm/interpreter/wasm_interp_fast.c
@@ -105,6 +105,19 @@ typedef float64 CellType_F64;
goto unaligned_atomic; \
} while (0)
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+#define CHECK_INSTRUCTION_LIMIT() \
+ if (instructions_to_execute >= 0) { \
+ instructions_to_execute -= instructions_schedule[opcode]; \
+ if (instructions_to_execute < 0) { \
+ wasm_set_exception(module, "instruction limit exceeded"); \
+ goto got_exception; \
+ } \
+ }
+#else
+#define CHECK_INSTRUCTION_LIMIT() (void)0
+#endif
+
static inline uint32
rotl32(uint32 n, uint32 c)
{
@@ -1466,12 +1479,14 @@ wasm_interp_dump_op_count()
} while (0)
#endif
#endif /* end of WASM_CPU_SUPPORTS_UNALIGNED_ADDR_ACCESS */
-#define HANDLE_OP_END() FETCH_OPCODE_AND_DISPATCH()
+#define HANDLE_OP_END() CHECK_INSTRUCTION_LIMIT(); FETCH_OPCODE_AND_DISPATCH()
#else /* else of WASM_ENABLE_LABELS_AS_VALUES */
#define HANDLE_OP(opcode) case opcode:
-#define HANDLE_OP_END() continue
+#define HANDLE_OP_END() \
+ CHECK_INSTRUCTION_LIMIT(); \
+ continue
#endif /* end of WASM_ENABLE_LABELS_AS_VALUES */
@@ -1538,6 +1553,16 @@ wasm_interp_call_func_bytecode(WASMModuleInstance *module,
uint8 *maddr = NULL;
uint32 local_idx, local_offset, global_idx;
uint8 opcode = 0, local_type, *global_addr;
+
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ int64 instructions_to_execute = -1;
+ int64 const *instructions_schedule = NULL;
+ if (exec_env) {
+ instructions_to_execute = exec_env->instructions_to_execute;
+ instructions_schedule = exec_env->instructions_schedule;
+ }
+#endif
+
#if !defined(OS_ENABLE_HW_BOUND_CHECK) \
|| WASM_CPU_SUPPORTS_UNALIGNED_ADDR_ACCESS == 0
#if WASM_CONFIGURABLE_BOUNDS_CHECKS != 0
@@ -7761,6 +7786,11 @@ wasm_interp_call_func_bytecode(WASMModuleInstance *module,
FREE_FRAME(exec_env, frame);
wasm_exec_env_set_cur_frame(exec_env, (WASMRuntimeFrame *)prev_frame);
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ if (exec_env)
+ exec_env->instructions_to_execute = instructions_to_execute;
+#endif
+
if (!prev_frame->ip)
/* Called from native. */
return;
@@ -7789,6 +7819,10 @@ wasm_interp_call_func_bytecode(WASMModuleInstance *module,
got_exception:
SYNC_ALL_TO_FRAME();
+#if WASM_ENABLE_INSTRUCTION_METERING != 0
+ if (exec_env)
+ exec_env->instructions_to_execute = instructions_to_execute;
+#endif
return;
#if WASM_ENABLE_LABELS_AS_VALUES == 0
diff --git a/core/shared/platform/include/platform_wasi_types.h b/core/shared/platform/include/platform_wasi_types.h
index ac1a95ea..e23b500e 100644
--- a/core/shared/platform/include/platform_wasi_types.h
+++ b/core/shared/platform/include/platform_wasi_types.h
@@ -36,7 +36,11 @@ extern "C" {
#if WASM_ENABLE_UVWASI != 0 || WASM_ENABLE_LIBC_WASI == 0
#define assert_wasi_layout(expr, message) /* nothing */
#else
-#define assert_wasi_layout(expr, message) _Static_assert(expr, message)
+ #ifndef _MSC_VER
+ #define assert_wasi_layout(expr, message) _Static_assert(expr, message)
+ #else
+ #define assert_wasi_layout(expr, message) static_assert(expr, message)
+ #endif
#endif
assert_wasi_layout(_Alignof(int8_t) == 1, "non-wasi data layout");
diff --git a/doc/build_wamr.md b/doc/build_wamr.md
index 6425450b..94dd9628 100644
--- a/doc/build_wamr.md
+++ b/doc/build_wamr.md
@@ -327,6 +327,10 @@ And the wasm app can calls below APIs to allocate/free memory from/to the shared
- **WAMR_BUILD_SHRUNK_MEMORY**=1/0, default to enable if not set
> Note: When enabled, this feature will reduce memory usage by decreasing the size of the linear memory, particularly when the `memory.grow` opcode is not used and memory usage is somewhat predictable.
+## **Instruction metering**
+- **WAMR_BUILD_INSTRUCTION_METERING**=1/0, default to disable if not set
+> Note: Enabling this feature allows limiting the number of instructions a wasm module instance can execute. Use the `wasm_runtime_set_instruction_count_limit(...)` API before calling `wasm_runtime_call_*(...)` APIs to enforce this limit.
+
## **Combination of configurations:**
We can combine the configurations. For example, if we want to disable interpreter, enable AOT and WASI, we can run command:

View File

@@ -25,7 +25,6 @@
#include <cstdint>
#include <cstring>
#include <memory>
namespace ripple {

View File

@@ -22,18 +22,8 @@
#include <xrpl/basics/contract.h>
#if defined(__clang__)
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated"
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
#endif
#include <boost/outcome.hpp>
#if defined(__clang__)
#pragma clang diagnostic pop
#endif
#include <stdexcept>
namespace ripple {

View File

@@ -26,7 +26,6 @@
#include <boost/beast/core/string.hpp>
#include <boost/filesystem.hpp>
#include <fstream>
#include <map>
#include <memory>
#include <mutex>

View File

@@ -29,6 +29,7 @@
#include <array>
#include <cstdint>
#include <optional>
#include <sstream>
#include <string>
namespace ripple {

View File

@@ -20,6 +20,7 @@
#ifndef RIPPLE_ALGORITHM_H_INCLUDED
#define RIPPLE_ALGORITHM_H_INCLUDED
#include <iterator>
#include <utility>
namespace ripple {

View File

@@ -24,8 +24,12 @@
#include <xrpl/beast/hash/xxhasher.h>
#include <cstdint>
#include <functional>
#include <mutex>
#include <random>
#include <type_traits>
#include <unordered_map>
#include <unordered_set>
#include <utility>
namespace ripple {

View File

@@ -23,6 +23,7 @@
#include <cstdint>
#include <limits>
#include <optional>
#include <utility>
namespace ripple {
auto constexpr muldiv_max = std::numeric_limits<std::uint64_t>::max();

View File

@@ -24,8 +24,10 @@
#include <boost/operators.hpp>
#include <functional>
#include <iostream>
#include <type_traits>
#include <utility>
namespace ripple {

View File

@@ -20,6 +20,9 @@
#ifndef BEAST_CHRONO_ABSTRACT_CLOCK_H_INCLUDED
#define BEAST_CHRONO_ABSTRACT_CLOCK_H_INCLUDED
#include <chrono>
#include <string>
namespace beast {
/** Abstract interface to a clock.

View File

@@ -23,8 +23,6 @@
#include <xrpl/beast/clock/abstract_clock.h>
#include <xrpl/beast/utility/instrumentation.h>
#include <chrono>
namespace beast {
/** Manual clock implementation.

View File

@@ -22,7 +22,6 @@
#include <xrpl/beast/container/aged_container.h>
#include <chrono>
#include <type_traits>
namespace beast {

View File

@@ -20,6 +20,8 @@
#ifndef BEAST_CONTAINER_DETAIL_AGED_ASSOCIATIVE_CONTAINER_H_INCLUDED
#define BEAST_CONTAINER_DETAIL_AGED_ASSOCIATIVE_CONTAINER_H_INCLUDED
#include <type_traits>
namespace beast {
namespace detail {

View File

@@ -33,6 +33,7 @@
#include <algorithm>
#include <functional>
#include <initializer_list>
#include <iterator>
#include <memory>
#include <type_traits>
#include <utility>

View File

@@ -3257,6 +3257,7 @@ operator==(aged_unordered_container<
{
if (size() != other.size())
return false;
using EqRng = std::pair<const_iterator, const_iterator>;
for (auto iter(cbegin()), last(cend()); iter != last;)
{
auto const& k(extract(*iter));

View File

@@ -29,9 +29,11 @@
#include <charconv>
#include <cstdlib>
#include <iterator>
#include <limits>
#include <string>
#include <type_traits>
#include <typeinfo>
#include <utility>
namespace beast {

View File

@@ -24,36 +24,14 @@
#include <boost/container/flat_set.hpp>
#include <boost/endian/conversion.hpp>
/*
Workaround for overzealous clang warning, which trips on libstdc++ headers
In file included from
/usr/lib/gcc/x86_64-linux-gnu/12/../../../../include/c++/12/bits/stl_algo.h:61:
/usr/lib/gcc/x86_64-linux-gnu/12/../../../../include/c++/12/bits/stl_tempbuf.h:263:8:
error: 'get_temporary_buffer<std::pair<ripple::Quality, const
std::vector<std::unique_ptr<ripple::Step>> *>>' is deprecated
[-Werror,-Wdeprecated-declarations] 263 |
std::get_temporary_buffer<value_type>(_M_original_len));
^
*/
#if defined(__clang__)
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated"
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
#endif
#include <functional>
#if defined(__clang__)
#pragma clang diagnostic pop
#endif
#include <array>
#include <chrono>
#include <cstdint>
#include <cstring>
#include <functional>
#include <map>
#include <memory>
#include <set>
#include <string>
#include <system_error>
#include <tuple>

View File

@@ -29,7 +29,11 @@
#include <boost/asio/ip/address.hpp>
#include <boost/functional/hash.hpp>
#include <cstdint>
#include <ios>
#include <sstream>
#include <string>
#include <typeinfo>
//------------------------------------------------------------------------------

View File

@@ -24,6 +24,12 @@
#include <boost/asio/ip/address_v4.hpp>
#include <cstdint>
#include <functional>
#include <ios>
#include <string>
#include <utility>
namespace beast {
namespace IP {

View File

@@ -24,6 +24,12 @@
#include <boost/asio/ip/address_v6.hpp>
#include <cstdint>
#include <functional>
#include <ios>
#include <string>
#include <utility>
namespace beast {
namespace IP {

View File

@@ -25,6 +25,7 @@
#include <xrpl/beast/net/IPAddress.h>
#include <cstdint>
#include <ios>
#include <optional>
#include <string>

View File

@@ -28,8 +28,10 @@
#include <algorithm>
#include <cctype>
#include <cstdint>
#include <iterator>
#include <string>
#include <utility>
#include <vector>
namespace beast {

View File

@@ -13,6 +13,7 @@
#include <boost/optional.hpp>
#include <condition_variable>
#include <functional>
#include <mutex>
#include <thread>
#include <vector>

View File

@@ -16,6 +16,7 @@
#include <algorithm>
#include <chrono>
#include <functional>
#include <iomanip>
#include <iostream>
#include <sstream>

View File

@@ -13,6 +13,7 @@
#include <boost/assert.hpp>
#include <mutex>
#include <ostream>
#include <string>
namespace beast {

View File

@@ -31,28 +31,36 @@ namespace beast {
template <class Generator>
void
rngfill(void* const buffer, std::size_t const bytes, Generator& g)
rngfill(void* buffer, std::size_t bytes, Generator& g)
{
using result_type = typename Generator::result_type;
constexpr std::size_t result_size = sizeof(result_type);
std::uint8_t* const buffer_start = static_cast<std::uint8_t*>(buffer);
std::size_t const complete_iterations = bytes / result_size;
std::size_t const bytes_remaining = bytes % result_size;
for (std::size_t count = 0; count < complete_iterations; ++count)
while (bytes >= sizeof(result_type))
{
result_type const v = g();
std::size_t const offset = count * result_size;
std::memcpy(buffer_start + offset, &v, result_size);
auto const v = g();
std::memcpy(buffer, &v, sizeof(v));
buffer = reinterpret_cast<std::uint8_t*>(buffer) + sizeof(v);
bytes -= sizeof(v);
}
if (bytes_remaining > 0)
XRPL_ASSERT(
bytes < sizeof(result_type), "beast::rngfill(void*) : maximum bytes");
#ifdef __GNUC__
// gcc 11.1 (falsely) warns about an array-bounds overflow in release mode.
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Warray-bounds"
#endif
if (bytes > 0)
{
result_type const v = g();
std::size_t const offset = complete_iterations * result_size;
std::memcpy(buffer_start + offset, &v, bytes_remaining);
auto const v = g();
std::memcpy(buffer, &v, bytes);
}
#ifdef __GNUC__
#pragma GCC diagnostic pop
#endif
}
template <

View File

@@ -26,6 +26,7 @@
#include <cstring>
#include <map>
#include <string>
#include <utility>
#include <vector>
/** \brief JSON (JavaScript Object Notation).

View File

@@ -29,6 +29,7 @@
#include <xrpl/protocol/json_get_or_throw.h>
#include <cstddef>
#include <mutex>
#include <optional>
#include <string>

View File

@@ -20,6 +20,7 @@
#ifndef RIPPLE_PROTOCOL_APIVERSION_H_INCLUDED
#define RIPPLE_PROTOCOL_APIVERSION_H_INCLUDED
#include <functional>
#include <type_traits>
#include <utility>

View File

@@ -21,7 +21,6 @@
#define RIPPLE_PROTOCOL_BOOK_H_INCLUDED
#include <xrpl/basics/CountedObject.h>
#include <xrpl/basics/base_uint.h>
#include <xrpl/protocol/Issue.h>
#include <boost/utility/base_from_member.hpp>
@@ -37,17 +36,12 @@ class Book final : public CountedObject<Book>
public:
Issue in;
Issue out;
std::optional<uint256> domain;
Book()
{
}
Book(
Issue const& in_,
Issue const& out_,
std::optional<uint256> const& domain_)
: in(in_), out(out_), domain(domain_)
Book(Issue const& in_, Issue const& out_) : in(in_), out(out_)
{
}
};
@@ -67,8 +61,6 @@ hash_append(Hasher& h, Book const& b)
{
using beast::hash_append;
hash_append(h, b.in, b.out);
if (b.domain)
hash_append(h, *(b.domain));
}
Book
@@ -79,8 +71,7 @@ reversed(Book const& book);
[[nodiscard]] inline constexpr bool
operator==(Book const& lhs, Book const& rhs)
{
return (lhs.in == rhs.in) && (lhs.out == rhs.out) &&
(lhs.domain == rhs.domain);
return (lhs.in == rhs.in) && (lhs.out == rhs.out);
}
/** @} */
@@ -91,18 +82,7 @@ operator<=>(Book const& lhs, Book const& rhs)
{
if (auto const c{lhs.in <=> rhs.in}; c != 0)
return c;
if (auto const c{lhs.out <=> rhs.out}; c != 0)
return c;
// Manually compare optionals
if (lhs.domain && rhs.domain)
return *lhs.domain <=> *rhs.domain; // Compare values if both exist
if (!lhs.domain && rhs.domain)
return std::weak_ordering::less; // Empty is considered less
if (lhs.domain && !rhs.domain)
return std::weak_ordering::greater; // Non-empty is greater
return std::weak_ordering::equivalent; // Both are empty
return lhs.out <=> rhs.out;
}
/** @} */
@@ -146,11 +126,9 @@ template <>
struct hash<ripple::Book>
{
private:
using issue_hasher = std::hash<ripple::Issue>;
using uint256_hasher = ripple::uint256::hasher;
using hasher = std::hash<ripple::Issue>;
issue_hasher m_issue_hasher;
uint256_hasher m_uint256_hasher;
hasher m_hasher;
public:
hash() = default;
@@ -161,12 +139,8 @@ public:
value_type
operator()(argument_type const& value) const
{
value_type result(m_issue_hasher(value.in));
boost::hash_combine(result, m_issue_hasher(value.out));
if (value.domain)
boost::hash_combine(result, m_uint256_hasher(*value.domain));
value_type result(m_hasher(value.in));
boost::hash_combine(result, m_hasher(value.out));
return result;
}
};

View File

@@ -154,10 +154,7 @@ enum error_code_i {
// Simulate
rpcTX_SIGNED = 96,
// Pathfinding
rpcDOMAIN_MALFORMED = 97,
rpcLAST = rpcDOMAIN_MALFORMED // rpcLAST should always equal the last code.
rpcLAST = rpcTX_SIGNED // rpcLAST should always equal the last code.
};
/** Codes returned in the `warnings` array of certain RPC commands.
@@ -169,8 +166,6 @@ enum warning_code_i {
warnRPC_AMENDMENT_BLOCKED = 1002,
warnRPC_EXPIRED_VALIDATOR_LIST = 1003,
// unused = 1004
warnRPC_FIELDS_DEPRECATED = 2004, // rippled needs to maintain
// compatibility with Clio on this code.
};
//------------------------------------------------------------------------------

View File

@@ -24,6 +24,7 @@
#include <boost/container/flat_map.hpp>
#include <array>
#include <bitset>
#include <map>
#include <optional>
@@ -53,18 +54,6 @@
* then change the macro parameter in features.macro to
* `VoteBehavior::DefaultYes`. The communication process is beyond
* the scope of these instructions.
* 5) If a supported feature (`Supported::yes`) was _ever_ in a released
* version, it can never be changed back to `Supported::no`, because
* it _may_ still become enabled at any time. This would cause newer
* versions of `rippled` to become amendment blocked.
* Instead, to prevent newer versions from voting on the feature, use
* `VoteBehavior::Obsolete`. Obsolete features can not be voted for
* by any versions of `rippled` built with that setting, but will still
* work correctly if they get enabled. If a feature remains obsolete
* for long enough that _all_ clients that could vote for it are
* amendment blocked, the feature can be removed from the code
* as if it was unsupported.
*
*
* When a feature has been enabled for several years, the conditional code

View File

@@ -27,9 +27,14 @@
#include <boost/multiprecision/cpp_int.hpp>
#include <boost/operators.hpp>
#include <cmath>
#include <ios>
#include <iosfwd>
#include <limits>
#include <optional>
#include <sstream>
#include <string>
#include <utility>
namespace ripple {
@@ -446,8 +451,8 @@ mulDivU(Source1 value, Dest mul, Source2 div)
}
using namespace boost::multiprecision;
uint128_t product;
using uint128 = boost::multiprecision::uint128_t;
uint128 product;
product = multiply(
product,
static_cast<std::uint64_t>(value.value()),

View File

@@ -24,6 +24,8 @@
namespace ripple {
constexpr std::uint32_t MICRO_DROPS_PER_DROP{1'000'000};
/** Reflects the fee settings for a particular ledger.
The fees are always the same for any transactions applied
@@ -34,6 +36,10 @@ struct Fees
XRPAmount base{0}; // Reference tx cost (drops)
XRPAmount reserve{0}; // Reserve base (drops)
XRPAmount increment{0}; // Reserve increment (drops)
std::uint32_t extensionComputeLimit{
0}; // Extension compute limit (instructions)
std::uint32_t extensionSizeLimit{0}; // Extension size limit (bytes)
std::uint32_t gasPrice{0}; // price of WASM gas (micro-drops)
explicit Fees() = default;
Fees(Fees const&) = default;

View File

@@ -98,12 +98,6 @@ public:
static IOUAmount
minPositiveAmount();
friend std::ostream&
operator<<(std::ostream& os, IOUAmount const& x)
{
return os << to_string(x);
}
};
inline IOUAmount::IOUAmount(beast::Zero)

View File

@@ -32,7 +32,6 @@
#include <xrpl/protocol/jss.h>
#include <cstdint>
#include <set>
namespace ripple {
@@ -231,6 +230,12 @@ page(Keylet const& root, std::uint64_t index = 0) noexcept
Keylet
escrow(AccountID const& src, std::uint32_t seq) noexcept;
inline Keylet
escrow(uint256 const& key) noexcept
{
return {ltESCROW, key};
}
/** A PaymentChannel */
Keylet
payChan(AccountID const& src, AccountID const& dst, std::uint32_t seq) noexcept;

View File

@@ -24,6 +24,9 @@
#include <xrpl/json/json_value.h>
#include <xrpl/protocol/UintTypes.h>
#include <functional>
#include <type_traits>
namespace ripple {
/** A currency issued by an account.

View File

@@ -145,15 +145,13 @@ enum LedgerSpecificFlags {
0x10000000, // True, reject new paychans
lsfDisallowIncomingTrustline =
0x20000000, // True, reject new trustlines (only if no issued assets)
lsfAllowTrustLineLocking =
0x40000000, // True, enable trustline locking
// 0x40000000 is available
lsfAllowTrustLineClawback =
0x80000000, // True, enable clawback
// ltOFFER
lsfPassive = 0x00010000,
lsfSell = 0x00020000, // True, offer was placed as a sell.
lsfHybrid = 0x00040000, // True, offer is hybrid.
// ltRIPPLE_STATE
lsfLowReserve = 0x00010000, // True, if entry counts toward reserve.

Some files were not shown because too many files have changed in this diff Show More