mirror of
https://github.com/Xahau/xahaud.git
synced 2025-12-01 07:55:49 +00:00
Compare commits
25 Commits
sync-2.3.0
...
nd-snap-20
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0f6aad948b | ||
|
|
e061823561 | ||
|
|
7763434646 | ||
|
|
afad05b526 | ||
|
|
9a3723b1dc | ||
|
|
22b6ea961e | ||
|
|
974249380a | ||
|
|
8362318d25 | ||
|
|
508bdd5b33 | ||
|
|
1269803aa6 | ||
|
|
ae46394788 | ||
|
|
d7bfff2bef | ||
|
|
717464a2d8 | ||
|
|
f9d6346e6d | ||
|
|
c968fdfb1a | ||
|
|
7a9d48d53c | ||
|
|
849d447a20 | ||
|
|
ee27049687 | ||
|
|
60dec74baf | ||
|
|
9abea13649 | ||
|
|
810e15319c | ||
|
|
d593f3bef5 | ||
|
|
1233694b6c | ||
|
|
a1d42b7380 | ||
|
|
f6d2bf819d |
@@ -45,11 +45,9 @@ DisableFormat: false
|
||||
ExperimentalAutoDetectBinPacking: false
|
||||
ForEachMacros: [ Q_FOREACH, BOOST_FOREACH ]
|
||||
IncludeCategories:
|
||||
- Regex: '^<(test)/'
|
||||
- Regex: '^<(BeastConfig)'
|
||||
Priority: 0
|
||||
- Regex: '^<(xrpld)/'
|
||||
Priority: 1
|
||||
- Regex: '^<(xrpl)/'
|
||||
- Regex: '^<(ripple)/'
|
||||
Priority: 2
|
||||
- Regex: '^<(boost)/'
|
||||
Priority: 3
|
||||
@@ -58,7 +56,6 @@ IncludeCategories:
|
||||
IncludeIsMainRegex: '$'
|
||||
IndentCaseLabels: true
|
||||
IndentFunctionDeclarationAfterType: false
|
||||
IndentRequiresClause: true
|
||||
IndentWidth: 4
|
||||
IndentWrappedFunctionNames: false
|
||||
KeepEmptyLinesAtTheStartOfBlocks: false
|
||||
@@ -74,7 +71,6 @@ PenaltyExcessCharacter: 1000000
|
||||
PenaltyReturnTypeOnItsOwnLine: 200
|
||||
PointerAlignment: Left
|
||||
ReflowComments: true
|
||||
RequiresClausePosition: OwnLine
|
||||
SortIncludes: true
|
||||
SpaceAfterCStyleCast: false
|
||||
SpaceBeforeAssignmentOperators: true
|
||||
|
||||
@@ -2,7 +2,3 @@
|
||||
# To use it by default in git blame:
|
||||
# git config blame.ignoreRevsFile .git-blame-ignore-revs
|
||||
50760c693510894ca368e90369b0cc2dabfd07f3
|
||||
e2384885f5f630c8f0ffe4bf21a169b433a16858
|
||||
241b9ddde9e11beb7480600fd5ed90e1ef109b21
|
||||
760f16f56835663d9286bd29294d074de26a7ba6
|
||||
0eebe6a5f4246fced516d52b83ec4e7f47373edd
|
||||
|
||||
12
.githooks/pre-commit
Executable file
12
.githooks/pre-commit
Executable file
@@ -0,0 +1,12 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Pre-commit hook that runs the suspicious patterns check on staged files
|
||||
|
||||
# Get the repository's root directory
|
||||
repo_root=$(git rev-parse --show-toplevel)
|
||||
|
||||
# Run the suspicious patterns script in pre-commit mode
|
||||
"$repo_root/suspicious_patterns.sh" --pre-commit
|
||||
|
||||
# Exit with the same code as the script
|
||||
exit $?
|
||||
4
.githooks/setup.sh
Normal file
4
.githooks/setup.sh
Normal file
@@ -0,0 +1,4 @@
|
||||
#!/bin/bash
|
||||
|
||||
echo "Configuring git to use .githooks directory..."
|
||||
git config core.hooksPath .githooks
|
||||
4
.github/actions/xahau-ga-build/action.yml
vendored
4
.github/actions/xahau-ga-build/action.yml
vendored
@@ -88,8 +88,6 @@ runs:
|
||||
$CCACHE_ARGS \
|
||||
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
|
||||
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }}
|
||||
-Dtests=TRUE \
|
||||
-Dxrpld=TRUE
|
||||
|
||||
- name: Build project
|
||||
shell: bash
|
||||
@@ -107,4 +105,4 @@ runs:
|
||||
uses: actions/cache/save@v4
|
||||
with:
|
||||
path: ~/.ccache
|
||||
key: ${{ steps.ccache-restore.outputs.cache-primary-key }}
|
||||
key: ${{ steps.ccache-restore.outputs.cache-primary-key }}
|
||||
@@ -59,7 +59,7 @@ runs:
|
||||
- name: Export custom recipes
|
||||
shell: bash
|
||||
run: |
|
||||
conan export external/snappy snappy/1.1.10@
|
||||
conan export external/snappy snappy/1.1.9@
|
||||
conan export external/soci soci/4.0.3@
|
||||
|
||||
- name: Install dependencies
|
||||
@@ -74,8 +74,6 @@ runs:
|
||||
--output-folder . \
|
||||
--build missing \
|
||||
--settings build_type=${{ inputs.configuration }} \
|
||||
-Dtests=TRUE \
|
||||
-Dxrpld=TRUE \
|
||||
..
|
||||
|
||||
- name: Save Conan cache
|
||||
@@ -85,4 +83,4 @@ runs:
|
||||
path: |
|
||||
~/.conan
|
||||
~/.conan2
|
||||
key: ${{ steps.cache-restore-conan.outputs.cache-primary-key }}
|
||||
key: ${{ steps.cache-restore-conan.outputs.cache-primary-key }}
|
||||
7
.github/pull_request_template.md
vendored
7
.github/pull_request_template.md
vendored
@@ -33,7 +33,6 @@ Please check [x] relevant options, delete irrelevant ones.
|
||||
- [ ] New feature (non-breaking change which adds functionality)
|
||||
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||
- [ ] Refactor (non-breaking change that only restructures code)
|
||||
- [ ] Performance (increase or change in throughput and/or latency)
|
||||
- [ ] Tests (you added tests for code that already exists, or your new feature included in this PR)
|
||||
- [ ] Documentation update
|
||||
- [ ] Chore (no impact to binary, e.g. `.gitignore`, formatting, dropping support for older tooling)
|
||||
@@ -59,12 +58,6 @@ Please check [x] relevant options, delete irrelevant ones.
|
||||
## Before / After
|
||||
If relevant, use this section for an English description of the change at a technical level.
|
||||
If this change affects an API, examples should be included here.
|
||||
|
||||
For performance-impacting changes, please provide these details:
|
||||
1. Is this a new feature, bug fix, or improvement to existing functionality?
|
||||
2. What behavior/functionality does the change impact?
|
||||
3. In what processing can the impact be measured? Be as specific as possible - e.g. RPC client call, payment transaction that involves LOB, AMM, caching, DB operations, etc.
|
||||
4. Does this change affect concurrent processing - e.g. does it involve acquiring locks, multi-threaded processing, or async processing?
|
||||
-->
|
||||
|
||||
<!--
|
||||
|
||||
20
.github/workflows/checkpatterns.yml
vendored
20
.github/workflows/checkpatterns.yml
vendored
@@ -1,20 +0,0 @@
|
||||
name: checkpatterns
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
checkpatterns:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Check for suspicious patterns
|
||||
run: |
|
||||
if [ -f "suspicious_patterns.sh" ]; then
|
||||
bash suspicious_patterns.sh
|
||||
else
|
||||
echo "Warning: suspicious_patterns.sh not found, skipping check"
|
||||
# Still exit with success for compatibility with dependent jobs
|
||||
exit 0
|
||||
fi
|
||||
43
.github/workflows/clang-format.yml
vendored
43
.github/workflows/clang-format.yml
vendored
@@ -4,23 +4,36 @@ on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
check:
|
||||
runs-on: ubuntu-24.04
|
||||
runs-on: ubuntu-22.04
|
||||
env:
|
||||
CLANG_VERSION: 18
|
||||
CLANG_VERSION: 10
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Install clang-format
|
||||
- uses: actions/checkout@v3
|
||||
# - name: Install clang-format
|
||||
# run: |
|
||||
# codename=$( lsb_release --codename --short )
|
||||
# sudo tee /etc/apt/sources.list.d/llvm.list >/dev/null <<EOF
|
||||
# deb http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
|
||||
# deb-src http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
|
||||
# EOF
|
||||
# wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add
|
||||
# sudo apt-get update -y
|
||||
# sudo apt-get install -y clang-format-${CLANG_VERSION}
|
||||
|
||||
# Temporary fix until this commit is merged
|
||||
# https://github.com/XRPLF/rippled/commit/552377c76f55b403a1c876df873a23d780fcc81c
|
||||
- name: Download and install clang-format
|
||||
run: |
|
||||
codename=$( lsb_release --codename --short )
|
||||
sudo tee /etc/apt/sources.list.d/llvm.list >/dev/null <<EOF
|
||||
deb http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
|
||||
deb-src http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
|
||||
EOF
|
||||
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add
|
||||
sudo apt-get update
|
||||
sudo apt-get install clang-format-${CLANG_VERSION}
|
||||
- name: Format first-party sources
|
||||
run: find include src -type f \( -name '*.cpp' -o -name '*.hpp' -o -name '*.h' -o -name '*.ipp' \) -not -path "src/magic/magic_enum.h" -exec clang-format-${CLANG_VERSION} -i {} +
|
||||
sudo apt-get update -y
|
||||
sudo apt-get install -y libtinfo5
|
||||
curl -LO https://github.com/llvm/llvm-project/releases/download/llvmorg-10.0.1/clang+llvm-10.0.1-x86_64-linux-gnu-ubuntu-16.04.tar.xz
|
||||
tar -xf clang+llvm-10.0.1-x86_64-linux-gnu-ubuntu-16.04.tar.xz
|
||||
sudo mv clang+llvm-10.0.1-x86_64-linux-gnu-ubuntu-16.04 /opt/clang-10
|
||||
sudo ln -s /opt/clang-10/bin/clang-format /usr/local/bin/clang-format-10
|
||||
- name: Format src/ripple
|
||||
run: find src/ripple -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -print0 | xargs -0 clang-format-${CLANG_VERSION} -i
|
||||
- name: Format src/test
|
||||
run: find src/test -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -print0 | xargs -0 clang-format-${CLANG_VERSION} -i
|
||||
- name: Check for differences
|
||||
id: assert
|
||||
run: |
|
||||
@@ -50,7 +63,7 @@ jobs:
|
||||
To fix it, you can do one of two things:
|
||||
1. Download and apply the patch generated as an artifact of this
|
||||
job to your repo, commit, and push.
|
||||
2. Run 'git-clang-format --extensions cpp,h,hpp,ipp develop'
|
||||
2. Run 'git-clang-format --extensions c,cpp,h,cxx,ipp dev'
|
||||
in your repo, commit, and push.
|
||||
run: |
|
||||
echo "${PREAMBLE}"
|
||||
|
||||
3
.github/workflows/xahau-ga-macos.yml
vendored
3
.github/workflows/xahau-ga-macos.yml
vendored
@@ -91,7 +91,6 @@ jobs:
|
||||
run: |
|
||||
conan profile new default --detect || true # Ignore error if profile exists
|
||||
conan profile update settings.compiler.cppstd=20 default
|
||||
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
|
||||
|
||||
- name: Install dependencies
|
||||
uses: ./.github/actions/xahau-ga-dependencies
|
||||
@@ -114,4 +113,4 @@ jobs:
|
||||
|
||||
- name: Test
|
||||
run: |
|
||||
${{ env.build_dir }}/rippled --unittest --unittest-jobs $(nproc)
|
||||
${{ env.build_dir }}/rippled --unittest --unittest-jobs $(nproc)
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -116,7 +116,3 @@ CMakeUserPresets.json
|
||||
bld.rippled/
|
||||
|
||||
generated
|
||||
.vscode
|
||||
|
||||
# Suggested in-tree build directory
|
||||
/.build/
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# .pre-commit-config.yaml
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/mirrors-clang-format
|
||||
rev: v18.1.3
|
||||
rev: v10.0.1
|
||||
hooks:
|
||||
- id: clang-format
|
||||
|
||||
193
API-CHANGELOG.md
193
API-CHANGELOG.md
@@ -1,193 +0,0 @@
|
||||
# API Changelog
|
||||
|
||||
This changelog is intended to list all updates to the [public API methods](https://xrpl.org/public-api-methods.html).
|
||||
|
||||
For info about how [API versioning](https://xrpl.org/request-formatting.html#api-versioning) works, including examples, please view the [XLS-22d spec](https://github.com/XRPLF/XRPL-Standards/discussions/54). For details about the implementation of API versioning, view the [implementation PR](https://github.com/XRPLF/rippled/pull/3155). API versioning ensures existing integrations and users continue to receive existing behavior, while those that request a higher API version will experience new behavior.
|
||||
|
||||
The API version controls the API behavior you see. This includes what properties you see in responses, what parameters you're permitted to send in requests, and so on. You specify the API version in each of your requests. When a breaking change is introduced to the `rippled` API, a new version is released. To avoid breaking your code, you should set (or increase) your version when you're ready to upgrade.
|
||||
|
||||
For a log of breaking changes, see the **API Version [number]** headings. In general, breaking changes are associated with a particular API Version number. For non-breaking changes, scroll to the **XRP Ledger version [x.y.z]** headings. Non-breaking changes are associated with a particular XRP Ledger (`rippled`) release.
|
||||
|
||||
## API Version 2
|
||||
|
||||
API version 2 is available in `rippled` version 2.0.0 and later. To use this API, clients specify `"api_version" : 2` in each request.
|
||||
|
||||
#### Removed methods
|
||||
|
||||
In API version 2, the following deprecated methods are no longer available: (https://github.com/XRPLF/rippled/pull/4759)
|
||||
|
||||
- `tx_history` - Instead, use other methods such as `account_tx` or `ledger` with the `transactions` field set to `true`.
|
||||
- `ledger_header` - Instead, use the `ledger` method.
|
||||
|
||||
#### Modifications to JSON transaction element in V2
|
||||
|
||||
In API version 2, JSON elements for transaction output have been changed and made consistent for all methods which output transactions. (https://github.com/XRPLF/rippled/pull/4775)
|
||||
This helps to unify the JSON serialization format of transactions. (https://github.com/XRPLF/clio/issues/722, https://github.com/XRPLF/rippled/issues/4727)
|
||||
|
||||
- JSON transaction element is named `tx_json`
|
||||
- Binary transaction element is named `tx_blob`
|
||||
- JSON transaction metadata element is named `meta`
|
||||
- Binary transaction metadata element is named `meta_blob`
|
||||
|
||||
Additionally, these elements are now consistently available next to `tx_json` (i.e. sibling elements), where possible:
|
||||
|
||||
- `hash` - Transaction ID. This data was stored inside transaction output in API version 1, but in API version 2 is a sibling element.
|
||||
- `ledger_index` - Ledger index (only set on validated ledgers)
|
||||
- `ledger_hash` - Ledger hash (only set on closed or validated ledgers)
|
||||
- `close_time_iso` - Ledger close time expressed in ISO 8601 time format (only set on validated ledgers)
|
||||
- `validated` - Bool element set to `true` if the transaction is in a validated ledger, otherwise `false`
|
||||
|
||||
This change affects the following methods:
|
||||
|
||||
- `tx` - Transaction data moved into element `tx_json` (was inline inside `result`) or, if binary output was requested, moved from `tx` to `tx_blob`. Renamed binary transaction metadata element (if it was requested) from `meta` to `meta_blob`. Changed location of `hash` and added new elements
|
||||
- `account_tx` - Renamed transaction element from `tx` to `tx_json`. Renamed binary transaction metadata element (if it was requested) from `meta` to `meta_blob`. Changed location of `hash` and added new elements
|
||||
- `transaction_entry` - Renamed transaction metadata element from `metadata` to `meta`. Changed location of `hash` and added new elements
|
||||
- `subscribe` - Renamed transaction element from `transaction` to `tx_json`. Changed location of `hash` and added new elements
|
||||
- `sign`, `sign_for`, `submit` and `submit_multisigned` - Changed location of `hash` element.
|
||||
|
||||
#### Modification to `Payment` transaction JSON schema
|
||||
|
||||
When reading Payments, the `Amount` field should generally **not** be used. Instead, use [delivered_amount](https://xrpl.org/partial-payments.html#the-delivered_amount-field) to see the amount that the Payment delivered. To clarify its meaning, the `Amount` field is being renamed to `DeliverMax`. (https://github.com/XRPLF/rippled/pull/4733)
|
||||
|
||||
- In `Payment` transaction type, JSON RPC field `Amount` is renamed to `DeliverMax`. To enable smooth client transition, `Amount` is still handled, as described below: (https://github.com/XRPLF/rippled/pull/4733)
|
||||
- On JSON RPC input (e.g. `submit_multisigned` etc. methods), `Amount` is recognized as an alias to `DeliverMax` for both API version 1 and version 2 clients.
|
||||
- On JSON RPC input, submitting both `Amount` and `DeliverMax` fields is allowed _only_ if they are identical; otherwise such input is rejected with `rpcINVALID_PARAMS` error.
|
||||
- On JSON RPC output (e.g. `subscribe`, `account_tx` etc. methods), `DeliverMax` is present in both API version 1 and version 2.
|
||||
- On JSON RPC output, `Amount` is only present in API version 1 and _not_ in version 2.
|
||||
|
||||
#### Modifications to account_info response
|
||||
|
||||
- `signer_lists` is returned in the root of the response. (In API version 1, it was nested under `account_data`.) (https://github.com/XRPLF/rippled/pull/3770)
|
||||
- When using an invalid `signer_lists` value, the API now returns an "invalidParams" error. (https://github.com/XRPLF/rippled/pull/4585)
|
||||
- (`signer_lists` must be a boolean. In API version 1, strings were accepted and may return a normal response - i.e. as if `signer_lists` were `true`.)
|
||||
|
||||
#### Modifications to [account_tx](https://xrpl.org/account_tx.html#account_tx) response
|
||||
|
||||
- Using `ledger_index_min`, `ledger_index_max`, and `ledger_index` returns `invalidParams` because if you use `ledger_index_min` or `ledger_index_max`, then it does not make sense to also specify `ledger_index`. In API version 1, no error was returned. (https://github.com/XRPLF/rippled/pull/4571)
|
||||
- The same applies for `ledger_index_min`, `ledger_index_max`, and `ledger_hash`. (https://github.com/XRPLF/rippled/issues/4545#issuecomment-1565065579)
|
||||
- Using a `ledger_index_min` or `ledger_index_max` beyond the range of ledgers that the server has:
|
||||
- returns `lgrIdxMalformed` in API version 2. Previously, in API version 1, no error was returned. (https://github.com/XRPLF/rippled/issues/4288)
|
||||
- Attempting to use a non-boolean value (such as a string) for the `binary` or `forward` parameters returns `invalidParams` (`rpcINVALID_PARAMS`). Previously, in API version 1, no error was returned. (https://github.com/XRPLF/rippled/pull/4620)
|
||||
|
||||
#### Modifications to [noripple_check](https://xrpl.org/noripple_check.html#noripple_check) response
|
||||
|
||||
- Attempting to use a non-boolean value (such as a string) for the `transactions` parameter returns `invalidParams` (`rpcINVALID_PARAMS`). Previously, in API version 1, no error was returned. (https://github.com/XRPLF/rippled/pull/4620)
|
||||
|
||||
## API Version 1
|
||||
|
||||
This version is supported by all `rippled` versions. For WebSocket and HTTP JSON-RPC requests, it is currently the default API version used when no `api_version` is specified.
|
||||
|
||||
The [commandline](https://xrpl.org/docs/references/http-websocket-apis/api-conventions/request-formatting/#commandline-format) always uses the latest API version. The command line is intended for ad-hoc usage by humans, not programs or automated scripts. The command line is not meant for use in production code.
|
||||
|
||||
### Inconsistency: server_info - network_id
|
||||
|
||||
The `network_id` field was added in the `server_info` response in version 1.5.0 (2019), but it is not returned in [reporting mode](https://xrpl.org/rippled-server-modes.html#reporting-mode). However, use of reporting mode is now discouraged, in favor of using [Clio](https://github.com/XRPLF/clio) instead.
|
||||
|
||||
## XRP Ledger server version 2.2.0
|
||||
|
||||
The following is a non-breaking addition to the API.
|
||||
|
||||
- The `feature` method now has a non-admin mode for users. (It was previously only available to admin connections.) The method returns an updated list of amendments, including their names and other information. ([#4781](https://github.com/XRPLF/rippled/pull/4781))
|
||||
|
||||
### Breaking change in 2.3
|
||||
|
||||
- `book_changes`: If the requested ledger version is not available on this node, a `ledgerNotFound` error is returned and the node does not attempt to acquire the ledger from the p2p network (as with other non-admin RPCs).
|
||||
|
||||
Admins can still attempt to retrieve old ledgers with the `ledger_request` RPC.
|
||||
|
||||
### Addition in 2.3
|
||||
|
||||
- `book_changes`: Returns a `validated` field in its response, which was missing in prior versions.
|
||||
|
||||
The following additions are non-breaking (because they are purely additive).
|
||||
|
||||
- `server_definitions`: A new RPC that generates a `definitions.json`-like output that can be used in XRPL libraries.
|
||||
- In `Payment` transactions, `DeliverMax` has been added. This is a replacement for the `Amount` field, which should not be used. Typically, the `delivered_amount` (in transaction metadata) should be used. To ease the transition, `DeliverMax` is present regardless of API version, since adding a field is non-breaking.
|
||||
- API version 2 has been moved from beta to supported, meaning that it is generally available (regardless of the `beta_rpc_api` setting).
|
||||
|
||||
## XRP Ledger server version 1.12.0
|
||||
|
||||
[Version 1.12.0](https://github.com/XRPLF/rippled/releases/tag/1.12.0) was released on Sep 6, 2023. The following additions are non-breaking (because they are purely additive).
|
||||
|
||||
- `server_info`: Added `ports`, an array which advertises the RPC and WebSocket ports. This information is also included in the `/crawl` endpoint (which calls `server_info` internally). `grpc` and `peer` ports are also included. (https://github.com/XRPLF/rippled/pull/4427)
|
||||
- `ports` contains objects, each containing a `port` for the listening port (a number string), and a `protocol` array listing the supported protocols on that port.
|
||||
- This allows crawlers to build a more detailed topology without needing to port-scan nodes.
|
||||
- (For peers and other non-admin clients, the info about admin ports is excluded.)
|
||||
- Clawback: The following additions are gated by the Clawback amendment (`featureClawback`). (https://github.com/XRPLF/rippled/pull/4553)
|
||||
- Adds an [AccountRoot flag](https://xrpl.org/accountroot.html#accountroot-flags) called `lsfAllowTrustLineClawback` (https://github.com/XRPLF/rippled/pull/4617)
|
||||
- Adds the corresponding `asfAllowTrustLineClawback` [AccountSet Flag](https://xrpl.org/accountset.html#accountset-flags) as well.
|
||||
- Clawback is disabled by default, so if an issuer desires the ability to claw back funds, they must use an `AccountSet` transaction to set the AllowTrustLineClawback flag. They must do this before creating any trust lines, offers, escrows, payment channels, or checks.
|
||||
- Adds the [Clawback transaction type](https://github.com/XRPLF/XRPL-Standards/blob/master/XLS-39d-clawback/README.md#331-clawback-transaction), containing these fields:
|
||||
- `Account`: The issuer of the asset being clawed back. Must also be the sender of the transaction.
|
||||
- `Amount`: The amount being clawed back, with the `Amount.issuer` being the token holder's address.
|
||||
- Adds [AMM](https://github.com/XRPLF/XRPL-Standards/discussions/78) ([#4294](https://github.com/XRPLF/rippled/pull/4294), [#4626](https://github.com/XRPLF/rippled/pull/4626)) feature:
|
||||
- Adds `amm_info` API to retrieve AMM information for a given tokens pair.
|
||||
- Adds `AMMCreate` transaction type to create `AMM` instance.
|
||||
- Adds `AMMDeposit` transaction type to deposit funds into `AMM` instance.
|
||||
- Adds `AMMWithdraw` transaction type to withdraw funds from `AMM` instance.
|
||||
- Adds `AMMVote` transaction type to vote for the trading fee of `AMM` instance.
|
||||
- Adds `AMMBid` transaction type to bid for the Auction Slot of `AMM` instance.
|
||||
- Adds `AMMDelete` transaction type to delete `AMM` instance.
|
||||
- Adds `sfAMMID` to `AccountRoot` to indicate that the account is `AMM`'s account. `AMMID` is used to fetch `ltAMM`.
|
||||
- Adds `lsfAMMNode` `TrustLine` flag to indicate that one side of the `TrustLine` is `AMM` account.
|
||||
- Adds `tfLPToken`, `tfSingleAsset`, `tfTwoAsset`, `tfOneAssetLPToken`, `tfLimitLPToken`, `tfTwoAssetIfEmpty`,
|
||||
`tfWithdrawAll`, `tfOneAssetWithdrawAll` which allow a trader to specify different fields combination
|
||||
for `AMMDeposit` and `AMMWithdraw` transactions.
|
||||
- Adds new transaction result codes:
|
||||
- tecUNFUNDED_AMM: insufficient balance to fund AMM. The account does not have funds for liquidity provision.
|
||||
- tecAMM_BALANCE: AMM has invalid balance. Calculated balances greater than the current pool balances.
|
||||
- tecAMM_FAILED: AMM transaction failed. Fails due to a processing failure.
|
||||
- tecAMM_INVALID_TOKENS: AMM invalid LP tokens. Invalid input values, format, or calculated values.
|
||||
- tecAMM_EMPTY: AMM is in empty state. Transaction requires AMM in non-empty state (LP tokens > 0).
|
||||
- tecAMM_NOT_EMPTY: AMM is not in empty state. Transaction requires AMM in empty state (LP tokens == 0).
|
||||
- tecAMM_ACCOUNT: AMM account. Clawback of AMM account.
|
||||
- tecINCOMPLETE: Some work was completed, but more submissions required to finish. AMMDelete partially deletes the trustlines.
|
||||
|
||||
## XRP Ledger server version 1.11.0
|
||||
|
||||
[Version 1.11.0](https://github.com/XRPLF/rippled/releases/tag/1.11.0) was released on Jun 20, 2023.
|
||||
|
||||
### Breaking changes in 1.11
|
||||
|
||||
- Added the ability to mark amendments as obsolete. For the `feature` admin API, there is a new possible value for the `vetoed` field. (https://github.com/XRPLF/rippled/pull/4291)
|
||||
- The value of `vetoed` can now be `true`, `false`, or `"Obsolete"`.
|
||||
- Removed the acceptance of seeds or public keys in place of account addresses. (https://github.com/XRPLF/rippled/pull/4404)
|
||||
- This simplifies the API and encourages better security practices (i.e. seeds should never be sent over the network).
|
||||
- For the `ledger_data` method, when all entries are filtered out, the `state` field of the response is now an empty list (in other words, an empty array, `[]`). (Previously, it would return `null`.) While this is technically a breaking change, the new behavior is consistent with the documentation, so this is considered only a bug fix. (https://github.com/XRPLF/rippled/pull/4398)
|
||||
- If and when the `fixNFTokenRemint` amendment activates, there will be a new AccountRoot field, `FirstNFTSequence`. This field is set to the current account sequence when the account issues their first NFT. If an account has not issued any NFTs, then the field is not set. ([#4406](https://github.com/XRPLF/rippled/pull/4406))
|
||||
- There is a new account deletion restriction: an account can only be deleted if `FirstNFTSequence` + `MintedNFTokens` + `256` is less than the current ledger sequence.
|
||||
- This is potentially a breaking change if clients have logic for determining whether an account can be deleted.
|
||||
- NetworkID
|
||||
- For sidechains and networks with a network ID greater than 1024, there is a new [transaction common field](https://xrpl.org/transaction-common-fields.html), `NetworkID`. (https://github.com/XRPLF/rippled/pull/4370)
|
||||
- This field helps to prevent replay attacks and is now required for chains whose network ID is 1025 or higher.
|
||||
- The field must be omitted for Mainnet, so there is no change for Mainnet users.
|
||||
- There are three new local error codes:
|
||||
- `telNETWORK_ID_MAKES_TX_NON_CANONICAL`: a `NetworkID` is present but the chain's network ID is less than 1025. Remove the field from the transaction, and try again.
|
||||
- `telREQUIRES_NETWORK_ID`: a `NetworkID` is required, but is not present. Add the field to the transaction, and try again.
|
||||
- `telWRONG_NETWORK`: a `NetworkID` is specified, but it is for a different network. Submit the transaction to a different server which is connected to the correct network.
|
||||
|
||||
### Additions and bug fixes in 1.11
|
||||
|
||||
- Added `nftoken_id`, `nftoken_ids` and `offer_id` meta fields into NFT `tx` and `account_tx` responses. (https://github.com/XRPLF/rippled/pull/4447)
|
||||
- Added an `account_flags` object to the `account_info` method response. (https://github.com/XRPLF/rippled/pull/4459)
|
||||
- Added `NFTokenPages` to the `account_objects` RPC. (https://github.com/XRPLF/rippled/pull/4352)
|
||||
- Fixed: `marker` returned from the `account_lines` command would not work on subsequent commands. (https://github.com/XRPLF/rippled/pull/4361)
|
||||
|
||||
## XRP Ledger server version 1.10.0
|
||||
|
||||
[Version 1.10.0](https://github.com/XRPLF/rippled/releases/tag/1.10.0)
|
||||
was released on Mar 14, 2023.
|
||||
|
||||
### Breaking changes in 1.10
|
||||
|
||||
- If the `XRPFees` feature is enabled, the `fee_ref` field will be
|
||||
removed from the [ledger subscription stream](https://xrpl.org/subscribe.html#ledger-stream), because it will no longer
|
||||
have any meaning.
|
||||
|
||||
# Unit tests for API changes
|
||||
|
||||
The following information is useful to developers contributing to this project:
|
||||
|
||||
The purpose of unit tests is to catch bugs and prevent regressions. In general, it often makes sense to create a test function when there is a breaking change to the API. For APIs that have changed in a new API version, the tests should be modified so that both the prior version and the new version are properly tested.
|
||||
|
||||
To take one example: for `account_info` version 1, WebSocket and JSON-RPC behavior should be tested. The latest API version, i.e. API version 2, should be tested over WebSocket, JSON-RPC, and command line.
|
||||
265
BUILD.md
265
BUILD.md
@@ -1,7 +1,7 @@
|
||||
| :warning: **WARNING** :warning:
|
||||
|---|
|
||||
| These instructions assume you have a C++ development environment ready with Git, Python, Conan, CMake, and a C++ compiler. For help setting one up on Linux, macOS, or Windows, [see this guide](./docs/build/environment.md). |
|
||||
|
||||
> These instructions assume you have a C++ development environment ready
|
||||
> with Git, Python, Conan, CMake, and a C++ compiler. For help setting one up
|
||||
> on Linux, macOS, or Windows, see [our guide](./docs/build/environment.md).
|
||||
>
|
||||
> These instructions also assume a basic familiarity with Conan and CMake.
|
||||
> If you are unfamiliar with Conan,
|
||||
> you can read our [crash course](./docs/build/conan.md)
|
||||
@@ -29,179 +29,102 @@ branch.
|
||||
git checkout develop
|
||||
```
|
||||
|
||||
|
||||
## Minimum Requirements
|
||||
|
||||
See [System Requirements](https://xrpl.org/system-requirements.html).
|
||||
|
||||
Building rippled generally requires git, Python, Conan, CMake, and a C++ compiler. Some guidance on setting up such a [C++ development environment can be found here](./docs/build/environment.md).
|
||||
|
||||
- [Python 3.7](https://www.python.org/downloads/)
|
||||
- [Conan 1.60](https://conan.io/downloads.html)[^1]
|
||||
- [Conan 1.55](https://conan.io/downloads.html)
|
||||
- [CMake 3.16](https://cmake.org/download/)
|
||||
|
||||
[^1]: It is possible to build with Conan 2.x,
|
||||
but the instructions are significantly different,
|
||||
which is why we are not recommending it yet.
|
||||
Notably, the `conan profile update` command is removed in 2.x.
|
||||
Profiles must be edited by hand.
|
||||
|
||||
`rippled` is written in the C++20 dialect and includes the `<concepts>` header.
|
||||
The [minimum compiler versions][2] required are:
|
||||
|
||||
| Compiler | Version |
|
||||
|-------------|---------|
|
||||
| GCC | 11 |
|
||||
| GCC | 10 |
|
||||
| Clang | 13 |
|
||||
| Apple Clang | 13.1.6 |
|
||||
| MSVC | 19.23 |
|
||||
|
||||
### Linux
|
||||
We don't recommend Windows for `rippled` production at this time. As of
|
||||
January 2023, Ubuntu has the highest level of quality assurance, testing,
|
||||
and support.
|
||||
|
||||
The Ubuntu operating system has received the highest level of
|
||||
quality assurance, testing, and support.
|
||||
Windows developers should use Visual Studio 2019. `rippled` isn't
|
||||
compatible with [Boost](https://www.boost.org/) 1.78 or 1.79, and Conan
|
||||
can't build earlier Boost versions.
|
||||
|
||||
Here are [sample instructions for setting up a C++ development environment on Linux](./docs/build/environment.md#linux).
|
||||
**Note:** 32-bit Windows development isn't supported.
|
||||
|
||||
### Mac
|
||||
|
||||
Many rippled engineers use macOS for development.
|
||||
|
||||
Here are [sample instructions for setting up a C++ development environment on macOS](./docs/build/environment.md#macos).
|
||||
|
||||
### Windows
|
||||
|
||||
Windows is not recommended for production use at this time.
|
||||
|
||||
- Additionally, 32-bit Windows development is not supported.
|
||||
|
||||
[Boost]: https://www.boost.org/
|
||||
|
||||
## Steps
|
||||
|
||||
|
||||
### Set Up Conan
|
||||
|
||||
After you have a [C++ development environment](./docs/build/environment.md) ready with Git, Python, Conan, CMake, and a C++ compiler, you may need to set up your Conan profile.
|
||||
|
||||
These instructions assume a basic familiarity with Conan and CMake.
|
||||
|
||||
If you are unfamiliar with Conan, then please read [this crash course](./docs/build/conan.md) or the official [Getting Started][3] walkthrough.
|
||||
|
||||
You'll need at least one Conan profile:
|
||||
1. (Optional) If you've never used Conan, use autodetect to set up a default profile.
|
||||
|
||||
```
|
||||
conan profile new default --detect
|
||||
```
|
||||
|
||||
Update the compiler settings:
|
||||
2. Update the compiler settings.
|
||||
|
||||
```
|
||||
conan profile update settings.compiler.cppstd=20 default
|
||||
```
|
||||
|
||||
Configure Conan (1.x only) to use recipe revisions:
|
||||
|
||||
```
|
||||
conan config set general.revisions_enabled=1
|
||||
```
|
||||
|
||||
**Linux** developers will commonly have a default Conan [profile][] that compiles
|
||||
with GCC and links with libstdc++.
|
||||
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
|
||||
then you will need to choose the `libstdc++11` ABI:
|
||||
Linux developers will commonly have a default Conan [profile][] that compiles
|
||||
with GCC and links with libstdc++.
|
||||
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
|
||||
then you will need to choose the `libstdc++11` ABI.
|
||||
|
||||
```
|
||||
conan profile update settings.compiler.libcxx=libstdc++11 default
|
||||
```
|
||||
|
||||
|
||||
Ensure inter-operability between `boost::string_view` and `std::string_view` types:
|
||||
|
||||
```
|
||||
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_BEAST_USE_STD_STRING_VIEW"]' default
|
||||
conan profile update 'env.CXXFLAGS="-DBOOST_BEAST_USE_STD_STRING_VIEW"' default
|
||||
```
|
||||
|
||||
If you have other flags in the `conf.tools.build` or `env.CXXFLAGS` sections, make sure to retain the existing flags and append the new ones. You can check them with:
|
||||
```
|
||||
conan profile show default
|
||||
```
|
||||
|
||||
|
||||
**Windows** developers may need to use the x64 native build tools.
|
||||
An easy way to do that is to run the shortcut "x64 Native Tools Command
|
||||
Prompt" for the version of Visual Studio that you have installed.
|
||||
On Windows, you should use the x64 native build tools.
|
||||
An easy way to do that is to run the shortcut "x64 Native Tools Command
|
||||
Prompt" for the version of Visual Studio that you have installed.
|
||||
|
||||
Windows developers must also build `rippled` and its dependencies for the x64
|
||||
architecture:
|
||||
architecture.
|
||||
|
||||
```
|
||||
conan profile update settings.arch=x86_64 default
|
||||
```
|
||||
|
||||
### Multiple compilers
|
||||
|
||||
When `/usr/bin/g++` exists on a platform, it is the default cpp compiler. This
|
||||
default works for some users.
|
||||
|
||||
However, if this compiler cannot build rippled or its dependencies, then you can
|
||||
install another compiler and set Conan and CMake to use it.
|
||||
Update the `conf.tools.build:compiler_executables` setting in order to set the correct variables (`CMAKE_<LANG>_COMPILER`) in the
|
||||
generated CMake toolchain file.
|
||||
For example, on Ubuntu 20, you may have gcc at `/usr/bin/gcc` and g++ at `/usr/bin/g++`; if that is the case, you can select those compilers with:
|
||||
```
|
||||
conan profile update 'conf.tools.build:compiler_executables={"c": "/usr/bin/gcc", "cpp": "/usr/bin/g++"}' default
|
||||
```
|
||||
|
||||
Replace `/usr/bin/gcc` and `/usr/bin/g++` with paths to the desired compilers.
|
||||
|
||||
It should choose the compiler for dependencies as well,
|
||||
but not all of them have a Conan recipe that respects this setting (yet).
|
||||
For the rest, you can set these environment variables.
|
||||
Replace `<path>` with paths to the desired compilers:
|
||||
|
||||
- `conan profile update env.CC=<path> default`
|
||||
- `conan profile update env.CXX=<path> default`
|
||||
|
||||
Export our [Conan recipe for Snappy](./external/snappy).
|
||||
It does not explicitly link the C++ standard library,
|
||||
which allows you to statically link it with GCC, if you want.
|
||||
3. (Optional) If you have multiple compilers installed on your platform,
|
||||
make sure that Conan and CMake select the one you want to use.
|
||||
This setting will set the correct variables (`CMAKE_<LANG>_COMPILER`)
|
||||
in the generated CMake toolchain file.
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/snappy snappy/1.1.10@
|
||||
# Conan 2.x
|
||||
conan export --version 1.1.10 external/snappy
|
||||
conan profile update 'conf.tools.build:compiler_executables={"c": "<path>", "cpp": "<path>"}' default
|
||||
```
|
||||
|
||||
Export our [Conan recipe for RocksDB](./external/rocksdb).
|
||||
It does not override paths to dependencies when building with Visual Studio.
|
||||
It should choose the compiler for dependencies as well,
|
||||
but not all of them have a Conan recipe that respects this setting (yet).
|
||||
For the rest, you can set these environment variables:
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/rocksdb rocksdb/6.29.5@
|
||||
# Conan 2.x
|
||||
conan export --version 6.29.5 external/rocksdb
|
||||
conan profile update env.CC=<path> default
|
||||
conan profile update env.CXX=<path> default
|
||||
```
|
||||
|
||||
Export our [Conan recipe for SOCI](./external/soci).
|
||||
It patches their CMake to correctly import its dependencies.
|
||||
4. Export our [Conan recipe for Snappy](./external/snappy).
|
||||
It doesn't explicitly link the C++ standard library,
|
||||
which allows you to statically link it with GCC, if you want.
|
||||
|
||||
```
|
||||
conan export external/snappy snappy/1.1.9@
|
||||
```
|
||||
|
||||
5. Export our [Conan recipe for SOCI](./external/soci).
|
||||
It patches their CMake to correctly import its dependencies.
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/soci soci/4.0.3@
|
||||
# Conan 2.x
|
||||
conan export --version 4.0.3 external/soci
|
||||
```
|
||||
|
||||
Export our [Conan recipe for NuDB](./external/nudb).
|
||||
It fixes some source files to add missing `#include`s.
|
||||
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/nudb nudb/2.0.8@
|
||||
# Conan 2.x
|
||||
conan export --version 2.0.8 external/nudb
|
||||
```
|
||||
|
||||
### Build and Test
|
||||
@@ -239,13 +162,13 @@ It fixes some source files to add missing `#include`s.
|
||||
generated by the first. You can pass the build type on the command line with
|
||||
`--settings build_type=$BUILD_TYPE` or in the profile itself,
|
||||
under the section `[settings]` with the key `build_type`.
|
||||
|
||||
|
||||
If you are using a Microsoft Visual C++ compiler,
|
||||
then you will need to ensure consistency between the `build_type` setting
|
||||
and the `compiler.runtime` setting.
|
||||
|
||||
|
||||
When `build_type` is `Release`, `compiler.runtime` should be `MT`.
|
||||
|
||||
|
||||
When `build_type` is `Debug`, `compiler.runtime` should be `MTd`.
|
||||
|
||||
```
|
||||
@@ -257,19 +180,19 @@ It fixes some source files to add missing `#include`s.
|
||||
`$OUTPUT_FOLDER/build/generators/conan_toolchain.cmake`.
|
||||
|
||||
Single-config generators:
|
||||
|
||||
|
||||
```
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release -Dxrpld=ON -Dtests=ON ..
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
|
||||
```
|
||||
|
||||
Pass the CMake variable [`CMAKE_BUILD_TYPE`][build_type]
|
||||
and make sure it matches the `build_type` setting you chose in the previous
|
||||
step.
|
||||
|
||||
Multi-config generators:
|
||||
Multi-config gnerators:
|
||||
|
||||
```
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -Dxrpld=ON -Dtests=ON ..
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
|
||||
```
|
||||
|
||||
**Note:** You can pass build options for `rippled` in this step.
|
||||
@@ -287,7 +210,7 @@ It fixes some source files to add missing `#include`s.
|
||||
```
|
||||
|
||||
Multi-config generators:
|
||||
|
||||
|
||||
```
|
||||
cmake --build . --config Release
|
||||
cmake --build . --config Debug
|
||||
@@ -312,75 +235,15 @@ It fixes some source files to add missing `#include`s.
|
||||
generator. Pass `--help` to see the rest of the command line options.
|
||||
|
||||
|
||||
## Coverage report
|
||||
|
||||
The coverage report is intended for developers using compilers GCC
|
||||
or Clang (including Apple Clang). It is generated by the build target `coverage`,
|
||||
which is only enabled when the `coverage` option is set, e.g. with
|
||||
`--options coverage=True` in `conan` or `-Dcoverage=ON` variable in `cmake`
|
||||
|
||||
Prerequisites for the coverage report:
|
||||
|
||||
- [gcovr tool][gcovr] (can be installed e.g. with [pip][python-pip])
|
||||
- `gcov` for GCC (installed with the compiler by default) or
|
||||
- `llvm-cov` for Clang (installed with the compiler by default)
|
||||
- `Debug` build type
|
||||
|
||||
A coverage report is created when the following steps are completed, in order:
|
||||
|
||||
1. `rippled` binary built with instrumentation data, enabled by the `coverage`
|
||||
option mentioned above
|
||||
2. completed run of unit tests, which populates coverage capture data
|
||||
3. completed run of the `gcovr` tool (which internally invokes either `gcov` or `llvm-cov`)
|
||||
to assemble both instrumentation data and the coverage capture data into a coverage report
|
||||
|
||||
The above steps are automated into a single target `coverage`. The instrumented
|
||||
`rippled` binary can also be used for regular development or testing work, at
|
||||
the cost of extra disk space utilization and a small performance hit
|
||||
(to store coverage capture). In case of a spurious failure of unit tests, it is
|
||||
possible to re-run the `coverage` target without rebuilding the `rippled` binary
|
||||
(since it is simply a dependency of the coverage report target). It is also possible
|
||||
to select only specific tests for the purpose of the coverage report, by setting
|
||||
the `coverage_test` variable in `cmake`
|
||||
|
||||
The default coverage report format is `html-details`, but the user
|
||||
can override it to any of the formats listed in `cmake/CodeCoverage.cmake`
|
||||
by setting the `coverage_format` variable in `cmake`. It is also possible
|
||||
to generate more than one format at a time by setting the `coverage_extra_args`
|
||||
variable in `cmake`. The specific command line used to run the `gcovr` tool will be
|
||||
displayed if the `CODE_COVERAGE_VERBOSE` variable is set.
|
||||
|
||||
By default, the code coverage tool runs parallel unit tests with `--unittest-jobs`
|
||||
set to the number of available CPU cores. This may cause spurious test
|
||||
errors on Apple. Developers can override the number of unit test jobs with
|
||||
the `coverage_test_parallelism` variable in `cmake`.
|
||||
|
||||
Example use with some cmake variables set:
|
||||
|
||||
```
|
||||
cd .build
|
||||
conan install .. --output-folder . --build missing --settings build_type=Debug
|
||||
cmake -DCMAKE_BUILD_TYPE=Debug -Dcoverage=ON -Dxrpld=ON -Dtests=ON -Dcoverage_test_parallelism=2 -Dcoverage_format=html-details -Dcoverage_extra_args="--json coverage.json" -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
|
||||
cmake --build . --target coverage
|
||||
```
|
||||
|
||||
After the `coverage` target is completed, the generated coverage report will be
|
||||
stored inside the build directory, as either of:
|
||||
|
||||
- file named `coverage.`_extension_ , with a suitable extension for the report format, or
|
||||
- directory named `coverage`, with the `index.html` and other files inside, for the `html-details` or `html-nested` report formats.
|
||||
|
||||
|
||||
## Options
|
||||
|
||||
| Option | Default Value | Description |
|
||||
| --- | ---| ---|
|
||||
| `assert` | OFF | Enable assertions.
|
||||
| `coverage` | OFF | Prepare the coverage report. |
|
||||
| `san` | N/A | Enable a sanitizer with Clang. Choices are `thread` and `address`. |
|
||||
| `tests` | OFF | Build tests. |
|
||||
| `reporting` | OFF | Build the reporting mode feature. |
|
||||
| `tests` | ON | Build tests. |
|
||||
| `unity` | ON | Configure a unity build. |
|
||||
| `xrpld` | OFF | Build the xrpld (`rippled`) application, and not just the libxrpl library. |
|
||||
| `san` | N/A | Enable a sanitizer with Clang. Choices are `thread` and `address`. |
|
||||
|
||||
[Unity builds][5] may be faster for the first build
|
||||
(at the cost of much more memory) since they concatenate sources into fewer
|
||||
@@ -416,18 +279,6 @@ conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_R
|
||||
```
|
||||
|
||||
|
||||
### call to 'async_teardown' is ambiguous
|
||||
|
||||
If you are compiling with an early version of Clang 16, then you might hit
|
||||
a [regression][6] when compiling C++20 that manifests as an [error in a Boost
|
||||
header][7]. You can workaround it by adding this preprocessor definition:
|
||||
|
||||
```
|
||||
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_DISABLE_CONCEPTS"' default
|
||||
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
|
||||
```
|
||||
|
||||
|
||||
### recompile with -fPIC
|
||||
|
||||
If you get a linker error suggesting that you recompile Boost with
|
||||
@@ -579,10 +430,6 @@ but it is more convenient to put them in a [profile][profile].
|
||||
|
||||
[1]: https://github.com/conan-io/conan-center-index/issues/13168
|
||||
[5]: https://en.wikipedia.org/wiki/Unity_build
|
||||
[6]: https://github.com/boostorg/beast/issues/2648
|
||||
[7]: https://github.com/boostorg/beast/issues/2661
|
||||
[gcovr]: https://gcovr.com/en/stable/getting-started.html
|
||||
[python-pip]: https://packaging.python.org/en/latest/guides/installing-using-pip-and-virtual-environments/
|
||||
[build_type]: https://cmake.org/cmake/help/latest/variable/CMAKE_BUILD_TYPE.html
|
||||
[runtime]: https://cmake.org/cmake/help/latest/variable/CMAKE_MSVC_RUNTIME_LIBRARY.html
|
||||
[toolchain]: https://cmake.org/cmake/help/latest/manual/cmake-toolchains.7.html
|
||||
|
||||
465
BUILD_LEDGER.md
Normal file
465
BUILD_LEDGER.md
Normal file
@@ -0,0 +1,465 @@
|
||||
# Hash Migration Implementation via BuildLedger
|
||||
|
||||
## Overview
|
||||
|
||||
This document outlines the approach for implementing SHA-512 Half to BLAKE3 hash migration by performing the state map rekeying operation in the ledger building process, bypassing the metadata generation problem inherent in the transaction processing pipeline.
|
||||
|
||||
## The Core Problem
|
||||
|
||||
When switching from SHA-512 Half to BLAKE3, every object in the state map needs to be rekeyed because the hash (which IS the key in the SHAMap) changes. This would generate metadata showing:
|
||||
- Every object deleted at its old SHA-512 key
|
||||
- Every object created at its new BLAKE3 key
|
||||
- Total metadata size: 2× the entire state size (potentially gigabytes)
|
||||
|
||||
## The Solution: Bypass Transaction Processing
|
||||
|
||||
Instead of trying to rekey within the transaction processor (which tracks all changes for metadata), perform the rekeying AFTER transaction processing but BEFORE ledger finalization.
|
||||
|
||||
## Implementation Location
|
||||
|
||||
The key intervention point is in `buildLedgerImpl()` at line 63 of `BuildLedger.cpp`:
|
||||
|
||||
```cpp
|
||||
// BuildLedger.cpp, lines 58-65
|
||||
{
|
||||
OpenView accum(&*built);
|
||||
assert(!accum.open());
|
||||
applyTxs(accum, built); // Apply transactions (including pseudo-txns)
|
||||
accum.apply(*built); // Apply accumulated changes to the ledger
|
||||
}
|
||||
// <-- INTERVENTION POINT HERE
|
||||
built->updateSkipList();
|
||||
```
|
||||
|
||||
## Detailed Implementation
|
||||
|
||||
### 1. Pseudo-Transaction Role (Simple Flag Setting)
|
||||
|
||||
```cpp
|
||||
// In Change.cpp
|
||||
TER Change::applyHashMigration()
|
||||
{
|
||||
// The pseudo-transaction just sets a flag
|
||||
// The actual migration happens in BuildLedger
|
||||
|
||||
JLOG(j_.warn()) << "Hash migration pseudo transaction triggered at ledger "
|
||||
<< view().seq();
|
||||
|
||||
// Create a migration flag object
|
||||
auto migrationFlag = std::make_shared<SLE>(
|
||||
keylet::hashMigrationFlag(
|
||||
hash_options{view().seq(), KEYLET_MIGRATION_FLAG}));
|
||||
|
||||
migrationFlag->setFieldU32(sfLedgerSequence, view().seq());
|
||||
migrationFlag->setFieldU8(sfMigrationStatus, 1); // 1 = pending
|
||||
|
||||
view().insert(migrationFlag);
|
||||
|
||||
return tesSUCCESS;
|
||||
}
|
||||
```
|
||||
|
||||
### 2. BuildLedger Modification
|
||||
|
||||
```cpp
|
||||
// In BuildLedger.cpp, after line 63
|
||||
template <class ApplyTxs>
|
||||
std::shared_ptr<Ledger>
|
||||
buildLedgerImpl(
|
||||
std::shared_ptr<Ledger const> const& parent,
|
||||
NetClock::time_point closeTime,
|
||||
const bool closeTimeCorrect,
|
||||
NetClock::duration closeResolution,
|
||||
Application& app,
|
||||
beast::Journal j,
|
||||
ApplyTxs&& applyTxs)
|
||||
{
|
||||
auto built = std::make_shared<Ledger>(*parent, closeTime);
|
||||
|
||||
if (built->isFlagLedger() && built->rules().enabled(featureNegativeUNL))
|
||||
{
|
||||
built->updateNegativeUNL();
|
||||
}
|
||||
|
||||
{
|
||||
OpenView accum(&*built);
|
||||
assert(!accum.open());
|
||||
applyTxs(accum, built);
|
||||
accum.apply(*built);
|
||||
}
|
||||
|
||||
// NEW: Check for hash migration flag
|
||||
if (shouldPerformHashMigration(built, app, j))
|
||||
{
|
||||
performHashMigration(built, app, j);
|
||||
}
|
||||
|
||||
built->updateSkipList();
|
||||
// ... rest of function
|
||||
}
|
||||
|
||||
// New helper functions
|
||||
bool shouldPerformHashMigration(
|
||||
std::shared_ptr<Ledger> const& ledger,
|
||||
Application& app,
|
||||
beast::Journal j)
|
||||
{
|
||||
// Check if we're in the migration window
|
||||
constexpr LedgerIndex MIGRATION_START = 20'000'000;
|
||||
constexpr LedgerIndex MIGRATION_END = 20'000'010;
|
||||
|
||||
if (ledger->seq() < MIGRATION_START || ledger->seq() >= MIGRATION_END)
|
||||
return false;
|
||||
|
||||
// Check for migration flag
|
||||
auto const flag = ledger->read(keylet::hashMigrationFlag(
|
||||
hash_options{ledger->seq(), KEYLET_MIGRATION_FLAG}));
|
||||
|
||||
if (!flag)
|
||||
return false;
|
||||
|
||||
return flag->getFieldU8(sfMigrationStatus) == 1; // 1 = pending
|
||||
}
|
||||
|
||||
void performHashMigration(
|
||||
std::shared_ptr<Ledger> const& ledger,
|
||||
Application& app,
|
||||
beast::Journal j)
|
||||
{
|
||||
JLOG(j.warn()) << "PERFORMING HASH MIGRATION at ledger " << ledger->seq();
|
||||
|
||||
auto& oldStateMap = ledger->stateMap();
|
||||
|
||||
// Create new state map with BLAKE3 hashing
|
||||
SHAMap newStateMap(SHAMapType::STATE, ledger->family());
|
||||
newStateMap.setLedgerSeq(ledger->seq());
|
||||
|
||||
// Track statistics
|
||||
std::size_t objectCount = 0;
|
||||
auto startTime = std::chrono::steady_clock::now();
|
||||
|
||||
// Walk the entire state map and rekey everything
|
||||
oldStateMap.visitLeaves([&](SHAMapItem const& item) {
|
||||
try {
|
||||
// Deserialize the ledger entry
|
||||
SerialIter sit(item.slice());
|
||||
auto sle = std::make_shared<SLE>(sit, item.key());
|
||||
|
||||
// The new key would be calculated with BLAKE3
|
||||
// For now, we'd need the actual BLAKE3 implementation
|
||||
// uint256 newKey = calculateBlake3Key(sle);
|
||||
|
||||
// For this example, let's assume we have a function that
|
||||
// computes the new key based on the SLE type and contents
|
||||
uint256 newKey = computeNewHashKey(sle, ledger->seq());
|
||||
|
||||
// Re-serialize the SLE
|
||||
Serializer s;
|
||||
sle->add(s);
|
||||
|
||||
// Add to new map with new key
|
||||
newStateMap.addGiveItem(
|
||||
SHAMapNodeType::tnACCOUNT_STATE,
|
||||
make_shamapitem(newKey, s.slice()));
|
||||
|
||||
objectCount++;
|
||||
|
||||
if (objectCount % 10000 == 0) {
|
||||
JLOG(j.info()) << "Migration progress: " << objectCount
|
||||
<< " objects rekeyed";
|
||||
}
|
||||
}
|
||||
catch (std::exception const& e) {
|
||||
JLOG(j.error()) << "Failed to migrate object " << item.key()
|
||||
<< ": " << e.what();
|
||||
throw;
|
||||
}
|
||||
});
|
||||
|
||||
auto endTime = std::chrono::steady_clock::now();
|
||||
auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(
|
||||
endTime - startTime);
|
||||
|
||||
JLOG(j.warn()) << "Hash migration completed: " << objectCount
|
||||
<< " objects rekeyed in " << duration.count() << "ms";
|
||||
|
||||
// Swap the state maps
|
||||
oldStateMap = std::move(newStateMap);
|
||||
|
||||
// Update the migration flag to completed
|
||||
auto flag = ledger->peek(keylet::hashMigrationFlag(
|
||||
hash_options{ledger->seq(), KEYLET_MIGRATION_FLAG}));
|
||||
if (flag) {
|
||||
flag->setFieldU8(sfMigrationStatus, 2); // 2 = completed
|
||||
ledger->rawReplace(flag);
|
||||
}
|
||||
}
|
||||
|
||||
uint256 computeNewHashKey(
|
||||
std::shared_ptr<SLE const> const& sle,
|
||||
LedgerIndex ledgerSeq)
|
||||
{
|
||||
// This would use BLAKE3 instead of SHA-512 Half
|
||||
// Implementation depends on the BLAKE3 integration
|
||||
// For now, this is a placeholder
|
||||
|
||||
// The actual implementation would:
|
||||
// 1. Determine the object type
|
||||
// 2. Extract the identifying fields
|
||||
// 3. Hash them with BLAKE3
|
||||
// 4. Return the new key
|
||||
|
||||
return uint256(); // Placeholder
|
||||
}
|
||||
```
|
||||
|
||||
## Why This Approach Works
|
||||
|
||||
### 1. No Metadata Explosion
|
||||
- The rekeying happens AFTER the `OpenView` is destroyed
|
||||
- No change tracking occurs during the rekeying
|
||||
- Only the migration flag generates metadata (minimal)
|
||||
|
||||
### 2. Direct SHAMap Access
|
||||
- We have direct access to `built->stateMap()`
|
||||
- Can manipulate the raw data structure without going through ApplyView
|
||||
- Can create a new SHAMap and swap it in
|
||||
|
||||
### 3. Clean Separation of Concerns
|
||||
- Pseudo-transaction: "Signal that migration should happen"
|
||||
- BuildLedger: "Actually perform the migration"
|
||||
- Transaction processor: Unchanged, doesn't need to handle massive rekeying
|
||||
|
||||
### 4. Timing is Perfect
|
||||
- After all transactions are applied
|
||||
- Before the ledger is finalized
|
||||
- Before the skip list is updated
|
||||
- Before the SHAMap is flushed to disk
|
||||
|
||||
## Files Referenced in This Analysis
|
||||
|
||||
### Core Implementation Files
|
||||
- `src/ripple/app/ledger/impl/BuildLedger.cpp` - Main implementation location
|
||||
- `src/ripple/app/ledger/BuildLedger.h` - Header for build functions
|
||||
- `src/ripple/app/tx/impl/Change.cpp` - Pseudo-transaction handler
|
||||
- `src/ripple/app/tx/impl/Change.h` - Change transactor header
|
||||
|
||||
### Transaction Processing Pipeline (analyzed but bypassed)
|
||||
- `src/ripple/app/tx/impl/Transactor.cpp` - Base transaction processor
|
||||
- `src/ripple/app/tx/impl/Transactor.h` - Transactor header
|
||||
- `src/ripple/app/tx/impl/apply.cpp` - Transaction application
|
||||
- `src/ripple/app/tx/impl/applySteps.cpp` - Transaction routing
|
||||
- `src/ripple/app/tx/impl/ApplyContext.h` - Application context
|
||||
|
||||
### Ledger and View Classes
|
||||
- `src/ripple/app/ledger/Ledger.h` - Ledger class definition
|
||||
- `src/ripple/app/ledger/Ledger.cpp` - Ledger implementation
|
||||
- `src/ripple/ledger/ApplyView.h` - View interface
|
||||
- `src/ripple/ledger/ApplyViewImpl.h` - View implementation header
|
||||
- `src/ripple/ledger/impl/ApplyViewImpl.cpp` - View implementation
|
||||
- `src/ripple/ledger/impl/ApplyViewBase.cpp` - Base view implementation
|
||||
- `src/ripple/ledger/detail/ApplyViewBase.h` - Base view header
|
||||
- `src/ripple/ledger/OpenView.h` - Open ledger view
|
||||
- `src/ripple/ledger/RawView.h` - Raw view interface
|
||||
|
||||
### SHAMap and Data Structures
|
||||
- `src/ripple/shamap/SHAMap.h` - SHAMap class definition
|
||||
|
||||
### Metadata Generation
|
||||
- `src/ripple/protocol/TxMeta.h` - Transaction metadata header
|
||||
- `src/ripple/protocol/impl/TxMeta.cpp` - Metadata implementation
|
||||
|
||||
### Consensus and Pseudo-Transaction Injection
|
||||
- `src/ripple/app/consensus/RCLConsensus.cpp` - Consensus implementation
|
||||
|
||||
### Supporting Documents
|
||||
- `PSEUDO_TRANSACTIONS.md` - Documentation on pseudo-transactions
|
||||
- `HASH_MIGRATION_CONTEXT.md` - Context for hash migration work
|
||||
|
||||
## Key Advantages
|
||||
|
||||
1. **Architecturally Clean**: Works within existing ledger building framework
|
||||
2. **No Metadata Issues**: Completely bypasses the metadata generation problem
|
||||
3. **Atomic Operation**: Either the entire state is rekeyed or none of it is
|
||||
4. **Fail-Safe**: Can be wrapped in try-catch for error handling
|
||||
5. **Observable**: Can log progress for large state maps
|
||||
6. **Testable**: Can be tested independently of transaction processing
|
||||
|
||||
## Challenges and Considerations
|
||||
|
||||
1. **Performance**: Rekeying millions of objects will take time
|
||||
- Solution: This happens during consensus, all nodes do it simultaneously
|
||||
|
||||
2. **Memory Usage**: Need to hold both old and new SHAMaps temporarily
|
||||
- Solution: Could potentially do in-place updates with careful ordering
|
||||
|
||||
3. **Verification**: Need to ensure all nodes get the same result
|
||||
- Solution: Deterministic rekeying based on ledger sequence
|
||||
|
||||
4. **Rollback**: If migration fails, need to handle gracefully
|
||||
- Solution: Keep old map until new map is fully built and verified
|
||||
|
||||
## Conclusion
|
||||
|
||||
By performing the hash migration at the ledger building level rather than within the transaction processing pipeline, we can successfully rekey the entire state map without generating massive metadata. This approach leverages the existing architecture's separation between transaction processing and ledger construction, providing a clean and efficient solution to what initially appeared to be an intractable problem.
|
||||
|
||||
---
|
||||
|
||||
## APPENDIX: Revised Implementation Following Ledger Pattern
|
||||
|
||||
After reviewing the existing pattern in `BuildLedger.cpp`, it's clear that special ledger operations are implemented as methods on the `Ledger` class itself (e.g., `built->updateNegativeUNL()`). Following this pattern, the hash migration should be implemented as `Ledger::migrateToBlake3()`.
|
||||
|
||||
### Updated BuildLedger.cpp Implementation
|
||||
|
||||
```cpp
|
||||
// In BuildLedger.cpp, following the existing pattern
|
||||
template <class ApplyTxs>
|
||||
std::shared_ptr<Ledger>
|
||||
buildLedgerImpl(
|
||||
std::shared_ptr<Ledger const> const& parent,
|
||||
NetClock::time_point closeTime,
|
||||
const bool closeTimeCorrect,
|
||||
NetClock::duration closeResolution,
|
||||
Application& app,
|
||||
beast::Journal j,
|
||||
ApplyTxs&& applyTxs)
|
||||
{
|
||||
auto built = std::make_shared<Ledger>(*parent, closeTime);
|
||||
|
||||
if (built->isFlagLedger() && built->rules().enabled(featureNegativeUNL))
|
||||
{
|
||||
built->updateNegativeUNL();
|
||||
}
|
||||
|
||||
{
|
||||
OpenView accum(&*built);
|
||||
assert(!accum.open());
|
||||
applyTxs(accum, built);
|
||||
accum.apply(*built);
|
||||
}
|
||||
|
||||
// NEW: Check and perform hash migration following the pattern
|
||||
if (built->rules().enabled(featureBLAKE3Migration) &&
|
||||
built->shouldMigrateToBlake3())
|
||||
{
|
||||
built->migrateToBlake3();
|
||||
}
|
||||
|
||||
built->updateSkipList();
|
||||
// ... rest of function
|
||||
}
|
||||
```
|
||||
|
||||
### Ledger.h Addition
|
||||
|
||||
```cpp
|
||||
// In src/ripple/app/ledger/Ledger.h
|
||||
class Ledger final : public std::enable_shared_from_this<Ledger>,
|
||||
public DigestAwareReadView,
|
||||
public TxsRawView,
|
||||
public CountedObject<Ledger>
|
||||
{
|
||||
public:
|
||||
// ... existing methods ...
|
||||
|
||||
/** Update the Negative UNL ledger component. */
|
||||
void
|
||||
updateNegativeUNL();
|
||||
|
||||
/** Check if hash migration to BLAKE3 should be performed */
|
||||
bool
|
||||
shouldMigrateToBlake3() const;
|
||||
|
||||
/** Perform hash migration from SHA-512 Half to BLAKE3
|
||||
* This rekeys all objects in the state map with new BLAKE3 hashes.
|
||||
* Must be called after transactions are applied but before the
|
||||
* ledger is finalized.
|
||||
*/
|
||||
void
|
||||
migrateToBlake3();
|
||||
|
||||
// ... rest of class ...
|
||||
};
|
||||
```
|
||||
|
||||
### Ledger.cpp Implementation
|
||||
|
||||
```cpp
|
||||
// In src/ripple/app/ledger/Ledger.cpp
|
||||
|
||||
bool
|
||||
Ledger::shouldMigrateToBlake3() const
|
||||
{
|
||||
// Check if we're in the migration window
|
||||
constexpr LedgerIndex MIGRATION_START = 20'000'000;
|
||||
constexpr LedgerIndex MIGRATION_END = 20'000'010;
|
||||
|
||||
if (seq() < MIGRATION_START || seq() >= MIGRATION_END)
|
||||
return false;
|
||||
|
||||
// Check for migration flag set by pseudo-transaction
|
||||
auto const flag = read(keylet::hashMigrationFlag(
|
||||
hash_options{seq(), KEYLET_MIGRATION_FLAG}));
|
||||
|
||||
if (!flag)
|
||||
return false;
|
||||
|
||||
return flag->getFieldU8(sfMigrationStatus) == 1; // 1 = pending
|
||||
}
|
||||
|
||||
void
|
||||
Ledger::migrateToBlake3()
|
||||
{
|
||||
JLOG(j_.warn()) << "Performing BLAKE3 hash migration at ledger " << seq();
|
||||
|
||||
// Create new state map with BLAKE3 hashing
|
||||
SHAMap newStateMap(SHAMapType::STATE, stateMap_.family());
|
||||
newStateMap.setLedgerSeq(seq());
|
||||
|
||||
std::size_t objectCount = 0;
|
||||
auto startTime = std::chrono::steady_clock::now();
|
||||
|
||||
// Walk the entire state map and rekey everything
|
||||
stateMap_.visitLeaves([&](SHAMapItem const& item) {
|
||||
// Deserialize the ledger entry
|
||||
SerialIter sit(item.slice());
|
||||
auto sle = std::make_shared<SLE>(sit, item.key());
|
||||
|
||||
// Calculate new BLAKE3-based key
|
||||
// This would use the actual BLAKE3 implementation
|
||||
uint256 newKey = computeBlake3Key(sle);
|
||||
|
||||
// Re-serialize and add to new map
|
||||
Serializer s;
|
||||
sle->add(s);
|
||||
|
||||
newStateMap.addGiveItem(
|
||||
SHAMapNodeType::tnACCOUNT_STATE,
|
||||
make_shamapitem(newKey, s.slice()));
|
||||
|
||||
if (++objectCount % 10000 == 0) {
|
||||
JLOG(j_.info()) << "Migration progress: " << objectCount
|
||||
<< " objects rekeyed";
|
||||
}
|
||||
});
|
||||
|
||||
auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(
|
||||
std::chrono::steady_clock::now() - startTime);
|
||||
|
||||
JLOG(j_.warn()) << "BLAKE3 migration completed: " << objectCount
|
||||
<< " objects rekeyed in " << duration.count() << "ms";
|
||||
|
||||
// Swap the state maps
|
||||
stateMap_ = std::move(newStateMap);
|
||||
|
||||
// Update the migration flag to completed
|
||||
auto flag = peek(keylet::hashMigrationFlag(
|
||||
hash_options{seq(), KEYLET_MIGRATION_FLAG}));
|
||||
if (flag) {
|
||||
flag->setFieldU8(sfMigrationStatus, 2); // 2 = completed
|
||||
rawReplace(flag);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This approach follows the established pattern in the codebase where special ledger operations are encapsulated as methods on the `Ledger` class itself, making the code more maintainable and consistent with the existing architecture.
|
||||
207
Builds/CMake/CMakeFuncs.cmake
Normal file
207
Builds/CMake/CMakeFuncs.cmake
Normal file
@@ -0,0 +1,207 @@
|
||||
|
||||
macro(group_sources_in source_dir curdir)
|
||||
file(GLOB children RELATIVE ${source_dir}/${curdir}
|
||||
${source_dir}/${curdir}/*)
|
||||
foreach (child ${children})
|
||||
if (IS_DIRECTORY ${source_dir}/${curdir}/${child})
|
||||
group_sources_in(${source_dir} ${curdir}/${child})
|
||||
else()
|
||||
string(REPLACE "/" "\\" groupname ${curdir})
|
||||
source_group(${groupname} FILES
|
||||
${source_dir}/${curdir}/${child})
|
||||
endif()
|
||||
endforeach()
|
||||
endmacro()
|
||||
|
||||
macro(group_sources curdir)
|
||||
group_sources_in(${PROJECT_SOURCE_DIR} ${curdir})
|
||||
endmacro()
|
||||
|
||||
macro (exclude_from_default target_)
|
||||
set_target_properties (${target_} PROPERTIES EXCLUDE_FROM_ALL ON)
|
||||
set_target_properties (${target_} PROPERTIES EXCLUDE_FROM_DEFAULT_BUILD ON)
|
||||
endmacro ()
|
||||
|
||||
macro (exclude_if_included target_)
|
||||
get_directory_property(has_parent PARENT_DIRECTORY)
|
||||
if (has_parent)
|
||||
exclude_from_default (${target_})
|
||||
endif ()
|
||||
endmacro ()
|
||||
|
||||
function (print_ep_logs _target)
|
||||
ExternalProject_Get_Property (${_target} STAMP_DIR)
|
||||
add_custom_command(TARGET ${_target} POST_BUILD
|
||||
COMMENT "${_target} BUILD OUTPUT"
|
||||
COMMAND ${CMAKE_COMMAND}
|
||||
-DIN_FILE=${STAMP_DIR}/${_target}-build-out.log
|
||||
-P ${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/echo_file.cmake
|
||||
COMMAND ${CMAKE_COMMAND}
|
||||
-DIN_FILE=${STAMP_DIR}/${_target}-build-err.log
|
||||
-P ${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/echo_file.cmake)
|
||||
endfunction ()
|
||||
|
||||
#[=========================================================[
|
||||
This is a function override for one function in the
|
||||
standard ExternalProject module. We want to change
|
||||
the generated build script slightly to include printing
|
||||
the build logs in the case of failure. Those modifications
|
||||
have been made here. This function override could break
|
||||
in the future if the ExternalProject module changes internal
|
||||
function names or changes the way it generates the build
|
||||
scripts.
|
||||
See:
|
||||
https://gitlab.kitware.com/cmake/cmake/blob/df1ddeec128d68cc636f2dde6c2acd87af5658b6/Modules/ExternalProject.cmake#L1855-1952
|
||||
#]=========================================================]
|
||||
|
||||
function(_ep_write_log_script name step cmd_var)
|
||||
ExternalProject_Get_Property(${name} stamp_dir)
|
||||
set(command "${${cmd_var}}")
|
||||
|
||||
set(make "")
|
||||
set(code_cygpath_make "")
|
||||
if(command MATCHES "^\\$\\(MAKE\\)")
|
||||
# GNU make recognizes the string "$(MAKE)" as recursive make, so
|
||||
# ensure that it appears directly in the makefile.
|
||||
string(REGEX REPLACE "^\\$\\(MAKE\\)" "\${make}" command "${command}")
|
||||
set(make "-Dmake=$(MAKE)")
|
||||
|
||||
if(WIN32 AND NOT CYGWIN)
|
||||
set(code_cygpath_make "
|
||||
if(\${make} MATCHES \"^/\")
|
||||
execute_process(
|
||||
COMMAND cygpath -w \${make}
|
||||
OUTPUT_VARIABLE cygpath_make
|
||||
ERROR_VARIABLE cygpath_make
|
||||
RESULT_VARIABLE cygpath_error
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
)
|
||||
if(NOT cygpath_error)
|
||||
set(make \${cygpath_make})
|
||||
endif()
|
||||
endif()
|
||||
")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
set(config "")
|
||||
if("${CMAKE_CFG_INTDIR}" MATCHES "^\\$")
|
||||
string(REPLACE "${CMAKE_CFG_INTDIR}" "\${config}" command "${command}")
|
||||
set(config "-Dconfig=${CMAKE_CFG_INTDIR}")
|
||||
endif()
|
||||
|
||||
# Wrap multiple 'COMMAND' lines up into a second-level wrapper
|
||||
# script so all output can be sent to one log file.
|
||||
if(command MATCHES "(^|;)COMMAND;")
|
||||
set(code_execute_process "
|
||||
${code_cygpath_make}
|
||||
execute_process(COMMAND \${command} RESULT_VARIABLE result)
|
||||
if(result)
|
||||
set(msg \"Command failed (\${result}):\\n\")
|
||||
foreach(arg IN LISTS command)
|
||||
set(msg \"\${msg} '\${arg}'\")
|
||||
endforeach()
|
||||
message(FATAL_ERROR \"\${msg}\")
|
||||
endif()
|
||||
")
|
||||
set(code "")
|
||||
set(cmd "")
|
||||
set(sep "")
|
||||
foreach(arg IN LISTS command)
|
||||
if("x${arg}" STREQUAL "xCOMMAND")
|
||||
if(NOT "x${cmd}" STREQUAL "x")
|
||||
string(APPEND code "set(command \"${cmd}\")${code_execute_process}")
|
||||
endif()
|
||||
set(cmd "")
|
||||
set(sep "")
|
||||
else()
|
||||
string(APPEND cmd "${sep}${arg}")
|
||||
set(sep ";")
|
||||
endif()
|
||||
endforeach()
|
||||
string(APPEND code "set(command \"${cmd}\")${code_execute_process}")
|
||||
file(GENERATE OUTPUT "${stamp_dir}/${name}-${step}-$<CONFIG>-impl.cmake" CONTENT "${code}")
|
||||
set(command ${CMAKE_COMMAND} "-Dmake=\${make}" "-Dconfig=\${config}" -P ${stamp_dir}/${name}-${step}-$<CONFIG>-impl.cmake)
|
||||
endif()
|
||||
|
||||
# Wrap the command in a script to log output to files.
|
||||
set(script ${stamp_dir}/${name}-${step}-$<CONFIG>.cmake)
|
||||
set(logbase ${stamp_dir}/${name}-${step})
|
||||
set(code "
|
||||
${code_cygpath_make}
|
||||
function (_echo_file _fil)
|
||||
file (READ \${_fil} _cont)
|
||||
execute_process (COMMAND \${CMAKE_COMMAND} -E echo \"\${_cont}\")
|
||||
endfunction ()
|
||||
set(command \"${command}\")
|
||||
execute_process(
|
||||
COMMAND \${command}
|
||||
RESULT_VARIABLE result
|
||||
OUTPUT_FILE \"${logbase}-out.log\"
|
||||
ERROR_FILE \"${logbase}-err.log\"
|
||||
)
|
||||
if(result)
|
||||
set(msg \"Command failed: \${result}\\n\")
|
||||
foreach(arg IN LISTS command)
|
||||
set(msg \"\${msg} '\${arg}'\")
|
||||
endforeach()
|
||||
execute_process (COMMAND \${CMAKE_COMMAND} -E echo \"Build output for ${logbase} : \")
|
||||
_echo_file (\"${logbase}-out.log\")
|
||||
_echo_file (\"${logbase}-err.log\")
|
||||
set(msg \"\${msg}\\nSee above\\n\")
|
||||
message(FATAL_ERROR \"\${msg}\")
|
||||
else()
|
||||
set(msg \"${name} ${step} command succeeded. See also ${logbase}-*.log\")
|
||||
message(STATUS \"\${msg}\")
|
||||
endif()
|
||||
")
|
||||
file(GENERATE OUTPUT "${script}" CONTENT "${code}")
|
||||
set(command ${CMAKE_COMMAND} ${make} ${config} -P ${script})
|
||||
set(${cmd_var} "${command}" PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
find_package(Git)
|
||||
|
||||
# function that calls git log to get current hash
|
||||
function (git_hash hash_val)
|
||||
# note: optional second extra string argument not in signature
|
||||
if (NOT GIT_FOUND)
|
||||
return ()
|
||||
endif ()
|
||||
set (_hash "")
|
||||
set (_format "%H")
|
||||
if (ARGC GREATER_EQUAL 2)
|
||||
string (TOLOWER ${ARGV1} _short)
|
||||
if (_short STREQUAL "short")
|
||||
set (_format "%h")
|
||||
endif ()
|
||||
endif ()
|
||||
execute_process (COMMAND ${GIT_EXECUTABLE} "log" "--pretty=${_format}" "-n1"
|
||||
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
RESULT_VARIABLE _git_exit_code
|
||||
OUTPUT_VARIABLE _temp_hash
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
ERROR_QUIET)
|
||||
if (_git_exit_code EQUAL 0)
|
||||
set (_hash ${_temp_hash})
|
||||
endif ()
|
||||
set (${hash_val} "${_hash}" PARENT_SCOPE)
|
||||
endfunction ()
|
||||
|
||||
function (git_branch branch_val)
|
||||
if (NOT GIT_FOUND)
|
||||
return ()
|
||||
endif ()
|
||||
set (_branch "")
|
||||
execute_process (COMMAND ${GIT_EXECUTABLE} "rev-parse" "--abbrev-ref" "HEAD"
|
||||
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
RESULT_VARIABLE _git_exit_code
|
||||
OUTPUT_VARIABLE _temp_branch
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
ERROR_QUIET)
|
||||
if (_git_exit_code EQUAL 0)
|
||||
set (_branch ${_temp_branch})
|
||||
endif ()
|
||||
set (${branch_val} "${_branch}" PARENT_SCOPE)
|
||||
endfunction ()
|
||||
|
||||
60
Builds/CMake/CMake_sqlite3.txt
Normal file
60
Builds/CMake/CMake_sqlite3.txt
Normal file
@@ -0,0 +1,60 @@
|
||||
|
||||
#[=========================================================[
|
||||
SQLITE doesn't provide build files in the
|
||||
standard source-only distribution. So we wrote
|
||||
a simple cmake file and we copy it to the
|
||||
external project folder so that we can use
|
||||
this file to build the lib with ExternalProject
|
||||
#]=========================================================]
|
||||
|
||||
add_library (sqlite3 STATIC sqlite3.c)
|
||||
#[=========================================================[
|
||||
When compiled with SQLITE_THREADSAFE=1, SQLite operates
|
||||
in serialized mode. In this mode, SQLite can be safely
|
||||
used by multiple threads with no restriction.
|
||||
|
||||
NOTE: This implies a global mutex!
|
||||
|
||||
When compiled with SQLITE_THREADSAFE=2, SQLite can be
|
||||
used in a multithreaded program so long as no two
|
||||
threads attempt to use the same database connection at
|
||||
the same time.
|
||||
|
||||
NOTE: This is the preferred threading model, but not
|
||||
currently enabled because we need to investigate our
|
||||
use-model and concurrency requirements.
|
||||
|
||||
TODO: consider whether any other options should be
|
||||
used: https://www.sqlite.org/compile.html
|
||||
#]=========================================================]
|
||||
|
||||
target_compile_definitions (sqlite3
|
||||
PRIVATE
|
||||
SQLITE_THREADSAFE=1
|
||||
HAVE_USLEEP=1)
|
||||
target_compile_options (sqlite3
|
||||
PRIVATE
|
||||
$<$<BOOL:${MSVC}>:
|
||||
-wd4100
|
||||
-wd4127
|
||||
-wd4232
|
||||
-wd4244
|
||||
-wd4701
|
||||
-wd4706
|
||||
-wd4996
|
||||
>
|
||||
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-array-bounds>)
|
||||
install (
|
||||
TARGETS
|
||||
sqlite3
|
||||
LIBRARY DESTINATION lib
|
||||
ARCHIVE DESTINATION lib
|
||||
RUNTIME DESTINATION bin
|
||||
INCLUDES DESTINATION include)
|
||||
install (
|
||||
FILES
|
||||
sqlite3.h
|
||||
sqlite3ext.h
|
||||
DESTINATION include)
|
||||
|
||||
|
||||
@@ -130,16 +130,7 @@ else ()
|
||||
>)
|
||||
endif ()
|
||||
|
||||
if (use_mold)
|
||||
# use mold linker if available
|
||||
execute_process (
|
||||
COMMAND ${CMAKE_CXX_COMPILER} -fuse-ld=mold -Wl,--version
|
||||
ERROR_QUIET OUTPUT_VARIABLE LD_VERSION)
|
||||
if ("${LD_VERSION}" MATCHES "mold")
|
||||
target_link_libraries (common INTERFACE -fuse-ld=mold)
|
||||
endif ()
|
||||
unset (LD_VERSION)
|
||||
elseif (use_gold AND is_gcc)
|
||||
if (use_gold AND is_gcc)
|
||||
# use gold linker if available
|
||||
execute_process (
|
||||
COMMAND ${CMAKE_CXX_COMPILER} -fuse-ld=gold -Wl,--version
|
||||
@@ -171,7 +162,9 @@ elseif (use_gold AND is_gcc)
|
||||
$<$<NOT:$<BOOL:${static}>>:-Wl,--disable-new-dtags>)
|
||||
endif ()
|
||||
unset (LD_VERSION)
|
||||
elseif (use_lld)
|
||||
endif ()
|
||||
|
||||
if (use_lld)
|
||||
# use lld linker if available
|
||||
execute_process (
|
||||
COMMAND ${CMAKE_CXX_COMPILER} -fuse-ld=lld -Wl,--version
|
||||
@@ -182,7 +175,6 @@ elseif (use_lld)
|
||||
unset (LD_VERSION)
|
||||
endif()
|
||||
|
||||
|
||||
if (assert)
|
||||
foreach (var_ CMAKE_C_FLAGS_RELEASE CMAKE_CXX_FLAGS_RELEASE)
|
||||
STRING (REGEX REPLACE "[-/]DNDEBUG" "" ${var_} "${${var_}}")
|
||||
1103
Builds/CMake/RippledCore.cmake
Normal file
1103
Builds/CMake/RippledCore.cmake
Normal file
File diff suppressed because it is too large
Load Diff
98
Builds/CMake/RippledCov.cmake
Normal file
98
Builds/CMake/RippledCov.cmake
Normal file
@@ -0,0 +1,98 @@
|
||||
#[===================================================================[
|
||||
coverage report target
|
||||
#]===================================================================]
|
||||
|
||||
if (coverage)
|
||||
if (is_clang)
|
||||
if (APPLE)
|
||||
execute_process (COMMAND xcrun -f llvm-profdata
|
||||
OUTPUT_VARIABLE LLVM_PROFDATA
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
else ()
|
||||
find_program (LLVM_PROFDATA llvm-profdata)
|
||||
endif ()
|
||||
if (NOT LLVM_PROFDATA)
|
||||
message (WARNING "unable to find llvm-profdata - skipping coverage_report target")
|
||||
endif ()
|
||||
|
||||
if (APPLE)
|
||||
execute_process (COMMAND xcrun -f llvm-cov
|
||||
OUTPUT_VARIABLE LLVM_COV
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
else ()
|
||||
find_program (LLVM_COV llvm-cov)
|
||||
endif ()
|
||||
if (NOT LLVM_COV)
|
||||
message (WARNING "unable to find llvm-cov - skipping coverage_report target")
|
||||
endif ()
|
||||
|
||||
set (extract_pattern "")
|
||||
if (coverage_core_only)
|
||||
set (extract_pattern "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/")
|
||||
endif ()
|
||||
|
||||
if (LLVM_COV AND LLVM_PROFDATA)
|
||||
add_custom_target (coverage_report
|
||||
USES_TERMINAL
|
||||
COMMAND ${CMAKE_COMMAND} -E echo "Generating coverage - results will be in ${CMAKE_BINARY_DIR}/coverage/index.html."
|
||||
COMMAND ${CMAKE_COMMAND} -E echo "Running rippled tests."
|
||||
COMMAND rippled --unittest$<$<BOOL:${coverage_test}>:=${coverage_test}> --quiet --unittest-log
|
||||
COMMAND ${LLVM_PROFDATA}
|
||||
merge -sparse default.profraw -o rip.profdata
|
||||
COMMAND ${CMAKE_COMMAND} -E echo "Summary of coverage:"
|
||||
COMMAND ${LLVM_COV}
|
||||
report -instr-profile=rip.profdata
|
||||
$<TARGET_FILE:rippled> ${extract_pattern}
|
||||
# generate html report
|
||||
COMMAND ${LLVM_COV}
|
||||
show -format=html -output-dir=${CMAKE_BINARY_DIR}/coverage
|
||||
-instr-profile=rip.profdata
|
||||
$<TARGET_FILE:rippled> ${extract_pattern}
|
||||
BYPRODUCTS coverage/index.html)
|
||||
endif ()
|
||||
elseif (is_gcc)
|
||||
find_program (LCOV lcov)
|
||||
if (NOT LCOV)
|
||||
message (WARNING "unable to find lcov - skipping coverage_report target")
|
||||
endif ()
|
||||
|
||||
find_program (GENHTML genhtml)
|
||||
if (NOT GENHTML)
|
||||
message (WARNING "unable to find genhtml - skipping coverage_report target")
|
||||
endif ()
|
||||
|
||||
set (extract_pattern "*")
|
||||
if (coverage_core_only)
|
||||
set (extract_pattern "*/src/ripple/*")
|
||||
endif ()
|
||||
|
||||
if (LCOV AND GENHTML)
|
||||
add_custom_target (coverage_report
|
||||
USES_TERMINAL
|
||||
COMMAND ${CMAKE_COMMAND} -E echo "Generating coverage- results will be in ${CMAKE_BINARY_DIR}/coverage/index.html."
|
||||
# create baseline info file
|
||||
COMMAND ${LCOV}
|
||||
--no-external -d "${CMAKE_CURRENT_SOURCE_DIR}" -c -d . -i -o baseline.info
|
||||
| grep -v "ignoring data for external file"
|
||||
# run tests
|
||||
COMMAND ${CMAKE_COMMAND} -E echo "Running rippled tests for coverage report."
|
||||
COMMAND rippled --unittest$<$<BOOL:${coverage_test}>:=${coverage_test}> --quiet --unittest-log
|
||||
# Create test coverage data file
|
||||
COMMAND ${LCOV}
|
||||
--no-external -d "${CMAKE_CURRENT_SOURCE_DIR}" -c -d . -o tests.info
|
||||
| grep -v "ignoring data for external file"
|
||||
# Combine baseline and test coverage data
|
||||
COMMAND ${LCOV}
|
||||
-a baseline.info -a tests.info -o lcov-all.info
|
||||
# extract our files
|
||||
COMMAND ${LCOV}
|
||||
-e lcov-all.info "${extract_pattern}" -o lcov.info
|
||||
COMMAND ${CMAKE_COMMAND} -E echo "Summary of coverage:"
|
||||
COMMAND ${LCOV} --summary lcov.info
|
||||
# generate HTML report
|
||||
COMMAND ${GENHTML}
|
||||
-o ${CMAKE_BINARY_DIR}/coverage lcov.info
|
||||
BYPRODUCTS coverage/index.html)
|
||||
endif ()
|
||||
endif ()
|
||||
endif ()
|
||||
@@ -26,7 +26,8 @@ if (tests)
|
||||
src/ripple/*.cpp
|
||||
src/ripple/*.md
|
||||
src/test/*.h
|
||||
src/test/*.md)
|
||||
src/test/*.md
|
||||
Builds/*/README.md)
|
||||
list (APPEND doxygen_input
|
||||
README.md
|
||||
RELEASENOTES.md
|
||||
@@ -8,26 +8,13 @@ install (
|
||||
opts
|
||||
ripple_syslibs
|
||||
ripple_boost
|
||||
xrpl.libpb
|
||||
xrpl.libxrpl
|
||||
xrpl_core
|
||||
EXPORT RippleExports
|
||||
LIBRARY DESTINATION lib
|
||||
ARCHIVE DESTINATION lib
|
||||
RUNTIME DESTINATION bin
|
||||
INCLUDES DESTINATION include)
|
||||
|
||||
install(
|
||||
DIRECTORY "${CMAKE_CURRENT_SOURCE_DIR}/include/xrpl"
|
||||
DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}"
|
||||
)
|
||||
|
||||
if(NOT WIN32)
|
||||
install(
|
||||
CODE "file(CREATE_LINK xrpl \
|
||||
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple SYMBOLIC)"
|
||||
)
|
||||
endif()
|
||||
|
||||
install (EXPORT RippleExports
|
||||
FILE RippleTargets.cmake
|
||||
NAMESPACE Ripple::
|
||||
@@ -38,9 +25,14 @@ write_basic_package_version_file (
|
||||
VERSION ${rippled_version}
|
||||
COMPATIBILITY SameMajorVersion)
|
||||
|
||||
if (is_root_project AND TARGET rippled)
|
||||
if (is_root_project)
|
||||
install (TARGETS rippled RUNTIME DESTINATION bin)
|
||||
set_target_properties(rippled PROPERTIES INSTALL_RPATH_USE_LINK_PATH ON)
|
||||
install (
|
||||
FILES
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/RippleConfig.cmake
|
||||
${CMAKE_CURRENT_BINARY_DIR}/RippleConfigVersion.cmake
|
||||
DESTINATION lib/cmake/ripple)
|
||||
# sample configs should not overwrite existing files
|
||||
# install if-not-exists workaround as suggested by
|
||||
# https://cmake.org/Bug/view.php?id=12646
|
||||
@@ -55,16 +47,4 @@ if (is_root_project AND TARGET rippled)
|
||||
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/rippled-example.cfg\" etc rippled.cfg)
|
||||
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/validators-example.txt\" etc validators.txt)
|
||||
")
|
||||
if(NOT WIN32)
|
||||
install(
|
||||
CODE "file(CREATE_LINK rippled${suffix} \
|
||||
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix} SYMBOLIC)"
|
||||
)
|
||||
endif()
|
||||
endif ()
|
||||
|
||||
install (
|
||||
FILES
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/cmake/RippleConfig.cmake
|
||||
${CMAKE_CURRENT_BINARY_DIR}/RippleConfigVersion.cmake
|
||||
DESTINATION lib/cmake/ripple)
|
||||
@@ -23,15 +23,15 @@ target_compile_options (opts
|
||||
INTERFACE
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<COMPILE_LANGUAGE:CXX>>:-Wsuggest-override>
|
||||
$<$<BOOL:${perf}>:-fno-omit-frame-pointer>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-g --coverage -fprofile-abs-path>
|
||||
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-g --coverage>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-fprofile-arcs -ftest-coverage>
|
||||
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-fprofile-instr-generate -fcoverage-mapping>
|
||||
$<$<BOOL:${profile}>:-pg>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
|
||||
|
||||
target_link_libraries (opts
|
||||
INTERFACE
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-g --coverage -fprofile-abs-path>
|
||||
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-g --coverage>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-fprofile-arcs -ftest-coverage>
|
||||
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-fprofile-instr-generate -fcoverage-mapping>
|
||||
$<$<BOOL:${profile}>:-pg>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
|
||||
|
||||
39
Builds/CMake/RippledMultiConfig.cmake
Normal file
39
Builds/CMake/RippledMultiConfig.cmake
Normal file
@@ -0,0 +1,39 @@
|
||||
#[===================================================================[
|
||||
multiconfig misc
|
||||
#]===================================================================]
|
||||
|
||||
if (is_multiconfig)
|
||||
# This code finds all source files in the src subdirectory for inclusion
|
||||
# in the IDE file tree as non-compiled sources. Since this file list will
|
||||
# have some overlap with files we have already added to our targets to
|
||||
# be compiled, we explicitly remove any of these target source files from
|
||||
# this list.
|
||||
file (GLOB_RECURSE all_sources RELATIVE ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
CONFIGURE_DEPENDS
|
||||
src/*.* Builds/*.md docs/*.md src/*.md Builds/*.cmake)
|
||||
file(GLOB md_files RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} CONFIGURE_DEPENDS
|
||||
*.md)
|
||||
LIST(APPEND all_sources ${md_files})
|
||||
foreach (_target secp256k1::secp256k1 ed25519::ed25519 pbufs xrpl_core rippled)
|
||||
get_target_property (_type ${_target} TYPE)
|
||||
if(_type STREQUAL "INTERFACE_LIBRARY")
|
||||
continue()
|
||||
endif()
|
||||
get_target_property (_src ${_target} SOURCES)
|
||||
list (REMOVE_ITEM all_sources ${_src})
|
||||
endforeach ()
|
||||
target_sources (rippled PRIVATE ${all_sources})
|
||||
set_property (
|
||||
SOURCE ${all_sources}
|
||||
APPEND
|
||||
PROPERTY HEADER_FILE_ONLY true)
|
||||
if (MSVC)
|
||||
set_property(
|
||||
DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
PROPERTY VS_STARTUP_PROJECT rippled)
|
||||
endif ()
|
||||
|
||||
group_sources(src)
|
||||
group_sources(docs)
|
||||
group_sources(Builds)
|
||||
endif ()
|
||||
@@ -2,8 +2,6 @@
|
||||
convenience variables and sanity checks
|
||||
#]===================================================================]
|
||||
|
||||
include(ProcessorCount)
|
||||
|
||||
if (NOT ep_procs)
|
||||
ProcessorCount(ep_procs)
|
||||
if (ep_procs GREATER 1)
|
||||
122
Builds/CMake/RippledSettings.cmake
Normal file
122
Builds/CMake/RippledSettings.cmake
Normal file
@@ -0,0 +1,122 @@
|
||||
#[===================================================================[
|
||||
declare user options/settings
|
||||
#]===================================================================]
|
||||
|
||||
option (assert "Enables asserts, even in release builds" OFF)
|
||||
|
||||
option (reporting "Build rippled with reporting mode enabled" OFF)
|
||||
|
||||
option (tests "Build tests" ON)
|
||||
|
||||
option (unity "Creates a build using UNITY support in cmake. This is the default" ON)
|
||||
if (unity)
|
||||
if (NOT is_ci)
|
||||
set (CMAKE_UNITY_BUILD_BATCH_SIZE 15 CACHE STRING "")
|
||||
endif ()
|
||||
endif ()
|
||||
if (is_gcc OR is_clang)
|
||||
option (coverage "Generates coverage info." OFF)
|
||||
option (profile "Add profiling flags" OFF)
|
||||
set (coverage_test "" CACHE STRING
|
||||
"On gcc & clang, the specific unit test(s) to run for coverage. Default is all tests.")
|
||||
if (coverage_test AND NOT coverage)
|
||||
set (coverage ON CACHE BOOL "gcc/clang only" FORCE)
|
||||
endif ()
|
||||
option (coverage_core_only
|
||||
"Include only src/ripple files when generating coverage report. \
|
||||
Set to OFF to include all sources in coverage report."
|
||||
ON)
|
||||
option (wextra "compile with extra gcc/clang warnings enabled" ON)
|
||||
else ()
|
||||
set (profile OFF CACHE BOOL "gcc/clang only" FORCE)
|
||||
set (coverage OFF CACHE BOOL "gcc/clang only" FORCE)
|
||||
set (wextra OFF CACHE BOOL "gcc/clang only" FORCE)
|
||||
endif ()
|
||||
if (is_linux)
|
||||
option (BUILD_SHARED_LIBS "build shared ripple libraries" OFF)
|
||||
option (static "link protobuf, openssl, libc++, and boost statically" ON)
|
||||
option (perf "Enables flags that assist with perf recording" OFF)
|
||||
option (use_gold "enables detection of gold (binutils) linker" ON)
|
||||
else ()
|
||||
# we are not ready to allow shared-libs on windows because it would require
|
||||
# export declarations. On macos it's more feasible, but static openssl
|
||||
# produces odd linker errors, thus we disable shared lib builds for now.
|
||||
set (BUILD_SHARED_LIBS OFF CACHE BOOL "build shared ripple libraries - OFF for win/macos" FORCE)
|
||||
set (static ON CACHE BOOL "static link, linux only. ON for WIN/macos" FORCE)
|
||||
set (perf OFF CACHE BOOL "perf flags, linux only" FORCE)
|
||||
set (use_gold OFF CACHE BOOL "gold linker, linux only" FORCE)
|
||||
endif ()
|
||||
if (is_clang)
|
||||
option (use_lld "enables detection of lld linker" ON)
|
||||
else ()
|
||||
set (use_lld OFF CACHE BOOL "try lld linker, clang only" FORCE)
|
||||
endif ()
|
||||
option (jemalloc "Enables jemalloc for heap profiling" OFF)
|
||||
option (werr "treat warnings as errors" OFF)
|
||||
option (local_protobuf
|
||||
"Force a local build of protobuf instead of looking for an installed version." OFF)
|
||||
option (local_grpc
|
||||
"Force a local build of gRPC instead of looking for an installed version." OFF)
|
||||
|
||||
# this one is a string and therefore can't be an option
|
||||
set (san "" CACHE STRING "On gcc & clang, add sanitizer instrumentation")
|
||||
set_property (CACHE san PROPERTY STRINGS ";undefined;memory;address;thread")
|
||||
if (san)
|
||||
string (TOLOWER ${san} san)
|
||||
set (SAN_FLAG "-fsanitize=${san}")
|
||||
set (SAN_LIB "")
|
||||
if (is_gcc)
|
||||
if (san STREQUAL "address")
|
||||
set (SAN_LIB "asan")
|
||||
elseif (san STREQUAL "thread")
|
||||
set (SAN_LIB "tsan")
|
||||
elseif (san STREQUAL "memory")
|
||||
set (SAN_LIB "msan")
|
||||
elseif (san STREQUAL "undefined")
|
||||
set (SAN_LIB "ubsan")
|
||||
endif ()
|
||||
endif ()
|
||||
set (_saved_CRL ${CMAKE_REQUIRED_LIBRARIES})
|
||||
set (CMAKE_REQUIRED_LIBRARIES "${SAN_FLAG};${SAN_LIB}")
|
||||
check_cxx_compiler_flag (${SAN_FLAG} COMPILER_SUPPORTS_SAN)
|
||||
set (CMAKE_REQUIRED_LIBRARIES ${_saved_CRL})
|
||||
if (NOT COMPILER_SUPPORTS_SAN)
|
||||
message (FATAL_ERROR "${san} sanitizer does not seem to be supported by your compiler")
|
||||
endif ()
|
||||
endif ()
|
||||
set (container_label "" CACHE STRING "tag to use for package building containers")
|
||||
option (packages_only
|
||||
"ONLY generate package building targets. This is special use-case and almost \
|
||||
certainly not what you want. Use with caution as you won't be able to build \
|
||||
any compiled targets locally." OFF)
|
||||
option (have_package_container
|
||||
"Sometimes you already have the tagged container you want to use for package \
|
||||
building and you don't want docker to rebuild it. This flag will detach the \
|
||||
dependency of the package build from the container build. It's an advanced \
|
||||
use case and most likely you should not be touching this flag." OFF)
|
||||
|
||||
# the remaining options are obscure and rarely used
|
||||
option (beast_no_unit_test_inline
|
||||
"Prevents unit test definitions from being inserted into global table"
|
||||
OFF)
|
||||
option (single_io_service_thread
|
||||
"Restricts the number of threads calling io_service::run to one. \
|
||||
This can be useful when debugging."
|
||||
OFF)
|
||||
option (boost_show_deprecated
|
||||
"Allow boost to fail on deprecated usage. Only useful if you're trying\
|
||||
to find deprecated calls."
|
||||
OFF)
|
||||
option (beast_hashers
|
||||
"Use local implementations for sha/ripemd hashes (experimental, not recommended)"
|
||||
OFF)
|
||||
|
||||
if (WIN32)
|
||||
option (beast_disable_autolink "Disables autolinking of system libraries on WIN32" OFF)
|
||||
else ()
|
||||
set (beast_disable_autolink OFF CACHE BOOL "WIN32 only" FORCE)
|
||||
endif ()
|
||||
if (coverage)
|
||||
message (STATUS "coverage build requested - forcing Debug build")
|
||||
set (CMAKE_BUILD_TYPE Debug CACHE STRING "build type" FORCE)
|
||||
endif ()
|
||||
@@ -2,9 +2,9 @@ option (validator_keys "Enables building of validator-keys-tool as a separate ta
|
||||
|
||||
if (validator_keys)
|
||||
git_branch (current_branch)
|
||||
# default to tracking VK master branch unless we are on release
|
||||
if (NOT (current_branch STREQUAL "release"))
|
||||
set (current_branch "master")
|
||||
# default to tracking VK develop branch unless we are on master/release
|
||||
if (NOT (current_branch STREQUAL "master" OR current_branch STREQUAL "release"))
|
||||
set (current_branch "develop")
|
||||
endif ()
|
||||
message (STATUS "tracking ValidatorKeys branch: ${current_branch}")
|
||||
|
||||
15
Builds/CMake/RippledVersion.cmake
Normal file
15
Builds/CMake/RippledVersion.cmake
Normal file
@@ -0,0 +1,15 @@
|
||||
#[===================================================================[
|
||||
read version from source
|
||||
#]===================================================================]
|
||||
|
||||
file (STRINGS src/ripple/protocol/impl/BuildInfo.cpp BUILD_INFO)
|
||||
foreach (line_ ${BUILD_INFO})
|
||||
if (line_ MATCHES "versionString[ ]*=[ ]*\"(.+)\"")
|
||||
set (rippled_version ${CMAKE_MATCH_1})
|
||||
endif ()
|
||||
endforeach ()
|
||||
if (rippled_version)
|
||||
message (STATUS "rippled version: ${rippled_version}")
|
||||
else ()
|
||||
message (FATAL_ERROR "unable to determine rippled version")
|
||||
endif ()
|
||||
106
Builds/CMake/SociConfig.cmake.patched
Normal file
106
Builds/CMake/SociConfig.cmake.patched
Normal file
@@ -0,0 +1,106 @@
|
||||
################################################################################
|
||||
# SociConfig.cmake - CMake build configuration of SOCI library
|
||||
################################################################################
|
||||
# Copyright (C) 2010 Mateusz Loskot <mateusz@loskot.net>
|
||||
#
|
||||
# Distributed under the Boost Software License, Version 1.0.
|
||||
# (See accompanying file LICENSE_1_0.txt or copy at
|
||||
# http://www.boost.org/LICENSE_1_0.txt)
|
||||
################################################################################
|
||||
|
||||
include(CheckCXXSymbolExists)
|
||||
|
||||
if(WIN32)
|
||||
check_cxx_symbol_exists("_M_AMD64" "" SOCI_TARGET_ARCH_X64)
|
||||
if(NOT RTC_ARCH_X64)
|
||||
check_cxx_symbol_exists("_M_IX86" "" SOCI_TARGET_ARCH_X86)
|
||||
endif(NOT RTC_ARCH_X64)
|
||||
# add check for arm here
|
||||
# see http://msdn.microsoft.com/en-us/library/b0084kay.aspx
|
||||
else(WIN32)
|
||||
check_cxx_symbol_exists("__i386__" "" SOCI_TARGET_ARCH_X86)
|
||||
check_cxx_symbol_exists("__x86_64__" "" SOCI_TARGET_ARCH_X64)
|
||||
check_cxx_symbol_exists("__arm__" "" SOCI_TARGET_ARCH_ARM)
|
||||
endif(WIN32)
|
||||
|
||||
if(NOT DEFINED LIB_SUFFIX)
|
||||
if(SOCI_TARGET_ARCH_X64)
|
||||
set(_lib_suffix "64")
|
||||
else()
|
||||
set(_lib_suffix "")
|
||||
endif()
|
||||
set(LIB_SUFFIX ${_lib_suffix} CACHE STRING "Specifies suffix for the lib directory")
|
||||
endif()
|
||||
|
||||
#
|
||||
# C++11 Option
|
||||
#
|
||||
|
||||
if(NOT SOCI_CXX_C11)
|
||||
set (SOCI_CXX_C11 OFF CACHE BOOL "Build to the C++11 standard")
|
||||
endif()
|
||||
|
||||
#
|
||||
# Force compilation flags and set desired warnings level
|
||||
#
|
||||
|
||||
if (MSVC)
|
||||
add_definitions(-D_CRT_SECURE_NO_DEPRECATE)
|
||||
add_definitions(-D_CRT_SECURE_NO_WARNINGS)
|
||||
add_definitions(-D_CRT_NONSTDC_NO_WARNING)
|
||||
add_definitions(-D_SCL_SECURE_NO_WARNINGS)
|
||||
|
||||
if(CMAKE_CXX_FLAGS MATCHES "/W[0-4]")
|
||||
string(REGEX REPLACE "/W[0-4]" "/W4" CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
|
||||
else()
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} /W4 /we4266")
|
||||
endif()
|
||||
|
||||
else()
|
||||
|
||||
set(SOCI_GCC_CLANG_COMMON_FLAGS "")
|
||||
# "-pedantic -Werror -Wno-error=parentheses -Wall -Wextra -Wpointer-arith -Wcast-align -Wcast-qual -Wfloat-equal -Woverloaded-virtual -Wredundant-decls -Wno-long-long")
|
||||
|
||||
|
||||
if (SOCI_CXX_C11)
|
||||
set(SOCI_CXX_VERSION_FLAGS "-std=c++11")
|
||||
else()
|
||||
set(SOCI_CXX_VERSION_FLAGS "-std=gnu++98")
|
||||
endif()
|
||||
|
||||
if("${CMAKE_CXX_COMPILER_ID}" MATCHES "Clang" OR "${CMAKE_CXX_COMPILER}" MATCHES "clang")
|
||||
|
||||
if(NOT CMAKE_CXX_COMPILER_VERSION LESS 3.1 AND SOCI_ASAN)
|
||||
set(SOCI_GCC_CLANG_COMMON_FLAGS "${SOCI_GCC_CLANG_COMMON_FLAGS} -fsanitize=address")
|
||||
endif()
|
||||
|
||||
# enforce C++11 for Clang
|
||||
set(SOCI_CXX_C11 ON)
|
||||
set(SOCI_CXX_VERSION_FLAGS "-std=c++11")
|
||||
add_definitions(-DCATCH_CONFIG_CPP11_NO_IS_ENUM)
|
||||
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${SOCI_GCC_CLANG_COMMON_FLAGS} ${SOCI_CXX_VERSION_FLAGS}")
|
||||
|
||||
elseif(CMAKE_COMPILER_IS_GNUCC OR CMAKE_COMPILER_IS_GNUCXX)
|
||||
|
||||
if(NOT CMAKE_CXX_COMPILER_VERSION LESS 4.8 AND SOCI_ASAN)
|
||||
set(SOCI_GCC_CLANG_COMMON_FLAGS "${SOCI_GCC_CLANG_COMMON_FLAGS} -fsanitize=address")
|
||||
endif()
|
||||
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${SOCI_GCC_CLANG_COMMON_FLAGS} ${SOCI_CXX_VERSION_FLAGS} ")
|
||||
if (CMAKE_COMPILER_IS_GNUCXX)
|
||||
if (CMAKE_SYSTEM_NAME MATCHES "FreeBSD")
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS}")
|
||||
else()
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wno-variadic-macros")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
else()
|
||||
message(WARNING "Unknown toolset - using default flags to build SOCI")
|
||||
endif()
|
||||
|
||||
endif()
|
||||
|
||||
# Set SOCI_HAVE_* variables for soci-config.h generator
|
||||
set(SOCI_HAVE_CXX_C11 ${SOCI_CXX_C11} CACHE INTERNAL "Enables C++11 support")
|
||||
@@ -6,7 +6,6 @@ find_package(Boost 1.83 REQUIRED
|
||||
coroutine
|
||||
date_time
|
||||
filesystem
|
||||
json
|
||||
program_options
|
||||
regex
|
||||
system
|
||||
@@ -30,7 +29,6 @@ target_link_libraries(ripple_boost
|
||||
Boost::coroutine
|
||||
Boost::date_time
|
||||
Boost::filesystem
|
||||
Boost::json
|
||||
Boost::program_options
|
||||
Boost::regex
|
||||
Boost::system
|
||||
@@ -51,4 +49,4 @@ if(san AND is_clang)
|
||||
INTERFACE
|
||||
# ignore boost headers for sanitizing
|
||||
-fsanitize-blacklist=${CMAKE_CURRENT_BINARY_DIR}/san_bl.txt)
|
||||
endif()
|
||||
endif()
|
||||
22
Builds/CMake/conan/Protobuf.cmake
Normal file
22
Builds/CMake/conan/Protobuf.cmake
Normal file
@@ -0,0 +1,22 @@
|
||||
find_package(Protobuf 3.8)
|
||||
|
||||
file(MAKE_DIRECTORY ${CMAKE_BINARY_DIR}/proto_gen)
|
||||
set(ccbd ${CMAKE_CURRENT_BINARY_DIR})
|
||||
set(CMAKE_CURRENT_BINARY_DIR ${CMAKE_BINARY_DIR}/proto_gen)
|
||||
protobuf_generate_cpp(PROTO_SRCS PROTO_HDRS src/ripple/proto/ripple.proto)
|
||||
set(CMAKE_CURRENT_BINARY_DIR ${ccbd})
|
||||
|
||||
add_library(pbufs STATIC ${PROTO_SRCS} ${PROTO_HDRS})
|
||||
target_include_directories(pbufs SYSTEM PUBLIC
|
||||
${CMAKE_BINARY_DIR}/proto_gen
|
||||
${CMAKE_BINARY_DIR}/proto_gen/src/ripple/proto
|
||||
)
|
||||
target_link_libraries(pbufs protobuf::libprotobuf)
|
||||
target_compile_options(pbufs
|
||||
PUBLIC
|
||||
$<$<BOOL:${XCODE}>:
|
||||
--system-header-prefix="google/protobuf"
|
||||
-Wno-deprecated-dynamic-exception-spec
|
||||
>
|
||||
)
|
||||
add_library(Ripple::pbufs ALIAS pbufs)
|
||||
62
Builds/CMake/conan/gRPC.cmake
Normal file
62
Builds/CMake/conan/gRPC.cmake
Normal file
@@ -0,0 +1,62 @@
|
||||
find_package(gRPC 1.23)
|
||||
|
||||
#[=================================[
|
||||
generate protobuf sources for
|
||||
grpc defs and bundle into a
|
||||
static lib
|
||||
#]=================================]
|
||||
set(GRPC_GEN_DIR "${CMAKE_BINARY_DIR}/proto_gen_grpc")
|
||||
file(MAKE_DIRECTORY ${GRPC_GEN_DIR})
|
||||
set(GRPC_PROTO_SRCS)
|
||||
set(GRPC_PROTO_HDRS)
|
||||
set(GRPC_PROTO_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/proto/org")
|
||||
file(GLOB_RECURSE GRPC_DEFINITION_FILES LIST_DIRECTORIES false "${GRPC_PROTO_ROOT}/*.proto")
|
||||
foreach(file ${GRPC_DEFINITION_FILES})
|
||||
get_filename_component(_abs_file ${file} ABSOLUTE)
|
||||
get_filename_component(_abs_dir ${_abs_file} DIRECTORY)
|
||||
get_filename_component(_basename ${file} NAME_WE)
|
||||
get_filename_component(_proto_inc ${GRPC_PROTO_ROOT} DIRECTORY) # updir one level
|
||||
file(RELATIVE_PATH _rel_root_file ${_proto_inc} ${_abs_file})
|
||||
get_filename_component(_rel_root_dir ${_rel_root_file} DIRECTORY)
|
||||
file(RELATIVE_PATH _rel_dir ${CMAKE_CURRENT_SOURCE_DIR} ${_abs_dir})
|
||||
|
||||
set(src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
|
||||
set(src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
|
||||
set(hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
|
||||
set(hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
|
||||
add_custom_command(
|
||||
OUTPUT ${src_1} ${src_2} ${hdr_1} ${hdr_2}
|
||||
COMMAND protobuf::protoc
|
||||
ARGS --grpc_out=${GRPC_GEN_DIR}
|
||||
--cpp_out=${GRPC_GEN_DIR}
|
||||
--plugin=protoc-gen-grpc=$<TARGET_FILE:gRPC::grpc_cpp_plugin>
|
||||
-I ${_proto_inc} -I ${_rel_dir}
|
||||
${_abs_file}
|
||||
DEPENDS ${_abs_file} protobuf::protoc gRPC::grpc_cpp_plugin
|
||||
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
COMMENT "Running gRPC C++ protocol buffer compiler on ${file}"
|
||||
VERBATIM)
|
||||
set_source_files_properties(${src_1} ${src_2} ${hdr_1} ${hdr_2} PROPERTIES GENERATED TRUE)
|
||||
list(APPEND GRPC_PROTO_SRCS ${src_1} ${src_2})
|
||||
list(APPEND GRPC_PROTO_HDRS ${hdr_1} ${hdr_2})
|
||||
endforeach()
|
||||
|
||||
add_library(grpc_pbufs STATIC ${GRPC_PROTO_SRCS} ${GRPC_PROTO_HDRS})
|
||||
#target_include_directories(grpc_pbufs PRIVATE src)
|
||||
target_include_directories(grpc_pbufs SYSTEM PUBLIC ${GRPC_GEN_DIR})
|
||||
target_link_libraries(grpc_pbufs
|
||||
"gRPC::grpc++"
|
||||
# libgrpc is missing references.
|
||||
absl::random_random
|
||||
)
|
||||
target_compile_options(grpc_pbufs
|
||||
PRIVATE
|
||||
$<$<BOOL:${MSVC}>:-wd4065>
|
||||
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-deprecated-declarations>
|
||||
PUBLIC
|
||||
$<$<BOOL:${MSVC}>:-wd4996>
|
||||
$<$<BOOL:${XCODE}>:
|
||||
--system-header-prefix="google/protobuf"
|
||||
-Wno-deprecated-dynamic-exception-spec
|
||||
>)
|
||||
add_library(Ripple::grpc_pbufs ALIAS grpc_pbufs)
|
||||
@@ -54,7 +54,6 @@ find_package(Boost 1.86 REQUIRED
|
||||
coroutine
|
||||
date_time
|
||||
filesystem
|
||||
json
|
||||
program_options
|
||||
regex
|
||||
system
|
||||
@@ -78,7 +77,6 @@ target_link_libraries(ripple_boost
|
||||
Boost::coroutine
|
||||
Boost::date_time
|
||||
Boost::filesystem
|
||||
Boost::json
|
||||
Boost::iostreams
|
||||
Boost::program_options
|
||||
Boost::regex
|
||||
@@ -3,13 +3,13 @@
|
||||
#]===================================================================]
|
||||
|
||||
add_library (ed25519-donna STATIC
|
||||
external/ed25519-donna/ed25519.c)
|
||||
src/ed25519-donna/ed25519.c)
|
||||
target_include_directories (ed25519-donna
|
||||
PUBLIC
|
||||
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/external>
|
||||
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
|
||||
$<INSTALL_INTERFACE:include>
|
||||
PRIVATE
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/external/ed25519-donna)
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/src/ed25519-donna)
|
||||
#[=========================================================[
|
||||
NOTE for macos:
|
||||
https://github.com/floodyberry/ed25519-donna/issues/29
|
||||
@@ -24,5 +24,5 @@ target_link_libraries (ripple_libs INTERFACE NIH::ed25519-donna)
|
||||
#]===========================]
|
||||
install (
|
||||
FILES
|
||||
external/ed25519-donna/ed25519.h
|
||||
src/ed25519-donna/ed25519.h
|
||||
DESTINATION include/ed25519-donna)
|
||||
@@ -129,28 +129,27 @@ else ()
|
||||
endif ()
|
||||
endif ()
|
||||
|
||||
set(output_dir ${CMAKE_BINARY_DIR}/proto_gen)
|
||||
file(MAKE_DIRECTORY ${output_dir})
|
||||
set(ccbd ${CMAKE_CURRENT_BINARY_DIR})
|
||||
set(CMAKE_CURRENT_BINARY_DIR ${output_dir})
|
||||
protobuf_generate_cpp(PROTO_SRCS PROTO_HDRS src/ripple/proto/ripple.proto)
|
||||
set(CMAKE_CURRENT_BINARY_DIR ${ccbd})
|
||||
file (MAKE_DIRECTORY ${CMAKE_BINARY_DIR}/proto_gen)
|
||||
set (save_CBD ${CMAKE_CURRENT_BINARY_DIR})
|
||||
set (CMAKE_CURRENT_BINARY_DIR ${CMAKE_BINARY_DIR}/proto_gen)
|
||||
protobuf_generate_cpp (
|
||||
PROTO_SRCS
|
||||
PROTO_HDRS
|
||||
src/ripple/proto/ripple.proto)
|
||||
set (CMAKE_CURRENT_BINARY_DIR ${save_CBD})
|
||||
|
||||
target_include_directories(xrpl_core SYSTEM PUBLIC
|
||||
# The generated implementation imports the header relative to the output
|
||||
# directory.
|
||||
$<BUILD_INTERFACE:${output_dir}>
|
||||
$<BUILD_INTERFACE:${output_dir}/src>
|
||||
)
|
||||
target_sources(xrpl_core PRIVATE ${output_dir}/src/ripple/proto/ripple.pb.cc)
|
||||
install(
|
||||
FILES ${output_dir}/src/ripple/proto/ripple.pb.h
|
||||
DESTINATION include/ripple/proto)
|
||||
target_link_libraries(xrpl_core PUBLIC protobuf::libprotobuf)
|
||||
target_compile_options(xrpl_core
|
||||
add_library (pbufs STATIC ${PROTO_SRCS} ${PROTO_HDRS})
|
||||
|
||||
target_include_directories (pbufs PRIVATE src)
|
||||
target_include_directories (pbufs
|
||||
SYSTEM PUBLIC ${CMAKE_BINARY_DIR}/proto_gen)
|
||||
target_link_libraries (pbufs protobuf::libprotobuf)
|
||||
target_compile_options (pbufs
|
||||
PUBLIC
|
||||
$<$<BOOL:${is_xcode}>:
|
||||
--system-header-prefix="google/protobuf"
|
||||
-Wno-deprecated-dynamic-exception-spec
|
||||
>
|
||||
)
|
||||
>)
|
||||
add_library (Ripple::pbufs ALIAS pbufs)
|
||||
target_link_libraries (ripple_libs INTERFACE Ripple::pbufs)
|
||||
exclude_if_included (pbufs)
|
||||
@@ -64,13 +64,13 @@ if (local_rocksdb)
|
||||
PATCH_COMMAND
|
||||
# only used by windows build
|
||||
${CMAKE_COMMAND} -E copy_if_different
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/cmake/rocks_thirdparty.inc
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/rocks_thirdparty.inc
|
||||
<SOURCE_DIR>/thirdparty.inc
|
||||
COMMAND
|
||||
# fixup their build version file to keep the values
|
||||
# from changing always
|
||||
${CMAKE_COMMAND} -E copy_if_different
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/cmake/rocksdb_build_version.cc.in
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/rocksdb_build_version.cc.in
|
||||
<SOURCE_DIR>/util/build_version.cc.in
|
||||
CMAKE_ARGS
|
||||
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
|
||||
@@ -24,7 +24,7 @@ else()
|
||||
set(INSTALL_SECP256K1 true)
|
||||
|
||||
add_library (secp256k1 STATIC
|
||||
external/secp256k1/src/secp256k1.c)
|
||||
src/secp256k1/src/secp256k1.c)
|
||||
target_compile_definitions (secp256k1
|
||||
PRIVATE
|
||||
USE_NUM_NONE
|
||||
@@ -34,9 +34,9 @@ else()
|
||||
USE_SCALAR_INV_BUILTIN)
|
||||
target_include_directories (secp256k1
|
||||
PUBLIC
|
||||
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/external>
|
||||
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
|
||||
$<INSTALL_INTERFACE:include>
|
||||
PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/external/secp256k1)
|
||||
PRIVATE ${CMAKE_CURRENT_SOURCE_DIR}/src/secp256k1)
|
||||
target_compile_options (secp256k1
|
||||
PRIVATE
|
||||
$<$<BOOL:${MSVC}>:-wd4319>
|
||||
@@ -51,7 +51,7 @@ else()
|
||||
#]===========================]
|
||||
install (
|
||||
FILES
|
||||
external/secp256k1/include/secp256k1.h
|
||||
src/secp256k1/include/secp256k1.h
|
||||
DESTINATION include/secp256k1/include)
|
||||
|
||||
add_library (NIH::secp256k1 ALIAS secp256k1)
|
||||
@@ -52,7 +52,7 @@ else()
|
||||
# whenever we update the GIT_TAG above.
|
||||
PATCH_COMMAND
|
||||
${CMAKE_COMMAND} -D RIPPLED_SOURCE=${CMAKE_CURRENT_SOURCE_DIR}
|
||||
-P ${CMAKE_CURRENT_SOURCE_DIR}/cmake/soci_patch.cmake
|
||||
-P ${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/soci_patch.cmake
|
||||
CMAKE_ARGS
|
||||
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
|
||||
-DCMAKE_C_COMPILER=${CMAKE_C_COMPILER}
|
||||
@@ -61,7 +61,7 @@ else()
|
||||
$<$<BOOL:${VCPKG_TARGET_TRIPLET}>:-DVCPKG_TARGET_TRIPLET=${VCPKG_TARGET_TRIPLET}>
|
||||
$<$<BOOL:${unity}>:-DCMAKE_UNITY_BUILD=ON}>
|
||||
-DCMAKE_PREFIX_PATH=${CMAKE_BINARY_DIR}/sqlite3
|
||||
-DCMAKE_MODULE_PATH=${CMAKE_CURRENT_SOURCE_DIR}/cmake
|
||||
-DCMAKE_MODULE_PATH=${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake
|
||||
-DCMAKE_INCLUDE_PATH=$<JOIN:$<TARGET_PROPERTY:sqlite,INTERFACE_INCLUDE_DIRECTORIES>,::>
|
||||
-DCMAKE_LIBRARY_PATH=${sqlite_BINARY_DIR}
|
||||
-DCMAKE_DEBUG_POSTFIX=_d
|
||||
@@ -37,7 +37,7 @@ else()
|
||||
# for the single amalgamation source file.
|
||||
PATCH_COMMAND
|
||||
${CMAKE_COMMAND} -E copy_if_different
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/cmake/CMake_sqlite3.txt
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/CMake_sqlite3.txt
|
||||
<SOURCE_DIR>/CMakeLists.txt
|
||||
CMAKE_ARGS
|
||||
-DCMAKE_CXX_COMPILER=${CMAKE_CXX_COMPILER}
|
||||
@@ -314,33 +314,25 @@ endif ()
|
||||
grpc defs and bundle into a
|
||||
static lib
|
||||
#]=================================]
|
||||
set(output_dir "${CMAKE_BINARY_DIR}/proto_gen_grpc")
|
||||
set(GRPC_GEN_DIR "${output_dir}/ripple/proto")
|
||||
file(MAKE_DIRECTORY ${GRPC_GEN_DIR})
|
||||
set(GRPC_PROTO_SRCS)
|
||||
set(GRPC_PROTO_HDRS)
|
||||
set(GRPC_PROTO_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/proto/org")
|
||||
file(GLOB_RECURSE GRPC_DEFINITION_FILES "${GRPC_PROTO_ROOT}/*.proto")
|
||||
set (GRPC_GEN_DIR "${CMAKE_BINARY_DIR}/proto_gen_grpc")
|
||||
file (MAKE_DIRECTORY ${GRPC_GEN_DIR})
|
||||
set (GRPC_PROTO_SRCS)
|
||||
set (GRPC_PROTO_HDRS)
|
||||
set (GRPC_PROTO_ROOT "${CMAKE_CURRENT_SOURCE_DIR}/src/ripple/proto/org")
|
||||
file(GLOB_RECURSE GRPC_DEFINITION_FILES LIST_DIRECTORIES false "${GRPC_PROTO_ROOT}/*.proto")
|
||||
foreach(file ${GRPC_DEFINITION_FILES})
|
||||
# /home/user/rippled/src/ripple/proto/org/.../v1/get_ledger.proto
|
||||
get_filename_component(_abs_file ${file} ABSOLUTE)
|
||||
# /home/user/rippled/src/ripple/proto/org/.../v1
|
||||
get_filename_component(_abs_dir ${_abs_file} DIRECTORY)
|
||||
# get_ledger
|
||||
get_filename_component(_basename ${file} NAME_WE)
|
||||
# /home/user/rippled/src/ripple/proto
|
||||
get_filename_component(_proto_inc ${GRPC_PROTO_ROOT} DIRECTORY) # updir one level
|
||||
# org/.../v1/get_ledger.proto
|
||||
file(RELATIVE_PATH _rel_root_file ${_proto_inc} ${_abs_file})
|
||||
# org/.../v1
|
||||
get_filename_component(_rel_root_dir ${_rel_root_file} DIRECTORY)
|
||||
# src/ripple/proto/org/.../v1
|
||||
file(RELATIVE_PATH _rel_dir ${CMAKE_CURRENT_SOURCE_DIR} ${_abs_dir})
|
||||
|
||||
set(src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
|
||||
set(src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
|
||||
set(hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
|
||||
set(hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
|
||||
set (src_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.cc")
|
||||
set (src_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.cc")
|
||||
set (hdr_1 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.grpc.pb.h")
|
||||
set (hdr_2 "${GRPC_GEN_DIR}/${_rel_root_dir}/${_basename}.pb.h")
|
||||
add_custom_command(
|
||||
OUTPUT ${src_1} ${src_2} ${hdr_1} ${hdr_2}
|
||||
COMMAND protobuf::protoc
|
||||
@@ -353,32 +345,16 @@ foreach(file ${GRPC_DEFINITION_FILES})
|
||||
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
COMMENT "Running gRPC C++ protocol buffer compiler on ${file}"
|
||||
VERBATIM)
|
||||
set_source_files_properties(${src_1} ${src_2} ${hdr_1} ${hdr_2} PROPERTIES
|
||||
GENERATED TRUE
|
||||
SKIP_UNITY_BUILD_INCLUSION ON
|
||||
)
|
||||
set_source_files_properties(${src_1} ${src_2} ${hdr_1} ${hdr_2} PROPERTIES GENERATED TRUE)
|
||||
list(APPEND GRPC_PROTO_SRCS ${src_1} ${src_2})
|
||||
list(APPEND GRPC_PROTO_HDRS ${hdr_1} ${hdr_2})
|
||||
endforeach()
|
||||
|
||||
target_include_directories(xrpl_core SYSTEM PUBLIC
|
||||
$<BUILD_INTERFACE:${output_dir}>
|
||||
$<BUILD_INTERFACE:${output_dir}/ripple/proto>
|
||||
# The generated sources include headers relative to this path. Fix it later.
|
||||
$<INSTALL_INTERFACE:include/ripple/proto>
|
||||
)
|
||||
target_sources(xrpl_core PRIVATE ${GRPC_PROTO_SRCS})
|
||||
install(
|
||||
DIRECTORY ${output_dir}/ripple
|
||||
DESTINATION include/
|
||||
FILES_MATCHING PATTERN "*.h"
|
||||
)
|
||||
target_link_libraries(xrpl_core PUBLIC
|
||||
"gRPC::grpc++"
|
||||
# libgrpc is missing references.
|
||||
absl::random_random
|
||||
)
|
||||
target_compile_options(xrpl_core
|
||||
add_library (grpc_pbufs STATIC ${GRPC_PROTO_SRCS} ${GRPC_PROTO_HDRS})
|
||||
#target_include_directories (grpc_pbufs PRIVATE src)
|
||||
target_include_directories (grpc_pbufs SYSTEM PUBLIC ${GRPC_GEN_DIR})
|
||||
target_link_libraries (grpc_pbufs protobuf::libprotobuf "gRPC::grpc++${grpc_suffix}")
|
||||
target_compile_options (grpc_pbufs
|
||||
PRIVATE
|
||||
$<$<BOOL:${MSVC}>:-wd4065>
|
||||
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-deprecated-declarations>
|
||||
@@ -388,5 +364,6 @@ target_compile_options(xrpl_core
|
||||
--system-header-prefix="google/protobuf"
|
||||
-Wno-deprecated-dynamic-exception-spec
|
||||
>)
|
||||
# target_link_libraries (ripple_libs INTERFACE Ripple::grpc_pbufs)
|
||||
# exclude_if_included (grpc_pbufs)
|
||||
add_library (Ripple::grpc_pbufs ALIAS grpc_pbufs)
|
||||
target_link_libraries (ripple_libs INTERFACE Ripple::grpc_pbufs)
|
||||
exclude_if_included (grpc_pbufs)
|
||||
17
Builds/CMake/echo_file.cmake
Normal file
17
Builds/CMake/echo_file.cmake
Normal file
@@ -0,0 +1,17 @@
|
||||
#[=========================================================[
|
||||
This is a CMake script file that is used to write
|
||||
the contents of a file to stdout (using the cmake
|
||||
echo command). The input file is passed via the
|
||||
IN_FILE variable.
|
||||
#]=========================================================]
|
||||
|
||||
if (EXISTS ${IN_FILE})
|
||||
file (READ ${IN_FILE} contents)
|
||||
## only print files that actually have some text in them
|
||||
if (contents MATCHES "[a-z0-9A-Z]+")
|
||||
execute_process(
|
||||
COMMAND
|
||||
${CMAKE_COMMAND} -E echo "${contents}")
|
||||
endif ()
|
||||
endif ()
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
# those warnings.
|
||||
if (RIPPLED_SOURCE)
|
||||
execute_process( COMMAND ${CMAKE_COMMAND} -E copy_if_different
|
||||
${RIPPLED_SOURCE}/cmake/SociConfig.cmake.patched
|
||||
${RIPPLED_SOURCE}/Builds/CMake/SociConfig.cmake.patched
|
||||
cmake/SociConfig.cmake )
|
||||
endif ()
|
||||
|
||||
1
Builds/README.md
Normal file
1
Builds/README.md
Normal file
@@ -0,0 +1 @@
|
||||
[Please see the BUILD instructions here](../BUILD.md)
|
||||
@@ -13,15 +13,12 @@ then
|
||||
git clean -ix
|
||||
fi
|
||||
|
||||
# Ensure all sorting is ASCII-order consistently across platforms.
|
||||
export LANG=C
|
||||
|
||||
rm -rfv results
|
||||
mkdir results
|
||||
includes="$( pwd )/results/rawincludes.txt"
|
||||
pushd ../..
|
||||
echo Raw includes:
|
||||
grep -r '#include.*/.*\.h' include src | \
|
||||
grep -r '#include.*/.*\.h' src/ripple/ src/test/ | \
|
||||
grep -v boost | tee ${includes}
|
||||
popd
|
||||
pushd results
|
||||
|
||||
@@ -1,3 +1,54 @@
|
||||
Loop: ripple.app ripple.core
|
||||
ripple.app > ripple.core
|
||||
|
||||
Loop: ripple.app ripple.ledger
|
||||
ripple.app > ripple.ledger
|
||||
|
||||
Loop: ripple.app ripple.net
|
||||
ripple.app > ripple.net
|
||||
|
||||
Loop: ripple.app ripple.nodestore
|
||||
ripple.app > ripple.nodestore
|
||||
|
||||
Loop: ripple.app ripple.overlay
|
||||
ripple.overlay ~= ripple.app
|
||||
|
||||
Loop: ripple.app ripple.peerfinder
|
||||
ripple.app > ripple.peerfinder
|
||||
|
||||
Loop: ripple.app ripple.protocol
|
||||
ripple.app > ripple.protocol
|
||||
|
||||
Loop: ripple.app ripple.rpc
|
||||
ripple.rpc > ripple.app
|
||||
|
||||
Loop: ripple.app ripple.shamap
|
||||
ripple.app > ripple.shamap
|
||||
|
||||
Loop: ripple.basics ripple.core
|
||||
ripple.core > ripple.basics
|
||||
|
||||
Loop: ripple.basics ripple.json
|
||||
ripple.json ~= ripple.basics
|
||||
|
||||
Loop: ripple.basics ripple.protocol
|
||||
ripple.protocol > ripple.basics
|
||||
|
||||
Loop: ripple.core ripple.net
|
||||
ripple.net > ripple.core
|
||||
|
||||
Loop: ripple.ledger ripple.protocol
|
||||
ripple.ledger > ripple.protocol
|
||||
|
||||
Loop: ripple.net ripple.rpc
|
||||
ripple.rpc > ripple.net
|
||||
|
||||
Loop: ripple.nodestore ripple.overlay
|
||||
ripple.overlay ~= ripple.nodestore
|
||||
|
||||
Loop: ripple.overlay ripple.rpc
|
||||
ripple.rpc ~= ripple.overlay
|
||||
|
||||
Loop: test.app test.jtx
|
||||
test.app > test.jtx
|
||||
|
||||
@@ -10,48 +61,3 @@ Loop: test.jtx test.toplevel
|
||||
Loop: test.jtx test.unit_test
|
||||
test.unit_test == test.jtx
|
||||
|
||||
Loop: xrpl.basics xrpl.json
|
||||
xrpl.json == xrpl.basics
|
||||
|
||||
Loop: xrpl.protocol xrpld.app
|
||||
xrpld.app > xrpl.protocol
|
||||
|
||||
Loop: xrpld.app xrpld.core
|
||||
xrpld.app > xrpld.core
|
||||
|
||||
Loop: xrpld.app xrpld.ledger
|
||||
xrpld.app > xrpld.ledger
|
||||
|
||||
Loop: xrpld.app xrpld.net
|
||||
xrpld.app > xrpld.net
|
||||
|
||||
Loop: xrpld.app xrpld.nodestore
|
||||
xrpld.app > xrpld.nodestore
|
||||
|
||||
Loop: xrpld.app xrpld.overlay
|
||||
xrpld.overlay ~= xrpld.app
|
||||
|
||||
Loop: xrpld.app xrpld.peerfinder
|
||||
xrpld.app > xrpld.peerfinder
|
||||
|
||||
Loop: xrpld.app xrpld.rpc
|
||||
xrpld.rpc > xrpld.app
|
||||
|
||||
Loop: xrpld.app xrpld.shamap
|
||||
xrpld.app > xrpld.shamap
|
||||
|
||||
Loop: xrpld.core xrpld.net
|
||||
xrpld.net > xrpld.core
|
||||
|
||||
Loop: xrpld.core xrpld.perflog
|
||||
xrpld.perflog == xrpld.core
|
||||
|
||||
Loop: xrpld.net xrpld.rpc
|
||||
xrpld.rpc > xrpld.net
|
||||
|
||||
Loop: xrpld.overlay xrpld.rpc
|
||||
xrpld.rpc ~= xrpld.overlay
|
||||
|
||||
Loop: xrpld.perflog xrpld.rpc
|
||||
xrpld.rpc ~= xrpld.perflog
|
||||
|
||||
|
||||
@@ -1,200 +1,226 @@
|
||||
libxrpl.basics > xrpl.basics
|
||||
libxrpl.basics > xrpl.protocol
|
||||
libxrpl.crypto > xrpl.basics
|
||||
libxrpl.json > xrpl.basics
|
||||
libxrpl.json > xrpl.json
|
||||
libxrpl.protocol > xrpl.basics
|
||||
libxrpl.protocol > xrpl.hook
|
||||
libxrpl.protocol > xrpl.json
|
||||
libxrpl.protocol > xrpl.protocol
|
||||
libxrpl.resource > xrpl.basics
|
||||
libxrpl.resource > xrpl.resource
|
||||
libxrpl.server > xrpl.basics
|
||||
libxrpl.server > xrpl.json
|
||||
libxrpl.server > xrpl.protocol
|
||||
libxrpl.server > xrpl.server
|
||||
ripple.app > ripple.basics
|
||||
ripple.app > ripple.beast
|
||||
ripple.app > ripple.conditions
|
||||
ripple.app > ripple.consensus
|
||||
ripple.app > ripple.crypto
|
||||
ripple.app > ripple.json
|
||||
ripple.app > ripple.resource
|
||||
ripple.app > test.unit_test
|
||||
ripple.basics > ripple.beast
|
||||
ripple.conditions > ripple.basics
|
||||
ripple.conditions > ripple.protocol
|
||||
ripple.consensus > ripple.basics
|
||||
ripple.consensus > ripple.beast
|
||||
ripple.consensus > ripple.json
|
||||
ripple.consensus > ripple.protocol
|
||||
ripple.core > ripple.beast
|
||||
ripple.core > ripple.json
|
||||
ripple.core > ripple.protocol
|
||||
ripple.crypto > ripple.basics
|
||||
ripple.json > ripple.beast
|
||||
ripple.ledger > ripple.basics
|
||||
ripple.ledger > ripple.beast
|
||||
ripple.ledger > ripple.core
|
||||
ripple.ledger > ripple.json
|
||||
ripple.net > ripple.basics
|
||||
ripple.net > ripple.beast
|
||||
ripple.net > ripple.json
|
||||
ripple.net > ripple.protocol
|
||||
ripple.net > ripple.resource
|
||||
ripple.nodestore > ripple.basics
|
||||
ripple.nodestore > ripple.beast
|
||||
ripple.nodestore > ripple.core
|
||||
ripple.nodestore > ripple.json
|
||||
ripple.nodestore > ripple.protocol
|
||||
ripple.nodestore > ripple.unity
|
||||
ripple.overlay > ripple.basics
|
||||
ripple.overlay > ripple.beast
|
||||
ripple.overlay > ripple.core
|
||||
ripple.overlay > ripple.json
|
||||
ripple.overlay > ripple.peerfinder
|
||||
ripple.overlay > ripple.protocol
|
||||
ripple.overlay > ripple.resource
|
||||
ripple.overlay > ripple.server
|
||||
ripple.peerfinder > ripple.basics
|
||||
ripple.peerfinder > ripple.beast
|
||||
ripple.peerfinder > ripple.core
|
||||
ripple.peerfinder > ripple.protocol
|
||||
ripple.perflog > ripple.basics
|
||||
ripple.perflog > ripple.beast
|
||||
ripple.perflog > ripple.core
|
||||
ripple.perflog > ripple.json
|
||||
ripple.perflog > ripple.nodestore
|
||||
ripple.perflog > ripple.protocol
|
||||
ripple.perflog > ripple.rpc
|
||||
ripple.protocol > ripple.beast
|
||||
ripple.protocol > ripple.crypto
|
||||
ripple.protocol > ripple.json
|
||||
ripple.resource > ripple.basics
|
||||
ripple.resource > ripple.beast
|
||||
ripple.resource > ripple.json
|
||||
ripple.resource > ripple.protocol
|
||||
ripple.rpc > ripple.basics
|
||||
ripple.rpc > ripple.beast
|
||||
ripple.rpc > ripple.core
|
||||
ripple.rpc > ripple.crypto
|
||||
ripple.rpc > ripple.json
|
||||
ripple.rpc > ripple.ledger
|
||||
ripple.rpc > ripple.nodestore
|
||||
ripple.rpc > ripple.protocol
|
||||
ripple.rpc > ripple.resource
|
||||
ripple.rpc > ripple.server
|
||||
ripple.rpc > ripple.shamap
|
||||
ripple.server > ripple.basics
|
||||
ripple.server > ripple.beast
|
||||
ripple.server > ripple.crypto
|
||||
ripple.server > ripple.json
|
||||
ripple.server > ripple.protocol
|
||||
ripple.shamap > ripple.basics
|
||||
ripple.shamap > ripple.beast
|
||||
ripple.shamap > ripple.crypto
|
||||
ripple.shamap > ripple.nodestore
|
||||
ripple.shamap > ripple.protocol
|
||||
test.app > ripple.app
|
||||
test.app > ripple.basics
|
||||
test.app > ripple.beast
|
||||
test.app > ripple.core
|
||||
test.app > ripple.json
|
||||
test.app > ripple.ledger
|
||||
test.app > ripple.overlay
|
||||
test.app > ripple.protocol
|
||||
test.app > ripple.resource
|
||||
test.app > ripple.rpc
|
||||
test.app > test.toplevel
|
||||
test.app > test.unit_test
|
||||
test.app > xrpl.basics
|
||||
test.app > xrpld.app
|
||||
test.app > xrpld.core
|
||||
test.app > xrpld.ledger
|
||||
test.app > xrpld.overlay
|
||||
test.app > xrpld.rpc
|
||||
test.app > xrpl.hook
|
||||
test.app > xrpl.json
|
||||
test.app > xrpl.protocol
|
||||
test.app > xrpl.resource
|
||||
test.basics > ripple.basics
|
||||
test.basics > ripple.beast
|
||||
test.basics > ripple.json
|
||||
test.basics > ripple.protocol
|
||||
test.basics > ripple.rpc
|
||||
test.basics > test.jtx
|
||||
test.basics > test.unit_test
|
||||
test.basics > xrpl.basics
|
||||
test.basics > xrpld.perflog
|
||||
test.basics > xrpld.rpc
|
||||
test.basics > xrpl.json
|
||||
test.basics > xrpl.protocol
|
||||
test.beast > xrpl.basics
|
||||
test.conditions > xrpl.basics
|
||||
test.conditions > xrpld.conditions
|
||||
test.beast > ripple.basics
|
||||
test.beast > ripple.beast
|
||||
test.conditions > ripple.basics
|
||||
test.conditions > ripple.beast
|
||||
test.conditions > ripple.conditions
|
||||
test.consensus > ripple.app
|
||||
test.consensus > ripple.basics
|
||||
test.consensus > ripple.beast
|
||||
test.consensus > ripple.consensus
|
||||
test.consensus > ripple.core
|
||||
test.consensus > ripple.ledger
|
||||
test.consensus > ripple.protocol
|
||||
test.consensus > test.csf
|
||||
test.consensus > test.toplevel
|
||||
test.consensus > test.unit_test
|
||||
test.consensus > xrpl.basics
|
||||
test.consensus > xrpld.app
|
||||
test.consensus > xrpld.consensus
|
||||
test.consensus > xrpld.core
|
||||
test.consensus > xrpld.ledger
|
||||
test.consensus > xrpl.protocol
|
||||
test.core > ripple.basics
|
||||
test.core > ripple.beast
|
||||
test.core > ripple.core
|
||||
test.core > ripple.crypto
|
||||
test.core > ripple.json
|
||||
test.core > ripple.server
|
||||
test.core > test.jtx
|
||||
test.core > test.toplevel
|
||||
test.core > test.unit_test
|
||||
test.core > xrpl.basics
|
||||
test.core > xrpld.core
|
||||
test.core > xrpld.perflog
|
||||
test.core > xrpl.json
|
||||
test.core > xrpl.server
|
||||
test.csf > xrpl.basics
|
||||
test.csf > xrpld.consensus
|
||||
test.csf > xrpl.json
|
||||
test.csf > xrpl.protocol
|
||||
test.csf > ripple.basics
|
||||
test.csf > ripple.beast
|
||||
test.csf > ripple.consensus
|
||||
test.csf > ripple.json
|
||||
test.csf > ripple.protocol
|
||||
test.json > ripple.beast
|
||||
test.json > ripple.json
|
||||
test.json > test.jtx
|
||||
test.json > xrpl.json
|
||||
test.jtx > xrpl.basics
|
||||
test.jtx > xrpld.app
|
||||
test.jtx > xrpld.consensus
|
||||
test.jtx > xrpld.core
|
||||
test.jtx > xrpld.ledger
|
||||
test.jtx > xrpld.net
|
||||
test.jtx > xrpld.rpc
|
||||
test.jtx > xrpl.hook
|
||||
test.jtx > xrpl.json
|
||||
test.jtx > xrpl.protocol
|
||||
test.jtx > xrpl.resource
|
||||
test.jtx > xrpl.server
|
||||
test.jtx > ripple.app
|
||||
test.jtx > ripple.basics
|
||||
test.jtx > ripple.beast
|
||||
test.jtx > ripple.consensus
|
||||
test.jtx > ripple.core
|
||||
test.jtx > ripple.json
|
||||
test.jtx > ripple.ledger
|
||||
test.jtx > ripple.net
|
||||
test.jtx > ripple.protocol
|
||||
test.jtx > ripple.server
|
||||
test.ledger > ripple.app
|
||||
test.ledger > ripple.basics
|
||||
test.ledger > ripple.beast
|
||||
test.ledger > ripple.core
|
||||
test.ledger > ripple.ledger
|
||||
test.ledger > ripple.protocol
|
||||
test.ledger > test.jtx
|
||||
test.ledger > test.toplevel
|
||||
test.ledger > xrpl.basics
|
||||
test.ledger > xrpld.app
|
||||
test.ledger > xrpld.core
|
||||
test.ledger > xrpld.ledger
|
||||
test.ledger > xrpl.protocol
|
||||
test.net > ripple.net
|
||||
test.net > test.jtx
|
||||
test.net > test.toplevel
|
||||
test.net > test.unit_test
|
||||
test.nodestore > ripple.app
|
||||
test.nodestore > ripple.basics
|
||||
test.nodestore > ripple.beast
|
||||
test.nodestore > ripple.core
|
||||
test.nodestore > ripple.nodestore
|
||||
test.nodestore > ripple.protocol
|
||||
test.nodestore > ripple.unity
|
||||
test.nodestore > test.jtx
|
||||
test.nodestore > test.toplevel
|
||||
test.nodestore > test.unit_test
|
||||
test.nodestore > xrpl.basics
|
||||
test.nodestore > xrpld.core
|
||||
test.nodestore > xrpld.nodestore
|
||||
test.nodestore > xrpld.unity
|
||||
test.overlay > ripple.app
|
||||
test.overlay > ripple.basics
|
||||
test.overlay > ripple.beast
|
||||
test.overlay > ripple.core
|
||||
test.overlay > ripple.overlay
|
||||
test.overlay > ripple.peerfinder
|
||||
test.overlay > ripple.protocol
|
||||
test.overlay > ripple.shamap
|
||||
test.overlay > test.jtx
|
||||
test.overlay > test.unit_test
|
||||
test.overlay > xrpl.basics
|
||||
test.overlay > xrpld.app
|
||||
test.overlay > xrpld.overlay
|
||||
test.overlay > xrpld.peerfinder
|
||||
test.overlay > xrpld.shamap
|
||||
test.overlay > xrpl.protocol
|
||||
test.peerfinder > ripple.basics
|
||||
test.peerfinder > ripple.beast
|
||||
test.peerfinder > ripple.core
|
||||
test.peerfinder > ripple.peerfinder
|
||||
test.peerfinder > ripple.protocol
|
||||
test.peerfinder > test.beast
|
||||
test.peerfinder > test.unit_test
|
||||
test.peerfinder > xrpl.basics
|
||||
test.peerfinder > xrpld.core
|
||||
test.peerfinder > xrpld.peerfinder
|
||||
test.peerfinder > xrpl.protocol
|
||||
test.protocol > ripple.basics
|
||||
test.protocol > ripple.beast
|
||||
test.protocol > ripple.crypto
|
||||
test.protocol > ripple.json
|
||||
test.protocol > ripple.protocol
|
||||
test.protocol > test.toplevel
|
||||
test.protocol > xrpl.basics
|
||||
test.protocol > xrpl.json
|
||||
test.protocol > xrpl.protocol
|
||||
test.resource > ripple.basics
|
||||
test.resource > ripple.beast
|
||||
test.resource > ripple.resource
|
||||
test.resource > test.unit_test
|
||||
test.resource > xrpl.basics
|
||||
test.resource > xrpl.resource
|
||||
test.rpc > ripple.app
|
||||
test.rpc > ripple.basics
|
||||
test.rpc > ripple.beast
|
||||
test.rpc > ripple.core
|
||||
test.rpc > ripple.json
|
||||
test.rpc > ripple.net
|
||||
test.rpc > ripple.nodestore
|
||||
test.rpc > ripple.overlay
|
||||
test.rpc > ripple.protocol
|
||||
test.rpc > ripple.resource
|
||||
test.rpc > ripple.rpc
|
||||
test.rpc > test.jtx
|
||||
test.rpc > test.nodestore
|
||||
test.rpc > test.toplevel
|
||||
test.rpc > xrpl.basics
|
||||
test.rpc > xrpld.app
|
||||
test.rpc > xrpld.core
|
||||
test.rpc > xrpld.net
|
||||
test.rpc > xrpld.overlay
|
||||
test.rpc > xrpld.rpc
|
||||
test.rpc > xrpl.hook
|
||||
test.rpc > xrpl.json
|
||||
test.rpc > xrpl.protocol
|
||||
test.rpc > xrpl.resource
|
||||
test.server > ripple.app
|
||||
test.server > ripple.basics
|
||||
test.server > ripple.beast
|
||||
test.server > ripple.core
|
||||
test.server > ripple.json
|
||||
test.server > ripple.rpc
|
||||
test.server > ripple.server
|
||||
test.server > test.jtx
|
||||
test.server > test.toplevel
|
||||
test.server > test.unit_test
|
||||
test.server > xrpl.basics
|
||||
test.server > xrpld.app
|
||||
test.server > xrpld.core
|
||||
test.server > xrpld.rpc
|
||||
test.server > xrpl.json
|
||||
test.server > xrpl.server
|
||||
test.shamap > ripple.basics
|
||||
test.shamap > ripple.beast
|
||||
test.shamap > ripple.nodestore
|
||||
test.shamap > ripple.protocol
|
||||
test.shamap > ripple.shamap
|
||||
test.shamap > test.unit_test
|
||||
test.shamap > xrpl.basics
|
||||
test.shamap > xrpld.nodestore
|
||||
test.shamap > xrpld.shamap
|
||||
test.shamap > xrpl.protocol
|
||||
test.toplevel > ripple.json
|
||||
test.toplevel > test.csf
|
||||
test.toplevel > xrpl.json
|
||||
test.unit_test > xrpl.basics
|
||||
xrpl.hook > xrpl.basics
|
||||
xrpl.protocol > xrpl.basics
|
||||
xrpl.protocol > xrpl.json
|
||||
xrpl.resource > xrpl.basics
|
||||
xrpl.resource > xrpl.json
|
||||
xrpl.resource > xrpl.protocol
|
||||
xrpl.server > xrpl.basics
|
||||
xrpl.server > xrpl.json
|
||||
xrpl.server > xrpl.protocol
|
||||
xrpld.app > test.unit_test
|
||||
xrpld.app > xrpl.basics
|
||||
xrpld.app > xrpld.conditions
|
||||
xrpld.app > xrpld.consensus
|
||||
xrpld.app > xrpld.perflog
|
||||
xrpld.app > xrpl.hook
|
||||
xrpld.app > xrpl.json
|
||||
xrpld.app > xrpl.resource
|
||||
xrpld.conditions > xrpl.basics
|
||||
xrpld.conditions > xrpl.protocol
|
||||
xrpld.consensus > xrpl.basics
|
||||
xrpld.consensus > xrpl.json
|
||||
xrpld.consensus > xrpl.protocol
|
||||
xrpld.core > xrpl.basics
|
||||
xrpld.core > xrpl.json
|
||||
xrpld.core > xrpl.protocol
|
||||
xrpld.ledger > xrpl.basics
|
||||
xrpld.ledger > xrpld.core
|
||||
xrpld.ledger > xrpl.json
|
||||
xrpld.ledger > xrpl.protocol
|
||||
xrpld.net > xrpl.basics
|
||||
xrpld.net > xrpl.json
|
||||
xrpld.net > xrpl.protocol
|
||||
xrpld.net > xrpl.resource
|
||||
xrpld.nodestore > xrpl.basics
|
||||
xrpld.nodestore > xrpld.core
|
||||
xrpld.nodestore > xrpld.unity
|
||||
xrpld.nodestore > xrpl.json
|
||||
xrpld.nodestore > xrpl.protocol
|
||||
xrpld.overlay > xrpl.basics
|
||||
xrpld.overlay > xrpld.core
|
||||
xrpld.overlay > xrpld.peerfinder
|
||||
xrpld.overlay > xrpld.perflog
|
||||
xrpld.overlay > xrpl.json
|
||||
xrpld.overlay > xrpl.protocol
|
||||
xrpld.overlay > xrpl.resource
|
||||
xrpld.overlay > xrpl.server
|
||||
xrpld.peerfinder > xrpl.basics
|
||||
xrpld.peerfinder > xrpld.core
|
||||
xrpld.peerfinder > xrpl.protocol
|
||||
xrpld.perflog > xrpl.basics
|
||||
xrpld.perflog > xrpl.json
|
||||
xrpld.perflog > xrpl.protocol
|
||||
xrpld.rpc > xrpl.basics
|
||||
xrpld.rpc > xrpld.core
|
||||
xrpld.rpc > xrpld.ledger
|
||||
xrpld.rpc > xrpld.nodestore
|
||||
xrpld.rpc > xrpld.shamap
|
||||
xrpld.rpc > xrpl.json
|
||||
xrpld.rpc > xrpl.protocol
|
||||
xrpld.rpc > xrpl.resource
|
||||
xrpld.rpc > xrpl.server
|
||||
xrpld.shamap > xrpl.basics
|
||||
xrpld.shamap > xrpld.nodestore
|
||||
xrpld.shamap > xrpl.protocol
|
||||
test.unit_test > ripple.basics
|
||||
test.unit_test > ripple.beast
|
||||
|
||||
@@ -12,19 +12,19 @@ endif()
|
||||
|
||||
# Fix "unrecognized escape" issues when passing CMAKE_MODULE_PATH on Windows.
|
||||
file(TO_CMAKE_PATH "${CMAKE_MODULE_PATH}" CMAKE_MODULE_PATH)
|
||||
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake")
|
||||
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake")
|
||||
|
||||
if(POLICY CMP0144)
|
||||
cmake_policy(SET CMP0144 NEW)
|
||||
endif()
|
||||
|
||||
project (xrpl)
|
||||
project (rippled)
|
||||
set(Boost_NO_BOOST_CMAKE ON)
|
||||
|
||||
# make GIT_COMMIT_HASH define available to all sources
|
||||
find_package(Git)
|
||||
if(Git_FOUND)
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} --git-dir=${CMAKE_CURRENT_SOURCE_DIR}/.git describe --always --abbrev=40
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} describe --always --abbrev=40
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE OUTPUT_VARIABLE gch)
|
||||
if(gch)
|
||||
set(GIT_COMMIT_HASH "${gch}")
|
||||
@@ -51,14 +51,15 @@ if(CMAKE_TOOLCHAIN_FILE)
|
||||
endif()
|
||||
|
||||
if (NOT USE_CONAN)
|
||||
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake")
|
||||
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake/deps")
|
||||
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake")
|
||||
list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/Builds/CMake/deps")
|
||||
endif()
|
||||
|
||||
include (CheckCXXCompilerFlag)
|
||||
include (FetchContent)
|
||||
include (ExternalProject)
|
||||
include (CMakeFuncs) # must come *after* ExternalProject b/c it overrides one function in EP
|
||||
include (ProcessorCount)
|
||||
if (target)
|
||||
message (FATAL_ERROR "The target option has been removed - use native cmake options to control build")
|
||||
endif ()
|
||||
@@ -82,10 +83,8 @@ include(RippledInterface)
|
||||
|
||||
###
|
||||
if (NOT USE_CONAN)
|
||||
set(SECP256K1_INSTALL TRUE)
|
||||
add_subdirectory(external/secp256k1)
|
||||
add_library(secp256k1::secp256k1 ALIAS secp256k1)
|
||||
add_subdirectory(external/ed25519-donna)
|
||||
add_subdirectory(src/secp256k1)
|
||||
add_subdirectory(src/ed25519-donna)
|
||||
include(deps/Boost)
|
||||
include(deps/OpenSSL)
|
||||
# include(deps/Secp256k1)
|
||||
@@ -94,11 +93,12 @@ if (NOT USE_CONAN)
|
||||
include(deps/Libarchive)
|
||||
include(deps/Sqlite)
|
||||
include(deps/Soci)
|
||||
include(deps/Snappy)
|
||||
include(deps/Rocksdb)
|
||||
include(deps/Nudb)
|
||||
include(deps/date)
|
||||
# include(deps/Protobuf)
|
||||
# include(deps/gRPC)
|
||||
include(deps/Protobuf)
|
||||
include(deps/gRPC)
|
||||
include(deps/cassandra)
|
||||
include(deps/Postgres)
|
||||
include(deps/WasmEdge)
|
||||
@@ -108,11 +108,8 @@ else()
|
||||
set_target_properties(OpenSSL::SSL PROPERTIES
|
||||
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2
|
||||
)
|
||||
set(SECP256K1_INSTALL TRUE)
|
||||
add_subdirectory(external/secp256k1)
|
||||
add_library(secp256k1::secp256k1 ALIAS secp256k1)
|
||||
add_subdirectory(external/ed25519-donna)
|
||||
find_package(gRPC REQUIRED)
|
||||
add_subdirectory(src/secp256k1)
|
||||
add_subdirectory(src/ed25519-donna)
|
||||
find_package(lz4 REQUIRED)
|
||||
# Target names with :: are not allowed in a generator expression.
|
||||
# We need to pull the include directories and imported location properties
|
||||
@@ -120,6 +117,7 @@ else()
|
||||
find_package(LibArchive REQUIRED)
|
||||
find_package(SOCI REQUIRED)
|
||||
find_package(SQLite3 REQUIRED)
|
||||
find_package(Snappy REQUIRED)
|
||||
find_package(wasmedge REQUIRED)
|
||||
option(rocksdb "Enable RocksDB" ON)
|
||||
if(rocksdb)
|
||||
@@ -131,7 +129,9 @@ else()
|
||||
endif()
|
||||
find_package(nudb REQUIRED)
|
||||
find_package(date REQUIRED)
|
||||
find_package(xxHash REQUIRED)
|
||||
find_package(BLAKE3 REQUIRED)
|
||||
include(conan/Protobuf)
|
||||
include(conan/gRPC)
|
||||
if(TARGET nudb::core)
|
||||
set(nudb nudb::core)
|
||||
elseif(TARGET NuDB::nudb)
|
||||
@@ -141,29 +141,33 @@ else()
|
||||
endif()
|
||||
target_link_libraries(ripple_libs INTERFACE ${nudb})
|
||||
|
||||
if(reporting)
|
||||
find_package(cassandra-cpp-driver REQUIRED)
|
||||
find_package(PostgreSQL REQUIRED)
|
||||
target_link_libraries(ripple_libs INTERFACE
|
||||
cassandra-cpp-driver::cassandra-cpp-driver
|
||||
PostgreSQL::PostgreSQL
|
||||
)
|
||||
endif()
|
||||
target_link_libraries(ripple_libs INTERFACE
|
||||
ed25519::ed25519
|
||||
LibArchive::LibArchive
|
||||
lz4::lz4
|
||||
OpenSSL::Crypto
|
||||
OpenSSL::SSL
|
||||
# Ripple::grpc_pbufs
|
||||
# Ripple::pbufs
|
||||
Ripple::grpc_pbufs
|
||||
Ripple::pbufs
|
||||
secp256k1::secp256k1
|
||||
soci::soci
|
||||
SQLite::SQLite3
|
||||
)
|
||||
endif()
|
||||
|
||||
if(coverage)
|
||||
include(RippledCov)
|
||||
endif()
|
||||
###
|
||||
|
||||
set(PROJECT_EXPORT_SET RippleExports)
|
||||
include(RippledCore)
|
||||
if (NOT USE_CONAN)
|
||||
include(deps/Protobuf)
|
||||
include(deps/gRPC)
|
||||
endif()
|
||||
include(RippledInstall)
|
||||
include(RippledCov)
|
||||
include(RippledMultiConfig)
|
||||
include(RippledDocs)
|
||||
include(RippledValidatorKeys)
|
||||
|
||||
255
CONTRIBUTING.md
255
CONTRIBUTING.md
@@ -57,12 +57,12 @@ Ensure that your code compiles according to the build instructions in the
|
||||
[`documentation`](https://docs.xahau.network/infrastructure/building-xahau).
|
||||
If you create new source files, they must go under `src/ripple`.
|
||||
You will need to add them to one of the
|
||||
[source lists](./cmake/RippledCore.cmake) in CMake.
|
||||
[source lists](./Builds/CMake/RippledCore.cmake) in CMake.
|
||||
|
||||
Please write tests for your code.
|
||||
If you create new test source files, they must go under `src/test`.
|
||||
You will need to add them to one of the
|
||||
[source lists](./cmake/RippledCore.cmake) in CMake.
|
||||
[source lists](./Builds/CMake/RippledCore.cmake) in CMake.
|
||||
If your test can be run offline, in under 60 seconds, then it can be an
|
||||
automatic test run by `rippled --unittest`.
|
||||
Otherwise, it must be a manual test.
|
||||
@@ -71,235 +71,27 @@ The source must be formatted according to the style guide below.
|
||||
|
||||
Header includes must be [levelized](./Builds/levelization).
|
||||
|
||||
Changes should be usually squashed down into a single commit.
|
||||
Some larger or more complicated change sets make more sense,
|
||||
and are easier to review if organized into multiple logical commits.
|
||||
Either way, all commits should fit the following criteria:
|
||||
* Changes should be presented in a single commit or a logical
|
||||
sequence of commits.
|
||||
Specifically, chronological commits that simply
|
||||
reflect the history of how the author implemented
|
||||
the change, "warts and all", are not useful to
|
||||
reviewers.
|
||||
* Every commit should have a [good message](#good-commit-messages).
|
||||
to explain a specific aspects of the change.
|
||||
* Every commit should be signed.
|
||||
* Every commit should be well-formed (builds successfully,
|
||||
unit tests passing), as this helps to resolve merge
|
||||
conflicts, and makes it easier to use `git bisect`
|
||||
to find bugs.
|
||||
|
||||
### Good commit messages
|
||||
|
||||
Refer to
|
||||
["How to Write a Git Commit Message"](https://cbea.ms/git-commit/)
|
||||
for general rules on writing a good commit message.
|
||||
|
||||
In addition to those guidelines, please add one of the following
|
||||
prefixes to the subject line if appropriate.
|
||||
* `fix:` - The primary purpose is to fix an existing bug.
|
||||
* `perf:` - The primary purpose is performance improvements.
|
||||
* `refactor:` - The changes refactor code without affecting
|
||||
functionality.
|
||||
* `test:` - The changes _only_ affect unit tests.
|
||||
* `docs:` - The changes _only_ affect documentation. This can
|
||||
include code comments in addition to `.md` files like this one.
|
||||
* `build:` - The changes _only_ affect the build process,
|
||||
including CMake and/or Conan settings.
|
||||
* `chore:` - Other tasks that don't affect the binary, but don't fit
|
||||
any of the other cases. e.g. formatting, git settings, updating
|
||||
Github Actions jobs.
|
||||
|
||||
Whenever possible, when updating commits after the PR is open, please
|
||||
add the PR number to the end of the subject line. e.g. `test: Add
|
||||
unit tests for Feature X (#1234)`.
|
||||
|
||||
## Pull requests
|
||||
|
||||
In general, pull requests use `develop` as the base branch.
|
||||
|
||||
(Hotfixes are an exception.)
|
||||
|
||||
If your changes are not quite ready, but you want to make it easily available
|
||||
for preliminary examination or review, you can create a "Draft" pull request.
|
||||
While a pull request is marked as a "Draft", you can rebase or reorganize the
|
||||
commits in the pull request as desired.
|
||||
|
||||
Github pull requests are created as "Ready" by default, or you can mark
|
||||
a "Draft" pull request as "Ready".
|
||||
Once a pull request is marked as "Ready",
|
||||
any changes must be added as new commits. Do not
|
||||
force-push to a branch in a pull request under review.
|
||||
(This includes rebasing your branch onto the updated base branch.
|
||||
Use a merge operation, instead or hit the "Update branch" button
|
||||
at the bottom of the Github PR page.)
|
||||
Changes to pull requests must be added as new commits.
|
||||
Once code reviewers have started looking at your code, please avoid
|
||||
force-pushing a branch in a pull request.
|
||||
This preserves the ability for reviewers to filter changes since their last
|
||||
review.
|
||||
|
||||
A pull request must obtain **approvals from at least two reviewers**
|
||||
before it can be considered for merge by a Maintainer.
|
||||
A pull request must obtain **approvals from at least two reviewers** before it
|
||||
can be considered for merge by a Maintainer.
|
||||
Maintainers retain discretion to require more approvals if they feel the
|
||||
credibility of the existing approvals is insufficient.
|
||||
|
||||
Pull requests must be merged by [squash-and-merge][2]
|
||||
to preserve a linear history for the `develop` branch.
|
||||
|
||||
### When and how to merge pull requests
|
||||
|
||||
#### "Passed"
|
||||
|
||||
A pull request should only have the "Passed" label added when it
|
||||
meets a few criteria:
|
||||
|
||||
1. It must have two approving reviews [as described
|
||||
above](#pull-requests). (Exception: PRs that are deemed "trivial"
|
||||
only need one approval.)
|
||||
2. All CI checks must be complete and passed. (One-off failures may
|
||||
be acceptable if they are related to a known issue.)
|
||||
3. The PR must have a [good commit message](#good-commit-messages).
|
||||
* If the PR started with a good commit message, and it doesn't
|
||||
need to be updated, the author can indicate that in a comment.
|
||||
* Any contributor, preferably the author, can leave a comment
|
||||
suggesting a commit message.
|
||||
* If the author squashes and rebases the code in preparation for
|
||||
merge, they should also ensure the commit message(s) are updated
|
||||
as well.
|
||||
4. The PR branch must be up to date with the base branch (usually
|
||||
`develop`). This is usually accomplised by merging the base branch
|
||||
into the feature branch, but if the other criteria are met, the
|
||||
changes can be squashed and rebased on top of the base branch.
|
||||
5. Finally, and most importantly, the author of the PR must
|
||||
positively indicate that the PR is ready to merge. That can be
|
||||
accomplished by adding the "Passed" label if their role allows,
|
||||
or by leaving a comment to the effect that the PR is ready to
|
||||
merge.
|
||||
|
||||
Once the "Passed" label is added, a maintainer may merge the PR at
|
||||
any time, so don't use it lightly.
|
||||
|
||||
#### Instructions for maintainers
|
||||
|
||||
The maintainer should double-check that the PR has met all the
|
||||
necessary criteria, and can request additional information from the
|
||||
owner, or additional reviews, and can always feel free to remove the
|
||||
"Passed" label if appropriate. The maintainer has final say on
|
||||
whether a PR gets merged, and are encouraged to communicate and
|
||||
issues or concerns to other maintainers.
|
||||
|
||||
##### Most pull requests: "Squash and merge"
|
||||
|
||||
Most pull requests don't need special handling, and can simply be
|
||||
merged using the "Squash and merge" button on the Github UI. Update
|
||||
the suggested commit message if necessary.
|
||||
|
||||
##### Slightly more complicated pull requests
|
||||
|
||||
Some pull requests need to be pushed to `develop` as more than one
|
||||
commit. There are multiple ways to accomplish this. If the author
|
||||
describes a process, and it is reasonable, follow it. Otherwise, do
|
||||
a fast forward only merge (`--ff-only`) on the command line and push.
|
||||
|
||||
Either way, check that:
|
||||
* The commits are based on the current tip of `develop`.
|
||||
* The commits are clean: No merge commits (except when reverse
|
||||
merging), no "[FOLD]" or "fixup!" messages.
|
||||
* All commits are signed. If the commits are not signed by the author, use
|
||||
`git commit --amend -S` to sign them yourself.
|
||||
* At least one (but preferably all) of the commits has the PR number
|
||||
in the commit message.
|
||||
|
||||
**Never use the "Create a merge commit" or "Rebase and merge"
|
||||
functions!**
|
||||
|
||||
##### Releases, release candidates, and betas
|
||||
|
||||
All releases, including release candidates and betas, are handled
|
||||
differently from typical PRs. Most importantly, never use
|
||||
the Github UI to merge a release.
|
||||
|
||||
1. There are two possible conditions that the `develop` branch will
|
||||
be in when preparing a release.
|
||||
1. Ready or almost ready to go: There may be one or two PRs that
|
||||
need to be merged, but otherwise, the only change needed is to
|
||||
update the version number in `BuildInfo.cpp`. In this case,
|
||||
merge those PRs as appropriate, updating the second one, and
|
||||
waiting for CI to finish in between. Then update
|
||||
`BuildInfo.cpp`.
|
||||
2. Several pending PRs: In this case, do not use the Github UI,
|
||||
because the delays waiting for CI in between each merge will be
|
||||
unnecessarily onerous. Instead, create a working branch (e.g.
|
||||
`develop-next`) based off of `develop`. Squash the changes
|
||||
from each PR onto the branch, one commit each (unless
|
||||
more are needed), being sure to sign each commit and update
|
||||
the commit message to include the PR number. You may be able
|
||||
to use a fast-forward merge for the first PR. The workflow may
|
||||
look something like:
|
||||
```
|
||||
git fetch upstream
|
||||
git checkout upstream/develop
|
||||
git checkout -b develop-next
|
||||
# Use -S on the ff-only merge if prbranch1 isn't signed.
|
||||
# Or do another branch first.
|
||||
git merge --ff-only user1/prbranch1
|
||||
git merge --squash user2/prbranch2
|
||||
git commit -S
|
||||
git merge --squash user3/prbranch3
|
||||
git commit -S
|
||||
[...]
|
||||
git push --set-upstream origin develop-next
|
||||
</pre>
|
||||
```
|
||||
2. Create the Pull Request with `release` as the base branch. If any
|
||||
of the included PRs are still open,
|
||||
[use closing keywords](https://docs.github.com/articles/closing-issues-using-keywords)
|
||||
in the description to ensure they are closed when the code is
|
||||
released. e.g. "Closes #1234"
|
||||
3. Instead of the default template, reuse and update the message from
|
||||
the previous release. Include the following verbiage somewhere in
|
||||
the description:
|
||||
```
|
||||
The base branch is release. All releases (including betas) go in
|
||||
release. This PR will be merged with --ff-only (not squashed or
|
||||
rebased, and not using the GitHub UI) to both release and develop.
|
||||
```
|
||||
4. Sign-offs for the three platforms usually occur offline, but at
|
||||
least one approval will be needed on the PR.
|
||||
5. Once everything is ready to go, open a terminal, and do the
|
||||
fast-forward merges manually. Do not push any branches until you
|
||||
verify that all of them update correctly.
|
||||
```
|
||||
git fetch upstream
|
||||
git checkout -b upstream--develop -t upstream/develop || git checkout upstream--develop
|
||||
git reset --hard upstream/develop
|
||||
# develop-next must be signed already!
|
||||
git merge --ff-only origin/develop-next
|
||||
git checkout -b upstream--release -t upstream/release || git checkout upstream--release
|
||||
git reset --hard upstream/release
|
||||
git merge --ff-only origin/develop-next
|
||||
# Only do these 3 steps if pushing a release. No betas or RCs
|
||||
git checkout -b upstream--master -t upstream/master || git checkout upstream--master
|
||||
git reset --hard upstream/master
|
||||
git merge --ff-only origin/develop-next
|
||||
# Check that all of the branches are updated
|
||||
git log -1 --oneline
|
||||
# The output should look like:
|
||||
# 02ec8b7962 (HEAD -> upstream--master, origin/develop-next, upstream--release, upstream--develop, develop-next) Set version to 2.2.0-rc1
|
||||
# Note that all of the upstream--develop/release/master are on this commit.
|
||||
# (Master will be missing for betas, etc.)
|
||||
# Just to be safe, do a dry run first:
|
||||
git push --dry-run upstream-push HEAD:develop
|
||||
git push --dry-run upstream-push HEAD:release
|
||||
# git push --dry-run upstream-push HEAD:master
|
||||
# Now push
|
||||
git push upstream-push HEAD:develop
|
||||
git push upstream-push HEAD:release
|
||||
# git push upstream-push HEAD:master
|
||||
# Don't forget to tag the release, too.
|
||||
git tag <version number>
|
||||
git push upstream-push <version number>
|
||||
```
|
||||
6. Finally
|
||||
[create a new release on Github](https://github.com/XRPLF/rippled/releases).
|
||||
|
||||
|
||||
# Style guide
|
||||
|
||||
@@ -325,41 +117,12 @@ this:
|
||||
You can format individual files in place by running `clang-format -i <file>...`
|
||||
from any directory within this project.
|
||||
|
||||
There is a Continuous Integration job that runs clang-format on pull requests. If the code doesn't comply, a patch file that corrects auto-fixable formatting issues is generated.
|
||||
|
||||
To download the patch file:
|
||||
|
||||
1. Next to `clang-format / check (pull_request) Failing after #s` -> click **Details** to open the details page.
|
||||
2. Left menu -> click **Summary**
|
||||
3. Scroll down to near the bottom-right under `Artifacts` -> click **clang-format.patch**
|
||||
4. Download the zip file and extract it to your local git repository. Run `git apply [patch-file-name]`.
|
||||
5. Commit and push.
|
||||
|
||||
You can install a pre-commit hook to automatically run `clang-format` before every commit:
|
||||
```
|
||||
pip3 install pre-commit
|
||||
pre-commit install
|
||||
```
|
||||
|
||||
## Unit Tests
|
||||
To execute all unit tests:
|
||||
|
||||
```rippled --unittest --unittest-jobs=<number of cores>```
|
||||
|
||||
(Note: Using multiple cores on a Mac M1 can cause spurious test failures. The
|
||||
cause is still under investigation. If you observe this problem, try specifying fewer jobs.)
|
||||
|
||||
To run a specific set of test suites:
|
||||
|
||||
```
|
||||
rippled --unittest TestSuiteName
|
||||
```
|
||||
Note: In this example, all tests with prefix `TestSuiteName` will be run, so if
|
||||
`TestSuiteName1` and `TestSuiteName2` both exist, then both tests will run.
|
||||
Alternatively, if the unit test name finds an exact match, it will stop
|
||||
doing partial matches, i.e. if a unit test with a title of `TestSuiteName`
|
||||
exists, then no other unit test will be executed, apart from `TestSuiteName`.
|
||||
|
||||
## Avoid
|
||||
|
||||
1. Proliferation of nearly identical code.
|
||||
@@ -419,4 +182,4 @@ existing maintainer without a vote.
|
||||
|
||||
|
||||
[1]: https://docs.github.com/en/get-started/quickstart/contributing-to-projects
|
||||
[2]: https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/incorporating-changes-from-a-pull-request/about-pull-request-merges#squash-and-merge-your-commits
|
||||
[2]: https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/incorporating-changes-from-a-pull-request/about-pull-request-merges#squash-and-merge-your-commits
|
||||
217
DISABLE_OLD_ENTRIES.md
Normal file
217
DISABLE_OLD_ENTRIES.md
Normal file
@@ -0,0 +1,217 @@
|
||||
# Auto-Disable Strategy for Hash Migration
|
||||
|
||||
## Core Concept
|
||||
Instead of trying to fix entries with hardcoded old keys, **automatically disable them** during migration. If it contains old keys, it's broken anyway - make that explicit.
|
||||
|
||||
## The Algorithm
|
||||
|
||||
### Phase 1: Build Complete Old Key Set
|
||||
```cpp
|
||||
std::unordered_set<uint256> all_old_keys;
|
||||
|
||||
// Collect ALL SHA-512 keys from current state
|
||||
stateMap_.visitLeaves([&](SHAMapItem const& item) {
|
||||
all_old_keys.insert(item.key());
|
||||
|
||||
// Also collect keys from reference fields
|
||||
SerialIter sit(item.slice());
|
||||
auto sle = std::make_shared<SLE>(sit, item.key());
|
||||
|
||||
// Collect from vector fields
|
||||
if (sle->isFieldPresent(sfIndexes)) {
|
||||
for (auto& key : sle->getFieldV256(sfIndexes)) {
|
||||
all_old_keys.insert(key);
|
||||
}
|
||||
}
|
||||
// ... check all other reference fields
|
||||
});
|
||||
```
|
||||
|
||||
### Phase 2: Scan and Disable
|
||||
|
||||
#### Hook Definitions (WASM Code)
|
||||
```cpp
|
||||
bool scanWASMForKeys(Blob const& wasm_code, std::unordered_set<uint256> const& keys) {
|
||||
// Scan for 32-byte sequences matching known keys
|
||||
for (size_t i = 0; i <= wasm_code.size() - 32; i++) {
|
||||
uint256 potential_key = extract32Bytes(wasm_code, i);
|
||||
if (keys.count(potential_key)) {
|
||||
return true; // Found hardcoded key!
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// During migration
|
||||
for (auto& hookDef : allHookDefinitions) {
|
||||
if (scanWASMForKeys(hookDef->getFieldBlob(sfCreateCode), all_old_keys)) {
|
||||
hookDef->setFieldU32(sfFlags, hookDef->getFlags() | HOOK_DISABLED_OLD_KEYS);
|
||||
disabled_hooks.push_back(hookDef->key());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Hook State (Arbitrary Data)
|
||||
```cpp
|
||||
for (auto& hookState : allHookStates) {
|
||||
auto data = hookState->getFieldBlob(sfHookStateData);
|
||||
if (containsAnyKey(data, all_old_keys)) {
|
||||
hookState->setFieldU32(sfFlags, STATE_INVALID_OLD_KEYS);
|
||||
disabled_states.push_back(hookState->key());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Other Vulnerable Entry Types
|
||||
```cpp
|
||||
void disableEntriesWithOldKeys(SLE& sle) {
|
||||
switch(sle.getType()) {
|
||||
case ltHOOK:
|
||||
if (hasOldKeys(sle)) {
|
||||
sle.setFlag(HOOK_DISABLED_MIGRATION);
|
||||
}
|
||||
break;
|
||||
|
||||
case ltESCROW:
|
||||
// Check if destination/condition contains old keys
|
||||
if (containsOldKeyReferences(sle)) {
|
||||
sle.setFlag(ESCROW_FROZEN_MIGRATION);
|
||||
}
|
||||
break;
|
||||
|
||||
case ltPAYCHAN:
|
||||
// Payment channels with old key references
|
||||
if (hasOldKeyInFields(sle)) {
|
||||
sle.setFlag(PAYCHAN_SUSPENDED_MIGRATION);
|
||||
}
|
||||
break;
|
||||
|
||||
case ltHOOK_STATE:
|
||||
// Already handled above
|
||||
break;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Flag Definitions
|
||||
|
||||
```cpp
|
||||
// New flags for migration-disabled entries
|
||||
constexpr uint32_t HOOK_DISABLED_OLD_KEYS = 0x00100000;
|
||||
constexpr uint32_t STATE_INVALID_OLD_KEYS = 0x00200000;
|
||||
constexpr uint32_t ESCROW_FROZEN_MIGRATION = 0x00400000;
|
||||
constexpr uint32_t PAYCHAN_SUSPENDED_MIGRATION = 0x00800000;
|
||||
constexpr uint32_t ENTRY_BROKEN_MIGRATION = 0x01000000;
|
||||
```
|
||||
|
||||
## Execution Prevention
|
||||
|
||||
```cpp
|
||||
// In transaction processing
|
||||
TER HookExecutor::executeHook(Hook const& hook) {
|
||||
if (hook.isFieldPresent(sfFlags)) {
|
||||
if (hook.getFlags() & HOOK_DISABLED_OLD_KEYS) {
|
||||
return tecHOOK_DISABLED_MIGRATION;
|
||||
}
|
||||
}
|
||||
// Normal execution
|
||||
}
|
||||
|
||||
TER processEscrow(Escrow const& escrow) {
|
||||
if (escrow.getFlags() & ESCROW_FROZEN_MIGRATION) {
|
||||
return tecESCROW_FROZEN_MIGRATION;
|
||||
}
|
||||
// Normal processing
|
||||
}
|
||||
```
|
||||
|
||||
## Re-enabling Process
|
||||
|
||||
### For Hooks
|
||||
Developer must submit a new SetHook transaction with updated WASM:
|
||||
```cpp
|
||||
TER SetHook::doApply() {
|
||||
// If hook was disabled for migration
|
||||
if (oldHook->getFlags() & HOOK_DISABLED_OLD_KEYS) {
|
||||
// Verify new WASM doesn't contain old keys
|
||||
if (scanWASMForKeys(newWasm, all_old_keys)) {
|
||||
return tecSTILL_CONTAINS_OLD_KEYS;
|
||||
}
|
||||
// Clear the disabled flag
|
||||
newHook->clearFlag(HOOK_DISABLED_OLD_KEYS);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### For Hook State
|
||||
Must be cleared and rebuilt:
|
||||
```cpp
|
||||
TER HookStateModify::doApply() {
|
||||
if (state->getFlags() & STATE_INVALID_OLD_KEYS) {
|
||||
// Can only delete, not modify
|
||||
if (operation != DELETE) {
|
||||
return tecSTATE_REQUIRES_REBUILD;
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Migration Report
|
||||
|
||||
```json
|
||||
{
|
||||
"migration_ledger": 20000000,
|
||||
"entries_scanned": 620000,
|
||||
"entries_disabled": {
|
||||
"hooks": 12,
|
||||
"hook_definitions": 3,
|
||||
"hook_states": 1847,
|
||||
"escrows": 5,
|
||||
"payment_channels": 2
|
||||
},
|
||||
"disabled_by_reason": {
|
||||
"wasm_contains_keys": 3,
|
||||
"state_contains_keys": 1847,
|
||||
"reference_old_keys": 19
|
||||
},
|
||||
"action_required": [
|
||||
{
|
||||
"type": "HOOK_DEFINITION",
|
||||
"key": "0xABCD...",
|
||||
"owner": "rXXX...",
|
||||
"reason": "WASM contains 3 hardcoded SHA-512 keys",
|
||||
"action": "Recompile hook with new keys or remove hardcoding"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Benefits
|
||||
|
||||
1. **Safety**: Broken things explicitly disabled, not silently failing
|
||||
2. **Transparency**: Clear record of what was disabled and why
|
||||
3. **Natural Cleanup**: Abandoned entries stay disabled forever
|
||||
4. **Developer Responsibility**: Owners must actively fix and re-enable
|
||||
5. **No Silent Corruption**: Better to disable than corrupt
|
||||
6. **Audit Trail**: Complete record of migration casualties
|
||||
|
||||
## Implementation Complexity
|
||||
|
||||
- **Scanning**: O(n×m) where n=entries, m=data size
|
||||
- **Memory**: Need all old keys in memory (~40MB)
|
||||
- **False Positives**: Extremely unlikely (2^-256 probability)
|
||||
- **Recovery**: Clear path to re-enable fixed entries
|
||||
|
||||
## The Nuclear Option
|
||||
|
||||
If too many critical entries would be disabled:
|
||||
```cpp
|
||||
if (disabled_count > ACCEPTABLE_THRESHOLD) {
|
||||
// Abort migration
|
||||
return temMIGRATION_TOO_DESTRUCTIVE;
|
||||
}
|
||||
```
|
||||
|
||||
## Summary
|
||||
|
||||
Instead of attempting impossible fixes for hardcoded keys, acknowledge reality: **if it contains old keys, it's broken**. Make that brokenness explicit through disabling, forcing conscious action to repair and re-enable. This turns an impossible problem (fixing hardcoded keys in WASM) into a manageable one (identifying and disabling broken entries).
|
||||
663
HASH_MIGRATION_CONTEXT.md
Normal file
663
HASH_MIGRATION_CONTEXT.md
Normal file
@@ -0,0 +1,663 @@
|
||||
# Hash Migration to Blake3 - Work Context
|
||||
|
||||
## Build Commands
|
||||
- **To build**: `ninja -C build`
|
||||
- **To count errors**: `ninja -C build 2>&1 | grep "error:" | wc -l`
|
||||
- **To see failed files**: `ninja -C build 2>&1 | grep "^FAILED:" | head -20`
|
||||
- **DO NOT USE**: `cmake --build` or `make`
|
||||
|
||||
## Test Compilation of Single Files
|
||||
|
||||
### Quick Method (basic errors only)
|
||||
```bash
|
||||
clang++ -std=c++20 -I/Users/nicholasdudfield/projects/xahaud-worktrees/xahaud-map-stats-rpc/src \
|
||||
-c src/test/app/NFToken_test.cpp -o /tmp/test.o 2>&1 | head -50
|
||||
```
|
||||
|
||||
### Full Compilation Command (from compile_commands.json)
|
||||
Extract the exact compilation command for any file:
|
||||
```bash
|
||||
# For any specific file:
|
||||
jq -r '.[] | select(.file | endswith("NFToken_test.cpp")) | .command' build/compile_commands.json
|
||||
|
||||
# Or simplified with just the file path:
|
||||
FILE="src/test/app/NFToken_test.cpp"
|
||||
jq -r --arg file "$FILE" '.[] | select(.file | endswith($file)) | .command' build/compile_commands.json
|
||||
```
|
||||
|
||||
### compile_commands.json location
|
||||
- **Location**: `build/compile_commands.json`
|
||||
- Contains exact compilation commands with all include paths and flags for each source file
|
||||
- Generated by CMake during configuration
|
||||
|
||||
## Objective
|
||||
Modify Xahaud to pass ledger_index through all hash functions to enable switching from SHA512-Half to Blake3 at a specific ledger index.
|
||||
|
||||
## Current Approach
|
||||
Using a `hash_options` struct containing `ledger_index` that must be passed to all hash functions.
|
||||
|
||||
## Structure Added
|
||||
```cpp
|
||||
struct hash_options {
|
||||
std::uint32_t ledger_index;
|
||||
explicit hash_options(std::uint32_t li) : ledger_index(li) {}
|
||||
};
|
||||
```
|
||||
|
||||
## CRITICAL: Hash Function Classification Required
|
||||
|
||||
### The Historical Ledger Problem
|
||||
**EVERY** hash operation needs proper classification because even "content hashing" (like transaction IDs, validator manifests, signatures) depends on WHEN it was created:
|
||||
- Transaction from ledger 10M (pre-transition) → Must use SHA-512 Half
|
||||
- Transaction from ledger 20M (post-transition) → Must use BLAKE3
|
||||
- You cannot mix hash algorithms for the same ledger - it's all or nothing
|
||||
|
||||
### Classification Constants (to be added to digest.h)
|
||||
As an interim step, introduce classification constants to make intent explicit:
|
||||
```cpp
|
||||
// Special ledger_index values for hash operations
|
||||
constexpr uint32_t LEDGER_INDEX_UNKNOWN = 0; // DANGEROUS - avoid!
|
||||
constexpr uint32_t LEDGER_INDEX_TEST_ONLY = std::numeric_limits<uint32_t>::max();
|
||||
constexpr uint32_t LEDGER_INDEX_NETWORK_PROTOCOL = std::numeric_limits<uint32_t>::max() - 1;
|
||||
constexpr uint32_t LEDGER_INDEX_CURRENT = std::numeric_limits<uint32_t>::max() - 2; // Use current ledger
|
||||
```
|
||||
|
||||
### Classification Categories
|
||||
|
||||
1. **Ledger Object Indexing** (MUST use actual ledger_index)
|
||||
- All `indexHash()` calls determining WHERE objects live
|
||||
- All keylet functions creating/finding ledger objects
|
||||
- SHAMap node hashing (builds the Merkle tree)
|
||||
|
||||
2. **Historical Content Hashing** (MUST use ledger_index from when it was created)
|
||||
- Transaction IDs (use ledger where tx was included)
|
||||
- Validator manifests (use ledger when signed)
|
||||
- Hook code hashing (use ledger when deployed)
|
||||
- Signatures referencing ledger data
|
||||
|
||||
3. **Test Code** (use LEDGER_INDEX_TEST_ONLY)
|
||||
- Unit tests
|
||||
- Mock objects
|
||||
- Test fixtures
|
||||
|
||||
4. **Network Protocol** (special handling needed)
|
||||
- Peer handshakes
|
||||
- Protocol messages
|
||||
- May need to support both algorithms during transition
|
||||
|
||||
### Why hash_options{0} is Dangerous
|
||||
Using `hash_options{0}` assumes everything is pre-transition (SHA-512 Half forever), which breaks after the transition point. Every usage must be classified and given the appropriate context.
|
||||
|
||||
## SHAMap Hash Architecture Analysis
|
||||
|
||||
### Current State
|
||||
- SHAMap stores `ledgerSeq_` and knows which ledger it represents
|
||||
- Nodes compute hashes WITHOUT ledger context (just use sha512Half directly)
|
||||
- Nodes can be SHARED between SHAMaps via canonicalization/caching
|
||||
- Node's `hash_` member stores only ONE hash value
|
||||
|
||||
### The Fundamental Problem: Node Sharing vs Hash Migration
|
||||
|
||||
When nodes are shared between ledgers through canonicalization:
|
||||
- A node from ledger 19,999,999 (SHA-512) has hash H1
|
||||
- Same node referenced by ledger 20,000,000 (BLAKE3) needs hash H2
|
||||
- **Cannot store both hashes without major memory overhead**
|
||||
|
||||
### Merkle Tree Constraint
|
||||
The Merkle tree structure requires homogeneous hashing:
|
||||
```
|
||||
Root (BLAKE3)
|
||||
├── Child1 (BLAKE3) ✓
|
||||
└── Child2 (SHA-512) ✗ IMPOSSIBLE - breaks tree integrity
|
||||
```
|
||||
You cannot mix hash algorithms within a single tree - all nodes must use the same algorithm.
|
||||
|
||||
### Migration Strategies Considered
|
||||
|
||||
#### Option 1: Lazy/Gradual Migration ✗
|
||||
- Store both SHA-512 and BLAKE3 hashes in each node
|
||||
- Problems:
|
||||
- Double memory usage per node
|
||||
- Complex cache invalidation logic
|
||||
- Still can't mix algorithms in same tree
|
||||
- Node sharing between ledgers becomes impossible
|
||||
|
||||
#### Option 2: Big Bang Migration ✓ (Recommended)
|
||||
- At transition ledger:
|
||||
- Invalidate ALL cached/stored nodes
|
||||
- Rebuild entire state with new hash algorithm
|
||||
- Maintain separate caches for pre/post transition
|
||||
- Benefits:
|
||||
- Clean separation of hash epochs
|
||||
- Easier to reason about
|
||||
- No memory overhead
|
||||
- Maintains tree integrity
|
||||
|
||||
### Implementation Requirements for Big Bang
|
||||
|
||||
1. **Pass ledger_index through all hash operations:**
|
||||
```cpp
|
||||
void updateHash(std::uint32_t ledgerSeq);
|
||||
SHAMapHash getHash(std::uint32_t ledgerSeq) const;
|
||||
```
|
||||
|
||||
2. **Separate node caches by hash epoch:**
|
||||
- Pre-transition: SHA-512 node cache
|
||||
- Post-transition: BLAKE3 node cache
|
||||
- Never share nodes between epochs
|
||||
|
||||
3. **Critical functions needing updates:**
|
||||
- `SHAMapInnerNode::updateHash()`
|
||||
- `SHAMapInnerNode::updateHashDeep()`
|
||||
- All `SHAMapLeafNode` subclass hash computations
|
||||
- `SHAMap::flushDirty()` and `walkSubTree()`
|
||||
|
||||
### Why Big Bang is Preferred
|
||||
- **Correctness**: Guarantees Merkle tree integrity
|
||||
- **Simplicity**: No complex dual-hash logic
|
||||
- **Performance**: No overhead of maintaining multiple hashes
|
||||
- **Clear boundaries**: Pre-transition and post-transition are completely separate
|
||||
|
||||
The transition point becomes a hard boundary where the entire ledger state is rehashed with the new algorithm.
|
||||
|
||||
### Alternative: Heterogeneous Tree - Deep Dive
|
||||
|
||||
After deeper analysis, heterogeneous trees are more complex but potentially viable. Here's a comprehensive examination:
|
||||
|
||||
#### The Core Insight: Hash Values Are Algorithm-Agnostic
|
||||
|
||||
When `SHAMapInnerNode::updateHash()` computes a hash:
|
||||
```cpp
|
||||
void updateHash(hash_options const& opts) {
|
||||
sha512_half_hasher h(opts);
|
||||
hash_append(h, HashPrefix::innerNode);
|
||||
iterChildren([&](SHAMapHash const& hh) { hash_append(h, hh); });
|
||||
// Parent hashes the HASH VALUES of children, not the raw data
|
||||
}
|
||||
```
|
||||
|
||||
**Key realization**: Parent nodes hash their children's hash values (256-bit numbers), NOT the children's data. This means a BLAKE3 parent can hash SHA-512 child hashes without issue.
|
||||
|
||||
#### How Heterogeneous Trees Would Work
|
||||
|
||||
Post-transition rule: **Any NEW hash uses BLAKE3**
|
||||
|
||||
```
|
||||
Ledger 19,999,999 (SHA-512):
|
||||
Root_SHA512
|
||||
├── Child1_SHA512
|
||||
└── Child2_SHA512
|
||||
|
||||
Ledger 20,000,000 (BLAKE3) - Child1 modified:
|
||||
Root_BLAKE3 = BLAKE3(Child1_BLAKE3_hash || Child2_SHA512_hash)
|
||||
├── Child1_BLAKE3 (NEW hash due to modification)
|
||||
└── Child2_SHA512 (unchanged, keeps old hash)
|
||||
```
|
||||
|
||||
#### The Canonical Structure Ensures Determinism
|
||||
|
||||
**Critical insight**: SHAMap trees are **canonical** - the structure is deterministic based on keys:
|
||||
- Alice's account always goes in the same tree position
|
||||
- Bob's account always goes in the same position
|
||||
- Tree shape is fully determined by the set of keys
|
||||
|
||||
Therefore:
|
||||
- Same modifications → Same tree structure
|
||||
- Same tree structure → Same nodes get rehashed
|
||||
- Same rehashing → Same final root hash
|
||||
- **Consensus is maintained!**
|
||||
|
||||
#### The "NEW vs OLD" Detection Problem
|
||||
|
||||
The killer issue: How do you know if you're computing a NEW hash vs verifying an OLD one?
|
||||
|
||||
```cpp
|
||||
void updateHash(hash_options const& opts) {
|
||||
// Am I computing a NEW hash (use BLAKE3)?
|
||||
// Or verifying an OLD hash (could be SHA-512)?
|
||||
// This function doesn't know WHY it was called!
|
||||
}
|
||||
```
|
||||
|
||||
Without explicit context about NEW vs OLD:
|
||||
- Loading from DB: Don't know if it's SHA-512 or BLAKE3
|
||||
- Modifying a node: Should use BLAKE3
|
||||
- Verifying from network: Could be either
|
||||
|
||||
Potential solutions:
|
||||
1. **Try-both approach**: Verify with BLAKE3, fallback to SHA-512
|
||||
2. **Version tracking**: Store algorithm version with each node
|
||||
3. **Context passing**: Thread NEW/OLD context through all calls
|
||||
|
||||
#### Canonical Nodes (cowid=0) - A Complication
|
||||
|
||||
Canonical nodes are immutable and shared, BUT:
|
||||
- They ARE verified when loaded from DB or network
|
||||
- The verification needs to compute the hash to check integrity
|
||||
- This means we need to know WHICH algorithm was used
|
||||
- Can't just trust the hash - must verify data matches
|
||||
|
||||
This actually makes heterogeneous trees HARDER because:
|
||||
```cpp
|
||||
// When loading a canonical node from DB:
|
||||
auto node = SHAMapTreeNode::makeFromPrefix(data, hash);
|
||||
// Need to verify: does hash(data) == provided_hash?
|
||||
// But which hash function? SHA-512 or BLAKE3?
|
||||
// Must try both, adding complexity and ambiguity
|
||||
```
|
||||
|
||||
#### System-Wide Implications
|
||||
|
||||
##### Database Layer
|
||||
- Heterogeneous: Same data might have 2 entries (SHA-512 and BLAKE3 versions)
|
||||
- Big Bang: Clean cutover, old entries become invalid
|
||||
|
||||
##### Network Sync
|
||||
- Heterogeneous: Ambiguous - "I need node 0xABC..." (which algorithm?)
|
||||
- Big Bang: Clear - algorithm determined by ledger context
|
||||
|
||||
##### Consensus
|
||||
- Heterogeneous: Works IF all validators make same NEW/OLD decisions
|
||||
- Big Bang: Simple - everyone uses same algorithm
|
||||
|
||||
##### External Proof Verification
|
||||
- Heterogeneous: Complex - mixed algorithms in Merkle paths
|
||||
- Big Bang: Simple - "before ledger X use SHA-512, after use BLAKE3"
|
||||
|
||||
##### Performance
|
||||
- Heterogeneous: Double verification attempts for old nodes
|
||||
- Big Bang: One-time rehash cost
|
||||
|
||||
#### Gradual Migration Pattern
|
||||
|
||||
With heterogeneous trees, migration happens naturally:
|
||||
```
|
||||
Ledger 20,000,000: 5% BLAKE3, 95% SHA512
|
||||
Ledger 20,001,000: 30% BLAKE3, 70% SHA512
|
||||
Ledger 20,010,000: 60% BLAKE3, 40% SHA512
|
||||
Ledger 20,100,000: 90% BLAKE3, 10% SHA512
|
||||
Eventually: ~100% BLAKE3 (dormant nodes may remain SHA-512)
|
||||
```
|
||||
|
||||
#### Remaining Challenges
|
||||
|
||||
1. **Context Plumbing**: Need to distinguish NEW vs OLD operations everywhere
|
||||
2. **Verification Ambiguity**: Failures could be corruption OR wrong algorithm
|
||||
3. **Testing Complexity**: Many more edge cases to test
|
||||
4. **Protocol Complexity**: Merkle proofs need algorithm information
|
||||
5. **Developer Cognitive Load**: Harder to reason about
|
||||
|
||||
### Conclusion: Trade-offs
|
||||
|
||||
**Heterogeneous Trees**:
|
||||
- ✅ No "big bang" transition
|
||||
- ✅ Natural, incremental migration
|
||||
- ✅ Maintains consensus (with careful implementation)
|
||||
- ❌ Permanent complexity throughout codebase
|
||||
- ❌ Ambiguous verification
|
||||
- ❌ Complex testing
|
||||
|
||||
**Big Bang Migration**:
|
||||
- ✅ Clean, simple mental model
|
||||
- ✅ Clear algorithm boundaries
|
||||
- ✅ Easier testing and debugging
|
||||
- ❌ One-time massive performance hit
|
||||
- ❌ Requires careful coordination
|
||||
- ❌ Can't easily roll back
|
||||
|
||||
The heterogeneous approach is **theoretically viable** but adds significant permanent complexity. Big Bang is simpler but has a painful transition. The choice depends on whether you prefer one-time pain (Big Bang) or permanent complexity (heterogeneous).
|
||||
|
||||
## Files Modified So Far
|
||||
1. `src/ripple/protocol/digest.h` - Added hash_options struct and modified sha512Half signatures
|
||||
2. `src/ripple/protocol/impl/Indexes.cpp` - Updated indexHash and all keylet functions to accept hash_options
|
||||
3. `src/ripple/protocol/Indexes.h` - Updated all function declarations to include hash_options
|
||||
|
||||
## Current Status
|
||||
- Core hash functions modified with backward-compatible overloads
|
||||
- All keylet functions updated to require hash_options
|
||||
- Propagating hash_options through codebase - MASSIVE undertaking
|
||||
- 91+ compilation errors remaining after fixing ~20 files
|
||||
- Every fix exposes more errors as headers propagate changes
|
||||
|
||||
## SHAMap Node Factory Methods - Missing Ledger Context
|
||||
|
||||
### The Problem with makeFromWire and makeFromPrefix
|
||||
These factory methods create SHAMap nodes from serialized data but **don't have ledger context**:
|
||||
|
||||
```cpp
|
||||
// Called when receiving nodes from peers over network
|
||||
SHAMapTreeNode::makeFromWire(Slice rawNode)
|
||||
|
||||
// Called when loading nodes from database
|
||||
SHAMapTreeNode::makeFromPrefix(Slice rawNode, SHAMapHash const& hash)
|
||||
```
|
||||
|
||||
These methods:
|
||||
- Parse serialized node data (from network or database)
|
||||
- Create `SHAMapInnerNode` or leaf nodes
|
||||
- Need to call `updateHash()` if hash isn't provided
|
||||
- **BUT** don't know which ledger they're building for!
|
||||
|
||||
### Why This Is Critical
|
||||
When a node is loaded from database or received from network:
|
||||
1. The serialized data doesn't include ledger_index
|
||||
2. The node might be shared across multiple ledgers (pre-transition)
|
||||
3. After transition, we need to know if this node uses SHA-512 or BLAKE3
|
||||
4. Currently using `LEDGER_INDEX_UNKNOWN` as placeholder
|
||||
|
||||
### Implications for P2P Protocol
|
||||
The network protocol would need changes:
|
||||
- `TMGetLedger` messages would need to specify ledger_index
|
||||
- Node responses would need hash algorithm version
|
||||
- Database storage would need to track which hash was used
|
||||
|
||||
This reinforces why **Big Bang migration** is simpler - no protocol changes needed!
|
||||
|
||||
## Next Steps
|
||||
1. Add hash_options{0} to all keylet call sites (using 0 as placeholder ledger_index)
|
||||
2. Get the code compiling first
|
||||
3. Later: Thread actual ledger_index values through from Views/transactions
|
||||
4. Eventually: Add Blake3 switching logic based on ledger_index threshold
|
||||
|
||||
## Key Insight
|
||||
Every place that creates a ledger key needs to know what ledger it's operating on. This is a massive change touching:
|
||||
- All View classes
|
||||
- All transaction processors
|
||||
- All RPC handlers
|
||||
- All tests
|
||||
- Consensus code
|
||||
|
||||
## Compilation Strategy
|
||||
NEVER use hash_options{0} - always use proper classification from the HashContext enum in digest.h!
|
||||
|
||||
## Quick Reference for Fixing Test Files
|
||||
|
||||
### Essential Files to Reference
|
||||
1. **@src/ripple/protocol/digest.h** - Lines 37-107 contain the HashContext enum with ALL valid classifiers
|
||||
2. **@src/ripple/protocol/Indexes.h** - Shows all keylet function signatures that need hash_options
|
||||
|
||||
### Keylet Function Mapping to HashContext
|
||||
When you see a keylet function, use the corresponding KEYLET_* enum value:
|
||||
```cpp
|
||||
keylet::account() → KEYLET_ACCOUNT
|
||||
keylet::amendments() → KEYLET_AMENDMENTS
|
||||
keylet::book() → KEYLET_BOOK
|
||||
keylet::check() → KEYLET_CHECK
|
||||
keylet::child() → KEYLET_CHILD
|
||||
keylet::depositPreauth() → KEYLET_DEPOSIT_PREAUTH
|
||||
keylet::dirPage() → KEYLET_DIR_PAGE
|
||||
keylet::emittedDir() → KEYLET_EMITTED_DIR
|
||||
keylet::emittedTxn() → KEYLET_EMITTED_TXN
|
||||
keylet::escrow() → KEYLET_ESCROW
|
||||
keylet::fees() → KEYLET_FEES
|
||||
keylet::hook() → KEYLET_HOOK
|
||||
keylet::hookDefinition() → KEYLET_HOOK_DEFINITION
|
||||
keylet::hookState() → KEYLET_HOOK_STATE
|
||||
keylet::hookStateDir() → KEYLET_HOOK_STATE_DIR
|
||||
keylet::importVLSeq() → KEYLET_IMPORT_VLSEQ
|
||||
keylet::negativeUNL() → KEYLET_NEGATIVE_UNL
|
||||
keylet::nftBuys() → KEYLET_NFT_BUYS
|
||||
keylet::nftOffer() → KEYLET_NFT_OFFER
|
||||
keylet::nftPage() → KEYLET_NFT_PAGE
|
||||
keylet::nftSells() → KEYLET_NFT_SELLS
|
||||
keylet::offer() → KEYLET_OFFER
|
||||
keylet::ownerDir() → KEYLET_OWNER_DIR
|
||||
keylet::payChan() → KEYLET_PAYCHAN
|
||||
keylet::signers() → KEYLET_SIGNERS
|
||||
keylet::skip() → KEYLET_SKIP_LIST
|
||||
keylet::ticket() → KEYLET_TICKET
|
||||
keylet::trustline() → KEYLET_TRUSTLINE
|
||||
keylet::unchecked() → KEYLET_UNCHECKED
|
||||
keylet::UNLReport() → KEYLET_UNL_REPORT
|
||||
keylet::uriToken() → KEYLET_URI_TOKEN
|
||||
```
|
||||
|
||||
### Non-Keylet Hash Classifications
|
||||
```cpp
|
||||
sha512Half() for validator data → VALIDATOR_LIST_HASH
|
||||
sha512Half() for hook code → HOOK_DEFINITION or LEDGER_INDEX_UNNEEDED
|
||||
sha512Half() for signatures → CRYPTO_SIGNATURE_HASH
|
||||
sha512Half() for network protocol → NETWORK_HANDSHAKE_HASH
|
||||
sha512Half_s() for secure hashing → Same rules apply
|
||||
```
|
||||
|
||||
## Key Insights
|
||||
|
||||
### The Scale Problem
|
||||
Every place that creates a ledger key needs to know what ledger it's operating on:
|
||||
- All View classes
|
||||
- All transaction processors (50+ files)
|
||||
- All RPC handlers (30+ files)
|
||||
- All tests (hundreds of files)
|
||||
- Consensus code
|
||||
- Path finding code
|
||||
- Payment processing pipelines
|
||||
|
||||
### Why This Is Hard
|
||||
1. **Cascading changes**: Fixing one header file exposes dozens of new errors
|
||||
2. **Deep call chains**: Ledger index must be threaded through multiple layers
|
||||
3. **Protocol implications**: Network messages would need to include ledger sequences
|
||||
4. **No gradual migration**: Can't mix hash algorithms - it's all or nothing
|
||||
5. **Testing nightmare**: Every test that creates mock ledger objects needs updating
|
||||
|
||||
### The Fundamental Challenge
|
||||
The hash function is so deeply embedded in the architecture that changing it is like replacing the foundation of a building while people are living in it. This is why most blockchains never change their hash functions after launch.
|
||||
|
||||
## Key Learnings - Where to Get ledger_index
|
||||
|
||||
### 1. ReadView/ApplyView Classes
|
||||
- `view.seq()` returns the LedgerIndex (found in ReadView.h:193)
|
||||
- ReadView has `info()` method that returns LedgerInfo struct
|
||||
- LedgerInfo contains `seq` field which is the ledger sequence number
|
||||
- Cast to uint32_t: `static_cast<std::uint32_t>(view.seq())`
|
||||
|
||||
### 2. Common Patterns Found
|
||||
- Functions that take `ReadView const& view` can access `view.seq()`
|
||||
- Functions that take `ApplyView& view` can also access `view.seq()` (ApplyView inherits from ReadView)
|
||||
|
||||
### 3. Files That Need Updates
|
||||
- View.h - DONE - all 6 keylet calls updated to use view.seq()
|
||||
- Any file with keylet:: namespace calls
|
||||
- Transaction processors that create/lookup ledger objects
|
||||
|
||||
### 4. Progress Log
|
||||
|
||||
#### Stats (Updated: 2025-09-09)
|
||||
- Total files with keylet:: calls: 126
|
||||
- Files fixed so far: **100+** (including all core files, transaction processors, RPC handlers, and most tests)
|
||||
- Started with 7 errors, peaked at 105+ as fixes propagate through headers
|
||||
- **Currently down to 113 errors across 11 test files** (from hundreds of files!)
|
||||
- ALL keylet function signatures updated to require hash_options
|
||||
- Pattern emerging: Every transaction processor, every RPC handler, every test needs updating
|
||||
|
||||
#### Major Milestone Achieved
|
||||
- Successfully fixed ALL non-test source files
|
||||
- Fixed majority of test files using parallel agents
|
||||
- Demonstrated that the hash migration IS possible despite being a massive undertaking
|
||||
|
||||
#### Major Files Fixed
|
||||
- All core ledger files (View.cpp, ApplyView.cpp, ReadView.cpp, etc.)
|
||||
- Most transaction processors (Payment, Escrow, CreateOffer, NFToken*, etc.)
|
||||
- Hook implementation files (applyHook.cpp, SetHook.cpp)
|
||||
- Infrastructure files (Transactor.cpp, BookDirs.cpp, Directory.cpp)
|
||||
|
||||
#### Key Milestone
|
||||
- Updated ALL keylet function signatures in Indexes.h/cpp to require hash_options
|
||||
- Even functions that take pre-existing uint256 keys now require hash_options for consistency
|
||||
- This causes massive cascading compilation errors but ensures consistency
|
||||
|
||||
#### Files Fixed So Far
|
||||
1. **View.h** - Fixed using `view.seq()`
|
||||
2. **Ledger.cpp** - Fixed using `info_.seq` in constructor, `seq()` in methods
|
||||
3. **LocalTxs.cpp** - Fixed using `view.seq()`
|
||||
4. **NegativeUNLVote.cpp** - Fixed using `prevLedger->seq()`
|
||||
5. **TxQ.cpp** - Fixed 5 calls using `view.seq()`
|
||||
6. **SkipListAcquire.cpp** - Fixed (with protocol issue on line 90)
|
||||
7. **LedgerReplayMsgHandler.cpp** - Fixed using `info.seq`
|
||||
8. **RCLValidations.cpp** - Fixed using `ledger->seq()`
|
||||
9. **Credit.cpp** - Fixed using `view.seq()`
|
||||
10. **StepChecks.h** - Fixed using `view.seq()`
|
||||
11. **BookTip.cpp** - Fixed using `view.seq()`
|
||||
12. **XRPEndpointStep.cpp** - Fixed using `ctx.view.seq()`
|
||||
13. **DirectStep.cpp** - Fixed using `sb.seq()` and `ctx.view.seq()`
|
||||
14. **BookStep.cpp** - Fixed using `afView.seq()` and `view.seq()`
|
||||
15. **CancelCheck.cpp** - Fixed using `ctx.view.seq()` and `view().seq()`
|
||||
16. **Pathfinder.cpp** - Fixed using `mLedger->seq()`
|
||||
17. More coming...
|
||||
|
||||
#### Common Patterns Found
|
||||
- **ReadView/ApplyView**: Use `.seq()`
|
||||
- **Ledger pointer**: Use `->seq()`
|
||||
- **Transactor classes**: Use `view().seq()`
|
||||
- **PaymentSandbox**: Use `sb.seq()`
|
||||
- **StrandContext**: Use `ctx.view.seq()`
|
||||
- **LedgerInfo struct**: Use `.seq` field directly
|
||||
|
||||
### 5. Architectural Questions
|
||||
|
||||
#### Keylets with pre-existing keys
|
||||
- Functions like `keylet::check(uint256 const& key)` just wrap an existing key
|
||||
- They don't compute a new hash, just interpret the key as a specific ledger type
|
||||
- **Question**: Do these really need hash_options?
|
||||
- **Current approach**: Include hash_options for consistency, might store ledger_index in Keylet for future use
|
||||
- **Note**: This could be revisited - might be unnecessary overhead for simple key wrapping
|
||||
|
||||
### 6. Known Issues
|
||||
- SkipListAcquire.cpp line 90: Requesting skip list by hash without knowing ledger seq yet
|
||||
- Can't know the sequence until we GET the ledger
|
||||
- Using hash_options{0} as placeholder
|
||||
- Would need to refactor to fetch ledger first, THEN request skip list with proper seq
|
||||
- Or protocol change to handle "skip list for whatever ledger this hash is"
|
||||
|
||||
### 7. Network Protocol Has Ledger Sequence!
|
||||
|
||||
**Critical Discovery**: The protobuf definitions show that network messages DO carry ledger sequence in many places:
|
||||
|
||||
#### TMLedgerData (what InboundLedger::processData receives):
|
||||
```protobuf
|
||||
message TMLedgerData {
|
||||
required bytes ledgerHash = 1;
|
||||
required uint32 ledgerSeq = 2; // <-- HAS THE SEQUENCE!
|
||||
required TMLedgerInfoType type = 3;
|
||||
repeated TMLedgerNode nodes = 4;
|
||||
}
|
||||
```
|
||||
|
||||
#### TMGetLedger (the request):
|
||||
```protobuf
|
||||
message TMGetLedger {
|
||||
optional uint32 ledgerSeq = 4; // Can request by sequence
|
||||
}
|
||||
```
|
||||
|
||||
#### TMIndexedObject (per-object context):
|
||||
```protobuf
|
||||
message TMIndexedObject {
|
||||
optional uint32 ledgerSeq = 5; // Per-object sequence!
|
||||
}
|
||||
```
|
||||
|
||||
**Implications**:
|
||||
- The protocol already has infrastructure for ledger context
|
||||
- `InboundLedger` can use `packet.ledgerseq()` from TMLedgerData
|
||||
- Network sync might be solvable WITHOUT major protocol changes
|
||||
- The ledger sequence just needs to be threaded through to hash functions
|
||||
|
||||
**Key Flow**:
|
||||
```cpp
|
||||
InboundLedger::processData(protocol::TMLedgerData& packet)
|
||||
packet.ledgerseq() // <-- Extract sequence from protocol message
|
||||
<- SHAMap::addKnownNode(..., ledgerSeq)
|
||||
<- SHAMapTreeNode::makeFromWire(data, ledgerSeq)
|
||||
<- updateHash(hash_options{ledgerSeq})
|
||||
```
|
||||
|
||||
This solves a major piece of the puzzle - the network layer CAN provide context for hash verification!
|
||||
|
||||
### 8. Test File Patterns
|
||||
|
||||
**Key Discovery**: Tests should use actual ledger sequences, NOT placeholders!
|
||||
|
||||
#### Getting Ledger Sequence in Tests
|
||||
Tests typically use `test::jtx::Env` which provides access to ledger context:
|
||||
- `env.current()` - Returns a ReadView pointer
|
||||
- `env.current()->seq()` - Gets the current ledger sequence (already uint32_t)
|
||||
|
||||
#### Common Test Patterns
|
||||
|
||||
##### Pattern 1: Direct keylet calls
|
||||
```cpp
|
||||
// OLD
|
||||
env.le(keylet::line(alice, bob, currency));
|
||||
|
||||
// NEW
|
||||
env.le(keylet::line(hash_options{env.current()->seq(), KEYLET_TRUSTLINE}, alice, bob, currency));
|
||||
```
|
||||
|
||||
##### Pattern 2: Helper functions need env parameter
|
||||
```cpp
|
||||
// OLD
|
||||
static uint256 getCheckIndex(AccountID const& account, uint32_t seq) {
|
||||
return keylet::check(account, seq).key;
|
||||
}
|
||||
// Called as: getCheckIndex(alice, seq)
|
||||
|
||||
// NEW
|
||||
static uint256 getCheckIndex(test::jtx::Env& env, AccountID const& account, uint32_t seq) {
|
||||
return keylet::check(hash_options{env.current()->seq(), KEYLET_CHECK}, account, seq).key;
|
||||
}
|
||||
// Called as: getCheckIndex(env, alice, seq)
|
||||
```
|
||||
|
||||
##### Pattern 3: Fee calculations
|
||||
```cpp
|
||||
// Uses env.current() to get fee information
|
||||
XRPAmount const baseFeeDrops{env.current()->fees().base};
|
||||
```
|
||||
|
||||
#### Test Files Status
|
||||
- **Total test files needing fixes**: ~12-15
|
||||
- **Pattern**: All test files that create or look up ledger objects need updating
|
||||
- **Common test files**:
|
||||
- Check_test.cpp - PARTIALLY FIXED
|
||||
- AccountDelete_test.cpp
|
||||
- Escrow_test.cpp
|
||||
- NFToken_test.cpp
|
||||
- Flow_test.cpp
|
||||
- Import_test.cpp
|
||||
- etc.
|
||||
|
||||
#### Why NOT to use placeholders in tests
|
||||
- Tests verify actual ledger behavior
|
||||
- Using `hash_options{0}` would test wrong behavior after transition
|
||||
- Tests need to work both pre and post hash migration
|
||||
- `env.current()->seq()` gives the actual test ledger sequence
|
||||
|
||||
### 6. Ways to Get Ledger Sequence
|
||||
|
||||
#### From Application (app_):
|
||||
- `app_.getLedgerMaster()` gives you LedgerMaster
|
||||
|
||||
#### From LedgerMaster:
|
||||
- `getLedgerByHash(hash)` - Get ledger by hash, then call `->seq()` on it
|
||||
- `getLedgerBySeq(index)` - Get ledger by sequence directly
|
||||
- `getCurrentLedger()` - Current open ledger, call `->seq()`
|
||||
- `getClosedLedger()` - Last closed ledger, call `->seq()`
|
||||
- `getValidatedLedger()` - Last validated ledger, call `->seq()`
|
||||
- `getPublishedLedger()` - Last published ledger, call `->seq()`
|
||||
- `getCurrentLedgerIndex()` - Direct sequence number
|
||||
- `getValidLedgerIndex()` - Direct sequence number
|
||||
- `walkHashBySeq()` - Walk ledger chain to find hash by sequence
|
||||
|
||||
#### From Ledger object:
|
||||
- `ledger->seq()` - Direct method
|
||||
- `ledger->info().seq` - Through LedgerInfo
|
||||
|
||||
#### From ReadView/OpenView/ApplyView:
|
||||
- `view.seq()` - All views have this method
|
||||
- `view.info().seq` - Through LedgerInfo
|
||||
|
||||
#### Special Case - SkipListAcquire:
|
||||
- Line 67: `app_.getLedgerMaster().getLedgerByHash(hash_)`
|
||||
- If ledger exists locally, we get it and can use its seq()
|
||||
- If not, we're requesting it from peers - don't know seq yet!
|
||||
108
LAST_TESTAMENT.md
Normal file
108
LAST_TESTAMENT.md
Normal file
@@ -0,0 +1,108 @@
|
||||
# Last Testament: SHA-512 to BLAKE3 Migration Deep Dive
|
||||
|
||||
## The Journey
|
||||
Started with "just change the hash function" - ended up discovering why hash functions are permanent blockchain decisions.
|
||||
|
||||
## Key Discoveries
|
||||
|
||||
### 1. Ledger Implementation Added
|
||||
- `Ledger::shouldMigrateToBlake3()` - checks migration window/flags
|
||||
- `Ledger::migrateToBlake3()` - placeholder for actual migration
|
||||
- Hook in `BuildLedger.cpp` after transaction processing
|
||||
- Migration happens OUTSIDE transaction pipeline to avoid metadata explosion
|
||||
|
||||
### 2. The Rekeying Nightmare (REKEYING_ISSUES.md)
|
||||
Every object key changes SHA512→BLAKE3, but keys are EVERYWHERE:
|
||||
- **DirectoryNode.sfIndexes** - vectors of keys
|
||||
- **sfPreviousTxnID, sfIndexNext, sfIndexPrevious** - direct key refs
|
||||
- **Order books** - sorted by key value (order changes!)
|
||||
- **Hook state** - arbitrary blobs with embedded keys
|
||||
- **620k objects** with millions of interconnected references
|
||||
|
||||
### 3. The LUT Approach
|
||||
```cpp
|
||||
// Build lookup table: old_key → new_key (40MB for 620k entries)
|
||||
// O(n) to build, O(n×m) to update all fields
|
||||
// Must check EVERY uint256 field in EVERY object
|
||||
```
|
||||
**Problems:**
|
||||
- LUT check on EVERY lookup forever (performance tax)
|
||||
- Can't know if key is old/new without checking
|
||||
- Might need bidirectional LUT (80MB)
|
||||
- Can NEVER remove it (WASM has hardcoded keys!)
|
||||
|
||||
### 4. The WASM Hook Bomb
|
||||
Hook code can hardcode keys in compiled WASM:
|
||||
- Can't modify without changing hook hash
|
||||
- Changing hash breaks all references to hook
|
||||
- Literally unfixable without breaking hooks
|
||||
|
||||
### 5. MapStats Enhancement
|
||||
Added ledger entry type tracking:
|
||||
- Count, total bytes, avg size per type
|
||||
- Uses `LedgerFormats::getInstance().findByType()` for names
|
||||
- Shows 124k HookState entries (potential key references!)
|
||||
|
||||
### 6. Current Ledger Stats
|
||||
- 620k total objects, 98MB total
|
||||
- 117k DirectoryNodes (full of references)
|
||||
- 124k HookState entries (arbitrary data)
|
||||
- 80 HookDefinitions (WASM code)
|
||||
- Millions of internal key references
|
||||
|
||||
## Why It's Impossible
|
||||
|
||||
### The Lookup Problem
|
||||
```cpp
|
||||
auto key = keylet::account(Alice).key;
|
||||
// Which hash function? SHA512 or BLAKE3?
|
||||
// NO WAY TO KNOW without timestamp/LUT/double-lookup
|
||||
```
|
||||
|
||||
### The Fundamental Issues
|
||||
1. **No key timestamp** - Can't tell if key is pre/post migration
|
||||
2. **Embedded references everywhere** - sfIndexes, hook state, WASM code
|
||||
3. **Permanent LUT required** - Check on every operation forever
|
||||
4. **Performance death** - 2x lookups or LUT check on everything
|
||||
5. **Dual-key SHAMap impossible** - Breaks ordering/structure
|
||||
|
||||
### The "Solutions" That Don't Work
|
||||
- **Lazy migration**: LUT forever, complexity forever
|
||||
- **Big bang**: Still need permanent LUT for old references
|
||||
- **Heterogeneous tree**: Can't mix hash functions in Merkle tree
|
||||
- **Binary search-replace**: Could corrupt data, no validation
|
||||
- **Import to v2 chain**: Same reference update problems
|
||||
|
||||
## The Verdict
|
||||
|
||||
After threading ledger_index through 1000+ functions for hash context, the migration faces insurmountable challenges:
|
||||
|
||||
1. **WASM hooks** contain unfixable hardcoded keys
|
||||
2. **Every lookup** needs LUT check forever (performance tax)
|
||||
3. **Can't determine key age** from identifier alone
|
||||
4. **Millions of references** need perfect updating
|
||||
5. **One mistake** = corrupted ledger, lost funds
|
||||
|
||||
**Conclusion**: This is a "v2 blockchain" problem, not a migration problem. SHA-512 is forever.
|
||||
|
||||
## Lessons Learned
|
||||
|
||||
- Hash functions are THE fundamental addressing system
|
||||
- Deeper than consensus, deeper than data structures
|
||||
- Once chosen, essentially permanent
|
||||
- The attempt revealed the true complexity of blockchain internals
|
||||
- Even 98MB of data has millions of interconnected references
|
||||
- WASM hooks make any migration effectively impossible
|
||||
|
||||
**Final Status**: Technically possible with permanent LUT and massive complexity. Practically impossible due to hooks, performance, and risk. SHA-512 until heat death of universe.
|
||||
|
||||
## Code Artifacts
|
||||
|
||||
- Hash classification system in `digest.h`
|
||||
- `hash_options` threaded through Indexes/keylets
|
||||
- Ledger migration methods (placeholder)
|
||||
- MapStats with entry type breakdown
|
||||
- REKEYING_ISSUES.md with full analysis
|
||||
- 100+ files modified to thread ledger context
|
||||
|
||||
The migration died not from lack of effort, but from the discovery that some architectural decisions are truly permanent.
|
||||
71
LEDGER_INDEX_PREVIOUS.md
Normal file
71
LEDGER_INDEX_PREVIOUS.md
Normal file
@@ -0,0 +1,71 @@
|
||||
Yes! You’re onto something quite elegant there. Let me break it down:
|
||||
|
||||
---
|
||||
|
||||
### 🔑 LedgerEntry “PriorKey” field
|
||||
|
||||
If each SLE carried an **optional `LedgerIndexPriorSha512`** (or whatever name), then:
|
||||
|
||||
* When you rekey at cutover, you don’t lose continuity.
|
||||
* Anyone walking the ledger can trivially build the old→new LUT on the fly.
|
||||
* For “day-to-day” operations, the node only needs the canonical keys.
|
||||
|
||||
This basically lets you say:
|
||||
|
||||
* **Canonical view** = new Blake3 keys.
|
||||
* **Continuity view** = traverse new keys, but if you need to resolve old references, follow the `LedgerIndexPriorSha512` link.
|
||||
|
||||
---
|
||||
|
||||
### 📊 For analysis and data science
|
||||
|
||||
You can now reconstruct historical continuity without keeping a giant external LUT file. Just walk the ledger and collect `(old, new)` pairs from the entries.
|
||||
|
||||
This means explorers, forensic tools, compliance, etc. can:
|
||||
|
||||
* Map old keys to new at any point.
|
||||
* Handle partially migrated networks (some ledgers pre-cutover, some post).
|
||||
|
||||
---
|
||||
|
||||
### 🟢 Operational benefits
|
||||
|
||||
* Validators don’t need to bloat memory with a global LUT — it’s **distributed inside the ledger itself**.
|
||||
* New ledgers “just work” for CRUD.
|
||||
* Old references (hooks, weird integrations) can be handled until people migrate.
|
||||
|
||||
---
|
||||
|
||||
### ⚠️ Caveats
|
||||
|
||||
1. **Storage overhead**: each SLE gets \~32 bytes extra. But given only \~600k objects in the state tree, that’s trivial compared to the win.
|
||||
2. **Hardcoded keys in hooks/wasm**: still a landmine — if someone has *literally baked in* the old canonical key, they’ll need the LUT or breakage handling at ApplyView.
|
||||
3. **Consensus rules**: adding this field changes serialization — so it’s an amendment, not just an operator convenience.
|
||||
|
||||
---
|
||||
|
||||
### 🔄 Proposal workflow
|
||||
|
||||
1. Cutover ledger → during bookkeeping, every object is rekeyed into the new canonical tree.
|
||||
2. At the same time, each new object gets `LedgerIndexPriorSha512 = old_key`.
|
||||
3. From then on, **every ledger post-cutover contains the LUT implicitly**.
|
||||
4. After some deprecation period, once nobody references `LedgerIndexPriorSha512`, you can prune support.
|
||||
|
||||
---
|
||||
|
||||
So in practice:
|
||||
|
||||
* **Validators** only care about the canonical map.
|
||||
* **Explorers, auditors, backward-compat systems** can lazily reconstruct the LUT.
|
||||
* You don’t poison `succ()` or iteration with mixed domains.
|
||||
|
||||
---
|
||||
|
||||
This actually solves both your goals:
|
||||
|
||||
* Day-to-day simplicity (Blake-only canonical).
|
||||
* Historical continuity (walk the field if you need).
|
||||
|
||||
---
|
||||
|
||||
Want me to sketch how `ReadView::read(Keylet)` could transparently try `LUT` lookups only if the canonical miss happens, using that new field? That would make it zero-effort for old call sites.
|
||||
1093
PSEUDO_TRANSACTIONS.md
Normal file
1093
PSEUDO_TRANSACTIONS.md
Normal file
File diff suppressed because it is too large
Load Diff
332
REKEYING_ISSUES.md
Normal file
332
REKEYING_ISSUES.md
Normal file
@@ -0,0 +1,332 @@
|
||||
# Hash Migration Rekeying Issues
|
||||
|
||||
## The Fundamental Problem
|
||||
|
||||
When migrating from SHA-512 Half to BLAKE3, we're not just changing a hash function - we're changing the **keys** that identify every object in the ledger's state map. Since the hash IS the key in the SHAMap, every object needs a new address.
|
||||
|
||||
## What Needs to be Rekeyed
|
||||
|
||||
### 1. Primary Ledger Objects
|
||||
Every SLE (STLedgerEntry) in the state map has its key computed from:
|
||||
- Its type (Account, Offer, RippleState, etc.)
|
||||
- Its identifying data (AccountID, offer sequence, etc.)
|
||||
|
||||
When we change the hash function, EVERY object gets a new key.
|
||||
|
||||
### 2. Directory Structures
|
||||
Directories are ledger objects that contain **lists of other objects' keys**:
|
||||
|
||||
#### Owner Directories (`keylet::ownerDir`)
|
||||
- Contains keys of all objects owned by an account
|
||||
- Every offer, escrow, check, payment channel, etc. key stored here
|
||||
- When those objects are rekeyed, ALL these references must be updated
|
||||
|
||||
#### Order Book Directories (`keylet::book`, `keylet::quality`)
|
||||
- Contains keys of offers at specific quality levels
|
||||
- All offer keys must be updated to their new BLAKE3 values
|
||||
|
||||
#### NFT Directories (`keylet::nft_buys`, `keylet::nft_sells`)
|
||||
- Contains keys of NFT offers
|
||||
- All NFT offer keys must be updated
|
||||
|
||||
#### Hook State Directories (`keylet::hookStateDir`)
|
||||
- Contains keys of hook state entries
|
||||
- All hook state keys must be updated
|
||||
|
||||
### 3. Cross-References Between Objects
|
||||
Many objects contain direct references to other objects:
|
||||
|
||||
#### Account Objects
|
||||
- `sfNFTokenPage` - References to NFT page keys
|
||||
- Previous/Next links in directory pages
|
||||
|
||||
#### Directory Pages
|
||||
- `sfIndexes` - Vector of object keys
|
||||
- `sfPreviousTxnID` - Transaction hash references
|
||||
- `sfIndexPrevious`/`sfIndexNext` - Links to other directory pages
|
||||
|
||||
#### NFT Pages
|
||||
- References to previous/next pages in the chain
|
||||
|
||||
## The Cascade Effect
|
||||
|
||||
Re-keying isn't a simple one-pass operation:
|
||||
|
||||
```
|
||||
1. Rekey Account A (SHA512 → BLAKE3)
|
||||
↓
|
||||
2. Update Account A's owner directory
|
||||
↓
|
||||
3. Rekey the owner directory itself
|
||||
↓
|
||||
4. Update all objects IN the directory with new keys
|
||||
↓
|
||||
5. Update any directories THOSE objects appear in
|
||||
↓
|
||||
6. Continue cascading...
|
||||
```
|
||||
|
||||
## Implementation Challenges
|
||||
|
||||
### Challenge 1: Directory Entry Updates
|
||||
```cpp
|
||||
// Current directory structure
|
||||
STVector256 indexes = directory->getFieldV256(sfIndexes);
|
||||
// Contains: [sha512_key1, sha512_key2, sha512_key3, ...]
|
||||
|
||||
// After migration needs to be:
|
||||
// [blake3_key1, blake3_key2, blake3_key3, ...]
|
||||
```
|
||||
|
||||
### Challenge 2: Finding All References
|
||||
There's no reverse index - given an object's key, you can't easily find all directories that reference it. You'd need to:
|
||||
1. Walk the entire state map
|
||||
2. Check every directory's `sfIndexes` field
|
||||
3. Update any matching keys
|
||||
|
||||
### Challenge 3: Maintaining Consistency
|
||||
During migration, you need to ensure:
|
||||
- No orphaned references (keys pointing to non-existent objects)
|
||||
- No duplicate entries
|
||||
- Proper ordering in sorted structures (offer books)
|
||||
|
||||
### Challenge 4: Page Links
|
||||
Directory pages link to each other:
|
||||
```cpp
|
||||
uint256 prevPage = dir->getFieldU256(sfIndexPrevious);
|
||||
uint256 nextPage = dir->getFieldU256(sfIndexNext);
|
||||
```
|
||||
These links are also keys that need updating!
|
||||
|
||||
## Why This Makes Migration Complex
|
||||
|
||||
### Option A: Big Bang Migration
|
||||
- Must update EVERYTHING atomically
|
||||
- Need to track old→new key mappings for entire ledger
|
||||
- Memory requirements: ~2x the state size for mapping table
|
||||
- Risk: Any missed reference breaks the ledger
|
||||
|
||||
### Option B: Heterogeneous Tree
|
||||
- Old nodes keep SHA-512 keys
|
||||
- New/modified nodes use BLAKE3
|
||||
- Problem: How do you know which hash to use for lookups?
|
||||
- Problem: Directory contains mix of old and new keys?
|
||||
|
||||
### Option C: Double Storage
|
||||
- Store objects under BOTH keys temporarily
|
||||
- Gradually migrate references
|
||||
- Problem: Massive storage overhead
|
||||
- Problem: Synchronization between copies
|
||||
|
||||
## Example: Rekeying an Offer
|
||||
|
||||
Consider rekeying a single offer:
|
||||
|
||||
1. **The Offer Itself**
|
||||
- Old key: `sha512Half(OFFER, account, sequence)`
|
||||
- New key: `blake3(OFFER, account, sequence)`
|
||||
|
||||
2. **Owner Directory**
|
||||
- Must update `sfIndexes` to replace old offer key with new
|
||||
|
||||
3. **Order Book Directory**
|
||||
- Must update `sfIndexes` in the quality directory
|
||||
- May need to update multiple quality levels if offer moved
|
||||
|
||||
4. **Account Object**
|
||||
- Update offer count/reserve tracking if needed
|
||||
|
||||
5. **The Directories Themselves**
|
||||
- Owner directory key: `sha512Half(OWNER_DIR, account, page)`
|
||||
- New key: `blake3(OWNER_DIR, account, page)`
|
||||
- Order book key: `sha512Half(BOOK_DIR, ...)`
|
||||
- New key: `blake3(BOOK_DIR, ...)`
|
||||
|
||||
## Potential Solutions
|
||||
|
||||
### 1. Migration Ledger Object
|
||||
Create a temporary "migration map" ledger object:
|
||||
```cpp
|
||||
sfOldKey → sfNewKey mappings
|
||||
```
|
||||
But this could be gigabytes for millions of objects.
|
||||
|
||||
### 2. Deterministic Rekeying
|
||||
Since we can determine an object's type from its `LedgerEntryType`, we could:
|
||||
1. Load each SLE
|
||||
2. Determine its type
|
||||
3. Recompute its key with BLAKE3
|
||||
4. Track the mapping
|
||||
|
||||
But we still need to update all references.
|
||||
|
||||
### 3. Lazy Migration
|
||||
Only rekey objects when they're modified:
|
||||
- Pro: Spreads migration over time
|
||||
- Con: Permanent complexity in codebase
|
||||
- Con: Must support both hash types forever
|
||||
|
||||
### 4. New State Structure
|
||||
Instead of migrating in-place, build an entirely new state map:
|
||||
1. Create new empty BLAKE3 SHAMap
|
||||
2. Walk old map, inserting with new keys
|
||||
3. Update all references during copy
|
||||
4. Atomically swap maps
|
||||
|
||||
This is essentially what BUILD_LEDGER.md suggests, but the reference updating remains complex.
|
||||
|
||||
## The Lookup Table Approach
|
||||
|
||||
After further analysis, a lookup table (LUT) based approach might actually be feasible:
|
||||
|
||||
### Algorithm Overview
|
||||
|
||||
#### Phase 1: Build the LUT (O(n))
|
||||
```cpp
|
||||
std::unordered_map<uint256, uint256> old_to_new;
|
||||
|
||||
stateMap_.visitLeaves([&](SHAMapItem const& item) {
|
||||
SerialIter sit(item.slice());
|
||||
auto sle = std::make_shared<SLE>(sit, item.key());
|
||||
|
||||
// Determine type from the SLE
|
||||
LedgerEntryType type = sle->getType();
|
||||
|
||||
// Recompute key with BLAKE3 based on type
|
||||
uint256 newKey = computeBlake3Key(sle, type);
|
||||
old_to_new[item.key()] = newKey;
|
||||
});
|
||||
// Results in ~620k entries in the LUT
|
||||
```
|
||||
|
||||
#### Phase 2: Update ALL uint256 Fields (O(n × m))
|
||||
Walk every object and check every uint256 field against the LUT:
|
||||
|
||||
```cpp
|
||||
stateMap_.visitLeaves([&](SHAMapItem& item) {
|
||||
SerialIter sit(item.slice());
|
||||
auto sle = std::make_shared<SLE>(sit, item.key());
|
||||
bool modified = false;
|
||||
|
||||
// Check every possible uint256 field
|
||||
modified |= updateField(sle, sfPreviousTxnID, old_to_new);
|
||||
modified |= updateField(sle, sfIndexPrevious, old_to_new);
|
||||
modified |= updateField(sle, sfIndexNext, old_to_new);
|
||||
modified |= updateField(sle, sfBookNode, old_to_new);
|
||||
|
||||
// Vector fields
|
||||
modified |= updateVector(sle, sfIndexes, old_to_new);
|
||||
modified |= updateVector(sle, sfHashes, old_to_new);
|
||||
modified |= updateVector(sle, sfAmendments, old_to_new);
|
||||
modified |= updateVector(sle, sfNFTokenOffers, old_to_new);
|
||||
|
||||
if (modified) {
|
||||
// Re-serialize with updated references
|
||||
Serializer s;
|
||||
sle->add(s);
|
||||
// Create new item with new key
|
||||
item = make_shamapitem(old_to_new[item.key()], s.slice());
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
### Complexity Analysis
|
||||
- **Phase 1**: O(n) where n = number of objects (~620k)
|
||||
- **Phase 2**: O(n × m) where m = average fields per object
|
||||
- **Hash lookups**: O(1) average case
|
||||
- **Total**: Linear in the number of objects!
|
||||
|
||||
### Memory Requirements
|
||||
- LUT size: 620k entries × (32 bytes + 32 bytes) = ~40 MB
|
||||
- Reasonable to keep in memory during migration
|
||||
|
||||
### Implementation Challenges
|
||||
|
||||
#### 1. Comprehensive Field Coverage
|
||||
Must check EVERY field that could contain a key:
|
||||
```cpp
|
||||
// Singleton uint256 fields
|
||||
sfPreviousTxnID, sfIndexPrevious, sfIndexNext, sfBookNode,
|
||||
sfNFTokenID, sfEmitParentTxnID, sfHookOn, sfHookStateKey...
|
||||
|
||||
// Vector256 fields
|
||||
sfIndexes, sfHashes, sfAmendments, sfNFTokenOffers,
|
||||
sfHookNamespaces, sfURITokenIDs...
|
||||
|
||||
// Nested structures
|
||||
STArray fields containing STObjects with uint256 fields
|
||||
```
|
||||
|
||||
#### 2. False Positive Risk
|
||||
Any uint256 that happens to match a key would be updated:
|
||||
- Could corrupt data if a non-key field matches
|
||||
- Mitigation: Only update known reference fields
|
||||
- Risk: Missing custom fields added by hooks
|
||||
|
||||
#### 3. Order Book Sorting
|
||||
Order books are sorted by key value. After rekeying:
|
||||
- Sort order changes completely
|
||||
- Need to rebuild book directories
|
||||
- Quality levels might shift
|
||||
|
||||
### Alternative: Persistent Migration Map
|
||||
|
||||
Instead of one-time migration, store the mapping permanently:
|
||||
|
||||
```cpp
|
||||
// Special ledger entries (one per ~1000 mappings)
|
||||
MigrationMap_0000: {
|
||||
sfOldKeys: [old_hash_0, old_hash_1, ...],
|
||||
sfNewKeys: [new_hash_0, new_hash_1, ...]
|
||||
}
|
||||
MigrationMap_0001: { ... }
|
||||
// ~620 of these objects
|
||||
```
|
||||
|
||||
Pros:
|
||||
- Can verify historical references
|
||||
- Debugging is easier
|
||||
- Can be pruned later if needed
|
||||
|
||||
Cons:
|
||||
- Permanent state bloat (~40MB)
|
||||
- Must be loaded on every node forever
|
||||
- Lookup overhead for historical operations
|
||||
|
||||
### The Nuclear Option: Binary Search-Replace
|
||||
|
||||
For maximum chaos (don't actually do this):
|
||||
```cpp
|
||||
// Build LUT
|
||||
std::map<std::array<uint8_t, 32>, std::array<uint8_t, 32>> binary_lut;
|
||||
|
||||
// Scan serialized blobs and replace
|
||||
for (auto& node : shamap) {
|
||||
auto data = node.getData();
|
||||
for (size_t i = 0; i <= data.size() - 32; i++) {
|
||||
if (binary_lut.count(data[i..i+31])) {
|
||||
memcpy(&data[i], binary_lut[data[i..i+31]], 32);
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Why this is insane:
|
||||
- False positives would corrupt data
|
||||
- No validation of what you're replacing
|
||||
- Breaks all checksums and signatures
|
||||
- Impossible to debug when it goes wrong
|
||||
|
||||
## Conclusion
|
||||
|
||||
The rekeying problem is not just about changing hash functions - it's about maintaining referential integrity across millions of interlinked objects. Every key change cascades through the reference graph, making this one of the most complex migrations possible in a blockchain system.
|
||||
|
||||
The lookup table approach makes it algorithmically feasible (linear time rather than quadratic), but the implementation complexity and risk remain enormous. You need to:
|
||||
1. Find every single field that could contain a key
|
||||
2. Update them all correctly
|
||||
3. Handle sorting changes in order books
|
||||
4. Avoid false positives
|
||||
5. Deal with custom fields from hooks
|
||||
6. Maintain consistency across the entire state
|
||||
|
||||
This is likely why most blockchains never change their hash functions after genesis - even with an efficient algorithm, the complexity and risk are enormous.
|
||||
741
RELEASENOTES.md
741
RELEASENOTES.md
@@ -7,747 +7,6 @@ This document contains the release notes for `rippled`, the reference server imp
|
||||
|
||||
Have new ideas? Need help with setting up your node? [Please open an issue here](https://github.com/xrplf/rippled/issues/new/choose).
|
||||
|
||||
# Version 2.2.3
|
||||
|
||||
Version 2.2.3 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release fixes a problem that can cause full-history servers to run out of space in their SQLite databases, depending on configuration. There are no new amendments in this release.
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
## Background
|
||||
|
||||
The `rippled` server uses a SQLite database for tracking transactions, in addition to the main data store (usually NuDB) for ledger data. In servers keeping a large amount of history, this database can run out of space based on the configured number and size of database pages, even if the machine has disk space available. Based on the size of full history on Mainnet, servers with the default SQLite page size of 4096 may now run out of space if they store full history. In this case, your server may shut down with an error such as the following:
|
||||
|
||||
```text
|
||||
Free SQLite space for transaction db is less than 512MB. To fix this, rippled
|
||||
must be executed with the vacuum <sqlitetmpdir> parameter before restarting.
|
||||
Note that this activity can take multiple days, depending on database size.
|
||||
```
|
||||
|
||||
The exact timing of when a server runs out of space can vary based on a few factors. Server operators who encountered a similar problem in 2018 and followed steps to [increase the SQLite transaction database page size issue](../../../docs/infrastructure/troubleshooting/fix-sqlite-tx-db-page-size-issue) may not encounter this problem at all. The `--vacuum` commandline option to `rippled` from that time may work to free up space in the database, but requires extended downtime.
|
||||
|
||||
Version 2.2.3 of `rippled` reconfigures the maximum number of SQLite pages so that the issue does not occur.
|
||||
|
||||
Clio servers providing full history are not affected by this issue.
|
||||
|
||||
|
||||
## Action Required
|
||||
|
||||
If you run an [XRP Ledger full history server](https://xrpl.org/docs/infrastructure/configuration/data-retention/configure-full-history), upgrading to version 2.2.3 may prevent the server from crashing when `transaction.db` exceeds approximately 8.7 terabytes.
|
||||
|
||||
Additionally, five amendments introduced in version 2.2.0 are open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators. If you operate an XRP Ledger server older than version 2.2.0, upgrade by Sep 23, 2024 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
|
||||
|
||||
## Changelog
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- Update SQLite3 max_page_count to match current defaults ([#5114](https://github.com/XRPLF/rippled/pull/5114))
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
J. Scott Branson <the@rabbitkick.club>
|
||||
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
|
||||
# Version 2.2.2
|
||||
|
||||
Version 2.2.2 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release fixes an ongoing issue with Mainnet where validators can stall during consensus processing due to lock contention, preventing ledgers from being validated for up to two minutes. There are no new amendments in this release.
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
## Action Required
|
||||
|
||||
If you run an XRP Ledger validator, upgrade to version 2.2.2 as soon as possible to ensure stable and uninterrupted network behavior.
|
||||
|
||||
Additionally, five amendments introduced in version 2.2.0 are open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators. If you operate an XRP Ledger server older than version 2.2.0, upgrade by September 17, 2024 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network. Version 2.2.2 is recommended because of known bugs affecting stability of versions 2.2.0 and 2.2.1.
|
||||
|
||||
If you operate a Clio server, Clio needs to be updated to 2.1.2 before updating to rippled 2.2.0. Clio will be blocked if it is not updated.
|
||||
|
||||
## Changelog
|
||||
|
||||
### Amendments and New Features
|
||||
|
||||
- None
|
||||
|
||||
### Bug Fixes and Performance Improvements
|
||||
|
||||
- Allow only 1 job queue slot for acquiring inbound ledger [#5115](https://github.com/XRPLF/rippled/pull/5115) ([7741483](https://github.com/XRPLF/rippled/commit/774148389467781aca7c01bac90af2fba870570c))
|
||||
|
||||
- Allow only 1 job queue slot for each validation ledger check [#5115](https://github.com/XRPLF/rippled/pull/5115) ([fbbea9e](https://github.com/XRPLF/rippled/commit/fbbea9e6e25795a8a6bd1bf64b780771933a9579))
|
||||
|
||||
### Other improvements
|
||||
|
||||
- Track latencies of certain code blocks, and log if they take too long [#5115](https://github.com/XRPLF/rippled/pull/5115) ([00ed7c9](https://github.com/XRPLF/rippled/commit/00ed7c942436f02644a13169002b5123f4e2a116))
|
||||
|
||||
### Docs and Build System
|
||||
|
||||
- None
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
Mark Travis <mtrippled@users.noreply.github.com>
|
||||
Valentin Balaschenko <13349202+vlntb@users.noreply.github.com>
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
# Version 2.2.1
|
||||
|
||||
Version 2.2.1 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release fixes a critical bug introduced in 2.2.0 handling some types of RPC requests.
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
## Action Required
|
||||
|
||||
If you run an XRP Ledger validator, upgrade to version 2.2.1 as soon as possible to ensure stable and uninterrupted network behavior.
|
||||
|
||||
Additionally, five amendments introduced in version 2.2.0 are open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators. If you operate an XRP Ledger server older than version 2.2.0, upgrade by August 14, 2024 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network. Version 2.2.1 is recommended because of known bugs affecting stability of versions 2.2.0.
|
||||
|
||||
If you operate a Clio server, Clio needs to be updated to 2.2.2 before updating to rippled 2.2.1. Clio will be blocked if it is not updated.
|
||||
|
||||
## Changelog
|
||||
|
||||
### Amendments and New Features
|
||||
|
||||
- None
|
||||
|
||||
### Bug Fixes and Performance Improvements
|
||||
|
||||
- Improve error handling in some RPC commands. [#5078](https://github.com/XRPLF/rippled/pull/5078)
|
||||
|
||||
- Use error codes throughout fast Base58 implementation. [#5078](https://github.com/XRPLF/rippled/pull/5078)
|
||||
|
||||
### Docs and Build System
|
||||
|
||||
- None
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
John Freeman <jfreeman08@gmail.com>
|
||||
Mayukha Vadari <mvadari@gmail.com>
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
|
||||
# Version 2.2.0
|
||||
|
||||
Version 2.2.0 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release adds performance optimizations, several bug fixes, and introduces the `featurePriceOracle`, `fixEmptyDID`, `fixXChainRewardRounding`, `fixPreviousTxnID`, and `fixAMMv1_1` amendments.
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
## Action Required
|
||||
|
||||
Five new amendments are now open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators.
|
||||
|
||||
If you operate an XRP Ledger server, upgrade to version 2.2.0 by June 17, 2024 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
|
||||
|
||||
If you operate a Clio server, Clio needs to be updated to 2.1.2 before updating to rippled 2.2.0. Clio will be blocked if it is not updated.
|
||||
|
||||
## Changelog
|
||||
|
||||
### Amendments and New Features
|
||||
(These are changes which may impact or be useful to end users. For example, you may be able to update your code/workflow to take advantage of these changes.)
|
||||
|
||||
- **featurePriceOracle** amendment: Implements a price oracle as defined in the [XLS-47](https://github.com/XRPLF/XRPL-Standards/blob/master/XLS-47d-PriceOracles/README.md) spec. A Price Oracle is used to bring real-world data, such as market prices, onto the blockchain, enabling dApps to access and utilize information that resides outside the blockchain. [#4789](https://github.com/XRPLF/rippled/pull/4789)
|
||||
|
||||
- **fixEmptyDID** amendment: Modifies the behavior of the DID amendment: adds an additional check to ensure that DIDs are non-empty when created, and returns a `tecEMPTY_DID` error if the DID would be empty. [#4950](https://github.com/XRPLF/rippled/pull/4950)
|
||||
|
||||
- **fixXChainRewardRounding** amendment: Modifies the behavior of the XChainBridge amendment: fixes rounding so reward shares are always rounded down, even when the `fixUniversalNumber` amendment is active. [#4933](https://github.com/XRPLF/rippled/pull/4933)
|
||||
|
||||
- **fixPreviousTxnID** amendment: Adds `PreviousTxnID` and `PreviousTxnLgrSequence` as fields to all ledger entries that did not already have them included (`DirectoryNode`, `Amendments`, `FeeSettings`, `NegativeUNL`, and `AMM`). Existing ledger entries will gain the fields whenever transactions modify those entries. [#4751](https://github.com/XRPLF/rippled/pull/4751).
|
||||
|
||||
- **fixAMMv1_1** amendment: Fixes AMM offer rounding and low quality order book offers from blocking the AMM. [#4983](https://github.com/XRPLF/rippled/pull/4983)
|
||||
|
||||
- Add a non-admin version of `feature` API method. [#4781](https://github.com/XRPLF/rippled/pull/4781)
|
||||
|
||||
### Bug Fixes and Performance Improvements
|
||||
(These are behind-the-scenes improvements, such as internal changes to the code, which are not expected to impact end users.)
|
||||
|
||||
- Optimize the base58 encoder and decoder. The algorithm is now about 10 times faster for encoding and 15 times faster for decoding. [#4327](https://github.com/XRPLF/rippled/pull/4327)
|
||||
|
||||
- Optimize the `account_tx` SQL query. [#4955](https://github.com/XRPLF/rippled/pull/4955)
|
||||
|
||||
- Don't reach consensus as quickly if no other proposals are seen. [#4763](https://github.com/XRPLF/rippled/pull/4763)
|
||||
|
||||
- Fix a potential deadlock in the database module. [#4989](https://github.com/XRPLF/rippled/pull/4989)
|
||||
|
||||
- Enforce no duplicate slots from incoming connections. [#4944](https://github.com/XRPLF/rippled/pull/4944)
|
||||
|
||||
- Fix an order book update variable swap. [#4890](https://github.com/XRPLF/rippled/pull/4890)
|
||||
|
||||
### Docs and Build System
|
||||
|
||||
- Add unit test to raise the test coverage of the AMM. [#4971](https://github.com/XRPLF/rippled/pull/4971)
|
||||
|
||||
- Improve test coverage reporting. [#4977](https://github.com/XRPLF/rippled/pull/4977)
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
Alex Kremer <akremer@ripple.com>
|
||||
Alloy Networks <45832257+alloynetworks@users.noreply.github.com>
|
||||
Bronek Kozicki <brok@incorrekt.com>
|
||||
Chenna Keshava <ckeshavabs@gmail.com>
|
||||
Denis Angell <dangell@transia.co>
|
||||
Ed Hennis <ed@ripple.com>
|
||||
Gregory Tsipenyuk <gtsipenyuk@ripple.com>
|
||||
Howard Hinnant <howard.hinnant@gmail.com>
|
||||
John Freeman <jfreeman08@gmail.com>
|
||||
Mark Travis <mtrippled@users.noreply.github.com>
|
||||
Mayukha Vadari <mvadari@gmail.com>
|
||||
Michael Legleux <mlegleux@ripple.com>
|
||||
Nik Bougalis <nikb@bougalis.net>
|
||||
Olek <115580134+oleks-rip@users.noreply.github.com>
|
||||
Scott Determan <scott.determan@yahoo.com>
|
||||
Snoppy <michaleli@foxmail.com>
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
|
||||
## Version 2.1.1
|
||||
|
||||
The `rippled` 2.1.1 release fixes a critical bug in the integration of AMMs with the payment engine.
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
|
||||
## Action Required
|
||||
|
||||
One new amendment is now open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators.
|
||||
|
||||
If you operate an XRP Ledger server, upgrade to version 2.1.1 by April 8, 2024 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
|
||||
|
||||
## Changelog
|
||||
|
||||
### Amendments
|
||||
|
||||
- **fixAMMOverflowOffer**: Fix improper handling of large synthetic AMM offers in the payment engine. Due to the importance of this fix, the default vote in the source code has been set to YES. For information on how to configure your validator's amendment voting, see [Configure Amendment Voting](https://xrpl.org/docs/infrastructure/configuration/configure-amendment-voting).
|
||||
|
||||
# Introducing XRP Ledger version 2.1.0
|
||||
|
||||
Version 2.1.0 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release adds a bug fix, build improvements, and introduces the `fixNFTokenReserve` and `fixInnerObjTemplate` amendments.
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
|
||||
## Action Required
|
||||
|
||||
Two new amendments are now open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators.
|
||||
|
||||
If you operate an XRP Ledger server, upgrade to version 2.1.0 by March 5, 2024 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
|
||||
|
||||
## Changelog
|
||||
|
||||
### Amendments
|
||||
(These are changes which may impact or be useful to end users. For example, you may be able to update your code/workflow to take advantage of these changes.)
|
||||
|
||||
- **fixNFTokenReserve**: Adds a check to the `NFTokenAcceptOffer` transactor to see if the `OwnerCount` changed. If it did, it checks that the reserve requirement is met. [#4767](https://github.com/XRPLF/rippled/pull/4767)
|
||||
|
||||
- **fixInnerObjTemplate**: Adds an `STObject` constructor overload that includes an additional boolean argument to set the inner object template; currently, the inner object template isn't set upon object creation. In some circumstances, this causes a `tefEXCEPTION` error when trying to access the AMM `sfTradingFee` and `sfDiscountedFee` fields in the inner objects of `sfVoteEntry` and `sfAuctionSlot`. [#4906](https://github.com/XRPLF/rippled/pull/4906)
|
||||
|
||||
|
||||
### Bug Fixes and Performance Improvements
|
||||
(These are behind-the-scenes improvements, such as internal changes to the code, which are not expected to impact end users.)
|
||||
|
||||
- Fixed a bug that prevented the gRPC port info from being specified in the `rippled` config file. [#4728](https://github.com/XRPLF/rippled/pull/4728)
|
||||
|
||||
|
||||
### Docs and Build System
|
||||
|
||||
- Added unit tests to check that payees and payers aren't the same account. [#4860](https://github.com/XRPLF/rippled/pull/4860)
|
||||
|
||||
- Removed a workaround that bypassed Windows CI unit test failures. [#4871](https://github.com/XRPLF/rippled/pull/4871)
|
||||
|
||||
- Updated library names to be platform-agnostic in Conan recipes. [#4831](https://github.com/XRPLF/rippled/pull/4831)
|
||||
|
||||
- Added headers required in the Conan package to build xbridge witness servers. [#4885](https://github.com/XRPLF/rippled/pull/4885)
|
||||
|
||||
- Improved object lifetime management when creating a temporary `Rules` object, fixing a crash in Windows unit tests. [#4917](https://github.com/XRPLF/rippled/pull/4917)
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
- Bronek Kozicki <brok@incorrekt.com>
|
||||
- CJ Cobb <cj@axelar.network>
|
||||
- Chenna Keshava B S <21219765+ckeshava@users.noreply.github.com>
|
||||
- Ed Hennis <ed@ripple.com>
|
||||
- Elliot Lee <github.public@intelliot.com>
|
||||
- Gregory Tsipenyuk <gregtatcam@users.noreply.github.com>
|
||||
- John Freeman <jfreeman08@gmail.com>
|
||||
- Michael Legleux <legleux@users.noreply.github.com>
|
||||
- Ryan Molley
|
||||
- Shawn Xie <35279399+shawnxie999@users.noreply.github.com>
|
||||
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
# Introducing XRP Ledger version 2.0.1
|
||||
|
||||
Version 2.0.1 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release includes minor fixes, unit test improvements, and doc updates.
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
|
||||
## Action Required
|
||||
|
||||
If you operate an XRP Ledger server, upgrade to version 2.0.1 to take advantage of the changes included in this update. Nodes on version 1.12 should upgrade as soon as possible.
|
||||
|
||||
|
||||
## Changelog
|
||||
|
||||
|
||||
### Changes
|
||||
(These are changes which may impact or be useful to end users. For example, you may be able to update your code/workflow to take advantage of these changes.)
|
||||
|
||||
- Updated the `send_queue_limit` to 500 in the default `rippled` config to handle increased transaction loads. [#4867](https://github.com/XRPLF/rippled/pull/4867)
|
||||
|
||||
|
||||
### Bug Fixes and Performance Improvements
|
||||
(These are behind-the-scenes improvements, such as internal changes to the code, which are not expected to impact end users.)
|
||||
|
||||
- Fixed an assertion that occurred when `rippled` was under heavy websocket client load. [#4848](https://github.com/XRPLF/rippled/pull/4848)
|
||||
|
||||
- Improved lifetime management of serialized type ledger entries to improve memory usage. [#4822](https://github.com/XRPLF/rippled/pull/4822)
|
||||
|
||||
- Fixed a clang warning about deprecated sprintf usage. [#4747](https://github.com/XRPLF/rippled/pull/4747)
|
||||
|
||||
|
||||
### Docs and Build System
|
||||
|
||||
- Added `DeliverMax` to more JSONRPC tests. [#4826](https://github.com/XRPLF/rippled/pull/4826)
|
||||
|
||||
- Updated the pull request template to include a `Type of Change` checkbox and additional contextual questions. [#4875](https://github.com/XRPLF/rippled/pull/4875)
|
||||
|
||||
- Updated help messages for unit tests pattern matching. [#4846](https://github.com/XRPLF/rippled/pull/4846)
|
||||
|
||||
- Improved the time it take to generate coverage reports. [#4849](https://github.com/XRPLF/rippled/pull/4849)
|
||||
|
||||
- Fixed broken links in the Conan build docs. [#4699](https://github.com/XRPLF/rippled/pull/4699)
|
||||
|
||||
- Spurious codecov uploads are now retried if there's an error uploading them the first time. [#4896](https://github.com/XRPLF/rippled/pull/4896)
|
||||
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
- Bronek Kozicki <brok@incorrekt.com>
|
||||
- Chenna Keshava B S <21219765+ckeshava@users.noreply.github.com>
|
||||
- Ed Hennis <ed@ripple.com>
|
||||
- Elliot Lee <github.public@intelliot.com>
|
||||
- Lathan Britz <jucallme@gmail.com>
|
||||
- Mark Travis <mtrippled@users.noreply.github.com>
|
||||
- nixer89 <pbnixer@gmail.com>
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the `rippled` code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
# Introducing XRP Ledger version 2.0.0
|
||||
|
||||
Version 2.0.0 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release adds new features and bug fixes, and introduces these amendments:
|
||||
|
||||
- `DID`
|
||||
- `XChainBridge`
|
||||
- `fixDisallowIncomingV1`
|
||||
- `fixFillOrKill`
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
|
||||
## Action Required
|
||||
|
||||
Four new amendments are now open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators.
|
||||
|
||||
If you operate an XRP Ledger server, upgrade to version 2.0.0 by January 22, 2024 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
|
||||
|
||||
|
||||
## Changelog
|
||||
|
||||
|
||||
### Amendments, New Features, and Changes
|
||||
(These are changes which may impact or be useful to end users. For example, you may be able to update your code/workflow to take advantage of these changes.)
|
||||
|
||||
- **XChainBridge**: Introduces cross-chain bridges, enabling interoperability between the XRP Ledger and sidechains. [#4292](https://github.com/XRPLF/rippled/pull/4292)
|
||||
|
||||
- **DID**: Introduces decentralized identifiers. [#4636](https://github.com/XRPLF/rippled/pull/4636)
|
||||
|
||||
- **fixDisallowIncomingV1**: Fixes an issue that occurs when users try to authorize a trustline while the `lsfDisallowIncomingTrustline` flag is enabled on their account. [#4721](https://github.com/XRPLF/rippled/pull/4721)
|
||||
|
||||
- **fixFillOrKill**: Fixes an issue introduced in the `flowCross` amendment. The `tfFillOrKill` and `tfSell` flags are now properly handled to allow offers to cross in certain scenarios. [#4694](https://github.com/XRPLF/rippled/pull/4694)
|
||||
|
||||
- **API v2 released with these changes:**
|
||||
|
||||
- Accepts currency codes in ASCII, using the full alphabet. [#4566](https://github.com/XRPLF/rippled/pull/4566)
|
||||
- Added test to verify the `check` field is a string. [#4630](https://github.com/XRPLF/rippled/pull/4630)
|
||||
- Added errors for malformed `account_tx` and `noripple_check` fields. [#4620](https://github.com/XRPLF/rippled/pull/4620)
|
||||
- Added errors for malformed `gateway_balances` and `channel_authorize` requests. [#4618](https://github.com/XRPLF/rippled/pull/4618)
|
||||
- Added a `DeliverMax` alias to `Amount` and removed `Amount`. [#4733](https://github.com/XRPLF/rippled/pull/4733)
|
||||
- Removed `tx_history` and `ledger_header` methods. Also updated `RPC::Handler` to allow for version-specific methods. [#4759](https://github.com/XRPLF/rippled/pull/4759)
|
||||
- Standardized the JSON serialization format of transactions. [#4727](https://github.com/XRPLF/rippled/issues/4727)
|
||||
- Bumped API support to v2, but kept the command-line interface for `rippled` and unit tests at v1. [#4803](https://github.com/XRPLF/rippled/pull/4803)
|
||||
- Standardized `ledger_index` to return as a number. [#4820](https://github.com/XRPLF/rippled/pull/4820)
|
||||
|
||||
- Added a `server_definitions` command that returns an SDK-compatible `definitions.json` file, generated from the `rippled` instance currently running. [#4703](https://github.com/XRPLF/rippled/pull/4703)
|
||||
|
||||
- Improved unit test command line input and run times. [#4634](https://github.com/XRPLF/rippled/pull/4634)
|
||||
|
||||
- Added the link compression setting to the the `rippled-example.cfg` file. [#4753](https://github.com/XRPLF/rippled/pull/4753)
|
||||
|
||||
- Changed the reserved hook error code name from `tecHOOK_ERROR` to `tecHOOK_REJECTED`. [#4559](https://github.com/XRPLF/rippled/pull/4559)
|
||||
|
||||
|
||||
### Bug Fixes and Performance Improvements
|
||||
(These are behind-the-scenes improvements, such as internal changes to the code, which are not expected to impact end users.)
|
||||
|
||||
- Simplified `TxFormats` common fields logic. [#4637](https://github.com/XRPLF/rippled/pull/4637)
|
||||
|
||||
- Improved transaction throughput by asynchronously writing batches to *NuDB*. [#4503](https://github.com/XRPLF/rippled/pull/4503)
|
||||
|
||||
- Removed 2 unused functions. [#4708](https://github.com/XRPLF/rippled/pull/4708)
|
||||
|
||||
- Removed an unused variable that caused clang 14 build errors. [#4672](https://github.com/XRPLF/rippled/pull/4672)
|
||||
|
||||
- Fixed comment about return value of `LedgerHistory::fixIndex`. [#4574](https://github.com/XRPLF/rippled/pull/4574)
|
||||
|
||||
- Updated `secp256k1` to 0.3.2. [#4653](https://github.com/XRPLF/rippled/pull/4653)
|
||||
|
||||
- Removed built-in SNTP clock issues. [#4628](https://github.com/XRPLF/rippled/pull/4628)
|
||||
|
||||
- Fixed amendment flapping. This issue usually occurred when an amendment was on the verge of gaining majority, but a validator not in favor of the amendment went offline. [#4410](https://github.com/XRPLF/rippled/pull/4410)
|
||||
|
||||
- Fixed asan stack-use-after-scope issue. [#4676](https://github.com/XRPLF/rippled/pull/4676)
|
||||
|
||||
- Transactions and pseudo-transactions share the same `commonFields` again. [#4715](https://github.com/XRPLF/rippled/pull/4715)
|
||||
|
||||
- Reduced boilerplate in `applySteps.cpp`. When a new transactor is added, only one function needs to be modified now. [#4710](https://github.com/XRPLF/rippled/pull/4710)
|
||||
|
||||
- Removed an incorrect assert. [#4743](https://github.com/XRPLF/rippled/pull/4743)
|
||||
|
||||
- Replaced some asserts in `PeerFinder::Logic` with `LogicError` to better indicate the nature of server crashes. [#4562](https://github.com/XRPLF/rippled/pull/4562)
|
||||
|
||||
- Fixed an issue with enabling new amendments on a network with an ID greater than 1024. [#4737](https://github.com/XRPLF/rippled/pull/4737)
|
||||
|
||||
|
||||
### Docs and Build System
|
||||
|
||||
- Updated `rippled-example.cfg` docs to clarify usage of *ssl_cert* vs *ssl_chain*. [#4667](https://github.com/XRPLF/rippled/pull/4667)
|
||||
|
||||
- Updated `BUILD.md`:
|
||||
- Made the `environment.md` link easier to find. Also made it easier to find platform-specific info. [#4507](https://github.com/XRPLF/rippled/pull/4507)
|
||||
- Fixed typo. [#4718](https://github.com/XRPLF/rippled/pull/4718)
|
||||
- Updated the minimum compiler requirements. [#4700](https://github.com/XRPLF/rippled/pull/4700)
|
||||
- Added note about enabling `XRPFees`. [#4741](https://github.com/XRPLF/rippled/pull/4741)
|
||||
|
||||
- Updated `API-CHANGELOG.md`:
|
||||
- Explained API v2 is releasing with `rippled` 2.0.0. [#4633](https://github.com/XRPLF/rippled/pull/4633)
|
||||
- Clarified the location of the `signer_lists` field in the `account_info` response for API v2. [#4724](https://github.com/XRPLF/rippled/pull/4724)
|
||||
- Added documentation for the new `DeliverMax` field. [#4784](https://github.com/XRPLF/rippled/pull/4784)
|
||||
- Removed references to API v2 being "in progress" and "in beta". [#4828](https://github.com/XRPLF/rippled/pull/4828)
|
||||
- Clarified that all breaking API changes will now occur in API v3 or later. [#4773](https://github.com/XRPLF/rippled/pull/4773)
|
||||
|
||||
- Fixed a mistake in the overlay README. [#4635](https://github.com/XRPLF/rippled/pull/4635)
|
||||
|
||||
- Fixed an early return from `RippledRelease.cmake` that prevented targets from being created during packaging. [#4707](https://github.com/XRPLF/rippled/pull/4707)
|
||||
|
||||
- Fixed a build error with Intel Macs. [#4632](https://github.com/XRPLF/rippled/pull/4632)
|
||||
|
||||
- Added `.build` to `.gitignore`. [#4722](https://github.com/XRPLF/rippled/pull/4722)
|
||||
|
||||
- Fixed a `uint is not universally defined` Windows build error. [#4731](https://github.com/XRPLF/rippled/pull/4731)
|
||||
|
||||
- Reenabled Windows CI build with Artifactory support. [#4596](https://github.com/XRPLF/rippled/pull/4596)
|
||||
|
||||
- Fixed output of remote step in Nix workflow. [#4746](https://github.com/XRPLF/rippled/pull/4746)
|
||||
|
||||
- Fixed a broken link in `conan.md`. [#4740](https://github.com/XRPLF/rippled/pull/4740)
|
||||
|
||||
- Added a `python` call to fix the `pip` upgrade command in Windows CI. [#4768](https://github.com/XRPLF/rippled/pull/4768)
|
||||
|
||||
- Added an API Impact section to `pull_request_template.md`. [#4757](https://github.com/XRPLF/rippled/pull/4757)
|
||||
|
||||
- Set permissions for the Doxygen workflow. [#4756](https://github.com/XRPLF/rippled/pull/4756)
|
||||
|
||||
- Switched to Unity builds to speed up Windows CI. [#4780](https://github.com/XRPLF/rippled/pull/4780)
|
||||
|
||||
- Clarified what makes consensus healthy in `FeeEscalation.md`. [#4729](https://github.com/XRPLF/rippled/pull/4729)
|
||||
|
||||
- Removed a dependency on the <ranges> header for unit tests. [#4788](https://github.com/XRPLF/rippled/pull/4788)
|
||||
|
||||
- Fixed a clang `unused-but-set-variable` warning. [#4677](https://github.com/XRPLF/rippled/pull/4677)
|
||||
|
||||
- Removed an unused Dockerfile. [#4791](https://github.com/XRPLF/rippled/pull/4791)
|
||||
|
||||
- Fixed unit tests to work with API v2. [#4785](https://github.com/XRPLF/rippled/pull/4785)
|
||||
|
||||
- Added support for the mold linker on Linux. [#4807](https://github.com/XRPLF/rippled/pull/4807)
|
||||
|
||||
- Updated Linux distribtuions `rippled` smoke tests run on. [#4813](https://github.com/XRPLF/rippled/pull/4813)
|
||||
|
||||
- Added codename `bookworm` to the distribution matrix during Artifactory uploads, enabling Debian 12 clients to install `rippled` packages. [#4836](https://github.com/XRPLF/rippled/pull/4836)
|
||||
|
||||
- Added a workaround for compilation errors with GCC 13 and other compilers relying on libstdc++ version 13. [#4817](https://github.com/XRPLF/rippled/pull/4817)
|
||||
|
||||
- Fixed a minor typo in the code comments of `AMMCreate.h`. [4821](https://github.com/XRPLF/rippled/pull/4821)
|
||||
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
- Bronek Kozicki <brok@incorrekt.com>
|
||||
- Chenna Keshava B S <21219765+ckeshava@users.noreply.github.com>
|
||||
- Denis Angell <dangell@transia.co>
|
||||
- Ed Hennis <ed@ripple.com>
|
||||
- Elliot Lee <github.public@intelliot.com>
|
||||
- Florent <36513774+florent-uzio@users.noreply.github.com>
|
||||
- ForwardSlashBack <142098649+ForwardSlashBack@users.noreply.github.com>
|
||||
- Gregory Tsipenyuk <gregtatcam@users.noreply.github.com>
|
||||
- Howard Hinnant <howard.hinnant@gmail.com>
|
||||
- Hussein Badakhchani <husseinb01@gmail.com>
|
||||
- Jackson Mills <aim4math@gmail.com>
|
||||
- John Freeman <jfreeman08@gmail.com>
|
||||
- Manoj Doshi <mdoshi@ripple.com>
|
||||
- Mark Pevec <mark.pevec@gmail.com>
|
||||
- Mark Travis <mtrippled@users.noreply.github.com>
|
||||
- Mayukha Vadari <mvadari@gmail.com>
|
||||
- Michael Legleux <legleux@users.noreply.github.com>
|
||||
- Nik Bougalis <nikb@bougalis.net>
|
||||
- Peter Chen <34582813+PeterChen13579@users.noreply.github.com>
|
||||
- Rome Reginelli <rome@ripple.com>
|
||||
- Scott Determan <scott.determan@yahoo.com>
|
||||
- Scott Schurr <scott@ripple.com>
|
||||
- Sophia Xie <106177003+sophiax851@users.noreply.github.com>
|
||||
- Stefan van Kessel <van_kessel@freenet.de>
|
||||
- pwang200 <354723+pwang200@users.noreply.github.com>
|
||||
- shichengsg002 <147461171+shichengsg002@users.noreply.github.com>
|
||||
- sokkaofthewatertribe <140777955+sokkaofthewatertribe@users.noreply.github.com>
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the rippled code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
|
||||
# Introducing XRP Ledger version 1.12.0
|
||||
|
||||
Version 1.12.0 of `rippled`, the reference server implementation of the XRP Ledger protocol, is now available. This release adds new features and bug fixes, and introduces these amendments:
|
||||
|
||||
- `AMM`
|
||||
- `Clawback`
|
||||
- `fixReducedOffersV1`
|
||||
|
||||
[Sign Up for Future Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
|
||||
<!-- BREAK -->
|
||||
|
||||
## Action Required
|
||||
|
||||
Three new amendments are now open for voting according to the XRP Ledger's [amendment process](https://xrpl.org/amendments.html), which enables protocol changes following two weeks of >80% support from trusted validators.
|
||||
|
||||
If you operate an XRP Ledger server, upgrade to version 1.12.0 by September 20, 2023 to ensure service continuity. The exact time that protocol changes take effect depends on the voting decisions of the decentralized network.
|
||||
|
||||
|
||||
## Install / Upgrade
|
||||
|
||||
On supported platforms, see the [instructions on installing or updating `rippled`](https://xrpl.org/install-rippled.html).
|
||||
|
||||
The XRPL Foundation publishes portable binaries, which are drop-in replacements for the `rippled` daemon. [See information and downloads for the portable binaries](https://github.com/XRPLF/rippled-portable-builds#portable-builds-of-the-rippled-server). This will work on most distributions, including Ubuntu 16.04, 18.04, 20.04, and 22.04; CentOS; and others. Please test and open issues on GitHub if there are problems.
|
||||
|
||||
|
||||
## Changelog
|
||||
|
||||
### Amendments, New Features, and Changes
|
||||
(These are changes which may impact or be useful to end users. For example, you may be able to update your code/workflow to take advantage of these changes.)
|
||||
|
||||
- **`AMM`**: Introduces an automated market maker (AMM) protocol to the XRP Ledger's decentralized exchange, enabling you to trade assets without a counterparty. For more information about AMMs, see: [Automated Market Maker](https://opensource.ripple.com/docs/xls-30d-amm/amm-uc/). [#4294](https://github.com/XRPLF/rippled/pull/4294)
|
||||
|
||||
- **`Clawback`**: Adds a setting, *Allow Clawback*, which lets an issuer recover, or _claw back_, tokens that they previously issued. Issuers cannot enable this setting if they have issued tokens already. For additional documentation on this feature, see: [#4553](https://github.com/XRPLF/rippled/pull/4553).
|
||||
|
||||
- **`fixReducedOffersV1`**: Reduces the occurrence of order books that are blocked by reduced offers. [#4512](https://github.com/XRPLF/rippled/pull/4512)
|
||||
|
||||
- Added WebSocket and RPC port info to `server_info` responses. [#4427](https://github.com/XRPLF/rippled/pull/4427)
|
||||
|
||||
- Removed the deprecated `accepted`, `seqNum`, `hash`, and `totalCoins` fields from the `ledger` method. [#4244](https://github.com/XRPLF/rippled/pull/4244)
|
||||
|
||||
|
||||
### Bug Fixes and Performance Improvements
|
||||
(These are behind-the-scenes improvements, such as internal changes to the code, which are not expected to impact end users.)
|
||||
|
||||
- Added a pre-commit hook that runs the clang-format linter locally before committing changes. To install this feature, see: [CONTRIBUTING](https://github.com/XRPLF/xrpl-dev-portal/blob/master/CONTRIBUTING.md). [#4599](https://github.com/XRPLF/rippled/pull/4599)
|
||||
|
||||
- In order to make it more straightforward to catch and handle overflows: changed the output type of the `mulDiv()` function from `std::pair<bool, uint64_t>` to `std::optional`. [#4243](https://github.com/XRPLF/rippled/pull/4243)
|
||||
|
||||
- Updated `Handler::Condition` enum values to make the code less brittle. [#4239](https://github.com/XRPLF/rippled/pull/4239)
|
||||
|
||||
- Renamed `ServerHandlerImp` to `ServerHandler`. [#4516](https://github.com/XRPLF/rippled/pull/4516), [#4592](https://github.com/XRPLF/rippled/pull/4592)
|
||||
|
||||
- Replaced hand-rolled code with `std::from_chars` for better maintainability. [#4473](https://github.com/XRPLF/rippled/pull/4473)
|
||||
|
||||
- Removed an unused `TypedField` move constructor. [#4567](https://github.com/XRPLF/rippled/pull/4567)
|
||||
|
||||
|
||||
### Docs and Build System
|
||||
|
||||
- Updated checkout versions to resolve warnings during GitHub jobs. [#4598](https://github.com/XRPLF/rippled/pull/4598)
|
||||
|
||||
- Fixed an issue with the Debian package build. [#4591](https://github.com/XRPLF/rippled/pull/4591)
|
||||
|
||||
- Updated build instructions with additional steps to take after updating dependencies. [#4623](https://github.com/XRPLF/rippled/pull/4623)
|
||||
|
||||
- Updated contributing doc to clarify that beta releases should also be pushed to the `release` branch. [#4589](https://github.com/XRPLF/rippled/pull/4589)
|
||||
|
||||
- Enabled the `BETA_RPC_API` flag in the default unit tests config, making the API v2 (beta) available to unit tests. [#4573](https://github.com/XRPLF/rippled/pull/4573)
|
||||
|
||||
- Conan dependency management.
|
||||
- Fixed package definitions for Conan. [#4485](https://github.com/XRPLF/rippled/pull/4485)
|
||||
- Updated build dependencies to the most recent versions in Conan Center. [#4595](https://github.com/XRPLF/rippled/pull/4595)
|
||||
- Updated Conan recipe for NuDB. [#4615](https://github.com/XRPLF/rippled/pull/4615)
|
||||
|
||||
- Added binary hardening and linker flags to enhance security during the build process. [#4603](https://github.com/XRPLF/rippled/pull/4603)
|
||||
|
||||
- Added an Artifactory to the `nix` workflow to improve build times. [#4556](https://github.com/XRPLF/rippled/pull/4556)
|
||||
|
||||
- Added quality-of-life improvements to workflows, using new [concurrency control](https://docs.github.com/en/actions/using-jobs/using-concurrency) features. [#4597](https://github.com/XRPLF/rippled/pull/4597)
|
||||
|
||||
|
||||
[Full Commit Log](https://github.com/XRPLF/rippled/compare/1.11.0...1.12.0)
|
||||
|
||||
|
||||
### GitHub
|
||||
|
||||
The public source code repository for `rippled` is hosted on GitHub at <https://github.com/XRPLF/rippled>.
|
||||
|
||||
We welcome all contributions and invite everyone to join the community of XRP Ledger developers to help build the Internet of Value.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
The following people contributed directly to this release:
|
||||
|
||||
- Alphonse N. Mousse <39067955+a-noni-mousse@users.noreply.github.com>
|
||||
- Arihant Kothari <arihantkothari17@gmail.com>
|
||||
- Chenna Keshava B S <21219765+ckeshava@users.noreply.github.com>
|
||||
- Denis Angell <dangell@transia.co>
|
||||
- Ed Hennis <ed@ripple.com>
|
||||
- Elliot Lee <github.public@intelliot.com>
|
||||
- Gregory Tsipenyuk <gregtatcam@users.noreply.github.com>
|
||||
- Howard Hinnant <howard.hinnant@gmail.com>
|
||||
- Ikko Eltociear Ashimine <eltociear@gmail.com>
|
||||
- John Freeman <jfreeman08@gmail.com>
|
||||
- Manoj Doshi <mdoshi@ripple.com>
|
||||
- Mark Travis <mtravis@ripple.com>
|
||||
- Mayukha Vadari <mvadari@gmail.com>
|
||||
- Michael Legleux <legleux@users.noreply.github.com>
|
||||
- Peter Chen <34582813+PeterChen13579@users.noreply.github.com>
|
||||
- RichardAH <richard.holland@starstone.co.nz>
|
||||
- Rome Reginelli <rome@ripple.com>
|
||||
- Scott Schurr <scott@ripple.com>
|
||||
- Shawn Xie <35279399+shawnxie999@users.noreply.github.com>
|
||||
- drlongle <drlongle@gmail.com>
|
||||
|
||||
Bug Bounties and Responsible Disclosures:
|
||||
|
||||
We welcome reviews of the rippled code and urge researchers to responsibly disclose any issues they may find.
|
||||
|
||||
To report a bug, please send a detailed report to: <bugs@xrpl.org>
|
||||
|
||||
|
||||
# Introducing XRP Ledger version 1.11.0
|
||||
|
||||
|
||||
179
TEST_FILES_TO_FIX.md
Normal file
179
TEST_FILES_TO_FIX.md
Normal file
@@ -0,0 +1,179 @@
|
||||
# Test Files That Need hash_options Fixes
|
||||
|
||||
## How to Check Compilation Errors
|
||||
|
||||
Use the `compile_single_v2.py` script to check individual files:
|
||||
```bash
|
||||
# Check compilation errors for a specific file
|
||||
./compile_single_v2.py src/test/app/SomeFile_test.cpp -e 3 --errors-only
|
||||
|
||||
# Get just the last few lines to see if it compiled successfully
|
||||
./compile_single_v2.py src/test/app/SomeFile_test.cpp 2>&1 | tail -5
|
||||
```
|
||||
|
||||
## Originally Fixed Files (11 files)
|
||||
|
||||
1. **src/test/app/Import_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
2. **src/test/app/LedgerReplay_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
3. **src/test/app/Offer_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
4. **src/test/app/SetHook_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
5. **src/test/app/SetHookTSH_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
6. **src/test/app/ValidatorList_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
7. **src/test/app/XahauGenesis_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options, keylet::fees() needs hash_options
|
||||
|
||||
8. **src/test/consensus/NegativeUNL_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
9. **src/test/consensus/UNLReport_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
10. **src/test/jtx/impl/balance.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
11. **src/test/jtx/impl/Env.cpp**
|
||||
- Status: Needs fixing
|
||||
- Errors: keylet functions missing hash_options
|
||||
|
||||
## Fix Strategy
|
||||
|
||||
Each file needs:
|
||||
1. keylet function calls updated to include hash_options{ledger_seq, classifier}
|
||||
2. The ledger_seq typically comes from env.current()->seq() or view.seq()
|
||||
3. The classifier matches the keylet type (e.g., KEYLET_ACCOUNT, KEYLET_FEES, etc.)
|
||||
|
||||
## Progress Tracking
|
||||
|
||||
- [x] Import_test.cpp - FIXED
|
||||
- [x] LedgerReplay_test.cpp - FIXED
|
||||
- [x] Offer_test.cpp - FIXED
|
||||
- [x] SetHook_test.cpp - FIXED
|
||||
- [x] SetHookTSH_test.cpp - FIXED
|
||||
- [x] ValidatorList_test.cpp - FIXED (sha512Half calls updated with VALIDATOR_LIST_HASH classifier)
|
||||
- [x] XahauGenesis_test.cpp - FIXED (removed duplicate hash_options parameters)
|
||||
- [x] NegativeUNL_test.cpp - FIXED
|
||||
- [x] UNLReport_test.cpp - FIXED
|
||||
- [x] balance.cpp - FIXED
|
||||
- [x] Env.cpp - FIXED
|
||||
|
||||
## All original 11 files have been successfully fixed!
|
||||
|
||||
## Remaining Files Still Needing Fixes (9 files)
|
||||
|
||||
### Status: NOT STARTED
|
||||
These files still have compilation errors and need hash_options fixes:
|
||||
|
||||
1. **src/test/jtx/impl/uritoken.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/jtx/impl/uritoken.cpp -e 3 --errors-only`
|
||||
|
||||
2. **src/test/jtx/impl/utility.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/jtx/impl/utility.cpp -e 3 --errors-only`
|
||||
|
||||
3. **src/test/ledger/Directory_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/ledger/Directory_test.cpp -e 3 --errors-only`
|
||||
|
||||
4. **src/test/ledger/Invariants_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/ledger/Invariants_test.cpp -e 3 --errors-only`
|
||||
|
||||
5. **src/test/overlay/compression_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/overlay/compression_test.cpp -e 3 --errors-only`
|
||||
|
||||
6. **src/test/rpc/AccountSet_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/rpc/AccountSet_test.cpp -e 3 --errors-only`
|
||||
|
||||
7. **src/test/rpc/AccountTx_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/rpc/AccountTx_test.cpp -e 3 --errors-only`
|
||||
|
||||
8. **src/test/rpc/Book_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/rpc/Book_test.cpp -e 3 --errors-only`
|
||||
|
||||
9. **src/test/rpc/Catalogue_test.cpp**
|
||||
- Status: Needs fixing
|
||||
- Check errors: `./compile_single_v2.py src/test/rpc/Catalogue_test.cpp -e 3 --errors-only`
|
||||
|
||||
## CRITICAL INSTRUCTIONS FOR FIXING
|
||||
|
||||
### 1. Read the HashContext enum from digest.h
|
||||
**ALWAYS** check @src/ripple/protocol/digest.h lines 37-107 for the complete HashContext enum.
|
||||
This enum defines ALL the valid classifiers you can use in hash_options.
|
||||
|
||||
### 2. Understanding hash_options constructor
|
||||
The hash_options struct (lines 110-126 in digest.h) has TWO constructors:
|
||||
- `hash_options(HashContext ctx)` - classifier only, no ledger index
|
||||
- `hash_options(std::uint32_t li, HashContext ctx)` - ledger index AND classifier
|
||||
|
||||
### 3. How to classify each hash operation
|
||||
|
||||
#### For keylet functions:
|
||||
- Match the keylet function name to the KEYLET_* enum value
|
||||
- Examples:
|
||||
- `keylet::account()` → use `KEYLET_ACCOUNT`
|
||||
- `keylet::fees()` → use `KEYLET_FEES`
|
||||
- `keylet::trustline()` → use `KEYLET_TRUSTLINE`
|
||||
- `keylet::negativeUNL()` → use `KEYLET_NEGATIVE_UNL`
|
||||
- `keylet::UNLReport()` → use `KEYLET_UNL_REPORT`
|
||||
- `keylet::hook()` → use `KEYLET_HOOK`
|
||||
- `keylet::uriToken()` → use `KEYLET_URI_TOKEN`
|
||||
|
||||
#### For sha512Half calls:
|
||||
- Validator manifests/lists → use `VALIDATOR_LIST_HASH`
|
||||
- Hook code hashing → use `HOOK_DEFINITION` or `LEDGER_INDEX_UNNEEDED`
|
||||
- Network protocol → use appropriate context from enum
|
||||
|
||||
#### For test environments:
|
||||
- Use `env.current()->seq()` to get ledger sequence (it's already uint32_t, NO CAST NEEDED)
|
||||
- Use `ledger->seq()` for Ledger pointers
|
||||
- Use `view.seq()` for ReadView/ApplyView references
|
||||
|
||||
### 4. IMPORTANT: Read the entire file first!
|
||||
When fixing a file, ALWAYS:
|
||||
1. Read the ENTIRE file first (or at least 500+ lines) to understand the context
|
||||
2. Look for patterns of how the test is structured
|
||||
3. Check what types of ledger objects are being tested
|
||||
4. Then fix ALL occurrences systematically
|
||||
|
||||
### 5. Common patterns to fix:
|
||||
|
||||
```cpp
|
||||
// OLD - missing hash_options
|
||||
env.le(keylet::account(alice));
|
||||
|
||||
// NEW - with proper classification
|
||||
env.le(keylet::account(hash_options{env.current()->seq(), KEYLET_ACCOUNT}, alice));
|
||||
|
||||
// OLD - sha512Half without context
|
||||
auto hash = sha512Half(data);
|
||||
|
||||
// NEW - with proper classification
|
||||
auto hash = sha512Half(hash_options{VALIDATOR_LIST_HASH}, data);
|
||||
```
|
||||
204
analyze_keylet_calls.py
Normal file
204
analyze_keylet_calls.py
Normal file
@@ -0,0 +1,204 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import re
|
||||
import os
|
||||
from pathlib import Path
|
||||
from collections import defaultdict
|
||||
from typing import Set, Dict, List, Tuple
|
||||
|
||||
def find_keylet_calls(root_dir: str) -> Dict[str, List[Tuple[str, int, str]]]:
|
||||
"""
|
||||
Find all keylet:: function calls with hash_options as first parameter.
|
||||
Returns a dict mapping keylet function names to list of (file, line, full_match) tuples.
|
||||
"""
|
||||
# Pattern to match keylet::<function>(hash_options{...}, ...) calls
|
||||
# This captures:
|
||||
# 1. The keylet function name
|
||||
# 2. The entire first argument (hash_options{...})
|
||||
# 3. The content inside hash_options{...}
|
||||
pattern = re.compile(
|
||||
r'keylet::(\w+)\s*\(\s*(hash_options\s*\{([^}]*)\})',
|
||||
re.MULTILINE | re.DOTALL
|
||||
)
|
||||
|
||||
results = defaultdict(list)
|
||||
unique_first_args = set()
|
||||
|
||||
# Walk through all C++ source files
|
||||
for root, dirs, files in os.walk(Path(root_dir) / "src" / "ripple"):
|
||||
# Skip certain directories
|
||||
dirs[:] = [d for d in dirs if d not in ['.git', 'build', '__pycache__']]
|
||||
|
||||
for file in files:
|
||||
if file.endswith(('.cpp', '.h', '.hpp')):
|
||||
filepath = os.path.join(root, file)
|
||||
try:
|
||||
with open(filepath, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
content = f.read()
|
||||
|
||||
# Find all matches in this file
|
||||
for match in pattern.finditer(content):
|
||||
func_name = match.group(1)
|
||||
full_first_arg = match.group(2)
|
||||
inner_content = match.group(3).strip()
|
||||
|
||||
# Get line number
|
||||
line_num = content[:match.start()].count('\n') + 1
|
||||
|
||||
# Store the result
|
||||
rel_path = os.path.relpath(filepath, root_dir)
|
||||
results[func_name].append((rel_path, line_num, full_first_arg))
|
||||
unique_first_args.add(inner_content)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error reading {filepath}: {e}")
|
||||
|
||||
return results, unique_first_args
|
||||
|
||||
def analyze_hash_options_content(unique_args: Set[str]) -> Dict[str, int]:
|
||||
"""Analyze the content of hash_options{...} arguments."""
|
||||
categories = {
|
||||
'literal_0': 0,
|
||||
'literal_number': 0,
|
||||
'view_seq': 0,
|
||||
'ledger_seq': 0,
|
||||
'info_seq': 0,
|
||||
'ctx_view_seq': 0,
|
||||
'sb_seq': 0,
|
||||
'env_current_seq': 0,
|
||||
'other': 0
|
||||
}
|
||||
|
||||
other_patterns = []
|
||||
|
||||
for arg in unique_args:
|
||||
arg_clean = arg.strip()
|
||||
|
||||
if arg_clean == '0':
|
||||
categories['literal_0'] += 1
|
||||
elif arg_clean.isdigit():
|
||||
categories['literal_number'] += 1
|
||||
elif 'view.seq()' in arg_clean or 'view().seq()' in arg_clean:
|
||||
categories['view_seq'] += 1
|
||||
elif 'ledger->seq()' in arg_clean or 'ledger.seq()' in arg_clean:
|
||||
categories['ledger_seq'] += 1
|
||||
elif 'info.seq' in arg_clean or 'info_.seq' in arg_clean:
|
||||
categories['info_seq'] += 1
|
||||
elif 'ctx.view.seq()' in arg_clean:
|
||||
categories['ctx_view_seq'] += 1
|
||||
elif 'sb.seq()' in arg_clean:
|
||||
categories['sb_seq'] += 1
|
||||
elif 'env.current()->seq()' in arg_clean:
|
||||
categories['env_current_seq'] += 1
|
||||
else:
|
||||
categories['other'] += 1
|
||||
other_patterns.append(arg_clean)
|
||||
|
||||
return categories, other_patterns
|
||||
|
||||
def print_report(results: Dict[str, List], unique_args: Set[str]):
|
||||
"""Print a detailed report of findings."""
|
||||
print("=" * 80)
|
||||
print("KEYLET FUNCTION CALL ANALYSIS")
|
||||
print("=" * 80)
|
||||
|
||||
# Summary
|
||||
total_calls = sum(len(calls) for calls in results.values())
|
||||
print(f"\nTotal keylet calls found: {total_calls}")
|
||||
print(f"Unique keylet functions: {len(results)}")
|
||||
print(f"Unique hash_options arguments: {len(unique_args)}")
|
||||
|
||||
# Function frequency
|
||||
print("\n" + "=" * 80)
|
||||
print("KEYLET FUNCTIONS BY FREQUENCY:")
|
||||
print("=" * 80)
|
||||
|
||||
sorted_funcs = sorted(results.items(), key=lambda x: len(x[1]), reverse=True)
|
||||
for func_name, calls in sorted_funcs[:20]: # Top 20
|
||||
print(f" {func_name:30} {len(calls):4} calls")
|
||||
|
||||
if len(sorted_funcs) > 20:
|
||||
print(f" ... and {len(sorted_funcs) - 20} more functions")
|
||||
|
||||
# Analyze hash_options content
|
||||
print("\n" + "=" * 80)
|
||||
print("HASH_OPTIONS ARGUMENT PATTERNS:")
|
||||
print("=" * 80)
|
||||
|
||||
categories, other_patterns = analyze_hash_options_content(unique_args)
|
||||
|
||||
for category, count in sorted(categories.items(), key=lambda x: x[1], reverse=True):
|
||||
if count > 0:
|
||||
print(f" {category:25} {count:4} occurrences")
|
||||
|
||||
if other_patterns:
|
||||
print("\n" + "=" * 80)
|
||||
print("OTHER PATTERNS (need review):")
|
||||
print("=" * 80)
|
||||
for i, pattern in enumerate(sorted(set(other_patterns))[:10], 1):
|
||||
# Truncate long patterns
|
||||
display = pattern if len(pattern) <= 60 else pattern[:57] + "..."
|
||||
print(f" {i:2}. {display}")
|
||||
|
||||
# Sample calls for most common functions
|
||||
print("\n" + "=" * 80)
|
||||
print("SAMPLE CALLS (top 5 functions):")
|
||||
print("=" * 80)
|
||||
|
||||
for func_name, calls in sorted_funcs[:5]:
|
||||
print(f"\n{func_name}:")
|
||||
for filepath, line_num, arg in calls[:3]: # Show first 3 examples
|
||||
print(f" {filepath}:{line_num}")
|
||||
print(f" {arg}")
|
||||
if len(calls) > 3:
|
||||
print(f" ... and {len(calls) - 3} more")
|
||||
|
||||
def generate_replacement_script(results: Dict[str, List], unique_args: Set[str]):
|
||||
"""Generate a script to help with replacements."""
|
||||
print("\n" + "=" * 80)
|
||||
print("SUGGESTED MIGRATION STRATEGY:")
|
||||
print("=" * 80)
|
||||
|
||||
print("""
|
||||
The goal is to migrate from:
|
||||
keylet::func(hash_options{ledger_seq})
|
||||
|
||||
To either:
|
||||
keylet::func(hash_options{ledger_seq, KEYLET_CLASSIFIER})
|
||||
|
||||
Where KEYLET_CLASSIFIER would be a specific HashContext enum value
|
||||
based on the keylet function type.
|
||||
|
||||
Suggested mappings:
|
||||
- keylet::account() -> LEDGER_HEADER_HASH (or new KEYLET_ACCOUNT)
|
||||
- keylet::line() -> LEDGER_HEADER_HASH (or new KEYLET_TRUSTLINE)
|
||||
- keylet::offer() -> LEDGER_HEADER_HASH (or new KEYLET_OFFER)
|
||||
- keylet::ownerDir() -> LEDGER_HEADER_HASH (or new KEYLET_OWNER_DIR)
|
||||
- keylet::page() -> LEDGER_HEADER_HASH (or new KEYLET_DIR_PAGE)
|
||||
- keylet::fees() -> LEDGER_HEADER_HASH (or new KEYLET_FEES)
|
||||
- keylet::amendments() -> LEDGER_HEADER_HASH (or new KEYLET_AMENDMENTS)
|
||||
- keylet::check() -> LEDGER_HEADER_HASH (or new KEYLET_CHECK)
|
||||
- keylet::escrow() -> LEDGER_HEADER_HASH (or new KEYLET_ESCROW)
|
||||
- keylet::payChan() -> LEDGER_HEADER_HASH (or new KEYLET_PAYCHAN)
|
||||
- keylet::signers() -> LEDGER_HEADER_HASH (or new KEYLET_SIGNERS)
|
||||
- keylet::ticket() -> LEDGER_HEADER_HASH (or new KEYLET_TICKET)
|
||||
- keylet::nftpage_*() -> LEDGER_HEADER_HASH (or new KEYLET_NFT_PAGE)
|
||||
- keylet::nftoffer() -> LEDGER_HEADER_HASH (or new KEYLET_NFT_OFFER)
|
||||
- keylet::depositPreauth() -> LEDGER_HEADER_HASH (or new KEYLET_DEPOSIT_PREAUTH)
|
||||
""")
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Get the project root directory
|
||||
project_root = "/Users/nicholasdudfield/projects/xahaud-worktrees/xahaud-map-stats-rpc"
|
||||
|
||||
print(f"Analyzing keylet calls in: {project_root}")
|
||||
print("This may take a moment...\n")
|
||||
|
||||
# Find all keylet calls
|
||||
results, unique_args = find_keylet_calls(project_root)
|
||||
|
||||
# Print the report
|
||||
print_report(results, unique_args)
|
||||
|
||||
# Generate replacement suggestions
|
||||
generate_replacement_script(results, unique_args)
|
||||
218
bin/physical.sh
218
bin/physical.sh
@@ -1,218 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -o errexit
|
||||
|
||||
marker_base=34be0ce4fef20c978df2923c29321ad6cc17facc
|
||||
marker_commit=${1:-${marker_base}}
|
||||
|
||||
if [ $(git merge-base ${marker_commit} ${marker_base}) != ${marker_base} ]; then
|
||||
echo "first marker commit not an ancestor: ${marker_commit}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ $(git merge-base ${marker_commit} HEAD) != $(git rev-parse --verify ${marker_commit}) ]; then
|
||||
echo "given marker commit not an ancestor: ${marker_commit}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -e Builds/CMake ]; then
|
||||
echo move CMake
|
||||
git mv Builds/CMake cmake
|
||||
git add --update .
|
||||
git commit -m 'Move CMake directory' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
fi
|
||||
|
||||
if [ -e src/ripple ]; then
|
||||
|
||||
echo move protocol buffers
|
||||
mkdir -p include/xrpl
|
||||
if [ -e src/ripple/proto ]; then
|
||||
git mv src/ripple/proto include/xrpl
|
||||
fi
|
||||
|
||||
extract_list() {
|
||||
git show ${marker_commit}:Builds/CMake/RippledCore.cmake | \
|
||||
awk "/END ${1}/ { p = 0 } p && /src\/ripple/; /BEGIN ${1}/ { p = 1 }" | \
|
||||
sed -e 's#src/ripple/##' -e 's#[^a-z]\+$##'
|
||||
}
|
||||
|
||||
move_files() {
|
||||
oldroot="$1"; shift
|
||||
newroot="$1"; shift
|
||||
detail="$1"; shift
|
||||
files=("$@")
|
||||
for file in ${files[@]}; do
|
||||
if [ ! -e ${oldroot}/${file} ]; then
|
||||
continue
|
||||
fi
|
||||
dir=$(dirname ${file})
|
||||
if [ $(basename ${dir}) == 'details' ]; then
|
||||
dir=$(dirname ${dir})
|
||||
fi
|
||||
if [ $(basename ${dir}) == 'impl' ]; then
|
||||
dir="$(dirname ${dir})/${detail}"
|
||||
fi
|
||||
mkdir -p ${newroot}/${dir}
|
||||
git mv ${oldroot}/${file} ${newroot}/${dir}
|
||||
done
|
||||
}
|
||||
|
||||
echo move libxrpl headers
|
||||
files=$(extract_list 'LIBXRPL HEADERS')
|
||||
files+=(
|
||||
basics/SlabAllocator.h
|
||||
|
||||
beast/asio/io_latency_probe.h
|
||||
beast/container/aged_container.h
|
||||
beast/container/aged_container_utility.h
|
||||
beast/container/aged_map.h
|
||||
beast/container/aged_multimap.h
|
||||
beast/container/aged_multiset.h
|
||||
beast/container/aged_set.h
|
||||
beast/container/aged_unordered_map.h
|
||||
beast/container/aged_unordered_multimap.h
|
||||
beast/container/aged_unordered_multiset.h
|
||||
beast/container/aged_unordered_set.h
|
||||
beast/container/detail/aged_associative_container.h
|
||||
beast/container/detail/aged_container_iterator.h
|
||||
beast/container/detail/aged_ordered_container.h
|
||||
beast/container/detail/aged_unordered_container.h
|
||||
beast/container/detail/empty_base_optimization.h
|
||||
beast/core/LockFreeStack.h
|
||||
beast/insight/Collector.h
|
||||
beast/insight/Counter.h
|
||||
beast/insight/CounterImpl.h
|
||||
beast/insight/Event.h
|
||||
beast/insight/EventImpl.h
|
||||
beast/insight/Gauge.h
|
||||
beast/insight/GaugeImpl.h
|
||||
beast/insight/Group.h
|
||||
beast/insight/Groups.h
|
||||
beast/insight/Hook.h
|
||||
beast/insight/HookImpl.h
|
||||
beast/insight/Insight.h
|
||||
beast/insight/Meter.h
|
||||
beast/insight/MeterImpl.h
|
||||
beast/insight/NullCollector.h
|
||||
beast/insight/StatsDCollector.h
|
||||
beast/test/fail_counter.h
|
||||
beast/test/fail_stream.h
|
||||
beast/test/pipe_stream.h
|
||||
beast/test/sig_wait.h
|
||||
beast/test/string_iostream.h
|
||||
beast/test/string_istream.h
|
||||
beast/test/string_ostream.h
|
||||
beast/test/test_allocator.h
|
||||
beast/test/yield_to.h
|
||||
beast/utility/hash_pair.h
|
||||
beast/utility/maybe_const.h
|
||||
beast/utility/temp_dir.h
|
||||
|
||||
# included by only json/impl/json_assert.h
|
||||
json/json_errors.h
|
||||
|
||||
protocol/PayChan.h
|
||||
protocol/RippleLedgerHash.h
|
||||
protocol/messages.h
|
||||
protocol/st.h
|
||||
)
|
||||
files+=(
|
||||
basics/README.md
|
||||
crypto/README.md
|
||||
json/README.md
|
||||
protocol/README.md
|
||||
resource/README.md
|
||||
)
|
||||
move_files src/ripple include/xrpl detail ${files[@]}
|
||||
|
||||
echo move libxrpl sources
|
||||
files=$(extract_list 'LIBXRPL SOURCES')
|
||||
move_files src/ripple src/libxrpl "" ${files[@]}
|
||||
|
||||
echo check leftovers
|
||||
dirs=$(cd include/xrpl; ls -d */)
|
||||
dirs=$(cd src/ripple; ls -d ${dirs} 2>/dev/null || true)
|
||||
files="$(cd src/ripple; find ${dirs} -type f)"
|
||||
if [ -n "${files}" ]; then
|
||||
echo "leftover files:"
|
||||
echo ${files}
|
||||
exit
|
||||
fi
|
||||
|
||||
echo remove empty directories
|
||||
empty_dirs="$(cd src/ripple; find ${dirs} -depth -type d)"
|
||||
for dir in ${empty_dirs[@]}; do
|
||||
if [ -e ${dir} ]; then
|
||||
rmdir ${dir}
|
||||
fi
|
||||
done
|
||||
|
||||
echo move xrpld sources
|
||||
files=$(
|
||||
extract_list 'XRPLD SOURCES'
|
||||
cd src/ripple
|
||||
find * -regex '.*\.\(h\|ipp\|md\|pu\|uml\|png\)'
|
||||
)
|
||||
move_files src/ripple src/xrpld detail ${files[@]}
|
||||
|
||||
files="$(cd src/ripple; find . -type f)"
|
||||
if [ -n "${files}" ]; then
|
||||
echo "leftover files:"
|
||||
echo ${files}
|
||||
exit
|
||||
fi
|
||||
|
||||
fi
|
||||
|
||||
rm -rf src/ripple
|
||||
|
||||
echo rename .hpp to .h
|
||||
find include src -name '*.hpp' -exec bash -c 'f="{}"; git mv "${f}" "${f%hpp}h"' \;
|
||||
|
||||
echo move PerfLog.h
|
||||
if [ -e include/xrpl/basics/PerfLog.h ]; then
|
||||
git mv include/xrpl/basics/PerfLog.h src/xrpld/perflog
|
||||
fi
|
||||
|
||||
# Make sure all protobuf includes have the correct prefix.
|
||||
protobuf_replace='s:^#include\s*["<].*org/xrpl\([^">]\+\)[">]:#include <xrpl/proto/org/xrpl\1>:'
|
||||
# Make sure first-party includes use angle brackets and .h extension.
|
||||
ripple_replace='s:include\s*["<]ripple/\(.*\)\.h\(pp\)\?[">]:include <ripple/\1.h>:'
|
||||
beast_replace='s:include\s*<beast/:include <xrpl/beast/:'
|
||||
# Rename impl directories to detail.
|
||||
impl_rename='s:\(<xrpl.*\)/impl\(/details\)\?/:\1/detail/:'
|
||||
|
||||
echo rewrite includes in libxrpl
|
||||
find include/xrpl src/libxrpl -type f -exec sed -i \
|
||||
-e "${protobuf_replace}" \
|
||||
-e "${ripple_replace}" \
|
||||
-e "${beast_replace}" \
|
||||
-e 's:^#include <ripple/:#include <xrpl/:' \
|
||||
-e "${impl_rename}" \
|
||||
{} +
|
||||
|
||||
echo rewrite includes in xrpld
|
||||
# # https://www.baeldung.com/linux/join-multiple-lines
|
||||
libxrpl_dirs="$(cd include/xrpl; ls -d1 */ | sed 's:/$::')"
|
||||
# libxrpl_dirs='a\nb\nc\n'
|
||||
readarray -t libxrpl_dirs <<< "${libxrpl_dirs}"
|
||||
# libxrpl_dirs=(a b c)
|
||||
libxrpl_dirs=$(printf -v txt '%s\\|' "${libxrpl_dirs[@]}"; echo "${txt%\\|}")
|
||||
# libxrpl_dirs='a\|b\|c'
|
||||
find src/xrpld src/test -type f -exec sed -i \
|
||||
-e "${protobuf_replace}" \
|
||||
-e "${ripple_replace}" \
|
||||
-e "${beast_replace}" \
|
||||
-e "s:^#include <ripple/basics/PerfLog.h>:#include <xrpld/perflog/PerfLog.h>:" \
|
||||
-e "s:^#include <ripple/\(${libxrpl_dirs}\)/:#include <xrpl/\1/:" \
|
||||
-e 's:^#include <ripple/:#include <xrpld/:' \
|
||||
-e "${impl_rename}" \
|
||||
{} +
|
||||
|
||||
git commit -m 'Rearrange sources' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
find include src -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -exec clang-format-10 -i {} +
|
||||
git add --update .
|
||||
git commit -m 'Rewrite includes' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
./Builds/levelization/levelization.sh
|
||||
git add --update .
|
||||
git commit -m 'Recompute loops' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
@@ -27,8 +27,8 @@ if [[ "$?" -ne "0" ]]; then
|
||||
exit 127
|
||||
fi
|
||||
|
||||
perl -i -pe "s/^(\\s*)-DBUILD_SHARED_LIBS=OFF/\\1-DBUILD_SHARED_LIBS=OFF\\n\\1-DROCKSDB_BUILD_SHARED=OFF/g" cmake/deps/Rocksdb.cmake &&
|
||||
mv cmake/deps/WasmEdge.cmake cmake/deps/WasmEdge.old &&
|
||||
perl -i -pe "s/^(\\s*)-DBUILD_SHARED_LIBS=OFF/\\1-DBUILD_SHARED_LIBS=OFF\\n\\1-DROCKSDB_BUILD_SHARED=OFF/g" Builds/CMake/deps/Rocksdb.cmake &&
|
||||
mv Builds/CMake/deps/WasmEdge.cmake Builds/CMake/deps/WasmEdge.old &&
|
||||
echo "find_package(LLVM REQUIRED CONFIG)
|
||||
message(STATUS \"Found LLVM \${LLVM_PACKAGE_VERSION}\")
|
||||
message(STATUS \"Using LLVMConfig.cmake in: \${LLVM_DIR}\")
|
||||
@@ -37,7 +37,7 @@ set_target_properties(wasmedge PROPERTIES IMPORTED_LOCATION \${WasmEdge_LIB})
|
||||
target_link_libraries (ripple_libs INTERFACE wasmedge)
|
||||
add_library (wasmedge::wasmedge ALIAS wasmedge)
|
||||
message(\"WasmEdge DONE\")
|
||||
" > cmake/deps/WasmEdge.cmake &&
|
||||
" > Builds/CMake/deps/WasmEdge.cmake &&
|
||||
git checkout src/ripple/protocol/impl/BuildInfo.cpp &&
|
||||
sed -i s/\"0.0.0\"/\"$(date +%Y).$(date +%-m).$(date +%-d)-$(git rev-parse --abbrev-ref HEAD)+$4\"/g src/ripple/protocol/impl/BuildInfo.cpp &&
|
||||
cd release-build &&
|
||||
@@ -69,8 +69,8 @@ fi
|
||||
cd ..;
|
||||
|
||||
mv src/ripple/net/impl/RegisterSSLCerts.cpp.old src/ripple/net/impl/RegisterSSLCerts.cpp;
|
||||
mv cmake/deps/Rocksdb.cmake.old cmake/deps/Rocksdb.cmake;
|
||||
mv cmake/deps/WasmEdge.old cmake/deps/WasmEdge.cmake;
|
||||
mv Builds/CMake/deps/Rocksdb.cmake.old Builds/CMake/deps/Rocksdb.cmake;
|
||||
mv Builds/CMake/deps/WasmEdge.old Builds/CMake/deps/WasmEdge.cmake;
|
||||
|
||||
|
||||
echo "END INSIDE CONTAINER - CORE"
|
||||
|
||||
@@ -155,7 +155,7 @@ cp -r include/api/wasmedge /usr/include/ &&
|
||||
cd /io/ &&
|
||||
echo "-- Build Rippled --" &&
|
||||
pwd &&
|
||||
cp cmake/deps/Rocksdb.cmake cmake/deps/Rocksdb.cmake.old &&
|
||||
cp Builds/CMake/deps/Rocksdb.cmake Builds/CMake/deps/Rocksdb.cmake.old &&
|
||||
|
||||
echo "MOVING TO [ build-core.sh ]"
|
||||
cd /io;
|
||||
|
||||
@@ -13,7 +13,7 @@
|
||||
#
|
||||
# 4. HTTPS Client
|
||||
#
|
||||
# 5. <vacated>
|
||||
# 5. Reporting Mode
|
||||
#
|
||||
# 6. Database
|
||||
#
|
||||
@@ -283,14 +283,12 @@
|
||||
# ssl_cert
|
||||
#
|
||||
# Specifies the path to the SSL certificate file in PEM format.
|
||||
# This is not needed if the chain includes it. Use ssl_chain if
|
||||
# your certificate includes one or more intermediates.
|
||||
# This is not needed if the chain includes it.
|
||||
#
|
||||
# ssl_chain
|
||||
#
|
||||
# If you need a certificate chain, specify the path to the
|
||||
# certificate chain here. The chain may include the end certificate.
|
||||
# This must be used if the certificate includes intermediates.
|
||||
#
|
||||
# ssl_ciphers = <cipherlist>
|
||||
#
|
||||
@@ -389,21 +387,6 @@
|
||||
#
|
||||
#
|
||||
#
|
||||
# [compression]
|
||||
#
|
||||
# true or false
|
||||
#
|
||||
# true - enables compression
|
||||
# false - disables compression [default].
|
||||
#
|
||||
# The rippled server can save bandwidth by compressing its peer-to-peer communications,
|
||||
# at a cost of greater CPU usage. If you enable link compression,
|
||||
# the server automatically compresses communications with peer servers
|
||||
# that also have link compression enabled.
|
||||
# https://xrpl.org/enable-link-compression.html
|
||||
#
|
||||
#
|
||||
#
|
||||
# [ips]
|
||||
#
|
||||
# List of hostnames or ips where the Ripple protocol is served. A default
|
||||
@@ -478,6 +461,19 @@
|
||||
#
|
||||
#
|
||||
#
|
||||
# [sntp_servers]
|
||||
#
|
||||
# IP address or domain of NTP servers to use for time synchronization.
|
||||
#
|
||||
# These NTP servers are suitable for rippled servers located in the United
|
||||
# States:
|
||||
# time.windows.com
|
||||
# time.apple.com
|
||||
# time.nist.gov
|
||||
# pool.ntp.org
|
||||
#
|
||||
#
|
||||
#
|
||||
# [max_transactions]
|
||||
#
|
||||
# Configure the maximum number of transactions to have in the job queue
|
||||
@@ -884,6 +880,119 @@
|
||||
#
|
||||
#-------------------------------------------------------------------------------
|
||||
#
|
||||
# 5. Reporting Mode
|
||||
#
|
||||
#------------
|
||||
#
|
||||
# rippled has an optional operating mode called Reporting Mode. In Reporting
|
||||
# Mode, rippled does not connect to the peer to peer network. Instead, rippled
|
||||
# will continuously extract data from one or more rippled servers that are
|
||||
# connected to the peer to peer network (referred to as an ETL source).
|
||||
# Reporting mode servers will forward RPC requests that require access to the
|
||||
# peer to peer network (submit, fee, etc) to an ETL source.
|
||||
#
|
||||
# [reporting] Settings for Reporting Mode. If and only if this section is
|
||||
# present, rippled will start in reporting mode. This section
|
||||
# contains a list of ETL source names, and key-value pairs. The
|
||||
# ETL source names each correspond to a configuration file
|
||||
# section; the names must match exactly. The key-value pairs are
|
||||
# optional.
|
||||
#
|
||||
#
|
||||
# [<name>]
|
||||
#
|
||||
# A series of key/value pairs that specify an ETL source.
|
||||
#
|
||||
# source_ip = <IP-address>
|
||||
#
|
||||
# Required. IP address of the ETL source. Can also be a DNS record.
|
||||
#
|
||||
# source_ws_port = <number>
|
||||
#
|
||||
# Required. Port on which ETL source is accepting unencrypted websocket
|
||||
# connections.
|
||||
#
|
||||
# source_grpc_port = <number>
|
||||
#
|
||||
# Required for ETL. Port on which ETL source is accepting gRPC requests.
|
||||
# If this option is ommitted, this ETL source cannot actually be used for
|
||||
# ETL; the Reporting Mode server can still forward RPCs to this ETL
|
||||
# source, but cannot extract data from this ETL source.
|
||||
#
|
||||
#
|
||||
# Key-value pairs (all optional):
|
||||
#
|
||||
# read_only Valid values: 0, 1. Default is 0. If set to 1, the server
|
||||
# will start in strict read-only mode, and will not perform
|
||||
# ETL. The server will still handle RPC requests, and will
|
||||
# still forward RPC requests that require access to the p2p
|
||||
# network.
|
||||
#
|
||||
# start_sequence
|
||||
# Sequence of first ledger to extract if the database is empty.
|
||||
# ETL extracts ledgers in order. If this setting is absent and
|
||||
# the database is empty, ETL will start with the next ledger
|
||||
# validated by the network. If this setting is present and the
|
||||
# database is not empty, an exception is thrown.
|
||||
#
|
||||
# num_markers Degree of parallelism used during the initial ledger
|
||||
# download. Only used if the database is empty. Valid values
|
||||
# are 1-256. A higher degree of parallelism results in a
|
||||
# faster download, but puts more load on the ETL source.
|
||||
# Default is 2.
|
||||
#
|
||||
# Example:
|
||||
#
|
||||
# [reporting]
|
||||
# etl_source1
|
||||
# etl_source2
|
||||
# read_only=0
|
||||
# start_sequence=32570
|
||||
# num_markers=8
|
||||
#
|
||||
# [etl_source1]
|
||||
# source_ip=1.2.3.4
|
||||
# source_ws_port=6005
|
||||
# source_grpc_port=50051
|
||||
#
|
||||
# [etl_source2]
|
||||
# source_ip=5.6.7.8
|
||||
# source_ws_port=6005
|
||||
# source_grpc_port=50051
|
||||
#
|
||||
# Minimal Example:
|
||||
#
|
||||
# [reporting]
|
||||
# etl_source1
|
||||
#
|
||||
# [etl_source1]
|
||||
# source_ip=1.2.3.4
|
||||
# source_ws_port=6005
|
||||
# source_grpc_port=50051
|
||||
#
|
||||
#
|
||||
# Notes:
|
||||
#
|
||||
# Reporting Mode requires Postgres (instead of SQLite). The Postgres
|
||||
# connection info is specified under the [ledger_tx_tables] config section;
|
||||
# see the Database section for further documentation.
|
||||
#
|
||||
# Each ETL source specified must have gRPC enabled (by adding a [port_grpc]
|
||||
# section to the config). It is recommended to add a secure_gateway entry to
|
||||
# the gRPC section, in order to bypass the server's rate limiting.
|
||||
# This section needs to be added to the config of the ETL source, not
|
||||
# the config of the reporting node. In the example below, the
|
||||
# reporting server is running at 127.0.0.1. Multiple IPs can be
|
||||
# specified in secure_gateway via a comma separated list.
|
||||
#
|
||||
# [port_grpc]
|
||||
# ip = 0.0.0.0
|
||||
# port = 50051
|
||||
# secure_gateway = 127.0.0.1
|
||||
#
|
||||
#
|
||||
#-------------------------------------------------------------------------------
|
||||
#
|
||||
# 6. Database
|
||||
#
|
||||
#------------
|
||||
@@ -891,7 +1000,13 @@
|
||||
# rippled creates 4 SQLite database to hold bookkeeping information
|
||||
# about transactions, local credentials, and various other things.
|
||||
# It also creates the NodeDB, which holds all the objects that
|
||||
# make up the current and historical ledgers.
|
||||
# make up the current and historical ledgers. In Reporting Mode, rippled
|
||||
# uses a Postgres database instead of SQLite.
|
||||
#
|
||||
# The simplest way to work with Postgres is to install it locally.
|
||||
# When it is running, execute the initdb.sh script in the current
|
||||
# directory as: sudo -u postgres ./initdb.sh
|
||||
# This will create the rippled user and an empty database of the same name.
|
||||
#
|
||||
# The size of the NodeDB grows in proportion to the amount of new data and the
|
||||
# amount of historical data (a configurable setting) so the performance of the
|
||||
@@ -933,6 +1048,14 @@
|
||||
# keeping full history is not advised, and using online delete is
|
||||
# recommended.
|
||||
#
|
||||
# type = Cassandra
|
||||
#
|
||||
# Apache Cassandra is an open-source, distributed key-value store - see
|
||||
# https://cassandra.apache.org/ for more details.
|
||||
#
|
||||
# Cassandra is an alternative backend to be used only with Reporting Mode.
|
||||
# See the Reporting Mode section for more details about Reporting Mode.
|
||||
#
|
||||
# type = RWDB
|
||||
#
|
||||
# RWDB is a high-performance memory store written by XRPL-Labs and optimized
|
||||
@@ -948,6 +1071,21 @@
|
||||
#
|
||||
# path Location to store the database
|
||||
#
|
||||
# Required keys for Cassandra:
|
||||
#
|
||||
# contact_points IP of a node in the Cassandra cluster
|
||||
#
|
||||
# port CQL Native Transport Port
|
||||
#
|
||||
# secure_connect_bundle
|
||||
# Absolute path to a secure connect bundle. When using
|
||||
# a secure connect bundle, contact_points and port are
|
||||
# not required.
|
||||
#
|
||||
# keyspace Name of Cassandra keyspace to use
|
||||
#
|
||||
# table_name Name of table in above keyspace to use
|
||||
#
|
||||
# Optional keys
|
||||
#
|
||||
# cache_size Size of cache for database records. Default is 16384.
|
||||
@@ -964,7 +1102,7 @@
|
||||
# default value for the unspecified parameter.
|
||||
#
|
||||
# Note: the cache will not be created if online_delete
|
||||
# is specified.
|
||||
# is specified, or if shards are used.
|
||||
#
|
||||
# fast_load Boolean. If set, load the last persisted ledger
|
||||
# from disk upon process start before syncing to
|
||||
@@ -977,6 +1115,10 @@
|
||||
# earliest_seq The default is 32570 to match the XRP ledger
|
||||
# network's earliest allowed sequence. Alternate
|
||||
# networks may set this value. Minimum value of 1.
|
||||
# If a [shard_db] section is defined, and this
|
||||
# value is present either [node_db] or [shard_db],
|
||||
# it must be defined with the same value in both
|
||||
# sections.
|
||||
#
|
||||
# online_delete Minimum value of 256. Enable automatic purging
|
||||
# of older ledger information. Maintain at least this
|
||||
@@ -1024,6 +1166,25 @@
|
||||
# checking until healthy.
|
||||
# Default is 5.
|
||||
#
|
||||
# Optional keys for Cassandra:
|
||||
#
|
||||
# username Username to use if Cassandra cluster requires
|
||||
# authentication
|
||||
#
|
||||
# password Password to use if Cassandra cluster requires
|
||||
# authentication
|
||||
#
|
||||
# max_requests_outstanding
|
||||
# Limits the maximum number of concurrent database
|
||||
# writes. Default is 10 million. For slower clusters,
|
||||
# large numbers of concurrent writes can overload the
|
||||
# cluster. Setting this option can help eliminate
|
||||
# write timeouts and other write errors due to the
|
||||
# cluster being overloaded.
|
||||
# io_threads
|
||||
# Set the number of IO threads used by the
|
||||
# Cassandra driver. Defaults to 4.
|
||||
#
|
||||
# Notes:
|
||||
# The 'node_db' entry configures the primary, persistent storage.
|
||||
#
|
||||
@@ -1040,6 +1201,32 @@
|
||||
# your rippled.cfg file.
|
||||
# Partial pathnames are relative to the location of the rippled executable.
|
||||
#
|
||||
# [shard_db] Settings for the Shard Database (optional)
|
||||
#
|
||||
# Format (without spaces):
|
||||
# One or more lines of case-insensitive key / value pairs:
|
||||
# <key> '=' <value>
|
||||
# ...
|
||||
#
|
||||
# Example:
|
||||
# path=db/shards/nudb
|
||||
#
|
||||
# Required keys:
|
||||
# path Location to store the database
|
||||
#
|
||||
# Optional keys:
|
||||
# max_historical_shards
|
||||
# The maximum number of historical shards
|
||||
# to store.
|
||||
#
|
||||
# [historical_shard_paths] Additional storage paths for the Shard Database (optional)
|
||||
#
|
||||
# Format (without spaces):
|
||||
# One or more lines, each expressing a full path for storing historical shards:
|
||||
# /mnt/disk1
|
||||
# /mnt/disk2
|
||||
# ...
|
||||
#
|
||||
# [sqlite] Tuning settings for the SQLite databases (optional)
|
||||
#
|
||||
# Format (without spaces):
|
||||
@@ -1119,18 +1306,40 @@
|
||||
# This setting may not be combined with the
|
||||
# "safety_level" setting.
|
||||
#
|
||||
# page_size Valid values: integer (MUST be power of 2 between 512 and 65536)
|
||||
# The default is 4096 bytes. This setting determines
|
||||
# the size of a page in the transaction.db file.
|
||||
# See https://www.sqlite.org/pragma.html#pragma_page_size
|
||||
# for more details about the available options.
|
||||
# [ledger_tx_tables] (optional)
|
||||
#
|
||||
# journal_size_limit Valid values: integer
|
||||
# The default is 1582080. This setting limits
|
||||
# the size of the journal for transaction.db file. When the limit is
|
||||
# reached, older entries will be deleted.
|
||||
# See https://www.sqlite.org/pragma.html#pragma_journal_size_limit
|
||||
# for more details about the available options.
|
||||
# conninfo Info for connecting to Postgres. Format is
|
||||
# postgres://[username]:[password]@[ip]/[database].
|
||||
# The database and user must already exist. If this
|
||||
# section is missing and rippled is running in
|
||||
# Reporting Mode, rippled will connect as the
|
||||
# user running rippled to a database with the
|
||||
# same name. On Linux and Mac OS X, the connection
|
||||
# will take place using the server's UNIX domain
|
||||
# socket. On Windows, through the localhost IP
|
||||
# address. Default is empty.
|
||||
#
|
||||
# use_tx_tables Valid values: 1, 0
|
||||
# The default is 1 (true). Determines whether to use
|
||||
# the SQLite transaction database. If set to 0,
|
||||
# rippled will not write to the transaction database,
|
||||
# and will reject tx, account_tx and tx_history RPCs.
|
||||
# In Reporting Mode, this setting is ignored.
|
||||
#
|
||||
# max_connections Valid values: any positive integer up to 64 bit
|
||||
# storage length. This configures the maximum
|
||||
# number of concurrent connections to postgres.
|
||||
# Default is the maximum possible value to
|
||||
# fit in a 64 bit integer.
|
||||
#
|
||||
# timeout Number of seconds after which idle postgres
|
||||
# connections are discconnected. If set to 0,
|
||||
# connections never timeout. Default is 600.
|
||||
#
|
||||
#
|
||||
# remember_ip Value values: 1, 0
|
||||
# Default is 1 (true). Whether to cache host and
|
||||
# port connection settings.
|
||||
#
|
||||
#
|
||||
#-------------------------------------------------------------------------------
|
||||
@@ -1396,12 +1605,6 @@
|
||||
# Admin level API commands over Secure Websockets, when originating
|
||||
# from the same machine (via the loopback adapter at 127.0.0.1).
|
||||
#
|
||||
# "grpc"
|
||||
#
|
||||
# ETL commands for Clio. We recommend setting secure_gateway
|
||||
# in this section to a comma-separated list of the addresses
|
||||
# of your Clio servers, in order to bypass rippled's rate limiting.
|
||||
#
|
||||
# This port is commented out but can be enabled by removing
|
||||
# the '#' from each corresponding line including the entry under [server]
|
||||
#
|
||||
@@ -1446,7 +1649,6 @@ port = 6006
|
||||
ip = 127.0.0.1
|
||||
admin = 127.0.0.1
|
||||
protocol = ws
|
||||
send_queue_limit = 500
|
||||
|
||||
[port_grpc]
|
||||
port = 50051
|
||||
@@ -1457,7 +1659,6 @@ secure_gateway = 127.0.0.1
|
||||
#port = 6005
|
||||
#ip = 127.0.0.1
|
||||
#protocol = wss
|
||||
#send_queue_limit = 500
|
||||
|
||||
#-------------------------------------------------------------------------------
|
||||
|
||||
@@ -1480,15 +1681,45 @@ path=/var/lib/rippled/db/nudb
|
||||
online_delete=512
|
||||
advisory_delete=0
|
||||
|
||||
# This is the persistent datastore for shards. It is important for the health
|
||||
# of the ripple network that rippled operators shard as much as practical.
|
||||
# NuDB requires SSD storage. Helpful information can be found at
|
||||
# https://xrpl.org/history-sharding.html
|
||||
#[shard_db]
|
||||
#path=/var/lib/rippled/db/shards/nudb
|
||||
#max_historical_shards=50
|
||||
#
|
||||
# This optional section can be configured with a list
|
||||
# of paths to use for storing historical shards. Each
|
||||
# path must correspond to a unique filesystem.
|
||||
#[historical_shard_paths]
|
||||
#/path/1
|
||||
#/path/2
|
||||
|
||||
[database_path]
|
||||
/var/lib/rippled/db
|
||||
|
||||
|
||||
# To use Postgres, uncomment this section and fill in the appropriate connection
|
||||
# info. Postgres can only be used in Reporting Mode.
|
||||
# To disable writing to the transaction database, uncomment this section, and
|
||||
# set use_tx_tables=0
|
||||
# [ledger_tx_tables]
|
||||
# conninfo = postgres://[username]:[password]@[ip]/[database]
|
||||
# use_tx_tables=1
|
||||
|
||||
|
||||
# This needs to be an absolute directory reference, not a relative one.
|
||||
# Modify this value as required.
|
||||
[debug_logfile]
|
||||
/var/log/rippled/debug.log
|
||||
|
||||
[sntp_servers]
|
||||
time.windows.com
|
||||
time.apple.com
|
||||
time.nist.gov
|
||||
pool.ntp.org
|
||||
|
||||
# To use the XRP test network
|
||||
# (see https://xrpl.org/connect-your-rippled-to-the-xrp-test-net.html),
|
||||
# use the following [ips] section:
|
||||
@@ -1511,3 +1742,15 @@ validators.txt
|
||||
# set to ssl_verify to 0.
|
||||
[ssl_verify]
|
||||
1
|
||||
|
||||
|
||||
# To run in Reporting Mode, uncomment this section and fill in the appropriate
|
||||
# connection info for one or more ETL sources.
|
||||
# [reporting]
|
||||
# etl_source
|
||||
#
|
||||
#
|
||||
# [etl_source]
|
||||
# source_grpc_port=50051
|
||||
# source_ws_port=6005
|
||||
# source_ip=127.0.0.1
|
||||
|
||||
1703
cfg/rippled-reporting.cfg
Normal file
1703
cfg/rippled-reporting.cfg
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,48 +0,0 @@
|
||||
macro(group_sources_in source_dir curdir)
|
||||
file(GLOB children RELATIVE ${source_dir}/${curdir}
|
||||
${source_dir}/${curdir}/*)
|
||||
foreach (child ${children})
|
||||
if (IS_DIRECTORY ${source_dir}/${curdir}/${child})
|
||||
group_sources_in(${source_dir} ${curdir}/${child})
|
||||
else()
|
||||
string(REPLACE "/" "\\" groupname ${curdir})
|
||||
source_group(${groupname} FILES
|
||||
${source_dir}/${curdir}/${child})
|
||||
endif()
|
||||
endforeach()
|
||||
endmacro()
|
||||
|
||||
macro(group_sources curdir)
|
||||
group_sources_in(${PROJECT_SOURCE_DIR} ${curdir})
|
||||
endmacro()
|
||||
|
||||
macro (exclude_from_default target_)
|
||||
set_target_properties (${target_} PROPERTIES EXCLUDE_FROM_ALL ON)
|
||||
set_target_properties (${target_} PROPERTIES EXCLUDE_FROM_DEFAULT_BUILD ON)
|
||||
endmacro ()
|
||||
|
||||
macro (exclude_if_included target_)
|
||||
get_directory_property(has_parent PARENT_DIRECTORY)
|
||||
if (has_parent)
|
||||
exclude_from_default (${target_})
|
||||
endif ()
|
||||
endmacro ()
|
||||
|
||||
find_package(Git)
|
||||
|
||||
function (git_branch branch_val)
|
||||
if (NOT GIT_FOUND)
|
||||
return ()
|
||||
endif ()
|
||||
set (_branch "")
|
||||
execute_process (COMMAND ${GIT_EXECUTABLE} "rev-parse" "--abbrev-ref" "HEAD"
|
||||
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
|
||||
RESULT_VARIABLE _git_exit_code
|
||||
OUTPUT_VARIABLE _temp_branch
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
ERROR_QUIET)
|
||||
if (_git_exit_code EQUAL 0)
|
||||
set (_branch ${_temp_branch})
|
||||
endif ()
|
||||
set (${branch_val} "${_branch}" PARENT_SCOPE)
|
||||
endfunction ()
|
||||
@@ -1,440 +0,0 @@
|
||||
# Copyright (c) 2012 - 2017, Lars Bilke
|
||||
# All rights reserved.
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without modification,
|
||||
# are permitted provided that the following conditions are met:
|
||||
#
|
||||
# 1. Redistributions of source code must retain the above copyright notice, this
|
||||
# list of conditions and the following disclaimer.
|
||||
#
|
||||
# 2. Redistributions in binary form must reproduce the above copyright notice,
|
||||
# this list of conditions and the following disclaimer in the documentation
|
||||
# and/or other materials provided with the distribution.
|
||||
#
|
||||
# 3. Neither the name of the copyright holder nor the names of its contributors
|
||||
# may be used to endorse or promote products derived from this software without
|
||||
# specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
|
||||
# ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||
# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
|
||||
# ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
#
|
||||
# CHANGES:
|
||||
#
|
||||
# 2012-01-31, Lars Bilke
|
||||
# - Enable Code Coverage
|
||||
#
|
||||
# 2013-09-17, Joakim Söderberg
|
||||
# - Added support for Clang.
|
||||
# - Some additional usage instructions.
|
||||
#
|
||||
# 2016-02-03, Lars Bilke
|
||||
# - Refactored functions to use named parameters
|
||||
#
|
||||
# 2017-06-02, Lars Bilke
|
||||
# - Merged with modified version from github.com/ufz/ogs
|
||||
#
|
||||
# 2019-05-06, Anatolii Kurotych
|
||||
# - Remove unnecessary --coverage flag
|
||||
#
|
||||
# 2019-12-13, FeRD (Frank Dana)
|
||||
# - Deprecate COVERAGE_LCOVR_EXCLUDES and COVERAGE_GCOVR_EXCLUDES lists in favor
|
||||
# of tool-agnostic COVERAGE_EXCLUDES variable, or EXCLUDE setup arguments.
|
||||
# - CMake 3.4+: All excludes can be specified relative to BASE_DIRECTORY
|
||||
# - All setup functions: accept BASE_DIRECTORY, EXCLUDE list
|
||||
# - Set lcov basedir with -b argument
|
||||
# - Add automatic --demangle-cpp in lcovr, if 'c++filt' is available (can be
|
||||
# overridden with NO_DEMANGLE option in setup_target_for_coverage_lcovr().)
|
||||
# - Delete output dir, .info file on 'make clean'
|
||||
# - Remove Python detection, since version mismatches will break gcovr
|
||||
# - Minor cleanup (lowercase function names, update examples...)
|
||||
#
|
||||
# 2019-12-19, FeRD (Frank Dana)
|
||||
# - Rename Lcov outputs, make filtered file canonical, fix cleanup for targets
|
||||
#
|
||||
# 2020-01-19, Bob Apthorpe
|
||||
# - Added gfortran support
|
||||
#
|
||||
# 2020-02-17, FeRD (Frank Dana)
|
||||
# - Make all add_custom_target()s VERBATIM to auto-escape wildcard characters
|
||||
# in EXCLUDEs, and remove manual escaping from gcovr targets
|
||||
#
|
||||
# 2021-01-19, Robin Mueller
|
||||
# - Add CODE_COVERAGE_VERBOSE option which will allow to print out commands which are run
|
||||
# - Added the option for users to set the GCOVR_ADDITIONAL_ARGS variable to supply additional
|
||||
# flags to the gcovr command
|
||||
#
|
||||
# 2020-05-04, Mihchael Davis
|
||||
# - Add -fprofile-abs-path to make gcno files contain absolute paths
|
||||
# - Fix BASE_DIRECTORY not working when defined
|
||||
# - Change BYPRODUCT from folder to index.html to stop ninja from complaining about double defines
|
||||
#
|
||||
# 2021-05-10, Martin Stump
|
||||
# - Check if the generator is multi-config before warning about non-Debug builds
|
||||
#
|
||||
# 2022-02-22, Marko Wehle
|
||||
# - Change gcovr output from -o <filename> for --xml <filename> and --html <filename> output respectively.
|
||||
# This will allow for Multiple Output Formats at the same time by making use of GCOVR_ADDITIONAL_ARGS, e.g. GCOVR_ADDITIONAL_ARGS "--txt".
|
||||
#
|
||||
# 2022-09-28, Sebastian Mueller
|
||||
# - fix append_coverage_compiler_flags_to_target to correctly add flags
|
||||
# - replace "-fprofile-arcs -ftest-coverage" with "--coverage" (equivalent)
|
||||
#
|
||||
# 2024-01-04, Bronek Kozicki
|
||||
# - remove setup_target_for_coverage_lcov (slow) and setup_target_for_coverage_fastcov (no support for Clang)
|
||||
# - fix Clang support by adding find_program( ... llvm-cov )
|
||||
# - add Apple Clang support by adding execute_process( COMMAND xcrun -f llvm-cov ... )
|
||||
# - add CODE_COVERAGE_GCOV_TOOL to explicitly select gcov tool and disable find_program
|
||||
# - replace both functions setup_target_for_coverage_gcovr_* with a single setup_target_for_coverage_gcovr
|
||||
# - add support for all gcovr output formats
|
||||
#
|
||||
# USAGE:
|
||||
#
|
||||
# 1. Copy this file into your cmake modules path.
|
||||
#
|
||||
# 2. Add the following line to your CMakeLists.txt (best inside an if-condition
|
||||
# using a CMake option() to enable it just optionally):
|
||||
# include(CodeCoverage)
|
||||
#
|
||||
# 3. Append necessary compiler flags for all supported source files:
|
||||
# append_coverage_compiler_flags()
|
||||
# Or for specific target:
|
||||
# append_coverage_compiler_flags_to_target(YOUR_TARGET_NAME)
|
||||
#
|
||||
# 3.a (OPTIONAL) Set appropriate optimization flags, e.g. -O0, -O1 or -Og
|
||||
#
|
||||
# 4. If you need to exclude additional directories from the report, specify them
|
||||
# using full paths in the COVERAGE_EXCLUDES variable before calling
|
||||
# setup_target_for_coverage_*().
|
||||
# Example:
|
||||
# set(COVERAGE_EXCLUDES
|
||||
# '${PROJECT_SOURCE_DIR}/src/dir1/*'
|
||||
# '/path/to/my/src/dir2/*')
|
||||
# Or, use the EXCLUDE argument to setup_target_for_coverage_*().
|
||||
# Example:
|
||||
# setup_target_for_coverage_gcovr(
|
||||
# NAME coverage
|
||||
# EXECUTABLE testrunner
|
||||
# EXCLUDE "${PROJECT_SOURCE_DIR}/src/dir1/*" "/path/to/my/src/dir2/*")
|
||||
#
|
||||
# 4.a NOTE: With CMake 3.4+, COVERAGE_EXCLUDES or EXCLUDE can also be set
|
||||
# relative to the BASE_DIRECTORY (default: PROJECT_SOURCE_DIR)
|
||||
# Example:
|
||||
# set(COVERAGE_EXCLUDES "dir1/*")
|
||||
# setup_target_for_coverage_gcovr(
|
||||
# NAME coverage
|
||||
# EXECUTABLE testrunner
|
||||
# FORMAT html-details
|
||||
# BASE_DIRECTORY "${PROJECT_SOURCE_DIR}/src"
|
||||
# EXCLUDE "dir2/*")
|
||||
#
|
||||
# 4.b If you need to pass specific options to gcovr, specify them in
|
||||
# GCOVR_ADDITIONAL_ARGS variable.
|
||||
# Example:
|
||||
# set (GCOVR_ADDITIONAL_ARGS --exclude-throw-branches --exclude-noncode-lines -s)
|
||||
# setup_target_for_coverage_gcovr(
|
||||
# NAME coverage
|
||||
# EXECUTABLE testrunner
|
||||
# EXCLUDE "src/dir1" "src/dir2")
|
||||
#
|
||||
# 5. Use the functions described below to create a custom make target which
|
||||
# runs your test executable and produces a code coverage report.
|
||||
#
|
||||
# 6. Build a Debug build:
|
||||
# cmake -DCMAKE_BUILD_TYPE=Debug ..
|
||||
# make
|
||||
# make my_coverage_target
|
||||
|
||||
include(CMakeParseArguments)
|
||||
|
||||
option(CODE_COVERAGE_VERBOSE "Verbose information" FALSE)
|
||||
|
||||
# Check prereqs
|
||||
find_program( GCOVR_PATH gcovr PATHS ${CMAKE_SOURCE_DIR}/scripts/test)
|
||||
|
||||
if(DEFINED CODE_COVERAGE_GCOV_TOOL)
|
||||
set(GCOV_TOOL "${CODE_COVERAGE_GCOV_TOOL}")
|
||||
elseif(DEFINED ENV{CODE_COVERAGE_GCOV_TOOL})
|
||||
set(GCOV_TOOL "$ENV{CODE_COVERAGE_GCOV_TOOL}")
|
||||
elseif("${CMAKE_CXX_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang")
|
||||
if(APPLE)
|
||||
execute_process( COMMAND xcrun -f llvm-cov
|
||||
OUTPUT_VARIABLE LLVMCOV_PATH
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
)
|
||||
else()
|
||||
find_program( LLVMCOV_PATH llvm-cov )
|
||||
endif()
|
||||
if(LLVMCOV_PATH)
|
||||
set(GCOV_TOOL "${LLVMCOV_PATH} gcov")
|
||||
endif()
|
||||
elseif("${CMAKE_CXX_COMPILER_ID}" MATCHES "GNU")
|
||||
find_program( GCOV_PATH gcov )
|
||||
set(GCOV_TOOL "${GCOV_PATH}")
|
||||
endif()
|
||||
|
||||
# Check supported compiler (Clang, GNU and Flang)
|
||||
get_property(LANGUAGES GLOBAL PROPERTY ENABLED_LANGUAGES)
|
||||
foreach(LANG ${LANGUAGES})
|
||||
if("${CMAKE_${LANG}_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang")
|
||||
if("${CMAKE_${LANG}_COMPILER_VERSION}" VERSION_LESS 3)
|
||||
message(FATAL_ERROR "Clang version must be 3.0.0 or greater! Aborting...")
|
||||
endif()
|
||||
elseif(NOT "${CMAKE_${LANG}_COMPILER_ID}" MATCHES "GNU"
|
||||
AND NOT "${CMAKE_${LANG}_COMPILER_ID}" MATCHES "(LLVM)?[Ff]lang")
|
||||
message(FATAL_ERROR "Compiler is not GNU or Flang! Aborting...")
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
set(COVERAGE_COMPILER_FLAGS "-g --coverage"
|
||||
CACHE INTERNAL "")
|
||||
if(CMAKE_CXX_COMPILER_ID MATCHES "(GNU|Clang)")
|
||||
include(CheckCXXCompilerFlag)
|
||||
check_cxx_compiler_flag(-fprofile-abs-path HAVE_cxx_fprofile_abs_path)
|
||||
if(HAVE_cxx_fprofile_abs_path)
|
||||
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
endif()
|
||||
include(CheckCCompilerFlag)
|
||||
check_c_compiler_flag(-fprofile-abs-path HAVE_c_fprofile_abs_path)
|
||||
if(HAVE_c_fprofile_abs_path)
|
||||
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
set(CMAKE_Fortran_FLAGS_COVERAGE
|
||||
${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the Fortran compiler during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_CXX_FLAGS_COVERAGE
|
||||
${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the C++ compiler during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_C_FLAGS_COVERAGE
|
||||
${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the C compiler during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_EXE_LINKER_FLAGS_COVERAGE
|
||||
""
|
||||
CACHE STRING "Flags used for linking binaries during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_SHARED_LINKER_FLAGS_COVERAGE
|
||||
""
|
||||
CACHE STRING "Flags used by the shared libraries linker during coverage builds."
|
||||
FORCE )
|
||||
mark_as_advanced(
|
||||
CMAKE_Fortran_FLAGS_COVERAGE
|
||||
CMAKE_CXX_FLAGS_COVERAGE
|
||||
CMAKE_C_FLAGS_COVERAGE
|
||||
CMAKE_EXE_LINKER_FLAGS_COVERAGE
|
||||
CMAKE_SHARED_LINKER_FLAGS_COVERAGE )
|
||||
|
||||
get_property(GENERATOR_IS_MULTI_CONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
|
||||
if(NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG))
|
||||
message(WARNING "Code coverage results with an optimised (non-Debug) build may be misleading")
|
||||
endif() # NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG)
|
||||
|
||||
if(CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
|
||||
link_libraries(gcov)
|
||||
endif()
|
||||
|
||||
# Defines a target for running and collection code coverage information
|
||||
# Builds dependencies, runs the given executable and outputs reports.
|
||||
# NOTE! The executable should always have a ZERO as exit code otherwise
|
||||
# the coverage generation will not complete.
|
||||
#
|
||||
# setup_target_for_coverage_gcovr(
|
||||
# NAME ctest_coverage # New target name
|
||||
# EXECUTABLE ctest -j ${PROCESSOR_COUNT} # Executable in PROJECT_BINARY_DIR
|
||||
# DEPENDENCIES executable_target # Dependencies to build first
|
||||
# BASE_DIRECTORY "../" # Base directory for report
|
||||
# # (defaults to PROJECT_SOURCE_DIR)
|
||||
# FORMAT "cobertura" # Output format, one of:
|
||||
# # xml cobertura sonarqube json-summary
|
||||
# # json-details coveralls csv txt
|
||||
# # html-single html-nested html-details
|
||||
# # (xml is an alias to cobertura;
|
||||
# # if no format is set, defaults to xml)
|
||||
# EXCLUDE "src/dir1/*" "src/dir2/*" # Patterns to exclude (can be relative
|
||||
# # to BASE_DIRECTORY, with CMake 3.4+)
|
||||
# )
|
||||
# The user can set the variable GCOVR_ADDITIONAL_ARGS to supply additional flags to the
|
||||
# GCVOR command.
|
||||
function(setup_target_for_coverage_gcovr)
|
||||
set(options NONE)
|
||||
set(oneValueArgs BASE_DIRECTORY NAME FORMAT)
|
||||
set(multiValueArgs EXCLUDE EXECUTABLE EXECUTABLE_ARGS DEPENDENCIES)
|
||||
cmake_parse_arguments(Coverage "${options}" "${oneValueArgs}" "${multiValueArgs}" ${ARGN})
|
||||
|
||||
if(NOT GCOV_TOOL)
|
||||
message(FATAL_ERROR "Could not find gcov or llvm-cov tool! Aborting...")
|
||||
endif()
|
||||
|
||||
if(NOT GCOVR_PATH)
|
||||
message(FATAL_ERROR "Could not find gcovr tool! Aborting...")
|
||||
endif()
|
||||
|
||||
# Set base directory (as absolute path), or default to PROJECT_SOURCE_DIR
|
||||
if(DEFINED Coverage_BASE_DIRECTORY)
|
||||
get_filename_component(BASEDIR ${Coverage_BASE_DIRECTORY} ABSOLUTE)
|
||||
else()
|
||||
set(BASEDIR ${PROJECT_SOURCE_DIR})
|
||||
endif()
|
||||
|
||||
if(NOT DEFINED Coverage_FORMAT)
|
||||
set(Coverage_FORMAT xml)
|
||||
endif()
|
||||
|
||||
if("--output" IN_LIST GCOVR_ADDITIONAL_ARGS)
|
||||
message(FATAL_ERROR "Unsupported --output option detected in GCOVR_ADDITIONAL_ARGS! Aborting...")
|
||||
else()
|
||||
if((Coverage_FORMAT STREQUAL "html-details")
|
||||
OR (Coverage_FORMAT STREQUAL "html-nested"))
|
||||
set(GCOVR_OUTPUT_FILE ${PROJECT_BINARY_DIR}/${Coverage_NAME}/index.html)
|
||||
set(GCOVR_CREATE_FOLDER ${PROJECT_BINARY_DIR}/${Coverage_NAME})
|
||||
elseif(Coverage_FORMAT STREQUAL "html-single")
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.html)
|
||||
elseif((Coverage_FORMAT STREQUAL "json-summary")
|
||||
OR (Coverage_FORMAT STREQUAL "json-details")
|
||||
OR (Coverage_FORMAT STREQUAL "coveralls"))
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.json)
|
||||
elseif(Coverage_FORMAT STREQUAL "txt")
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.txt)
|
||||
elseif(Coverage_FORMAT STREQUAL "csv")
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.csv)
|
||||
else()
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.xml)
|
||||
endif()
|
||||
endif()
|
||||
|
||||
if((Coverage_FORMAT STREQUAL "cobertura")
|
||||
OR (Coverage_FORMAT STREQUAL "xml"))
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --cobertura "${GCOVR_OUTPUT_FILE}" )
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --cobertura-pretty )
|
||||
set(Coverage_FORMAT cobertura) # overwrite xml
|
||||
elseif(Coverage_FORMAT STREQUAL "sonarqube")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --sonarqube "${GCOVR_OUTPUT_FILE}" )
|
||||
elseif(Coverage_FORMAT STREQUAL "json-summary")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json-summary "${GCOVR_OUTPUT_FILE}" )
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json-summary-pretty)
|
||||
elseif(Coverage_FORMAT STREQUAL "json-details")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json "${GCOVR_OUTPUT_FILE}" )
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json-pretty)
|
||||
elseif(Coverage_FORMAT STREQUAL "coveralls")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --coveralls "${GCOVR_OUTPUT_FILE}" )
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --coveralls-pretty)
|
||||
elseif(Coverage_FORMAT STREQUAL "csv")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --csv "${GCOVR_OUTPUT_FILE}" )
|
||||
elseif(Coverage_FORMAT STREQUAL "txt")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --txt "${GCOVR_OUTPUT_FILE}" )
|
||||
elseif(Coverage_FORMAT STREQUAL "html-single")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html "${GCOVR_OUTPUT_FILE}" )
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html-self-contained)
|
||||
elseif(Coverage_FORMAT STREQUAL "html-nested")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html-nested "${GCOVR_OUTPUT_FILE}" )
|
||||
elseif(Coverage_FORMAT STREQUAL "html-details")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html-details "${GCOVR_OUTPUT_FILE}" )
|
||||
else()
|
||||
message(FATAL_ERROR "Unsupported output style ${Coverage_FORMAT}! Aborting...")
|
||||
endif()
|
||||
|
||||
# Collect excludes (CMake 3.4+: Also compute absolute paths)
|
||||
set(GCOVR_EXCLUDES "")
|
||||
foreach(EXCLUDE ${Coverage_EXCLUDE} ${COVERAGE_EXCLUDES} ${COVERAGE_GCOVR_EXCLUDES})
|
||||
if(CMAKE_VERSION VERSION_GREATER 3.4)
|
||||
get_filename_component(EXCLUDE ${EXCLUDE} ABSOLUTE BASE_DIR ${BASEDIR})
|
||||
endif()
|
||||
list(APPEND GCOVR_EXCLUDES "${EXCLUDE}")
|
||||
endforeach()
|
||||
list(REMOVE_DUPLICATES GCOVR_EXCLUDES)
|
||||
|
||||
# Combine excludes to several -e arguments
|
||||
set(GCOVR_EXCLUDE_ARGS "")
|
||||
foreach(EXCLUDE ${GCOVR_EXCLUDES})
|
||||
list(APPEND GCOVR_EXCLUDE_ARGS "-e")
|
||||
list(APPEND GCOVR_EXCLUDE_ARGS "${EXCLUDE}")
|
||||
endforeach()
|
||||
|
||||
# Set up commands which will be run to generate coverage data
|
||||
# Run tests
|
||||
set(GCOVR_EXEC_TESTS_CMD
|
||||
${Coverage_EXECUTABLE} ${Coverage_EXECUTABLE_ARGS}
|
||||
)
|
||||
|
||||
# Create folder
|
||||
if(DEFINED GCOVR_CREATE_FOLDER)
|
||||
set(GCOVR_FOLDER_CMD
|
||||
${CMAKE_COMMAND} -E make_directory ${GCOVR_CREATE_FOLDER})
|
||||
else()
|
||||
set(GCOVR_FOLDER_CMD echo) # dummy
|
||||
endif()
|
||||
|
||||
# Running gcovr
|
||||
set(GCOVR_CMD
|
||||
${GCOVR_PATH}
|
||||
--gcov-executable ${GCOV_TOOL}
|
||||
--gcov-ignore-parse-errors=negative_hits.warn_once_per_file
|
||||
-r ${BASEDIR}
|
||||
${GCOVR_ADDITIONAL_ARGS}
|
||||
${GCOVR_EXCLUDE_ARGS}
|
||||
--object-directory=${PROJECT_BINARY_DIR}
|
||||
)
|
||||
|
||||
if(CODE_COVERAGE_VERBOSE)
|
||||
message(STATUS "Executed command report")
|
||||
|
||||
message(STATUS "Command to run tests: ")
|
||||
string(REPLACE ";" " " GCOVR_EXEC_TESTS_CMD_SPACED "${GCOVR_EXEC_TESTS_CMD}")
|
||||
message(STATUS "${GCOVR_EXEC_TESTS_CMD_SPACED}")
|
||||
|
||||
if(NOT GCOVR_FOLDER_CMD STREQUAL "echo")
|
||||
message(STATUS "Command to create a folder: ")
|
||||
string(REPLACE ";" " " GCOVR_FOLDER_CMD_SPACED "${GCOVR_FOLDER_CMD}")
|
||||
message(STATUS "${GCOVR_FOLDER_CMD_SPACED}")
|
||||
endif()
|
||||
|
||||
message(STATUS "Command to generate gcovr coverage data: ")
|
||||
string(REPLACE ";" " " GCOVR_CMD_SPACED "${GCOVR_CMD}")
|
||||
message(STATUS "${GCOVR_CMD_SPACED}")
|
||||
endif()
|
||||
|
||||
add_custom_target(${Coverage_NAME}
|
||||
COMMAND ${GCOVR_EXEC_TESTS_CMD}
|
||||
COMMAND ${GCOVR_FOLDER_CMD}
|
||||
COMMAND ${GCOVR_CMD}
|
||||
|
||||
BYPRODUCTS ${GCOVR_OUTPUT_FILE}
|
||||
WORKING_DIRECTORY ${PROJECT_BINARY_DIR}
|
||||
DEPENDS ${Coverage_DEPENDENCIES}
|
||||
VERBATIM # Protect arguments to commands
|
||||
COMMENT "Running gcovr to produce code coverage report."
|
||||
)
|
||||
|
||||
# Show info where to find the report
|
||||
add_custom_command(TARGET ${Coverage_NAME} POST_BUILD
|
||||
COMMAND ;
|
||||
COMMENT "Code coverage report saved in ${GCOVR_OUTPUT_FILE} formatted as ${Coverage_FORMAT}"
|
||||
)
|
||||
endfunction() # setup_target_for_coverage_gcovr
|
||||
|
||||
function(append_coverage_compiler_flags)
|
||||
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
set(CMAKE_Fortran_FLAGS "${CMAKE_Fortran_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
message(STATUS "Appending code coverage compiler flags: ${COVERAGE_COMPILER_FLAGS}")
|
||||
endfunction() # append_coverage_compiler_flags
|
||||
|
||||
# Setup coverage for specific library
|
||||
function(append_coverage_compiler_flags_to_target name)
|
||||
separate_arguments(_flag_list NATIVE_COMMAND "${COVERAGE_COMPILER_FLAGS}")
|
||||
target_compile_options(${name} PRIVATE ${_flag_list})
|
||||
if(CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
|
||||
target_link_libraries(${name} PRIVATE gcov)
|
||||
endif()
|
||||
endfunction()
|
||||
@@ -1,155 +0,0 @@
|
||||
#[===================================================================[
|
||||
Exported targets.
|
||||
#]===================================================================]
|
||||
|
||||
include(target_protobuf_sources)
|
||||
|
||||
# Protocol buffers cannot participate in a unity build,
|
||||
# because all the generated sources
|
||||
# define a bunch of `static const` variables with the same names,
|
||||
# so we just build them as a separate library.
|
||||
add_library(xrpl.libpb)
|
||||
target_protobuf_sources(xrpl.libpb xrpl/proto
|
||||
LANGUAGE cpp
|
||||
IMPORT_DIRS include/xrpl/proto
|
||||
PROTOS include/xrpl/proto/ripple.proto
|
||||
)
|
||||
|
||||
file(GLOB_RECURSE protos "include/xrpl/proto/org/*.proto")
|
||||
target_protobuf_sources(xrpl.libpb xrpl/proto
|
||||
LANGUAGE cpp
|
||||
IMPORT_DIRS include/xrpl/proto
|
||||
PROTOS "${protos}"
|
||||
)
|
||||
target_protobuf_sources(xrpl.libpb xrpl/proto
|
||||
LANGUAGE grpc
|
||||
IMPORT_DIRS include/xrpl/proto
|
||||
PROTOS "${protos}"
|
||||
PLUGIN protoc-gen-grpc=$<TARGET_FILE:gRPC::grpc_cpp_plugin>
|
||||
GENERATE_EXTENSIONS .grpc.pb.h .grpc.pb.cc
|
||||
)
|
||||
|
||||
target_compile_options(xrpl.libpb
|
||||
PUBLIC
|
||||
$<$<BOOL:${MSVC}>:-wd4996>
|
||||
$<$<BOOL:${XCODE}>:
|
||||
--system-header-prefix="google/protobuf"
|
||||
-Wno-deprecated-dynamic-exception-spec
|
||||
>
|
||||
PRIVATE
|
||||
$<$<BOOL:${MSVC}>:-wd4065>
|
||||
$<$<NOT:$<BOOL:${MSVC}>>:-Wno-deprecated-declarations>
|
||||
)
|
||||
|
||||
target_link_libraries(xrpl.libpb
|
||||
PUBLIC
|
||||
protobuf::libprotobuf
|
||||
gRPC::grpc++
|
||||
)
|
||||
|
||||
add_library(xrpl.libxrpl)
|
||||
set_target_properties(xrpl.libxrpl PROPERTIES OUTPUT_NAME xrpl)
|
||||
if(unity)
|
||||
set_target_properties(xrpl.libxrpl PROPERTIES UNITY_BUILD ON)
|
||||
endif()
|
||||
|
||||
# Try to find the ACL library
|
||||
find_library(ACL_LIBRARY NAMES acl)
|
||||
|
||||
# Check if ACL was found
|
||||
if(ACL_LIBRARY)
|
||||
message(STATUS "Found ACL: ${ACL_LIBRARY}")
|
||||
else()
|
||||
message(STATUS "ACL not found, continuing without ACL support")
|
||||
endif()
|
||||
|
||||
add_library(xrpl::libxrpl ALIAS xrpl.libxrpl)
|
||||
|
||||
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src/libxrpl/*.cpp"
|
||||
)
|
||||
target_sources(xrpl.libxrpl PRIVATE ${sources})
|
||||
|
||||
target_include_directories(xrpl.libxrpl
|
||||
PUBLIC
|
||||
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/include>
|
||||
$<INSTALL_INTERFACE:include>)
|
||||
|
||||
target_compile_definitions(xrpl.libxrpl
|
||||
PUBLIC
|
||||
BOOST_ASIO_USE_TS_EXECUTOR_AS_DEFAULT
|
||||
BOOST_CONTAINER_FWD_BAD_DEQUE
|
||||
HAS_UNCAUGHT_EXCEPTIONS=1)
|
||||
|
||||
target_compile_options(xrpl.libxrpl
|
||||
PUBLIC
|
||||
$<$<BOOL:${is_gcc}>:-Wno-maybe-uninitialized>
|
||||
)
|
||||
|
||||
target_link_libraries(xrpl.libxrpl
|
||||
PUBLIC
|
||||
LibArchive::LibArchive
|
||||
OpenSSL::Crypto
|
||||
Ripple::boost
|
||||
wasmedge::wasmedge
|
||||
Ripple::opts
|
||||
Ripple::syslibs
|
||||
absl::random_random
|
||||
date::date
|
||||
ed25519::ed25519
|
||||
secp256k1::secp256k1
|
||||
xrpl.libpb
|
||||
xxHash::xxhash
|
||||
)
|
||||
|
||||
if(xrpld)
|
||||
add_executable(rippled)
|
||||
if(unity)
|
||||
set_target_properties(rippled PROPERTIES UNITY_BUILD ON)
|
||||
endif()
|
||||
if(tests)
|
||||
target_compile_definitions(rippled PUBLIC ENABLE_TESTS)
|
||||
endif()
|
||||
target_include_directories(rippled
|
||||
PRIVATE
|
||||
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
|
||||
)
|
||||
|
||||
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src/xrpld/*.cpp"
|
||||
)
|
||||
target_sources(rippled PRIVATE ${sources})
|
||||
|
||||
if(tests)
|
||||
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src/test/*.cpp"
|
||||
)
|
||||
target_sources(rippled PRIVATE ${sources})
|
||||
endif()
|
||||
|
||||
target_link_libraries(rippled
|
||||
Ripple::boost
|
||||
Ripple::opts
|
||||
Ripple::libs
|
||||
xrpl.libxrpl
|
||||
)
|
||||
exclude_if_included(rippled)
|
||||
# define a macro for tests that might need to
|
||||
# be exluded or run differently in CI environment
|
||||
if(is_ci)
|
||||
target_compile_definitions(rippled PRIVATE RIPPLED_RUNNING_IN_CI)
|
||||
endif ()
|
||||
|
||||
# any files that don't play well with unity should be added here
|
||||
if(tests)
|
||||
set_source_files_properties(
|
||||
# these two seem to produce conflicts in beast teardown template methods
|
||||
src/test/rpc/ValidatorRPC_test.cpp
|
||||
src/test/ledger/Invariants_test.cpp
|
||||
PROPERTIES SKIP_UNITY_BUILD_INCLUSION TRUE)
|
||||
endif()
|
||||
endif()
|
||||
|
||||
if(ACL_LIBRARY)
|
||||
target_link_libraries(rippled ${ACL_LIBRARY})
|
||||
endif()
|
||||
@@ -1,38 +0,0 @@
|
||||
#[===================================================================[
|
||||
coverage report target
|
||||
#]===================================================================]
|
||||
|
||||
if(NOT coverage)
|
||||
message(FATAL_ERROR "Code coverage not enabled! Aborting ...")
|
||||
endif()
|
||||
|
||||
if(CMAKE_CXX_COMPILER_ID MATCHES "MSVC")
|
||||
message(WARNING "Code coverage on Windows is not supported, ignoring 'coverage' flag")
|
||||
return()
|
||||
endif()
|
||||
|
||||
include(CodeCoverage)
|
||||
|
||||
# The instructions for these commands come from the `CodeCoverage` module,
|
||||
# which was copied from https://github.com/bilke/cmake-modules, commit fb7d2a3,
|
||||
# then locally changed (see CHANGES: section in `CodeCoverage.cmake`)
|
||||
|
||||
set(GCOVR_ADDITIONAL_ARGS ${coverage_extra_args})
|
||||
if(NOT GCOVR_ADDITIONAL_ARGS STREQUAL "")
|
||||
separate_arguments(GCOVR_ADDITIONAL_ARGS)
|
||||
endif()
|
||||
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS
|
||||
--exclude-throw-branches
|
||||
--exclude-noncode-lines
|
||||
--exclude-unreachable-branches -s
|
||||
-j ${coverage_test_parallelism})
|
||||
|
||||
setup_target_for_coverage_gcovr(
|
||||
NAME coverage
|
||||
FORMAT ${coverage_format}
|
||||
EXECUTABLE rippled
|
||||
EXECUTABLE_ARGS --unittest$<$<BOOL:${coverage_test}>:=${coverage_test}> --unittest-jobs ${coverage_test_parallelism} --quiet --unittest-log
|
||||
EXCLUDE "src/test" "include/xrpl/beast/test" "include/xrpl/beast/unit_test" "${CMAKE_BINARY_DIR}/pb-xrpl.libpb"
|
||||
DEPENDENCIES rippled
|
||||
)
|
||||
@@ -1,130 +0,0 @@
|
||||
#[===================================================================[
|
||||
declare user options/settings
|
||||
#]===================================================================]
|
||||
|
||||
include(ProcessorCount)
|
||||
|
||||
ProcessorCount(PROCESSOR_COUNT)
|
||||
|
||||
option(assert "Enables asserts, even in release builds" OFF)
|
||||
|
||||
option(xrpld "Build xrpld" ON)
|
||||
|
||||
option(tests "Build tests" ON)
|
||||
|
||||
option(unity "Creates a build using UNITY support in cmake. This is the default" ON)
|
||||
if(unity)
|
||||
if(NOT is_ci)
|
||||
set(CMAKE_UNITY_BUILD_BATCH_SIZE 15 CACHE STRING "")
|
||||
endif()
|
||||
endif()
|
||||
if(is_gcc OR is_clang)
|
||||
option(coverage "Generates coverage info." OFF)
|
||||
option(profile "Add profiling flags" OFF)
|
||||
set(coverage_test_parallelism "${PROCESSOR_COUNT}" CACHE STRING
|
||||
"Unit tests parallelism for the purpose of coverage report.")
|
||||
set(coverage_format "html-details" CACHE STRING
|
||||
"Output format of the coverage report.")
|
||||
set(coverage_extra_args "" CACHE STRING
|
||||
"Additional arguments to pass to gcovr.")
|
||||
set(coverage_test "" CACHE STRING
|
||||
"On gcc & clang, the specific unit test(s) to run for coverage. Default is all tests.")
|
||||
if(coverage_test AND NOT coverage)
|
||||
set(coverage ON CACHE BOOL "gcc/clang only" FORCE)
|
||||
endif()
|
||||
option(wextra "compile with extra gcc/clang warnings enabled" ON)
|
||||
else()
|
||||
set(profile OFF CACHE BOOL "gcc/clang only" FORCE)
|
||||
set(coverage OFF CACHE BOOL "gcc/clang only" FORCE)
|
||||
set(wextra OFF CACHE BOOL "gcc/clang only" FORCE)
|
||||
endif()
|
||||
if(is_linux)
|
||||
option(BUILD_SHARED_LIBS "build shared ripple libraries" OFF)
|
||||
option(static "link protobuf, openssl, libc++, and boost statically" ON)
|
||||
option(perf "Enables flags that assist with perf recording" OFF)
|
||||
option(use_gold "enables detection of gold (binutils) linker" ON)
|
||||
option(use_mold "enables detection of mold (binutils) linker" ON)
|
||||
else()
|
||||
# we are not ready to allow shared-libs on windows because it would require
|
||||
# export declarations. On macos it's more feasible, but static openssl
|
||||
# produces odd linker errors, thus we disable shared lib builds for now.
|
||||
set(BUILD_SHARED_LIBS OFF CACHE BOOL "build shared ripple libraries - OFF for win/macos" FORCE)
|
||||
set(static ON CACHE BOOL "static link, linux only. ON for WIN/macos" FORCE)
|
||||
set(perf OFF CACHE BOOL "perf flags, linux only" FORCE)
|
||||
set(use_gold OFF CACHE BOOL "gold linker, linux only" FORCE)
|
||||
set(use_mold OFF CACHE BOOL "mold linker, linux only" FORCE)
|
||||
endif()
|
||||
if(is_clang)
|
||||
option(use_lld "enables detection of lld linker" ON)
|
||||
else()
|
||||
set(use_lld OFF CACHE BOOL "try lld linker, clang only" FORCE)
|
||||
endif()
|
||||
option(jemalloc "Enables jemalloc for heap profiling" OFF)
|
||||
option(werr "treat warnings as errors" OFF)
|
||||
option(local_protobuf
|
||||
"Force a local build of protobuf instead of looking for an installed version." OFF)
|
||||
option(local_grpc
|
||||
"Force a local build of gRPC instead of looking for an installed version." OFF)
|
||||
|
||||
# this one is a string and therefore can't be an option
|
||||
set(san "" CACHE STRING "On gcc & clang, add sanitizer instrumentation")
|
||||
set_property(CACHE san PROPERTY STRINGS ";undefined;memory;address;thread")
|
||||
if(san)
|
||||
string(TOLOWER ${san} san)
|
||||
set(SAN_FLAG "-fsanitize=${san}")
|
||||
set(SAN_LIB "")
|
||||
if(is_gcc)
|
||||
if(san STREQUAL "address")
|
||||
set(SAN_LIB "asan")
|
||||
elseif(san STREQUAL "thread")
|
||||
set(SAN_LIB "tsan")
|
||||
elseif(san STREQUAL "memory")
|
||||
set(SAN_LIB "msan")
|
||||
elseif(san STREQUAL "undefined")
|
||||
set(SAN_LIB "ubsan")
|
||||
endif()
|
||||
endif()
|
||||
set(_saved_CRL ${CMAKE_REQUIRED_LIBRARIES})
|
||||
set(CMAKE_REQUIRED_LIBRARIES "${SAN_FLAG};${SAN_LIB}")
|
||||
check_cxx_compiler_flag(${SAN_FLAG} COMPILER_SUPPORTS_SAN)
|
||||
set(CMAKE_REQUIRED_LIBRARIES ${_saved_CRL})
|
||||
if(NOT COMPILER_SUPPORTS_SAN)
|
||||
message(FATAL_ERROR "${san} sanitizer does not seem to be supported by your compiler")
|
||||
endif()
|
||||
endif()
|
||||
set(container_label "" CACHE STRING "tag to use for package building containers")
|
||||
option(packages_only
|
||||
"ONLY generate package building targets. This is special use-case and almost \
|
||||
certainly not what you want. Use with caution as you won't be able to build \
|
||||
any compiled targets locally." OFF)
|
||||
option(have_package_container
|
||||
"Sometimes you already have the tagged container you want to use for package \
|
||||
building and you don't want docker to rebuild it. This flag will detach the \
|
||||
dependency of the package build from the container build. It's an advanced \
|
||||
use case and most likely you should not be touching this flag." OFF)
|
||||
|
||||
# the remaining options are obscure and rarely used
|
||||
option(beast_no_unit_test_inline
|
||||
"Prevents unit test definitions from being inserted into global table"
|
||||
OFF)
|
||||
option(single_io_service_thread
|
||||
"Restricts the number of threads calling io_service::run to one. \
|
||||
This can be useful when debugging."
|
||||
OFF)
|
||||
option(boost_show_deprecated
|
||||
"Allow boost to fail on deprecated usage. Only useful if you're trying\
|
||||
to find deprecated calls."
|
||||
OFF)
|
||||
option(beast_hashers
|
||||
"Use local implementations for sha/ripemd hashes (experimental, not recommended)"
|
||||
OFF)
|
||||
|
||||
if(WIN32)
|
||||
option(beast_disable_autolink "Disables autolinking of system libraries on WIN32" OFF)
|
||||
else()
|
||||
set(beast_disable_autolink OFF CACHE BOOL "WIN32 only" FORCE)
|
||||
endif()
|
||||
if(coverage)
|
||||
message(STATUS "coverage build requested - forcing Debug build")
|
||||
set(CMAKE_BUILD_TYPE Debug CACHE STRING "build type" FORCE)
|
||||
endif()
|
||||
@@ -1,15 +0,0 @@
|
||||
#[===================================================================[
|
||||
read version from source
|
||||
#]===================================================================]
|
||||
|
||||
file(STRINGS src/libxrpl/protocol/BuildInfo.cpp BUILD_INFO)
|
||||
foreach(line_ ${BUILD_INFO})
|
||||
if(line_ MATCHES "versionString[ ]*=[ ]*\"(.+)\"")
|
||||
set(rippled_version ${CMAKE_MATCH_1})
|
||||
endif()
|
||||
endforeach()
|
||||
if(rippled_version)
|
||||
message(STATUS "rippled version: ${rippled_version}")
|
||||
else()
|
||||
message(FATAL_ERROR "unable to determine rippled version")
|
||||
endif()
|
||||
@@ -1,62 +0,0 @@
|
||||
find_package(Protobuf REQUIRED)
|
||||
|
||||
# .proto files import each other like this:
|
||||
#
|
||||
# import "path/to/file.proto";
|
||||
#
|
||||
# For the protobuf compiler to find these imports,
|
||||
# the parent directory of "path" must be in the import path.
|
||||
#
|
||||
# When generating C++,
|
||||
# it turns into an include statement like this:
|
||||
#
|
||||
# #include "path/to/file.pb.h"
|
||||
#
|
||||
# and the header is generated at a path relative to the output directory
|
||||
# that matches the given .proto path relative to the source directory
|
||||
# minus the first matching prefix on the import path.
|
||||
#
|
||||
# In other words, a file `include/package/path/to/file.proto`
|
||||
# with import path [`include/package`, `include`]
|
||||
# will generate files `output/path/to/file.pb.{h,cc}`
|
||||
# with includes like `#include "path/to/file.pb.h".
|
||||
#
|
||||
# During build, the generated files can find each other if the output
|
||||
# directory is an include directory, but we want to install that directory
|
||||
# under our package's include directory (`include/package`), not as a sibling.
|
||||
# After install, they can find each other if that subdirectory is an include
|
||||
# directory.
|
||||
|
||||
# Add protocol buffer sources to an existing library target.
|
||||
# target:
|
||||
# The name of the library target.
|
||||
# prefix:
|
||||
# The install prefix for headers relative to `CMAKE_INSTALL_INCLUDEDIR`.
|
||||
# This prefix should appear at the start of all your consumer includes.
|
||||
# ARGN:
|
||||
# A list of .proto files.
|
||||
function(target_protobuf_sources target prefix)
|
||||
set(dir "${CMAKE_CURRENT_BINARY_DIR}/pb-${target}")
|
||||
file(MAKE_DIRECTORY "${dir}/${prefix}")
|
||||
|
||||
protobuf_generate(
|
||||
TARGET ${target}
|
||||
PROTOC_OUT_DIR "${dir}/${prefix}"
|
||||
"${ARGN}"
|
||||
)
|
||||
target_include_directories(${target} SYSTEM PUBLIC
|
||||
# Allows #include <package/path/to/file.proto> used by consumer files.
|
||||
$<BUILD_INTERFACE:${dir}>
|
||||
# Allows #include "path/to/file.proto" used by generated files.
|
||||
$<BUILD_INTERFACE:${dir}/${prefix}>
|
||||
# Allows #include <package/path/to/file.proto> used by consumer files.
|
||||
$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>
|
||||
# Allows #include "path/to/file.proto" used by generated files.
|
||||
$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}/${prefix}>
|
||||
)
|
||||
install(
|
||||
DIRECTORY ${dir}/
|
||||
DESTINATION ${CMAKE_INSTALL_INCLUDEDIR}
|
||||
FILES_MATCHING PATTERN "*.h"
|
||||
)
|
||||
endfunction()
|
||||
127
compile_single.py
Executable file
127
compile_single.py
Executable file
@@ -0,0 +1,127 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Compile a single file using commands from compile_commands.json
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import argparse
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def find_compile_command(compile_commands, file_path):
|
||||
"""Find the compile command for a given file path."""
|
||||
# Normalize the input path
|
||||
abs_path = os.path.abspath(file_path)
|
||||
|
||||
for entry in compile_commands:
|
||||
# Check if this entry matches our file
|
||||
entry_file = os.path.abspath(entry['file'])
|
||||
if entry_file == abs_path:
|
||||
return entry
|
||||
|
||||
# Try relative path matching as fallback
|
||||
for entry in compile_commands:
|
||||
if entry['file'].endswith(file_path) or file_path.endswith(entry['file']):
|
||||
return entry
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Compile a single file using compile_commands.json'
|
||||
)
|
||||
parser.add_argument(
|
||||
'file',
|
||||
help='Path to the source file to compile'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--verbose', '-v',
|
||||
action='store_true',
|
||||
help='Show the compile command being executed'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--dump-output', '-d',
|
||||
action='store_true',
|
||||
help='Dump the full output from the compiler'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--compile-db',
|
||||
default='build/compile_commands.json',
|
||||
help='Path to compile_commands.json (default: build/compile_commands.json)'
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Check if compile_commands.json exists
|
||||
if not os.path.exists(args.compile_db):
|
||||
print(f"Error: {args.compile_db} not found", file=sys.stderr)
|
||||
print("Make sure you've run cmake with -DCMAKE_EXPORT_COMPILE_COMMANDS=ON", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Load compile commands
|
||||
try:
|
||||
with open(args.compile_db, 'r') as f:
|
||||
compile_commands = json.load(f)
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"Error parsing {args.compile_db}: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Find the compile command for the requested file
|
||||
entry = find_compile_command(compile_commands, args.file)
|
||||
|
||||
if not entry:
|
||||
print(f"Error: No compile command found for {args.file}", file=sys.stderr)
|
||||
print(f"Available files in {args.compile_db}:", file=sys.stderr)
|
||||
# Show first 10 files as examples
|
||||
for i, cmd in enumerate(compile_commands[:10]):
|
||||
print(f" {cmd['file']}", file=sys.stderr)
|
||||
if len(compile_commands) > 10:
|
||||
print(f" ... and {len(compile_commands) - 10} more", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Extract the command and directory
|
||||
command = entry['command']
|
||||
directory = entry.get('directory', '.')
|
||||
|
||||
if args.verbose:
|
||||
print(f"Directory: {directory}", file=sys.stderr)
|
||||
print(f"Command: {command}", file=sys.stderr)
|
||||
print("-" * 80, file=sys.stderr)
|
||||
|
||||
# Execute the compile command
|
||||
try:
|
||||
result = subprocess.run(
|
||||
command,
|
||||
shell=True,
|
||||
cwd=directory,
|
||||
capture_output=not args.dump_output,
|
||||
text=True
|
||||
)
|
||||
|
||||
if args.dump_output:
|
||||
# Output was already printed to stdout/stderr
|
||||
pass
|
||||
else:
|
||||
# Only show output if there were errors or warnings
|
||||
if result.stderr:
|
||||
print(result.stderr, file=sys.stderr)
|
||||
if result.stdout:
|
||||
print(result.stdout)
|
||||
|
||||
# Exit with the same code as the compiler
|
||||
sys.exit(result.returncode)
|
||||
|
||||
except subprocess.SubprocessError as e:
|
||||
print(f"Error executing compile command: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
except KeyboardInterrupt:
|
||||
print("\nCompilation interrupted", file=sys.stderr)
|
||||
sys.exit(130)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
311
compile_single_v2.py
Executable file
311
compile_single_v2.py
Executable file
@@ -0,0 +1,311 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Compile a single file using commands from compile_commands.json
|
||||
Enhanced version with error context display
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import argparse
|
||||
import re
|
||||
import logging
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def setup_logging(level):
|
||||
"""Setup logging configuration."""
|
||||
numeric_level = getattr(logging, level.upper(), None)
|
||||
if not isinstance(numeric_level, int):
|
||||
raise ValueError(f'Invalid log level: {level}')
|
||||
|
||||
logging.basicConfig(
|
||||
level=numeric_level,
|
||||
format='[%(levelname)s] %(message)s',
|
||||
stream=sys.stderr
|
||||
)
|
||||
|
||||
|
||||
def find_compile_command(compile_commands, file_path):
|
||||
"""Find the compile command for a given file path."""
|
||||
# Normalize the input path
|
||||
abs_path = os.path.abspath(file_path)
|
||||
logging.debug(f"Looking for compile command for: {abs_path}")
|
||||
|
||||
for entry in compile_commands:
|
||||
# Check if this entry matches our file
|
||||
entry_file = os.path.abspath(entry['file'])
|
||||
if entry_file == abs_path:
|
||||
logging.debug(f"Found exact match: {entry_file}")
|
||||
return entry
|
||||
|
||||
# Try relative path matching as fallback
|
||||
for entry in compile_commands:
|
||||
if entry['file'].endswith(file_path) or file_path.endswith(entry['file']):
|
||||
logging.debug(f"Found relative match: {entry['file']}")
|
||||
return entry
|
||||
|
||||
logging.debug("No compile command found")
|
||||
return None
|
||||
|
||||
|
||||
def extract_errors_with_context(output, file_path, context_lines=3):
|
||||
"""Extract error messages with context from compiler output."""
|
||||
lines = output.split('\n')
|
||||
errors = []
|
||||
|
||||
logging.debug(f"Parsing {len(lines)} lines of compiler output")
|
||||
logging.debug(f"Looking for errors in file: {file_path}")
|
||||
|
||||
# Pattern to match error lines from clang/gcc
|
||||
# Matches: filename:line:col: error: message
|
||||
# Also handle color codes
|
||||
error_pattern = re.compile(r'([^:]+):(\d+):(\d+):\s*(?:\x1b\[[0-9;]*m)?\s*(error|warning):\s*(?:\x1b\[[0-9;]*m)?\s*(.*?)(?:\x1b\[[0-9;]*m)?$')
|
||||
|
||||
for i, line in enumerate(lines):
|
||||
# Strip ANSI color codes for pattern matching
|
||||
clean_line = re.sub(r'\x1b\[[0-9;]*m', '', line)
|
||||
match = error_pattern.search(clean_line)
|
||||
|
||||
if match:
|
||||
filename = match.group(1)
|
||||
line_num = int(match.group(2))
|
||||
col_num = int(match.group(3))
|
||||
error_type = match.group(4)
|
||||
message = match.group(5)
|
||||
|
||||
logging.debug(f"Found {error_type} at {filename}:{line_num}:{col_num}")
|
||||
|
||||
# Check if this error is from the file we're compiling
|
||||
# Be more flexible with path matching
|
||||
if (file_path in filename or
|
||||
filename.endswith(os.path.basename(file_path)) or
|
||||
os.path.basename(filename) == os.path.basename(file_path)):
|
||||
|
||||
logging.debug(f" -> Including {error_type}: {message[:50]}...")
|
||||
|
||||
error_info = {
|
||||
'line': line_num,
|
||||
'col': col_num,
|
||||
'type': error_type,
|
||||
'message': message,
|
||||
'full_line': line, # Keep original line with colors
|
||||
'context_before': [],
|
||||
'context_after': []
|
||||
}
|
||||
|
||||
# Get context lines from compiler output
|
||||
for j in range(max(0, i - context_lines), i):
|
||||
error_info['context_before'].append(lines[j])
|
||||
|
||||
for j in range(i + 1, min(len(lines), i + context_lines + 1)):
|
||||
error_info['context_after'].append(lines[j])
|
||||
|
||||
errors.append(error_info)
|
||||
else:
|
||||
logging.debug(f" -> Skipping (different file: {filename})")
|
||||
|
||||
logging.info(f"Found {len(errors)} errors/warnings")
|
||||
return errors
|
||||
|
||||
|
||||
def read_source_context(file_path, line_num, context_lines=3):
|
||||
"""Read context from the source file around a specific line."""
|
||||
try:
|
||||
with open(file_path, 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
start = max(0, line_num - context_lines - 1)
|
||||
end = min(len(lines), line_num + context_lines)
|
||||
|
||||
context = []
|
||||
for i in range(start, end):
|
||||
line_marker = '>>> ' if i == line_num - 1 else ' '
|
||||
context.append(f"{i+1:4d}:{line_marker}{lines[i].rstrip()}")
|
||||
|
||||
return '\n'.join(context)
|
||||
except Exception as e:
|
||||
logging.warning(f"Could not read source context: {e}")
|
||||
return None
|
||||
|
||||
|
||||
def format_error_with_context(error, file_path, show_source_context=False):
|
||||
"""Format an error with its context."""
|
||||
output = []
|
||||
output.append(f"\n{'='*80}")
|
||||
output.append(f"Error at line {error['line']}, column {error['col']}:")
|
||||
output.append(f" {error['message']}")
|
||||
|
||||
if show_source_context:
|
||||
source_context = read_source_context(file_path, error['line'], 3)
|
||||
if source_context:
|
||||
output.append("\nSource context:")
|
||||
output.append(source_context)
|
||||
|
||||
if error['context_before'] or error['context_after']:
|
||||
output.append("\nCompiler output context:")
|
||||
for line in error['context_before']:
|
||||
output.append(f" {line}")
|
||||
output.append(f">>> {error['full_line']}")
|
||||
for line in error['context_after']:
|
||||
output.append(f" {line}")
|
||||
|
||||
return '\n'.join(output)
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Compile a single file using compile_commands.json with enhanced error display'
|
||||
)
|
||||
parser.add_argument(
|
||||
'file',
|
||||
help='Path to the source file to compile'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--verbose', '-v',
|
||||
action='store_true',
|
||||
help='Show the compile command being executed'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--dump-output', '-d',
|
||||
action='store_true',
|
||||
help='Dump the full output from the compiler'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--show-error-context', '-e',
|
||||
type=int,
|
||||
metavar='N',
|
||||
help='Show N lines of context around each error (implies capturing output)'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--show-source-context', '-s',
|
||||
action='store_true',
|
||||
help='Show source file context around errors'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--errors-only',
|
||||
action='store_true',
|
||||
help='Only show errors, not warnings'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--compile-db',
|
||||
default='build/compile_commands.json',
|
||||
help='Path to compile_commands.json (default: build/compile_commands.json)'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--log-level', '-l',
|
||||
default='WARNING',
|
||||
choices=['DEBUG', 'INFO', 'WARNING', 'ERROR'],
|
||||
help='Set logging level (default: WARNING)'
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Setup logging
|
||||
setup_logging(args.log_level)
|
||||
|
||||
# Check if compile_commands.json exists
|
||||
if not os.path.exists(args.compile_db):
|
||||
print(f"Error: {args.compile_db} not found", file=sys.stderr)
|
||||
print("Make sure you've run cmake with -DCMAKE_EXPORT_COMPILE_COMMANDS=ON", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Load compile commands
|
||||
try:
|
||||
with open(args.compile_db, 'r') as f:
|
||||
compile_commands = json.load(f)
|
||||
logging.info(f"Loaded {len(compile_commands)} compile commands")
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"Error parsing {args.compile_db}: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Find the compile command for the requested file
|
||||
entry = find_compile_command(compile_commands, args.file)
|
||||
|
||||
if not entry:
|
||||
print(f"Error: No compile command found for {args.file}", file=sys.stderr)
|
||||
print(f"Available files in {args.compile_db}:", file=sys.stderr)
|
||||
# Show first 10 files as examples
|
||||
for i, cmd in enumerate(compile_commands[:10]):
|
||||
print(f" {cmd['file']}", file=sys.stderr)
|
||||
if len(compile_commands) > 10:
|
||||
print(f" ... and {len(compile_commands) - 10} more", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Extract the command and directory
|
||||
command = entry['command']
|
||||
directory = entry.get('directory', '.')
|
||||
source_file = entry['file']
|
||||
|
||||
if args.verbose:
|
||||
print(f"Directory: {directory}", file=sys.stderr)
|
||||
print(f"Command: {command}", file=sys.stderr)
|
||||
print("-" * 80, file=sys.stderr)
|
||||
|
||||
logging.info(f"Compiling {source_file}")
|
||||
logging.debug(f"Working directory: {directory}")
|
||||
logging.debug(f"Command: {command}")
|
||||
|
||||
# Execute the compile command
|
||||
try:
|
||||
# If we need to show error context, we must capture output
|
||||
capture = not args.dump_output or args.show_error_context is not None
|
||||
|
||||
logging.debug(f"Running compiler (capture={capture})")
|
||||
result = subprocess.run(
|
||||
command,
|
||||
shell=True,
|
||||
cwd=directory,
|
||||
capture_output=capture,
|
||||
text=True
|
||||
)
|
||||
|
||||
logging.info(f"Compiler returned code: {result.returncode}")
|
||||
|
||||
if args.dump_output and not args.show_error_context:
|
||||
# Output was already printed to stdout/stderr
|
||||
pass
|
||||
elif args.show_error_context is not None:
|
||||
# Parse and display errors with context
|
||||
all_output = result.stderr + "\n" + result.stdout
|
||||
|
||||
# Log first few lines of output for debugging
|
||||
output_lines = all_output.split('\n')[:10]
|
||||
for line in output_lines:
|
||||
logging.debug(f"Output: {line}")
|
||||
|
||||
errors = extract_errors_with_context(all_output, args.file, args.show_error_context)
|
||||
|
||||
if args.errors_only:
|
||||
errors = [e for e in errors if e['type'] == 'error']
|
||||
logging.info(f"Filtered to {len(errors)} errors only")
|
||||
|
||||
print(f"\nFound {len(errors)} {'error' if args.errors_only else 'error/warning'}(s) in {args.file}:\n")
|
||||
|
||||
for error in errors:
|
||||
print(format_error_with_context(error, source_file, args.show_source_context))
|
||||
|
||||
if errors:
|
||||
print(f"\n{'='*80}")
|
||||
print(f"Total: {len(errors)} {'error' if args.errors_only else 'error/warning'}(s)")
|
||||
else:
|
||||
# Default behavior - show output if there were errors or warnings
|
||||
if result.stderr:
|
||||
print(result.stderr, file=sys.stderr)
|
||||
if result.stdout:
|
||||
print(result.stdout)
|
||||
|
||||
# Exit with the same code as the compiler
|
||||
sys.exit(result.returncode)
|
||||
|
||||
except subprocess.SubprocessError as e:
|
||||
print(f"Error executing compile command: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
except KeyboardInterrupt:
|
||||
print("\nCompilation interrupted", file=sys.stderr)
|
||||
sys.exit(130)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
146
conanfile.py
146
conanfile.py
@@ -15,28 +15,29 @@ class Xrpl(ConanFile):
|
||||
'coverage': [True, False],
|
||||
'fPIC': [True, False],
|
||||
'jemalloc': [True, False],
|
||||
'reporting': [True, False],
|
||||
'rocksdb': [True, False],
|
||||
'shared': [True, False],
|
||||
'static': [True, False],
|
||||
'tests': [True, False],
|
||||
'unity': [True, False],
|
||||
'xrpld': [True, False],
|
||||
}
|
||||
|
||||
requires = [
|
||||
'blake3/1.5.0@xahaud/stable',
|
||||
'boost/1.86.0',
|
||||
'date/3.0.1',
|
||||
'libarchive/3.6.0',
|
||||
'lz4/1.9.3',
|
||||
'grpc/1.50.1',
|
||||
'libarchive/3.6.2',
|
||||
'nudb/2.0.8',
|
||||
'openssl/1.1.1u',
|
||||
'soci/4.0.3',
|
||||
'xxhash/0.8.2',
|
||||
'wasmedge/0.11.2',
|
||||
'zlib/1.2.13',
|
||||
]
|
||||
|
||||
tool_requires = [
|
||||
'openssl/3.3.2',
|
||||
'protobuf/3.21.9',
|
||||
'snappy/1.1.10',
|
||||
'soci/4.0.3',
|
||||
'sqlite3/3.42.0',
|
||||
'zlib/1.3.1',
|
||||
'wasmedge/0.11.2',
|
||||
]
|
||||
|
||||
default_options = {
|
||||
@@ -44,51 +45,53 @@ class Xrpl(ConanFile):
|
||||
'coverage': False,
|
||||
'fPIC': True,
|
||||
'jemalloc': False,
|
||||
'reporting': False,
|
||||
'rocksdb': True,
|
||||
'shared': False,
|
||||
'static': True,
|
||||
'tests': False,
|
||||
'tests': True,
|
||||
'unity': False,
|
||||
'xrpld': False,
|
||||
|
||||
'date/*:header_only': True,
|
||||
'grpc/*:shared': False,
|
||||
'grpc/*:secure': True,
|
||||
'libarchive/*:shared': False,
|
||||
'libarchive/*:with_acl': False,
|
||||
'libarchive/*:with_bzip2': False,
|
||||
'libarchive/*:with_cng': False,
|
||||
'libarchive/*:with_expat': False,
|
||||
'libarchive/*:with_iconv': False,
|
||||
'libarchive/*:with_libxml2': False,
|
||||
'libarchive/*:with_lz4': True,
|
||||
'libarchive/*:with_lzma': False,
|
||||
'libarchive/*:with_lzo': False,
|
||||
'libarchive/*:with_nettle': False,
|
||||
'libarchive/*:with_openssl': False,
|
||||
'libarchive/*:with_pcreposix': False,
|
||||
'libarchive/*:with_xattr': False,
|
||||
'libarchive/*:with_zlib': False,
|
||||
'lz4/*:shared': False,
|
||||
'openssl/*:shared': False,
|
||||
'protobuf/*:shared': False,
|
||||
'protobuf/*:with_zlib': True,
|
||||
'rocksdb/*:enable_sse': False,
|
||||
'rocksdb/*:lite': False,
|
||||
'rocksdb/*:shared': False,
|
||||
'rocksdb/*:use_rtti': True,
|
||||
'rocksdb/*:with_jemalloc': False,
|
||||
'rocksdb/*:with_lz4': True,
|
||||
'rocksdb/*:with_snappy': True,
|
||||
'snappy/*:shared': False,
|
||||
'soci/*:shared': False,
|
||||
'soci/*:with_sqlite3': True,
|
||||
'soci/*:with_boost': True,
|
||||
'xxhash/*:shared': False,
|
||||
'blake3:simd': False, # Disable SIMD for testing
|
||||
'cassandra-cpp-driver:shared': False,
|
||||
'date:header_only': True,
|
||||
'grpc:shared': False,
|
||||
'grpc:secure': True,
|
||||
'libarchive:shared': False,
|
||||
'libarchive:with_acl': False,
|
||||
'libarchive:with_bzip2': False,
|
||||
'libarchive:with_cng': False,
|
||||
'libarchive:with_expat': False,
|
||||
'libarchive:with_iconv': False,
|
||||
'libarchive:with_libxml2': False,
|
||||
'libarchive:with_lz4': True,
|
||||
'libarchive:with_lzma': False,
|
||||
'libarchive:with_lzo': False,
|
||||
'libarchive:with_nettle': False,
|
||||
'libarchive:with_openssl': False,
|
||||
'libarchive:with_pcreposix': False,
|
||||
'libarchive:with_xattr': False,
|
||||
'libarchive:with_zlib': False,
|
||||
'libpq:shared': False,
|
||||
'lz4:shared': False,
|
||||
'openssl:shared': False,
|
||||
'protobuf:shared': False,
|
||||
'protobuf:with_zlib': True,
|
||||
'rocksdb:enable_sse': False,
|
||||
'rocksdb:lite': False,
|
||||
'rocksdb:shared': False,
|
||||
'rocksdb:use_rtti': True,
|
||||
'rocksdb:with_jemalloc': False,
|
||||
'rocksdb:with_lz4': True,
|
||||
'rocksdb:with_snappy': True,
|
||||
'snappy:shared': False,
|
||||
'soci:shared': False,
|
||||
'soci:with_sqlite3': True,
|
||||
'soci:with_boost': True,
|
||||
}
|
||||
|
||||
def set_version(self):
|
||||
path = f'{self.recipe_folder}/src/libxrpl/protocol/BuildInfo.cpp'
|
||||
path = f'{self.recipe_folder}/src/ripple/protocol/impl/BuildInfo.cpp'
|
||||
regex = r'versionString\s?=\s?\"(.*)\"'
|
||||
with open(path, 'r') as file:
|
||||
matches = (re.search(regex, line) for line in file)
|
||||
@@ -100,23 +103,16 @@ class Xrpl(ConanFile):
|
||||
self.options['boost'].visibility = 'global'
|
||||
|
||||
def requirements(self):
|
||||
self.requires('boost/1.86.0', force=True)
|
||||
self.requires('lz4/1.9.3', force=True)
|
||||
self.requires('protobuf/3.21.9', force=True)
|
||||
self.requires('sqlite3/3.42.0', force=True)
|
||||
if self.options.jemalloc:
|
||||
self.requires('jemalloc/5.3.0')
|
||||
self.requires('jemalloc/5.2.1')
|
||||
if self.options.reporting:
|
||||
self.requires('cassandra-cpp-driver/2.15.3')
|
||||
self.requires('libpq/13.6')
|
||||
if self.options.rocksdb:
|
||||
self.requires('rocksdb/6.29.5')
|
||||
self.requires('rocksdb/6.27.3')
|
||||
|
||||
exports_sources = (
|
||||
'CMakeLists.txt',
|
||||
'bin/getRippledInfo',
|
||||
'cfg/*',
|
||||
'cmake/*',
|
||||
'external/*',
|
||||
'include/*',
|
||||
'src/*',
|
||||
'CMakeLists.txt', 'Builds/*', 'bin/getRippledInfo', 'src/*', 'cfg/*'
|
||||
)
|
||||
|
||||
def layout(self):
|
||||
@@ -132,11 +128,11 @@ class Xrpl(ConanFile):
|
||||
tc.variables['assert'] = self.options.assertions
|
||||
tc.variables['coverage'] = self.options.coverage
|
||||
tc.variables['jemalloc'] = self.options.jemalloc
|
||||
tc.variables['reporting'] = self.options.reporting
|
||||
tc.variables['rocksdb'] = self.options.rocksdb
|
||||
tc.variables['BUILD_SHARED_LIBS'] = self.options.shared
|
||||
tc.variables['static'] = self.options.static
|
||||
tc.variables['unity'] = self.options.unity
|
||||
tc.variables['xrpld'] = self.options.xrpld
|
||||
tc.generate()
|
||||
|
||||
def build(self):
|
||||
@@ -153,27 +149,9 @@ class Xrpl(ConanFile):
|
||||
def package_info(self):
|
||||
libxrpl = self.cpp_info.components['libxrpl']
|
||||
libxrpl.libs = [
|
||||
'xrpl',
|
||||
'xrpl.libpb',
|
||||
'ed25519',
|
||||
'secp256k1',
|
||||
'libxrpl_core.a',
|
||||
'libed25519.a',
|
||||
'libsecp256k1.a',
|
||||
]
|
||||
# TODO: Fix the protobufs to include each other relative to
|
||||
# `include/`, not `include/ripple/proto/`.
|
||||
libxrpl.includedirs = ['include', 'include/ripple/proto']
|
||||
libxrpl.requires = [
|
||||
'boost::boost',
|
||||
'date::date',
|
||||
'grpc::grpc++',
|
||||
'libarchive::libarchive',
|
||||
'lz4::lz4',
|
||||
'nudb::nudb',
|
||||
'openssl::crypto',
|
||||
'protobuf::libprotobuf',
|
||||
'soci::soci',
|
||||
'sqlite3::sqlite',
|
||||
'xxhash::xxhash',
|
||||
'zlib::zlib',
|
||||
]
|
||||
if self.options.rocksdb:
|
||||
libxrpl.requires.append('rocksdb::librocksdb')
|
||||
libxrpl.includedirs = ['include']
|
||||
libxrpl.requires = ['boost::boost']
|
||||
|
||||
2
docs/build/environment.md
vendored
2
docs/build/environment.md
vendored
@@ -23,7 +23,7 @@ direction.
|
||||
|
||||
```
|
||||
apt update
|
||||
apt install --yes curl git libssl-dev python3.10-dev python3-pip make g++-11 libprotobuf-dev protobuf-compiler
|
||||
apt install --yes curl git libssl-dev python3.10-dev python3-pip make g++-11
|
||||
|
||||
curl --location --remote-name \
|
||||
"https://github.com/Kitware/CMake/releases/download/v3.25.1/cmake-3.25.1.tar.gz"
|
||||
|
||||
@@ -1,16 +0,0 @@
|
||||
cmake_minimum_required(VERSION 3.21)
|
||||
|
||||
set(name example)
|
||||
set(version 0.1.0)
|
||||
|
||||
project(
|
||||
${name}
|
||||
VERSION ${version}
|
||||
LANGUAGES CXX
|
||||
)
|
||||
|
||||
find_package(xrpl REQUIRED)
|
||||
|
||||
add_executable(example)
|
||||
target_sources(example PRIVATE src/example.cpp)
|
||||
target_link_libraries(example PRIVATE xrpl::libxrpl)
|
||||
@@ -1,59 +0,0 @@
|
||||
from conan import ConanFile, conan_version
|
||||
from conan.tools.cmake import CMake, cmake_layout
|
||||
|
||||
class Example(ConanFile):
|
||||
|
||||
def set_name(self):
|
||||
if self.name is None:
|
||||
self.name = 'example'
|
||||
|
||||
def set_version(self):
|
||||
if self.version is None:
|
||||
self.version = '0.1.0'
|
||||
|
||||
license = 'ISC'
|
||||
author = 'John Freeman <jfreeman08@gmail.com>'
|
||||
|
||||
settings = 'os', 'compiler', 'build_type', 'arch'
|
||||
options = {'shared': [True, False], 'fPIC': [True, False]}
|
||||
default_options = {
|
||||
'shared': False,
|
||||
'fPIC': True,
|
||||
'xrpl:xrpld': False,
|
||||
}
|
||||
|
||||
requires = ['xrpl/2.2.0-rc1@jfreeman/nodestore']
|
||||
generators = ['CMakeDeps', 'CMakeToolchain']
|
||||
|
||||
exports_sources = [
|
||||
'CMakeLists.txt',
|
||||
'cmake/*',
|
||||
'external/*',
|
||||
'include/*',
|
||||
'src/*',
|
||||
]
|
||||
|
||||
# For out-of-source build.
|
||||
# https://docs.conan.io/en/latest/reference/build_helpers/cmake.html#configure
|
||||
no_copy_source = True
|
||||
|
||||
def layout(self):
|
||||
cmake_layout(self)
|
||||
|
||||
def config_options(self):
|
||||
if self.settings.os == 'Windows':
|
||||
del self.options.fPIC
|
||||
|
||||
def build(self):
|
||||
cmake = CMake(self)
|
||||
cmake.configure(variables={'BUILD_TESTING': 'NO'})
|
||||
cmake.build()
|
||||
|
||||
def package(self):
|
||||
cmake = CMake(self)
|
||||
cmake.install()
|
||||
|
||||
def package_info(self):
|
||||
path = f'{self.package_folder}/share/{self.name}/cpp_info.py'
|
||||
with open(path, 'r') as file:
|
||||
exec(file.read(), {}, {'self': self.cpp_info})
|
||||
@@ -1,8 +0,0 @@
|
||||
#include <cstdio>
|
||||
|
||||
#include <xrpl/protocol/BuildInfo.h>
|
||||
|
||||
int main(int argc, char const** argv) {
|
||||
std::printf("%s\n", ripple::BuildInfo::getVersionString().c_str());
|
||||
return 0;
|
||||
}
|
||||
11
external/README.md
vendored
11
external/README.md
vendored
@@ -1,11 +0,0 @@
|
||||
The subdirectories in this directory contain either copies or Conan recipes
|
||||
of external libraries used by rippled.
|
||||
The Conan recipes include patches we have not yet pushed upstream.
|
||||
|
||||
| Folder | Upstream | Description |
|
||||
|:----------------|:---------------------------------------------|:------------|
|
||||
| `ed25519-donna` | [Project](https://github.com/floodyberry/ed25519-donna) | [Ed25519](http://ed25519.cr.yp.to/) digital signatures |
|
||||
| `rocksdb` | [Recipe](https://github.com/conan-io/conan-center-index/tree/master/recipes/rocksdb) | Fast key/value database. (Supports rotational disks better than NuDB.) |
|
||||
| `secp256k1` | [Project](https://github.com/bitcoin-core/secp256k1) | ECDSA digital signatures using the **secp256k1** curve |
|
||||
| `snappy` | [Recipe](https://github.com/conan-io/conan-center-index/tree/master/recipes/snappy) | "Snappy" lossless compression algorithm. |
|
||||
| `soci` | [Recipe](https://github.com/conan-io/conan-center-index/tree/master/recipes/soci) | Abstraction layer for database access. |
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user