mirror of
https://github.com/XRPLF/rippled.git
synced 2025-11-04 11:15:56 +00:00
Compare commits
357 Commits
q73zhao/re
...
a1q123456/
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
bf4dc342c6 | ||
|
|
a71cd5d271 | ||
|
|
45baf7339c | ||
|
|
f4f7618173 | ||
|
|
66f16469f9 | ||
|
|
1845b1c656 | ||
|
|
e192ffe964 | ||
|
|
2bf77cc8f6 | ||
|
|
5e33ca56fd | ||
|
|
7c39c810eb | ||
|
|
a7792ebcae | ||
|
|
83ee3788e1 | ||
|
|
ae719b86d3 | ||
|
|
dd722f8b3f | ||
|
|
30190a5feb | ||
|
|
afb6e0e41b | ||
|
|
5523557226 | ||
|
|
b64707f53b | ||
|
|
0b113f371f | ||
|
|
b4c894c1ba | ||
|
|
92281a4ede | ||
|
|
e80642fc12 | ||
|
|
640ce4988f | ||
|
|
a422855ea7 | ||
|
|
108f90586c | ||
|
|
519d1dbc34 | ||
|
|
3d44758e5a | ||
|
|
97bc94a7f6 | ||
|
|
34619f2504 | ||
|
|
3509de9c5f | ||
|
|
459d0da010 | ||
|
|
8637d606a4 | ||
|
|
8456b8275e | ||
|
|
3c88786bb0 | ||
|
|
46ba8a28fe | ||
|
|
5ecde3cf39 | ||
|
|
620fb26823 | ||
|
|
6b6b213cf5 | ||
|
|
f61086b43c | ||
|
|
176fd2b6e4 | ||
|
|
2df730438d | ||
|
|
5d79bfc531 | ||
|
|
51ef35ab55 | ||
|
|
330a3215bc | ||
|
|
85c2ceacde | ||
|
|
70d5c624e8 | ||
|
|
8e4fda160d | ||
|
|
072b1c442c | ||
|
|
294e03ecf5 | ||
|
|
550f90a75e | ||
|
|
d67dcfe3c4 | ||
|
|
0fd2f715bb | ||
|
|
807462b191 | ||
|
|
19c4226d3d | ||
|
|
d02c306f1e | ||
|
|
cfd26f444c | ||
|
|
2c3024716b | ||
|
|
a12f5de68d | ||
|
|
51c5f2bfc9 | ||
|
|
73ff54143d | ||
|
|
08b136528e | ||
|
|
6b8a589447 | ||
|
|
2bc2930a28 | ||
|
|
e837171f7c | ||
|
|
6f05bd035c | ||
|
|
b98b42bbec | ||
|
|
6e6ea4311b | ||
|
|
f2271305e5 | ||
|
|
ffeabc9642 | ||
|
|
3cbdf818a7 | ||
|
|
c46888f8f7 | ||
|
|
2ae65d2fdb | ||
|
|
bd834c87e0 | ||
|
|
dc8b37a524 | ||
|
|
617a895af5 | ||
|
|
1af1048c58 | ||
|
|
f07ba87e51 | ||
|
|
e66558a883 | ||
|
|
510314d344 | ||
|
|
37b951859c | ||
|
|
9494fc9668 | ||
|
|
8d01f35eb9 | ||
|
|
1020a32d76 | ||
|
|
17a2606591 | ||
|
|
ccb9f1e42d | ||
|
|
3e4e9a2ddc | ||
|
|
4caebfbd0e | ||
|
|
37c377a1b6 | ||
|
|
bd182c0a3e | ||
|
|
406c26cc72 | ||
|
|
9bd1ce436a | ||
|
|
f69ad4eff6 | ||
|
|
6fe0599cc2 | ||
|
|
e6f8bc720f | ||
|
|
fbd60fc000 | ||
|
|
61d628d654 | ||
|
|
3d92375d12 | ||
|
|
cdbe70b2a7 | ||
|
|
f6426ca183 | ||
|
|
e5f7a8442d | ||
|
|
e67e0395df | ||
|
|
148f669a25 | ||
|
|
f1eaa6a264 | ||
|
|
da4c8c9550 | ||
|
|
bcde2790a4 | ||
|
|
9ebeb413e4 | ||
|
|
6d40b882a4 | ||
|
|
9fe0a154f1 | ||
|
|
cb52c9af00 | ||
|
|
6bf8338038 | ||
|
|
b0f4174e47 | ||
|
|
3865dde0b8 | ||
|
|
811c980821 | ||
|
|
cf5f65b68e | ||
|
|
c38f2a3f2e | ||
|
|
16c2ff97cc | ||
|
|
32043463a8 | ||
|
|
724e9b1313 | ||
|
|
2e6f00aef2 | ||
|
|
e0b9812fc5 | ||
|
|
e4fdf33158 | ||
|
|
6e814d7ebd | ||
|
|
1e37d00d6c | ||
|
|
87ea3ba65d | ||
|
|
dedf3d3983 | ||
|
|
2df7dcfdeb | ||
|
|
1506e65558 | ||
|
|
808c86663c | ||
|
|
92431a4238 | ||
|
|
285120684c | ||
|
|
77fef8732b | ||
|
|
7775c725f3 | ||
|
|
c61096239c | ||
|
|
c5fe970646 | ||
|
|
c57cd8b23e | ||
|
|
c14ce956ad | ||
|
|
095dc4d9cc | ||
|
|
2e255812ae | ||
|
|
896b8c3b54 | ||
|
|
58dd07bbdf | ||
|
|
b13370ac0d | ||
|
|
f847e3287c | ||
|
|
56c1e078f2 | ||
|
|
afc05659ed | ||
|
|
b04d239926 | ||
|
|
dc1caa41b2 | ||
|
|
ceb0ce5634 | ||
|
|
fb89213d4d | ||
|
|
d8628d481d | ||
|
|
a14551b151 | ||
|
|
de33a6a241 | ||
|
|
28eec6ce1b | ||
|
|
c9a723128a | ||
|
|
da82e52613 | ||
|
|
c9d73b6135 | ||
|
|
b7ed99426b | ||
|
|
97f0747e10 | ||
|
|
abf12db788 | ||
|
|
bdfc376951 | ||
|
|
b40a3684ae | ||
|
|
86ef16dbeb | ||
|
|
39b5031ab5 | ||
|
|
94decc753b | ||
|
|
991891625a | ||
|
|
69314e6832 | ||
|
|
dbeb841b5a | ||
|
|
4eae037fee | ||
|
|
b5a63b39d3 | ||
|
|
6419f9a253 | ||
|
|
31c99caa65 | ||
|
|
d835e97490 | ||
|
|
baf4b8381f | ||
|
|
9b45b6888b | ||
|
|
7179ce9c58 | ||
|
|
921aef9934 | ||
|
|
e7a7bb83c1 | ||
|
|
5c2a3a2779 | ||
|
|
b2960b9e7f | ||
|
|
5713f9782a | ||
|
|
60e340d356 | ||
|
|
80d82c5b2b | ||
|
|
433eeabfa5 | ||
|
|
faa781b71f | ||
|
|
c233df720a | ||
|
|
7ff4f79d30 | ||
|
|
60909655d3 | ||
|
|
03e46cd026 | ||
|
|
e95683a0fb | ||
|
|
13353ae36d | ||
|
|
1a40f18bdd | ||
|
|
90e6380383 | ||
|
|
8bfaa7fe0a | ||
|
|
c9135a63cd | ||
|
|
452263eaa5 | ||
|
|
8aa94ea09a | ||
|
|
258ba71363 | ||
|
|
b8626ea3c6 | ||
|
|
6534757d85 | ||
|
|
8e94ea3154 | ||
|
|
b113190563 | ||
|
|
358b7f50a7 | ||
|
|
f47e2f4e82 | ||
|
|
a7eea9546f | ||
|
|
9874d47d7f | ||
|
|
c2f3e2e263 | ||
|
|
e18f27f5f7 | ||
|
|
df6daf0d8f | ||
|
|
e9d46f0bfc | ||
|
|
42fd74b77b | ||
|
|
c55ea56c5e | ||
|
|
1e01cd34f7 | ||
|
|
e2fa5c1b7c | ||
|
|
fc0984d286 | ||
|
|
8b3dcd41f7 | ||
|
|
8f2f5310e2 | ||
|
|
edb4f0342c | ||
|
|
ea17abb92a | ||
|
|
35a40a8e62 | ||
|
|
d494bf45b2 | ||
|
|
8bf4a5cbff | ||
|
|
58c2c82a30 | ||
|
|
11edaa441d | ||
|
|
a5e953b191 | ||
|
|
506ae12a8c | ||
|
|
0310c5cbe0 | ||
|
|
053e1af7ff | ||
|
|
7e24adbdd0 | ||
|
|
621df422a7 | ||
|
|
0a34b5c691 | ||
|
|
e0bc3dd51f | ||
|
|
dacecd24ba | ||
|
|
05105743e9 | ||
|
|
9e1fe9a85e | ||
|
|
d71ce51901 | ||
|
|
be668ee26d | ||
|
|
cae5294b4e | ||
|
|
cd777f79ef | ||
|
|
8b9e21e3f5 | ||
|
|
2a61aee562 | ||
|
|
40ce8a8833 | ||
|
|
7713ff8c5c | ||
|
|
70371a4344 | ||
|
|
e514de76ed | ||
|
|
dd62cfcc22 | ||
|
|
09690f1b38 | ||
|
|
380ba9f1c1 | ||
|
|
c3e9380fb4 | ||
|
|
e3ebc253fa | ||
|
|
c6c7c84355 | ||
|
|
28f50cb7cf | ||
|
|
3e152fec74 | ||
|
|
2db2791805 | ||
|
|
9ec2d7f8ff | ||
|
|
4a084ce34c | ||
|
|
3502df2174 | ||
|
|
fa1e25abef | ||
|
|
217ba8dd4d | ||
|
|
405f4613d8 | ||
|
|
cba512068b | ||
|
|
1c99ea23d1 | ||
|
|
c4308b216f | ||
|
|
aafd2d8525 | ||
|
|
a574ec6023 | ||
|
|
e429455f4d | ||
|
|
7692eeb9a0 | ||
|
|
a099f5a804 | ||
|
|
ca0bc767fe | ||
|
|
4ba9288935 | ||
|
|
e923ec6d36 | ||
|
|
851d99d99e | ||
|
|
f608e653ca | ||
|
|
72e076b694 | ||
|
|
6cf37c4abe | ||
|
|
fc204773d6 | ||
|
|
2bc5cb240f | ||
|
|
67028d6ea6 | ||
|
|
d22a5057b9 | ||
|
|
75a20194c5 | ||
|
|
7fe81fe62e | ||
|
|
345ddc7234 | ||
|
|
d167d4864f | ||
|
|
bf504912a4 | ||
|
|
a7fb8ae915 | ||
|
|
d9b7a2688f | ||
|
|
c0299dba88 | ||
|
|
c3ecdb4746 | ||
|
|
c17676a9be | ||
|
|
ed8e32cc92 | ||
|
|
2406b28e64 | ||
|
|
2216e5a13f | ||
|
|
5bf3a308d5 | ||
|
|
53ea31c69a | ||
|
|
c1c2b5bf52 | ||
|
|
af018c7b0b | ||
|
|
0a1ca0600f | ||
|
|
cd7c62818b | ||
|
|
37d06bcce8 | ||
|
|
9745718467 | ||
|
|
ab44cc31e2 | ||
|
|
dce3e1efa6 | ||
|
|
159dfb5acb | ||
|
|
844646dc50 | ||
|
|
01fc8f2209 | ||
|
|
43e1d4440e | ||
|
|
466849efe8 | ||
|
|
db0fad6826 | ||
|
|
dd5e6559dd | ||
|
|
7c9d652d9b | ||
|
|
dc9e6c37fe | ||
|
|
01fe9477f4 | ||
|
|
97e3dae6f4 | ||
|
|
e8e7888a23 | ||
|
|
b02b8d016c | ||
|
|
a079bac153 | ||
|
|
3a55a64e1c | ||
|
|
fa5a85439f | ||
|
|
81034596a8 | ||
|
|
0968cdf340 | ||
|
|
d9e4009e33 | ||
|
|
02387fd227 | ||
|
|
fb3713bc25 | ||
|
|
f6d63082c0 | ||
|
|
33e1c42599 | ||
|
|
1b75dc8bcd | ||
|
|
3d02580c09 | ||
|
|
dcc4581220 | ||
|
|
50b8f19cb5 | ||
|
|
b6e3453f49 | ||
|
|
ed4870cdb4 | ||
|
|
5fbee8c824 | ||
|
|
3868c04e99 | ||
|
|
409c1d5aa2 | ||
|
|
20710f5232 | ||
|
|
870882f567 | ||
|
|
e1e67b2c9e | ||
|
|
eac3abdca9 | ||
|
|
ebd8e63276 | ||
|
|
839d17e7bd | ||
|
|
7be5c31bc6 | ||
|
|
9e4a7d5871 | ||
|
|
ff8b9aa439 | ||
|
|
ccc0889803 | ||
|
|
07f118caec | ||
|
|
58af62f388 | ||
|
|
040cd23e4a | ||
|
|
0324764a83 | ||
|
|
679e35fd46 | ||
|
|
49e0d54c76 | ||
|
|
7506852a99 | ||
|
|
bcbfb04992 | ||
|
|
5cd72f2431 | ||
|
|
eabca8439f | ||
|
|
ea1fffeebf | ||
|
|
6d58065909 | ||
|
|
47b0543461 | ||
|
|
8215c605b4 | ||
|
|
d7e949193f |
@@ -1,5 +1,21 @@
|
||||
---
|
||||
Language: Cpp
|
||||
BreakBeforeBraces: Custom
|
||||
BraceWrapping:
|
||||
AfterClass: true
|
||||
AfterControlStatement: true
|
||||
AfterEnum: false
|
||||
AfterFunction: true
|
||||
AfterNamespace: false
|
||||
AfterObjCDeclaration: true
|
||||
AfterStruct: true
|
||||
AfterUnion: true
|
||||
BeforeCatch: true
|
||||
BeforeElse: true
|
||||
IndentBraces: false
|
||||
KeepEmptyLinesAtTheStartOfBlocks: false
|
||||
MaxEmptyLinesToKeep: 1
|
||||
---
|
||||
Language: Cpp
|
||||
AccessModifierOffset: -4
|
||||
AlignAfterOpenBracket: AlwaysBreak
|
||||
AlignConsecutiveAssignments: false
|
||||
@@ -18,51 +34,41 @@ AlwaysBreakBeforeMultilineStrings: true
|
||||
AlwaysBreakTemplateDeclarations: true
|
||||
BinPackArguments: false
|
||||
BinPackParameters: false
|
||||
BraceWrapping:
|
||||
AfterClass: true
|
||||
AfterControlStatement: true
|
||||
AfterEnum: false
|
||||
AfterFunction: true
|
||||
AfterNamespace: false
|
||||
AfterObjCDeclaration: true
|
||||
AfterStruct: true
|
||||
AfterUnion: true
|
||||
BeforeCatch: true
|
||||
BeforeElse: true
|
||||
IndentBraces: false
|
||||
BreakBeforeBinaryOperators: false
|
||||
BreakBeforeBraces: Custom
|
||||
BreakBeforeTernaryOperators: true
|
||||
BreakConstructorInitializersBeforeComma: true
|
||||
ColumnLimit: 80
|
||||
CommentPragmas: '^ IWYU pragma:'
|
||||
ColumnLimit: 80
|
||||
CommentPragmas: "^ IWYU pragma:"
|
||||
ConstructorInitializerAllOnOneLineOrOnePerLine: true
|
||||
ConstructorInitializerIndentWidth: 4
|
||||
ContinuationIndentWidth: 4
|
||||
Cpp11BracedListStyle: true
|
||||
DerivePointerAlignment: false
|
||||
DisableFormat: false
|
||||
DisableFormat: false
|
||||
ExperimentalAutoDetectBinPacking: false
|
||||
ForEachMacros: [ Q_FOREACH, BOOST_FOREACH ]
|
||||
ForEachMacros: [Q_FOREACH, BOOST_FOREACH]
|
||||
IncludeBlocks: Regroup
|
||||
IncludeCategories:
|
||||
- Regex: '^<(test)/'
|
||||
Priority: 0
|
||||
- Regex: '^<(xrpld)/'
|
||||
Priority: 1
|
||||
- Regex: '^<(xrpl)/'
|
||||
Priority: 2
|
||||
- Regex: '^<(boost)/'
|
||||
Priority: 3
|
||||
- Regex: '.*'
|
||||
Priority: 4
|
||||
IncludeIsMainRegex: '$'
|
||||
- Regex: "^<(test)/"
|
||||
Priority: 0
|
||||
- Regex: "^<(xrpld)/"
|
||||
Priority: 1
|
||||
- Regex: "^<(xrpl)/"
|
||||
Priority: 2
|
||||
- Regex: "^<(boost)/"
|
||||
Priority: 3
|
||||
- Regex: "^.*/"
|
||||
Priority: 4
|
||||
- Regex: '^.*\.h'
|
||||
Priority: 5
|
||||
- Regex: ".*"
|
||||
Priority: 6
|
||||
IncludeIsMainRegex: "$"
|
||||
IndentCaseLabels: true
|
||||
IndentFunctionDeclarationAfterType: false
|
||||
IndentRequiresClause: true
|
||||
IndentWidth: 4
|
||||
IndentWidth: 4
|
||||
IndentWrappedFunctionNames: false
|
||||
KeepEmptyLinesAtTheStartOfBlocks: false
|
||||
MaxEmptyLinesToKeep: 1
|
||||
NamespaceIndentation: None
|
||||
ObjCSpaceAfterProperty: false
|
||||
ObjCSpaceBeforeProtocolList: false
|
||||
@@ -73,19 +79,25 @@ PenaltyBreakString: 1000
|
||||
PenaltyExcessCharacter: 1000000
|
||||
PenaltyReturnTypeOnItsOwnLine: 200
|
||||
PointerAlignment: Left
|
||||
ReflowComments: true
|
||||
ReflowComments: true
|
||||
RequiresClausePosition: OwnLine
|
||||
SortIncludes: true
|
||||
SortIncludes: true
|
||||
SpaceAfterCStyleCast: false
|
||||
SpaceBeforeAssignmentOperators: true
|
||||
SpaceBeforeParens: ControlStatements
|
||||
SpaceInEmptyParentheses: false
|
||||
SpacesBeforeTrailingComments: 2
|
||||
SpacesInAngles: false
|
||||
SpacesInAngles: false
|
||||
SpacesInContainerLiterals: true
|
||||
SpacesInCStyleCastParentheses: false
|
||||
SpacesInParentheses: false
|
||||
SpacesInSquareBrackets: false
|
||||
Standard: Cpp11
|
||||
TabWidth: 8
|
||||
UseTab: Never
|
||||
Standard: Cpp11
|
||||
TabWidth: 8
|
||||
UseTab: Never
|
||||
QualifierAlignment: Right
|
||||
---
|
||||
Language: Proto
|
||||
BasedOnStyle: Google
|
||||
ColumnLimit: 0
|
||||
IndentWidth: 2
|
||||
|
||||
@@ -7,13 +7,13 @@ comment:
|
||||
show_carryforward_flags: false
|
||||
|
||||
coverage:
|
||||
range: "60..80"
|
||||
range: "70..85"
|
||||
precision: 1
|
||||
round: nearest
|
||||
status:
|
||||
project:
|
||||
default:
|
||||
target: 60%
|
||||
target: 75%
|
||||
threshold: 2%
|
||||
patch:
|
||||
default:
|
||||
@@ -27,11 +27,12 @@ github_checks:
|
||||
parsers:
|
||||
cobertura:
|
||||
partials_as_hits: true
|
||||
handle_missing_conditions : true
|
||||
handle_missing_conditions: true
|
||||
|
||||
slack_app: false
|
||||
|
||||
ignore:
|
||||
- "src/test/"
|
||||
- "src/tests/"
|
||||
- "include/xrpl/beast/test/"
|
||||
- "include/xrpl/beast/unit_test/"
|
||||
|
||||
@@ -11,3 +11,6 @@ b9d007813378ad0ff45660dc07285b823c7e9855
|
||||
fe9a5365b8a52d4acc42eb27369247e6f238a4f9
|
||||
9a93577314e6a8d4b4a8368cc9d2b15a5d8303e8
|
||||
552377c76f55b403a1c876df873a23d780fcc81c
|
||||
97f0747e103f13e26e45b731731059b32f7679ac
|
||||
b13370ac0d207217354f1fc1c29aef87769fb8a1
|
||||
896b8c3b54a22b0497cb0d1ce95e1095f9a227ce
|
||||
|
||||
8
.github/CODEOWNERS
vendored
Normal file
8
.github/CODEOWNERS
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
# Allow anyone to review any change by default.
|
||||
*
|
||||
|
||||
# Require the rpc-reviewers team to review changes to the rpc code.
|
||||
include/xrpl/protocol/ @xrplf/rpc-reviewers
|
||||
src/libxrpl/protocol/ @xrplf/rpc-reviewers
|
||||
src/xrpld/rpc/ @xrplf/rpc-reviewers
|
||||
src/xrpld/app/misc/ @xrplf/rpc-reviewers
|
||||
13
.github/ISSUE_TEMPLATE/bug_report.md
vendored
13
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -2,30 +2,35 @@
|
||||
name: Bug Report
|
||||
about: Create a report to help us improve rippled
|
||||
title: "[Title with short description] (Version: [rippled version])"
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
labels: ""
|
||||
assignees: ""
|
||||
---
|
||||
|
||||
<!-- Please search existing issues to avoid creating duplicates.-->
|
||||
|
||||
## Issue Description
|
||||
|
||||
<!--Provide a summary for your issue/bug.-->
|
||||
|
||||
## Steps to Reproduce
|
||||
|
||||
<!--List in detail the exact steps to reproduce the unexpected behavior of the software.-->
|
||||
|
||||
## Expected Result
|
||||
|
||||
<!--Explain in detail what behavior you expected to happen.-->
|
||||
|
||||
## Actual Result
|
||||
|
||||
<!--Explain in detail what behavior actually happened.-->
|
||||
|
||||
## Environment
|
||||
|
||||
<!--Please describe your environment setup (such as Ubuntu 18.04 with Boost 1.70).-->
|
||||
<!-- If you are using a formal release, please use the version returned by './rippled --version' as the version number-->
|
||||
<!-- If you are working off of develop, please add the git hash via 'git rev-parse HEAD'-->
|
||||
|
||||
## Supporting Files
|
||||
|
||||
<!--If you have supporting files such as a log, feel free to post a link here using Github Gist.-->
|
||||
<!--Consider adding configuration files with private information removed via Github Gist. -->
|
||||
|
||||
|
||||
8
.github/ISSUE_TEMPLATE/feature_request.md
vendored
8
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -3,19 +3,23 @@ name: Feature Request
|
||||
about: Suggest a new feature for the rippled project
|
||||
title: "[Title with short description] (Version: [rippled version])"
|
||||
labels: Feature Request
|
||||
assignees: ''
|
||||
|
||||
assignees: ""
|
||||
---
|
||||
|
||||
<!-- Please search existing issues to avoid creating duplicates.-->
|
||||
|
||||
## Summary
|
||||
|
||||
<!-- Provide a summary to the feature request-->
|
||||
|
||||
## Motivation
|
||||
|
||||
<!-- Why do we need this feature?-->
|
||||
|
||||
## Solution
|
||||
|
||||
<!-- What is the solution?-->
|
||||
|
||||
## Paths Not Taken
|
||||
|
||||
<!-- What other alternatives have been considered?-->
|
||||
|
||||
44
.github/actions/build-deps/action.yml
vendored
Normal file
44
.github/actions/build-deps/action.yml
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
name: Build Conan dependencies
|
||||
description: "Install Conan dependencies, optionally forcing a rebuild of all dependencies."
|
||||
|
||||
# Note that actions do not support 'type' and all inputs are strings, see
|
||||
# https://docs.github.com/en/actions/reference/workflows-and-actions/metadata-syntax#inputs.
|
||||
inputs:
|
||||
verbosity:
|
||||
description: "The build verbosity."
|
||||
required: false
|
||||
default: "verbose"
|
||||
build_dir:
|
||||
description: "The directory where to build."
|
||||
required: true
|
||||
build_type:
|
||||
description: 'The build type to use ("Debug", "Release").'
|
||||
required: true
|
||||
force_build:
|
||||
description: 'Force building of all dependencies ("true", "false").'
|
||||
required: false
|
||||
default: "false"
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Install Conan dependencies
|
||||
shell: bash
|
||||
env:
|
||||
BUILD_DIR: ${{ inputs.build_dir }}
|
||||
BUILD_OPTION: ${{ inputs.force_build == 'true' && '*' || 'missing' }}
|
||||
BUILD_TYPE: ${{ inputs.build_type }}
|
||||
VERBOSITY: ${{ inputs.verbosity }}
|
||||
run: |
|
||||
echo 'Installing dependencies.'
|
||||
mkdir -p "${BUILD_DIR}"
|
||||
cd "${BUILD_DIR}"
|
||||
conan install \
|
||||
--output-folder . \
|
||||
--build="${BUILD_OPTION}" \
|
||||
--options:host='&:tests=True' \
|
||||
--options:host='&:xrpld=True' \
|
||||
--settings:all build_type="${BUILD_TYPE}" \
|
||||
--conf:all tools.build:verbosity="${VERBOSITY}" \
|
||||
--conf:all tools.compilation:verbosity="${VERBOSITY}" \
|
||||
..
|
||||
34
.github/actions/build/action.yml
vendored
34
.github/actions/build/action.yml
vendored
@@ -1,34 +0,0 @@
|
||||
name: build
|
||||
inputs:
|
||||
generator:
|
||||
default: null
|
||||
configuration:
|
||||
required: true
|
||||
cmake-args:
|
||||
default: null
|
||||
cmake-target:
|
||||
default: all
|
||||
# An implicit input is the environment variable `build_dir`.
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: configure
|
||||
shell: bash
|
||||
run: |
|
||||
cd ${build_dir}
|
||||
cmake \
|
||||
${{ inputs.generator && format('-G "{0}"', inputs.generator) || '' }} \
|
||||
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
|
||||
-DCMAKE_BUILD_TYPE=${{ inputs.configuration }} \
|
||||
-Dtests=TRUE \
|
||||
-Dxrpld=TRUE \
|
||||
${{ inputs.cmake-args }} \
|
||||
..
|
||||
- name: build
|
||||
shell: bash
|
||||
run: |
|
||||
cmake \
|
||||
--build ${build_dir} \
|
||||
--config ${{ inputs.configuration }} \
|
||||
--parallel ${NUM_PROCESSORS:-$(nproc)} \
|
||||
--target ${{ inputs.cmake-target }}
|
||||
60
.github/actions/dependencies/action.yml
vendored
60
.github/actions/dependencies/action.yml
vendored
@@ -1,60 +0,0 @@
|
||||
name: dependencies
|
||||
inputs:
|
||||
configuration:
|
||||
required: true
|
||||
# An implicit input is the environment variable `build_dir`.
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: unlock Conan
|
||||
shell: bash
|
||||
run: conan remove --locks
|
||||
- name: export custom recipes
|
||||
shell: bash
|
||||
run: |
|
||||
conan config set general.revisions_enabled=1
|
||||
conan export external/snappy snappy/1.1.10@
|
||||
conan export external/rocksdb rocksdb/6.29.5@
|
||||
conan export external/soci soci/4.0.3@
|
||||
- name: add Ripple Conan remote
|
||||
shell: bash
|
||||
run: |
|
||||
conan remote list
|
||||
conan remote remove ripple || true
|
||||
# Do not quote the URL. An empty string will be accepted (with
|
||||
# a non-fatal warning), but a missing argument will not.
|
||||
conan remote add ripple ${{ env.CONAN_URL }} --insert 0
|
||||
- name: try to authenticate to Ripple Conan remote
|
||||
id: remote
|
||||
shell: bash
|
||||
run: |
|
||||
# `conan user` implicitly uses the environment variables
|
||||
# CONAN_LOGIN_USERNAME_<REMOTE> and CONAN_PASSWORD_<REMOTE>.
|
||||
# https://docs.conan.io/1/reference/commands/misc/user.html#using-environment-variables
|
||||
# https://docs.conan.io/1/reference/env_vars.html#conan-login-username-conan-login-username-remote-name
|
||||
# https://docs.conan.io/1/reference/env_vars.html#conan-password-conan-password-remote-name
|
||||
echo outcome=$(conan user --remote ripple --password >&2 \
|
||||
&& echo success || echo failure) | tee ${GITHUB_OUTPUT}
|
||||
- name: list missing binaries
|
||||
id: binaries
|
||||
shell: bash
|
||||
# Print the list of dependencies that would need to be built locally.
|
||||
# A non-empty list means we have "failed" to cache binaries remotely.
|
||||
run: |
|
||||
echo missing=$(conan info . --build missing --settings build_type=${{ inputs.configuration }} --json 2>/dev/null | grep '^\[') | tee ${GITHUB_OUTPUT}
|
||||
- name: install dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
mkdir ${build_dir}
|
||||
cd ${build_dir}
|
||||
conan install \
|
||||
--output-folder . \
|
||||
--build missing \
|
||||
--options tests=True \
|
||||
--options xrpld=True \
|
||||
--settings build_type=${{ inputs.configuration }} \
|
||||
..
|
||||
- name: upload dependencies to remote
|
||||
if: (steps.binaries.outputs.missing != '[]') && (steps.remote.outputs.outcome == 'success')
|
||||
shell: bash
|
||||
run: conan upload --remote ripple '*' --all --parallel --confirm
|
||||
43
.github/actions/print-env/action.yml
vendored
Normal file
43
.github/actions/print-env/action.yml
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
name: Print build environment
|
||||
description: "Print environment and some tooling versions"
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Check configuration (Windows)
|
||||
if: ${{ runner.os == 'Windows' }}
|
||||
shell: bash
|
||||
run: |
|
||||
echo 'Checking environment variables.'
|
||||
set
|
||||
|
||||
echo 'Checking CMake version.'
|
||||
cmake --version
|
||||
|
||||
echo 'Checking Conan version.'
|
||||
conan --version
|
||||
|
||||
- name: Check configuration (Linux and macOS)
|
||||
if: ${{ runner.os == 'Linux' || runner.os == 'macOS' }}
|
||||
shell: bash
|
||||
run: |
|
||||
echo 'Checking path.'
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
|
||||
echo 'Checking environment variables.'
|
||||
env | sort
|
||||
|
||||
echo 'Checking CMake version.'
|
||||
cmake --version
|
||||
|
||||
echo 'Checking compiler version.'
|
||||
${{ runner.os == 'Linux' && '${CC}' || 'clang' }} --version
|
||||
|
||||
echo 'Checking Conan version.'
|
||||
conan --version
|
||||
|
||||
echo 'Checking Ninja version.'
|
||||
ninja --version
|
||||
|
||||
echo 'Checking nproc version.'
|
||||
nproc --version
|
||||
46
.github/actions/setup-conan/action.yml
vendored
Normal file
46
.github/actions/setup-conan/action.yml
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
name: Setup Conan
|
||||
description: "Set up Conan configuration, profile, and remote."
|
||||
|
||||
inputs:
|
||||
conan_remote_name:
|
||||
description: "The name of the Conan remote to use."
|
||||
required: false
|
||||
default: xrplf
|
||||
conan_remote_url:
|
||||
description: "The URL of the Conan endpoint to use."
|
||||
required: false
|
||||
default: https://conan.ripplex.io
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
|
||||
steps:
|
||||
- name: Set up Conan configuration
|
||||
shell: bash
|
||||
run: |
|
||||
echo 'Installing configuration.'
|
||||
cat conan/global.conf ${{ runner.os == 'Linux' && '>>' || '>' }} $(conan config home)/global.conf
|
||||
|
||||
echo 'Conan configuration:'
|
||||
conan config show '*'
|
||||
|
||||
- name: Set up Conan profile
|
||||
shell: bash
|
||||
run: |
|
||||
echo 'Installing profile.'
|
||||
conan config install conan/profiles/default -tf $(conan config home)/profiles/
|
||||
|
||||
echo 'Conan profile:'
|
||||
conan profile show
|
||||
|
||||
- name: Set up Conan remote
|
||||
shell: bash
|
||||
env:
|
||||
CONAN_REMOTE_NAME: ${{ inputs.conan_remote_name }}
|
||||
CONAN_REMOTE_URL: ${{ inputs.conan_remote_url }}
|
||||
run: |
|
||||
echo "Adding Conan remote '${CONAN_REMOTE_NAME}' at '${CONAN_REMOTE_URL}'."
|
||||
conan remote add --index 0 --force "${CONAN_REMOTE_NAME}" "${CONAN_REMOTE_URL}"
|
||||
|
||||
echo 'Listing Conan remotes.'
|
||||
conan remote list
|
||||
@@ -25,32 +25,32 @@ more dependencies listed later.
|
||||
**tl;dr:** The modules listed first are more independent than the modules
|
||||
listed later.
|
||||
|
||||
| Level / Tier | Module(s) |
|
||||
|--------------|-----------------------------------------------|
|
||||
| 01 | ripple/beast ripple/unity
|
||||
| 02 | ripple/basics
|
||||
| 03 | ripple/json ripple/crypto
|
||||
| 04 | ripple/protocol
|
||||
| 05 | ripple/core ripple/conditions ripple/consensus ripple/resource ripple/server
|
||||
| 06 | ripple/peerfinder ripple/ledger ripple/nodestore ripple/net
|
||||
| 07 | ripple/shamap ripple/overlay
|
||||
| 08 | ripple/app
|
||||
| 09 | ripple/rpc
|
||||
| 10 | ripple/perflog
|
||||
| 11 | test/jtx test/beast test/csf
|
||||
| 12 | test/unit_test
|
||||
| 13 | test/crypto test/conditions test/json test/resource test/shamap test/peerfinder test/basics test/overlay
|
||||
| 14 | test
|
||||
| 15 | test/net test/protocol test/ledger test/consensus test/core test/server test/nodestore
|
||||
| 16 | test/rpc test/app
|
||||
| Level / Tier | Module(s) |
|
||||
| ------------ | -------------------------------------------------------------------------------------------------------- |
|
||||
| 01 | ripple/beast ripple/unity |
|
||||
| 02 | ripple/basics |
|
||||
| 03 | ripple/json ripple/crypto |
|
||||
| 04 | ripple/protocol |
|
||||
| 05 | ripple/core ripple/conditions ripple/consensus ripple/resource ripple/server |
|
||||
| 06 | ripple/peerfinder ripple/ledger ripple/nodestore ripple/net |
|
||||
| 07 | ripple/shamap ripple/overlay |
|
||||
| 08 | ripple/app |
|
||||
| 09 | ripple/rpc |
|
||||
| 10 | ripple/perflog |
|
||||
| 11 | test/jtx test/beast test/csf |
|
||||
| 12 | test/unit_test |
|
||||
| 13 | test/crypto test/conditions test/json test/resource test/shamap test/peerfinder test/basics test/overlay |
|
||||
| 14 | test |
|
||||
| 15 | test/net test/protocol test/ledger test/consensus test/core test/server test/nodestore |
|
||||
| 16 | test/rpc test/app |
|
||||
|
||||
(Note that `test` levelization is *much* less important and *much* less
|
||||
(Note that `test` levelization is _much_ less important and _much_ less
|
||||
strictly enforced than `ripple` levelization, other than the requirement
|
||||
that `test` code should *never* be included in `ripple` code.)
|
||||
that `test` code should _never_ be included in `ripple` code.)
|
||||
|
||||
## Validation
|
||||
|
||||
The [levelization.sh](levelization.sh) script takes no parameters,
|
||||
The [levelization](generate.sh) script takes no parameters,
|
||||
reads no environment variables, and can be run from any directory,
|
||||
as long as it is in the expected location in the rippled repo.
|
||||
It can be run at any time from within a checked out repo, and will
|
||||
@@ -59,48 +59,48 @@ the rippled source. The only caveat is that it runs much slower
|
||||
under Windows than in Linux. It hasn't yet been tested under MacOS.
|
||||
It generates many files of [results](results):
|
||||
|
||||
* `rawincludes.txt`: The raw dump of the `#includes`
|
||||
* `paths.txt`: A second dump grouping the source module
|
||||
- `rawincludes.txt`: The raw dump of the `#includes`
|
||||
- `paths.txt`: A second dump grouping the source module
|
||||
to the destination module, deduped, and with frequency counts.
|
||||
* `includes/`: A directory where each file represents a module and
|
||||
- `includes/`: A directory where each file represents a module and
|
||||
contains a list of modules and counts that the module _includes_.
|
||||
* `includedby/`: Similar to `includes/`, but the other way around. Each
|
||||
- `includedby/`: Similar to `includes/`, but the other way around. Each
|
||||
file represents a module and contains a list of modules and counts
|
||||
that _include_ the module.
|
||||
* [`loops.txt`](results/loops.txt): A list of direct loops detected
|
||||
- [`loops.txt`](results/loops.txt): A list of direct loops detected
|
||||
between modules as they actually exist, as opposed to how they are
|
||||
desired as described above. In a perfect repo, this file will be
|
||||
empty.
|
||||
This file is committed to the repo, and is used by the [levelization
|
||||
Github workflow](../../.github/workflows/levelization.yml) to validate
|
||||
Github workflow](../../workflows/reusable-check-levelization.yml) to validate
|
||||
that nothing changed.
|
||||
* [`ordering.txt`](results/ordering.txt): A list showing relationships
|
||||
- [`ordering.txt`](results/ordering.txt): A list showing relationships
|
||||
between modules where there are no loops as they actually exist, as
|
||||
opposed to how they are desired as described above.
|
||||
This file is committed to the repo, and is used by the [levelization
|
||||
Github workflow](../../.github/workflows/levelization.yml) to validate
|
||||
Github workflow](../../workflows/reusable-check-levelization.yml) to validate
|
||||
that nothing changed.
|
||||
* [`levelization.yml`](../../.github/workflows/levelization.yml)
|
||||
- [`levelization.yml`](../../workflows/reusable-check-levelization.yml)
|
||||
Github Actions workflow to test that levelization loops haven't
|
||||
changed. Unfortunately, if changes are detected, it can't tell if
|
||||
changed. Unfortunately, if changes are detected, it can't tell if
|
||||
they are improvements or not, so if you have resolved any issues or
|
||||
done anything else to improve levelization, run `levelization.sh`,
|
||||
and commit the updated results.
|
||||
|
||||
The `loops.txt` and `ordering.txt` files relate the modules
|
||||
The `loops.txt` and `ordering.txt` files relate the modules
|
||||
using comparison signs, which indicate the number of times each
|
||||
module is included in the other.
|
||||
|
||||
* `A > B` means that A should probably be at a higher level than B,
|
||||
- `A > B` means that A should probably be at a higher level than B,
|
||||
because B is included in A significantly more than A is included in B.
|
||||
These results can be included in both `loops.txt` and `ordering.txt`.
|
||||
Because `ordering.txt`only includes relationships where B is not
|
||||
included in A at all, it will only include these types of results.
|
||||
* `A ~= B` means that A and B are included in each other a different
|
||||
- `A ~= B` means that A and B are included in each other a different
|
||||
number of times, but the values are so close that the script can't
|
||||
definitively say that one should be above the other. These results
|
||||
will only be included in `loops.txt`.
|
||||
* `A == B` means that A and B include each other the same number of
|
||||
- `A == B` means that A and B include each other the same number of
|
||||
times, so the script has no clue which should be higher. These results
|
||||
will only be included in `loops.txt`.
|
||||
|
||||
@@ -110,5 +110,5 @@ get those details locally.
|
||||
|
||||
1. Run `levelization.sh`
|
||||
2. Grep the modules in `paths.txt`.
|
||||
* For example, if a cycle is found `A ~= B`, simply `grep -w
|
||||
A Builds/levelization/results/paths.txt | grep -w B`
|
||||
- For example, if a cycle is found `A ~= B`, simply `grep -w
|
||||
A .github/scripts/levelization/results/paths.txt | grep -w B`
|
||||
@@ -1,6 +1,6 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Usage: levelization.sh
|
||||
# Usage: generate.sh
|
||||
# This script takes no parameters, reads no environment variables,
|
||||
# and can be run from any directory, as long as it is in the expected
|
||||
# location in the repo.
|
||||
@@ -19,9 +19,9 @@ export LANG=C
|
||||
rm -rfv results
|
||||
mkdir results
|
||||
includes="$( pwd )/results/rawincludes.txt"
|
||||
pushd ../..
|
||||
pushd ../../..
|
||||
echo Raw includes:
|
||||
grep -r '#include.*/.*\.h' include src | \
|
||||
grep -r '^[ ]*#include.*/.*\.h' include src | \
|
||||
grep -v boost | tee ${includes}
|
||||
popd
|
||||
pushd results
|
||||
@@ -4,23 +4,14 @@ Loop: test.jtx test.toplevel
|
||||
Loop: test.jtx test.unit_test
|
||||
test.unit_test == test.jtx
|
||||
|
||||
Loop: xrpl.basics xrpl.json
|
||||
xrpl.json == xrpl.basics
|
||||
|
||||
Loop: xrpld.app xrpld.core
|
||||
xrpld.app > xrpld.core
|
||||
|
||||
Loop: xrpld.app xrpld.ledger
|
||||
xrpld.app > xrpld.ledger
|
||||
|
||||
Loop: xrpld.app xrpld.net
|
||||
xrpld.app > xrpld.net
|
||||
|
||||
Loop: xrpld.app xrpld.overlay
|
||||
xrpld.overlay == xrpld.app
|
||||
xrpld.overlay > xrpld.app
|
||||
|
||||
Loop: xrpld.app xrpld.peerfinder
|
||||
xrpld.app > xrpld.peerfinder
|
||||
xrpld.peerfinder ~= xrpld.app
|
||||
|
||||
Loop: xrpld.app xrpld.rpc
|
||||
xrpld.rpc > xrpld.app
|
||||
@@ -28,15 +19,9 @@ Loop: xrpld.app xrpld.rpc
|
||||
Loop: xrpld.app xrpld.shamap
|
||||
xrpld.app > xrpld.shamap
|
||||
|
||||
Loop: xrpld.core xrpld.net
|
||||
xrpld.net > xrpld.core
|
||||
|
||||
Loop: xrpld.core xrpld.perflog
|
||||
xrpld.perflog == xrpld.core
|
||||
|
||||
Loop: xrpld.net xrpld.rpc
|
||||
xrpld.rpc ~= xrpld.net
|
||||
|
||||
Loop: xrpld.overlay xrpld.rpc
|
||||
xrpld.rpc ~= xrpld.overlay
|
||||
|
||||
@@ -1,12 +1,18 @@
|
||||
libxrpl.basics > xrpl.basics
|
||||
libxrpl.basics > xrpl.protocol
|
||||
libxrpl.crypto > xrpl.basics
|
||||
libxrpl.json > xrpl.basics
|
||||
libxrpl.json > xrpl.json
|
||||
libxrpl.ledger > xrpl.basics
|
||||
libxrpl.ledger > xrpl.json
|
||||
libxrpl.ledger > xrpl.ledger
|
||||
libxrpl.ledger > xrpl.protocol
|
||||
libxrpl.net > xrpl.basics
|
||||
libxrpl.net > xrpl.net
|
||||
libxrpl.protocol > xrpl.basics
|
||||
libxrpl.protocol > xrpl.json
|
||||
libxrpl.protocol > xrpl.protocol
|
||||
libxrpl.resource > xrpl.basics
|
||||
libxrpl.resource > xrpl.json
|
||||
libxrpl.resource > xrpl.resource
|
||||
libxrpl.server > xrpl.basics
|
||||
libxrpl.server > xrpl.json
|
||||
@@ -19,10 +25,11 @@ test.app > test.unit_test
|
||||
test.app > xrpl.basics
|
||||
test.app > xrpld.app
|
||||
test.app > xrpld.core
|
||||
test.app > xrpld.ledger
|
||||
test.app > xrpld.nodestore
|
||||
test.app > xrpld.overlay
|
||||
test.app > xrpld.rpc
|
||||
test.app > xrpl.json
|
||||
test.app > xrpl.ledger
|
||||
test.app > xrpl.protocol
|
||||
test.app > xrpl.resource
|
||||
test.basics > test.jtx
|
||||
@@ -41,7 +48,8 @@ test.consensus > test.unit_test
|
||||
test.consensus > xrpl.basics
|
||||
test.consensus > xrpld.app
|
||||
test.consensus > xrpld.consensus
|
||||
test.consensus > xrpld.ledger
|
||||
test.consensus > xrpl.json
|
||||
test.consensus > xrpl.ledger
|
||||
test.core > test.jtx
|
||||
test.core > test.toplevel
|
||||
test.core > test.unit_test
|
||||
@@ -58,12 +66,11 @@ test.json > test.jtx
|
||||
test.json > xrpl.json
|
||||
test.jtx > xrpl.basics
|
||||
test.jtx > xrpld.app
|
||||
test.jtx > xrpld.consensus
|
||||
test.jtx > xrpld.core
|
||||
test.jtx > xrpld.ledger
|
||||
test.jtx > xrpld.net
|
||||
test.jtx > xrpld.rpc
|
||||
test.jtx > xrpl.json
|
||||
test.jtx > xrpl.ledger
|
||||
test.jtx > xrpl.net
|
||||
test.jtx > xrpl.protocol
|
||||
test.jtx > xrpl.resource
|
||||
test.jtx > xrpl.server
|
||||
@@ -72,7 +79,7 @@ test.ledger > test.toplevel
|
||||
test.ledger > xrpl.basics
|
||||
test.ledger > xrpld.app
|
||||
test.ledger > xrpld.core
|
||||
test.ledger > xrpld.ledger
|
||||
test.ledger > xrpl.ledger
|
||||
test.ledger > xrpl.protocol
|
||||
test.nodestore > test.jtx
|
||||
test.nodestore > test.toplevel
|
||||
@@ -108,7 +115,6 @@ test.rpc > test.toplevel
|
||||
test.rpc > xrpl.basics
|
||||
test.rpc > xrpld.app
|
||||
test.rpc > xrpld.core
|
||||
test.rpc > xrpld.net
|
||||
test.rpc > xrpld.overlay
|
||||
test.rpc > xrpld.rpc
|
||||
test.rpc > xrpl.json
|
||||
@@ -131,6 +137,12 @@ test.shamap > xrpl.protocol
|
||||
test.toplevel > test.csf
|
||||
test.toplevel > xrpl.json
|
||||
test.unit_test > xrpl.basics
|
||||
tests.libxrpl > xrpl.basics
|
||||
tests.libxrpl > xrpl.net
|
||||
xrpl.json > xrpl.basics
|
||||
xrpl.ledger > xrpl.basics
|
||||
xrpl.ledger > xrpl.protocol
|
||||
xrpl.net > xrpl.basics
|
||||
xrpl.protocol > xrpl.basics
|
||||
xrpl.protocol > xrpl.json
|
||||
xrpl.resource > xrpl.basics
|
||||
@@ -146,6 +158,8 @@ xrpld.app > xrpld.consensus
|
||||
xrpld.app > xrpld.nodestore
|
||||
xrpld.app > xrpld.perflog
|
||||
xrpld.app > xrpl.json
|
||||
xrpld.app > xrpl.ledger
|
||||
xrpld.app > xrpl.net
|
||||
xrpld.app > xrpl.protocol
|
||||
xrpld.app > xrpl.resource
|
||||
xrpld.conditions > xrpl.basics
|
||||
@@ -155,15 +169,8 @@ xrpld.consensus > xrpl.json
|
||||
xrpld.consensus > xrpl.protocol
|
||||
xrpld.core > xrpl.basics
|
||||
xrpld.core > xrpl.json
|
||||
xrpld.core > xrpl.net
|
||||
xrpld.core > xrpl.protocol
|
||||
xrpld.ledger > xrpl.basics
|
||||
xrpld.ledger > xrpld.core
|
||||
xrpld.ledger > xrpl.json
|
||||
xrpld.ledger > xrpl.protocol
|
||||
xrpld.net > xrpl.basics
|
||||
xrpld.net > xrpl.json
|
||||
xrpld.net > xrpl.protocol
|
||||
xrpld.net > xrpl.resource
|
||||
xrpld.nodestore > xrpl.basics
|
||||
xrpld.nodestore > xrpld.core
|
||||
xrpld.nodestore > xrpld.unity
|
||||
@@ -182,12 +189,12 @@ xrpld.peerfinder > xrpld.core
|
||||
xrpld.peerfinder > xrpl.protocol
|
||||
xrpld.perflog > xrpl.basics
|
||||
xrpld.perflog > xrpl.json
|
||||
xrpld.perflog > xrpl.protocol
|
||||
xrpld.rpc > xrpl.basics
|
||||
xrpld.rpc > xrpld.core
|
||||
xrpld.rpc > xrpld.ledger
|
||||
xrpld.rpc > xrpld.nodestore
|
||||
xrpld.rpc > xrpl.json
|
||||
xrpld.rpc > xrpl.ledger
|
||||
xrpld.rpc > xrpl.net
|
||||
xrpld.rpc > xrpl.protocol
|
||||
xrpld.rpc > xrpl.resource
|
||||
xrpld.rpc > xrpl.server
|
||||
197
.github/scripts/strategy-matrix/generate.py
vendored
Executable file
197
.github/scripts/strategy-matrix/generate.py
vendored
Executable file
@@ -0,0 +1,197 @@
|
||||
#!/usr/bin/env python3
|
||||
import argparse
|
||||
import itertools
|
||||
import json
|
||||
from pathlib import Path
|
||||
from dataclasses import dataclass
|
||||
|
||||
THIS_DIR = Path(__file__).parent.resolve()
|
||||
|
||||
@dataclass
|
||||
class Config:
|
||||
architecture: list[dict]
|
||||
os: list[dict]
|
||||
build_type: list[str]
|
||||
cmake_args: list[str]
|
||||
|
||||
'''
|
||||
Generate a strategy matrix for GitHub Actions CI.
|
||||
|
||||
On each PR commit we will build a selection of Debian, RHEL, Ubuntu, MacOS, and
|
||||
Windows configurations, while upon merge into the develop, release, or master
|
||||
branches, we will build all configurations, and test most of them.
|
||||
|
||||
We will further set additional CMake arguments as follows:
|
||||
- All builds will have the `tests`, `werr`, and `xrpld` options.
|
||||
- All builds will have the `wextra` option except for GCC 12 and Clang 16.
|
||||
- All release builds will have the `assert` option.
|
||||
- Certain Debian Bookworm configurations will change the reference fee, enable
|
||||
codecov, and enable voidstar in PRs.
|
||||
'''
|
||||
def generate_strategy_matrix(all: bool, config: Config) -> list:
|
||||
configurations = []
|
||||
for architecture, os, build_type, cmake_args in itertools.product(config.architecture, config.os, config.build_type, config.cmake_args):
|
||||
# The default CMake target is 'all' for Linux and MacOS and 'install'
|
||||
# for Windows, but it can get overridden for certain configurations.
|
||||
cmake_target = 'install' if os["distro_name"] == 'windows' else 'all'
|
||||
|
||||
# We build and test all configurations by default, except for Windows in
|
||||
# Debug, because it is too slow, as well as when code coverage is
|
||||
# enabled as that mode already runs the tests.
|
||||
build_only = False
|
||||
if os['distro_name'] == 'windows' and build_type == 'Debug':
|
||||
build_only = True
|
||||
|
||||
# Only generate a subset of configurations in PRs.
|
||||
if not all:
|
||||
# Debian:
|
||||
# - Bookworm using GCC 13: Release and Unity on linux/amd64, set
|
||||
# the reference fee to 500.
|
||||
# - Bookworm using GCC 15: Debug and no Unity on linux/amd64, enable
|
||||
# code coverage (which will be done below).
|
||||
# - Bookworm using Clang 16: Debug and no Unity on linux/arm64,
|
||||
# enable voidstar.
|
||||
# - Bookworm using Clang 17: Release and no Unity on linux/amd64,
|
||||
# set the reference fee to 1000.
|
||||
# - Bookworm using Clang 20: Debug and Unity on linux/amd64.
|
||||
if os['distro_name'] == 'debian':
|
||||
skip = True
|
||||
if os['distro_version'] == 'bookworm':
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-13' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
cmake_args = f'-DUNIT_TEST_REFERENCE_FEE=500 {cmake_args}'
|
||||
skip = False
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-15' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
skip = False
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-16' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/arm64':
|
||||
cmake_args = f'-Dvoidstar=ON {cmake_args}'
|
||||
skip = False
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-17' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
cmake_args = f'-DUNIT_TEST_REFERENCE_FEE=1000 {cmake_args}'
|
||||
skip = False
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-20' and build_type == 'Debug' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
skip = False
|
||||
if skip:
|
||||
continue
|
||||
|
||||
# RHEL:
|
||||
# - 9 using GCC 12: Debug and Unity on linux/amd64.
|
||||
# - 10 using Clang: Release and no Unity on linux/amd64.
|
||||
if os['distro_name'] == 'rhel':
|
||||
skip = True
|
||||
if os['distro_version'] == '9':
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-12' and build_type == 'Debug' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
skip = False
|
||||
elif os['distro_version'] == '10':
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-any' and build_type == 'Release' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
skip = False
|
||||
if skip:
|
||||
continue
|
||||
|
||||
# Ubuntu:
|
||||
# - Jammy using GCC 12: Debug and no Unity on linux/arm64.
|
||||
# - Noble using GCC 14: Release and Unity on linux/amd64.
|
||||
# - Noble using Clang 18: Debug and no Unity on linux/amd64.
|
||||
# - Noble using Clang 19: Release and Unity on linux/arm64.
|
||||
if os['distro_name'] == 'ubuntu':
|
||||
skip = True
|
||||
if os['distro_version'] == 'jammy':
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-12' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/arm64':
|
||||
skip = False
|
||||
elif os['distro_version'] == 'noble':
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-14' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
skip = False
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-18' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
skip = False
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-19' and build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'linux/arm64':
|
||||
skip = False
|
||||
if skip:
|
||||
continue
|
||||
|
||||
# MacOS:
|
||||
# - Debug and no Unity on macos/arm64.
|
||||
if os['distro_name'] == 'macos' and not (build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'macos/arm64'):
|
||||
continue
|
||||
|
||||
# Windows:
|
||||
# - Release and Unity on windows/amd64.
|
||||
if os['distro_name'] == 'windows' and not (build_type == 'Release' and '-Dunity=ON' in cmake_args and architecture['platform'] == 'windows/amd64'):
|
||||
continue
|
||||
|
||||
|
||||
# Additional CMake arguments.
|
||||
cmake_args = f'{cmake_args} -Dtests=ON -Dwerr=ON -Dxrpld=ON'
|
||||
if not f'{os['compiler_name']}-{os['compiler_version']}' in ['gcc-12', 'clang-16']:
|
||||
cmake_args = f'{cmake_args} -Dwextra=ON'
|
||||
if build_type == 'Release':
|
||||
cmake_args = f'{cmake_args} -Dassert=ON'
|
||||
|
||||
# We skip all RHEL on arm64 due to a build failure that needs further
|
||||
# investigation.
|
||||
if os['distro_name'] == 'rhel' and architecture['platform'] == 'linux/arm64':
|
||||
continue
|
||||
|
||||
# We skip all clang-20 on arm64 due to boost 1.86 build error
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'clang-20' and architecture['platform'] == 'linux/arm64':
|
||||
continue
|
||||
|
||||
# Enable code coverage for Debian Bookworm using GCC 15 in Debug and no
|
||||
# Unity on linux/amd64
|
||||
if f'{os['compiler_name']}-{os['compiler_version']}' == 'gcc-15' and build_type == 'Debug' and '-Dunity=OFF' in cmake_args and architecture['platform'] == 'linux/amd64':
|
||||
cmake_args = f'-Dcoverage=ON -Dcoverage_format=xml -DCODE_COVERAGE_VERBOSE=ON -DCMAKE_C_FLAGS=-O0 -DCMAKE_CXX_FLAGS=-O0 {cmake_args}'
|
||||
cmake_target = 'coverage'
|
||||
build_only = True
|
||||
|
||||
# Generate a unique name for the configuration, e.g. macos-arm64-debug
|
||||
# or debian-bookworm-gcc-12-amd64-release-unity.
|
||||
config_name = os['distro_name']
|
||||
if (n := os['distro_version']) != '':
|
||||
config_name += f'-{n}'
|
||||
if (n := os['compiler_name']) != '':
|
||||
config_name += f'-{n}'
|
||||
if (n := os['compiler_version']) != '':
|
||||
config_name += f'-{n}'
|
||||
config_name += f'-{architecture['platform'][architecture['platform'].find('/')+1:]}'
|
||||
config_name += f'-{build_type.lower()}'
|
||||
if '-Dunity=ON' in cmake_args:
|
||||
config_name += '-unity'
|
||||
|
||||
# Add the configuration to the list, with the most unique fields first,
|
||||
# so that they are easier to identify in the GitHub Actions UI, as long
|
||||
# names get truncated.
|
||||
configurations.append({
|
||||
'config_name': config_name,
|
||||
'cmake_args': cmake_args,
|
||||
'cmake_target': cmake_target,
|
||||
'build_only': build_only,
|
||||
'build_type': build_type,
|
||||
'os': os,
|
||||
'architecture': architecture,
|
||||
})
|
||||
|
||||
return configurations
|
||||
|
||||
|
||||
def read_config(file: Path) -> Config:
|
||||
config = json.loads(file.read_text())
|
||||
if config['architecture'] is None or config['os'] is None or config['build_type'] is None or config['cmake_args'] is None:
|
||||
raise Exception('Invalid configuration file.')
|
||||
|
||||
return Config(**config)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('-a', '--all', help='Set to generate all configurations (generally used when merging a PR) or leave unset to generate a subset of configurations (generally used when committing to a PR).', action="store_true")
|
||||
parser.add_argument('-c', '--config', help='Path to the JSON file containing the strategy matrix configurations.', required=False, type=Path)
|
||||
args = parser.parse_args()
|
||||
|
||||
matrix = []
|
||||
if args.config is None or args.config == '':
|
||||
matrix += generate_strategy_matrix(args.all, read_config(THIS_DIR / "linux.json"))
|
||||
matrix += generate_strategy_matrix(args.all, read_config(THIS_DIR / "macos.json"))
|
||||
matrix += generate_strategy_matrix(args.all, read_config(THIS_DIR / "windows.json"))
|
||||
else:
|
||||
matrix += generate_strategy_matrix(args.all, read_config(args.config))
|
||||
|
||||
# Generate the strategy matrix.
|
||||
print(f'matrix={json.dumps({"include": matrix})}')
|
||||
184
.github/scripts/strategy-matrix/linux.json
vendored
Normal file
184
.github/scripts/strategy-matrix/linux.json
vendored
Normal file
@@ -0,0 +1,184 @@
|
||||
{
|
||||
"architecture": [
|
||||
{
|
||||
"platform": "linux/amd64",
|
||||
"runner": ["self-hosted", "Linux", "X64", "heavy"]
|
||||
},
|
||||
{
|
||||
"platform": "linux/arm64",
|
||||
"runner": ["self-hosted", "Linux", "ARM64", "heavy-arm64"]
|
||||
}
|
||||
],
|
||||
"os": [
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "12",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "13",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "14",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "15",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "16",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "17",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "18",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "19",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "debian",
|
||||
"distro_version": "bookworm",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "20",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "8",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "14",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "8",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "any",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "9",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "12",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "9",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "13",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "9",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "14",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "9",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "any",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "10",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "14",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "rhel",
|
||||
"distro_version": "10",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "any",
|
||||
"image_sha": "10e69b4"
|
||||
},
|
||||
{
|
||||
"distro_name": "ubuntu",
|
||||
"distro_version": "jammy",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "12",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "ubuntu",
|
||||
"distro_version": "noble",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "13",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "ubuntu",
|
||||
"distro_version": "noble",
|
||||
"compiler_name": "gcc",
|
||||
"compiler_version": "14",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "ubuntu",
|
||||
"distro_version": "noble",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "16",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "ubuntu",
|
||||
"distro_version": "noble",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "17",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "ubuntu",
|
||||
"distro_version": "noble",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "18",
|
||||
"image_sha": "6948666"
|
||||
},
|
||||
{
|
||||
"distro_name": "ubuntu",
|
||||
"distro_version": "noble",
|
||||
"compiler_name": "clang",
|
||||
"compiler_version": "19",
|
||||
"image_sha": "6948666"
|
||||
}
|
||||
],
|
||||
"build_type": ["Debug", "Release"],
|
||||
"cmake_args": ["-Dunity=OFF", "-Dunity=ON"]
|
||||
}
|
||||
22
.github/scripts/strategy-matrix/macos.json
vendored
Normal file
22
.github/scripts/strategy-matrix/macos.json
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"architecture": [
|
||||
{
|
||||
"platform": "macos/arm64",
|
||||
"runner": ["self-hosted", "macOS", "ARM64", "mac-runner-m1"]
|
||||
}
|
||||
],
|
||||
"os": [
|
||||
{
|
||||
"distro_name": "macos",
|
||||
"distro_version": "",
|
||||
"compiler_name": "",
|
||||
"compiler_version": "",
|
||||
"image_sha": ""
|
||||
}
|
||||
],
|
||||
"build_type": ["Debug", "Release"],
|
||||
"cmake_args": [
|
||||
"-Dunity=OFF -DCMAKE_POLICY_VERSION_MINIMUM=3.5",
|
||||
"-Dunity=ON -DCMAKE_POLICY_VERSION_MINIMUM=3.5"
|
||||
]
|
||||
}
|
||||
19
.github/scripts/strategy-matrix/windows.json
vendored
Normal file
19
.github/scripts/strategy-matrix/windows.json
vendored
Normal file
@@ -0,0 +1,19 @@
|
||||
{
|
||||
"architecture": [
|
||||
{
|
||||
"platform": "windows/amd64",
|
||||
"runner": ["self-hosted", "Windows", "devbox"]
|
||||
}
|
||||
],
|
||||
"os": [
|
||||
{
|
||||
"distro_name": "windows",
|
||||
"distro_version": "",
|
||||
"compiler_name": "",
|
||||
"compiler_version": "",
|
||||
"image_sha": ""
|
||||
}
|
||||
],
|
||||
"build_type": ["Debug", "Release"],
|
||||
"cmake_args": ["-Dunity=OFF", "-Dunity=ON"]
|
||||
}
|
||||
59
.github/workflows/clang-format.yml
vendored
59
.github/workflows/clang-format.yml
vendored
@@ -1,59 +0,0 @@
|
||||
name: clang-format
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
check:
|
||||
runs-on: ubuntu-24.04
|
||||
env:
|
||||
CLANG_VERSION: 18
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Install clang-format
|
||||
run: |
|
||||
codename=$( lsb_release --codename --short )
|
||||
sudo tee /etc/apt/sources.list.d/llvm.list >/dev/null <<EOF
|
||||
deb http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
|
||||
deb-src http://apt.llvm.org/${codename}/ llvm-toolchain-${codename}-${CLANG_VERSION} main
|
||||
EOF
|
||||
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add
|
||||
sudo apt-get update
|
||||
sudo apt-get install clang-format-${CLANG_VERSION}
|
||||
- name: Format first-party sources
|
||||
run: find include src -type f \( -name '*.cpp' -o -name '*.hpp' -o -name '*.h' -o -name '*.ipp' \) -exec clang-format-${CLANG_VERSION} -i {} +
|
||||
- name: Check for differences
|
||||
id: assert
|
||||
run: |
|
||||
set -o pipefail
|
||||
git diff --exit-code | tee "clang-format.patch"
|
||||
- name: Upload patch
|
||||
if: failure() && steps.assert.outcome == 'failure'
|
||||
uses: actions/upload-artifact@v3
|
||||
continue-on-error: true
|
||||
with:
|
||||
name: clang-format.patch
|
||||
if-no-files-found: ignore
|
||||
path: clang-format.patch
|
||||
- name: What happened?
|
||||
if: failure() && steps.assert.outcome == 'failure'
|
||||
env:
|
||||
PREAMBLE: |
|
||||
If you are reading this, you are looking at a failed Github Actions
|
||||
job. That means you pushed one or more files that did not conform
|
||||
to the formatting specified in .clang-format. That may be because
|
||||
you neglected to run 'git clang-format' or 'clang-format' before
|
||||
committing, or that your version of clang-format has an
|
||||
incompatibility with the one on this
|
||||
machine, which is:
|
||||
SUGGESTION: |
|
||||
|
||||
To fix it, you can do one of two things:
|
||||
1. Download and apply the patch generated as an artifact of this
|
||||
job to your repo, commit, and push.
|
||||
2. Run 'git-clang-format --extensions cpp,h,hpp,ipp develop'
|
||||
in your repo, commit, and push.
|
||||
run: |
|
||||
echo "${PREAMBLE}"
|
||||
clang-format-${CLANG_VERSION} --version
|
||||
echo "${SUGGESTION}"
|
||||
exit 1
|
||||
37
.github/workflows/doxygen.yml
vendored
37
.github/workflows/doxygen.yml
vendored
@@ -1,37 +0,0 @@
|
||||
name: Build and publish Doxygen documentation
|
||||
# To test this workflow, push your changes to your fork's `develop` branch.
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- develop
|
||||
- doxygen
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
job:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
container: rippleci/rippled-build-ubuntu:aaf5e3e
|
||||
steps:
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: check environment
|
||||
run: |
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
cmake --version
|
||||
doxygen --version
|
||||
env | sort
|
||||
- name: build
|
||||
run: |
|
||||
mkdir build
|
||||
cd build
|
||||
cmake -Donly_docs=TRUE ..
|
||||
cmake --build . --target docs --parallel $(nproc)
|
||||
- name: publish
|
||||
uses: peaceiris/actions-gh-pages@v3
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
publish_dir: build/docs/html
|
||||
49
.github/workflows/levelization.yml
vendored
49
.github/workflows/levelization.yml
vendored
@@ -1,49 +0,0 @@
|
||||
name: levelization
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
check:
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
CLANG_VERSION: 10
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Check levelization
|
||||
run: Builds/levelization/levelization.sh
|
||||
- name: Check for differences
|
||||
id: assert
|
||||
run: |
|
||||
set -o pipefail
|
||||
git diff --exit-code | tee "levelization.patch"
|
||||
- name: Upload patch
|
||||
if: failure() && steps.assert.outcome == 'failure'
|
||||
uses: actions/upload-artifact@v3
|
||||
continue-on-error: true
|
||||
with:
|
||||
name: levelization.patch
|
||||
if-no-files-found: ignore
|
||||
path: levelization.patch
|
||||
- name: What happened?
|
||||
if: failure() && steps.assert.outcome == 'failure'
|
||||
env:
|
||||
MESSAGE: |
|
||||
If you are reading this, you are looking at a failed Github
|
||||
Actions job. That means you changed the dependency relationships
|
||||
between the modules in rippled. That may be an improvement or a
|
||||
regression. This check doesn't judge.
|
||||
|
||||
A rule of thumb, though, is that if your changes caused
|
||||
something to be removed from loops.txt, that's probably an
|
||||
improvement. If something was added, it's probably a regression.
|
||||
|
||||
To fix it, you can do one of two things:
|
||||
1. Download and apply the patch generated as an artifact of this
|
||||
job to your repo, commit, and push.
|
||||
2. Run './Builds/levelization/levelization.sh' in your repo,
|
||||
commit, and push.
|
||||
|
||||
See Builds/levelization/README.md for more info.
|
||||
run: |
|
||||
echo "${MESSAGE}"
|
||||
exit 1
|
||||
88
.github/workflows/libxrpl.yml
vendored
88
.github/workflows/libxrpl.yml
vendored
@@ -1,88 +0,0 @@
|
||||
name: Check libXRPL compatibility with Clio
|
||||
env:
|
||||
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
|
||||
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
|
||||
on:
|
||||
pull_request:
|
||||
paths:
|
||||
- 'src/libxrpl/protocol/BuildInfo.cpp'
|
||||
- '.github/workflows/libxrpl.yml'
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
publish:
|
||||
name: Publish libXRPL
|
||||
outputs:
|
||||
outcome: ${{ steps.upload.outputs.outcome }}
|
||||
version: ${{ steps.version.outputs.version }}
|
||||
channel: ${{ steps.channel.outputs.channel }}
|
||||
runs-on: [self-hosted, heavy]
|
||||
container: rippleci/rippled-build-ubuntu:aaf5e3e
|
||||
steps:
|
||||
- name: Wait for essential checks to succeed
|
||||
uses: lewagon/wait-on-check-action@v1.3.4
|
||||
with:
|
||||
ref: ${{ github.event.pull_request.head.sha || github.sha }}
|
||||
running-workflow-name: wait-for-check-regexp
|
||||
check-regexp: '(dependencies|test).*linux.*' # Ignore windows and mac tests but make sure linux passes
|
||||
repo-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
wait-interval: 10
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: Generate channel
|
||||
id: channel
|
||||
shell: bash
|
||||
run: |
|
||||
echo channel="clio/pr_${{ github.event.pull_request.number }}" | tee ${GITHUB_OUTPUT}
|
||||
- name: Export new package
|
||||
shell: bash
|
||||
run: |
|
||||
conan export . ${{ steps.channel.outputs.channel }}
|
||||
- name: Add Ripple Conan remote
|
||||
shell: bash
|
||||
run: |
|
||||
conan remote list
|
||||
conan remote remove ripple || true
|
||||
# Do not quote the URL. An empty string will be accepted (with a non-fatal warning), but a missing argument will not.
|
||||
conan remote add ripple ${{ env.CONAN_URL }} --insert 0
|
||||
- name: Parse new version
|
||||
id: version
|
||||
shell: bash
|
||||
run: |
|
||||
echo version="$(cat src/libxrpl/protocol/BuildInfo.cpp | grep "versionString =" \
|
||||
| awk -F '"' '{print $2}')" | tee ${GITHUB_OUTPUT}
|
||||
- name: Try to authenticate to Ripple Conan remote
|
||||
id: remote
|
||||
shell: bash
|
||||
run: |
|
||||
# `conan user` implicitly uses the environment variables CONAN_LOGIN_USERNAME_<REMOTE> and CONAN_PASSWORD_<REMOTE>.
|
||||
# https://docs.conan.io/1/reference/commands/misc/user.html#using-environment-variables
|
||||
# https://docs.conan.io/1/reference/env_vars.html#conan-login-username-conan-login-username-remote-name
|
||||
# https://docs.conan.io/1/reference/env_vars.html#conan-password-conan-password-remote-name
|
||||
echo outcome=$(conan user --remote ripple --password >&2 \
|
||||
&& echo success || echo failure) | tee ${GITHUB_OUTPUT}
|
||||
- name: Upload new package
|
||||
id: upload
|
||||
if: (steps.remote.outputs.outcome == 'success')
|
||||
shell: bash
|
||||
run: |
|
||||
echo "conan upload version ${{ steps.version.outputs.version }} on channel ${{ steps.channel.outputs.channel }}"
|
||||
echo outcome=$(conan upload xrpl/${{ steps.version.outputs.version }}@${{ steps.channel.outputs.channel }} --remote ripple --confirm >&2 \
|
||||
&& echo success || echo failure) | tee ${GITHUB_OUTPUT}
|
||||
notify_clio:
|
||||
name: Notify Clio
|
||||
runs-on: ubuntu-latest
|
||||
needs: publish
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.CLIO_NOTIFY_TOKEN }}
|
||||
steps:
|
||||
- name: Notify Clio about new version
|
||||
if: (needs.publish.outputs.outcome == 'success')
|
||||
shell: bash
|
||||
run: |
|
||||
gh api --method POST -H "Accept: application/vnd.github+json" -H "X-GitHub-Api-Version: 2022-11-28" \
|
||||
/repos/xrplf/clio/dispatches -f "event_type=check_libxrpl" \
|
||||
-F "client_payload[version]=${{ needs.publish.outputs.version }}@${{ needs.publish.outputs.channel }}"
|
||||
71
.github/workflows/macos.yml
vendored
71
.github/workflows/macos.yml
vendored
@@ -1,71 +0,0 @@
|
||||
name: macos
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
# If the branches list is ever changed, be sure to change it on all
|
||||
# build/test jobs (nix, macos, windows)
|
||||
branches:
|
||||
# Always build the package branches
|
||||
- develop
|
||||
- release
|
||||
- master
|
||||
# Branches that opt-in to running
|
||||
- 'ci/**'
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
|
||||
test:
|
||||
strategy:
|
||||
matrix:
|
||||
platform:
|
||||
- macos
|
||||
generator:
|
||||
- Ninja
|
||||
configuration:
|
||||
- Release
|
||||
runs-on: [self-hosted, macOS]
|
||||
env:
|
||||
# The `build` action requires these variables.
|
||||
build_dir: .build
|
||||
NUM_PROCESSORS: 12
|
||||
steps:
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: install Conan
|
||||
run: |
|
||||
brew install conan@1
|
||||
echo '/opt/homebrew/opt/conan@1/bin' >> $GITHUB_PATH
|
||||
- name: install Ninja
|
||||
if: matrix.generator == 'Ninja'
|
||||
run: brew install ninja
|
||||
- name: check environment
|
||||
run: |
|
||||
env | sort
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
python --version
|
||||
conan --version
|
||||
cmake --version
|
||||
- name: configure Conan
|
||||
run : |
|
||||
conan profile new default --detect || true
|
||||
conan profile update settings.compiler.cppstd=20 default
|
||||
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
|
||||
- name: build dependencies
|
||||
uses: ./.github/actions/dependencies
|
||||
env:
|
||||
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
|
||||
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
|
||||
with:
|
||||
configuration: ${{ matrix.configuration }}
|
||||
- name: build
|
||||
uses: ./.github/actions/build
|
||||
with:
|
||||
generator: ${{ matrix.generator }}
|
||||
configuration: ${{ matrix.configuration }}
|
||||
- name: test
|
||||
run: |
|
||||
${build_dir}/rippled --unittest
|
||||
296
.github/workflows/nix.yml
vendored
296
.github/workflows/nix.yml
vendored
@@ -1,296 +0,0 @@
|
||||
name: nix
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
# If the branches list is ever changed, be sure to change it on all
|
||||
# build/test jobs (nix, macos, windows)
|
||||
branches:
|
||||
# Always build the package branches
|
||||
- develop
|
||||
- release
|
||||
- master
|
||||
# Branches that opt-in to running
|
||||
- 'ci/**'
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
# This workflow has two job matrixes.
|
||||
# They can be considered phases because the second matrix ("test")
|
||||
# depends on the first ("dependencies").
|
||||
#
|
||||
# The first phase has a job in the matrix for each combination of
|
||||
# variables that affects dependency ABI:
|
||||
# platform, compiler, and configuration.
|
||||
# It creates a GitHub artifact holding the Conan profile,
|
||||
# and builds and caches binaries for all the dependencies.
|
||||
# If an Artifactory remote is configured, they are cached there.
|
||||
# If not, they are added to the GitHub artifact.
|
||||
# GitHub's "cache" action has a size limit (10 GB) that is too small
|
||||
# to hold the binaries if they are built locally.
|
||||
# We must use the "{upload,download}-artifact" actions instead.
|
||||
#
|
||||
# The second phase has a job in the matrix for each test configuration.
|
||||
# It installs dependency binaries from the cache, whichever was used,
|
||||
# and builds and tests rippled.
|
||||
|
||||
jobs:
|
||||
|
||||
dependencies:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
platform:
|
||||
- linux
|
||||
compiler:
|
||||
- gcc
|
||||
- clang
|
||||
configuration:
|
||||
- Debug
|
||||
- Release
|
||||
include:
|
||||
- compiler: gcc
|
||||
profile:
|
||||
version: 11
|
||||
cc: /usr/bin/gcc
|
||||
cxx: /usr/bin/g++
|
||||
- compiler: clang
|
||||
profile:
|
||||
version: 14
|
||||
cc: /usr/bin/clang-14
|
||||
cxx: /usr/bin/clang++-14
|
||||
runs-on: [self-hosted, heavy]
|
||||
container: rippleci/rippled-build-ubuntu:aaf5e3e
|
||||
env:
|
||||
build_dir: .build
|
||||
steps:
|
||||
- name: upgrade conan
|
||||
run: |
|
||||
pip install --upgrade "conan<2"
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: check environment
|
||||
run: |
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
conan --version
|
||||
cmake --version
|
||||
env | sort
|
||||
- name: configure Conan
|
||||
run: |
|
||||
conan profile new default --detect
|
||||
conan profile update settings.compiler.cppstd=20 default
|
||||
conan profile update settings.compiler=${{ matrix.compiler }} default
|
||||
conan profile update settings.compiler.version=${{ matrix.profile.version }} default
|
||||
conan profile update settings.compiler.libcxx=libstdc++11 default
|
||||
conan profile update env.CC=${{ matrix.profile.cc }} default
|
||||
conan profile update env.CXX=${{ matrix.profile.cxx }} default
|
||||
conan profile update conf.tools.build:compiler_executables='{"c": "${{ matrix.profile.cc }}", "cpp": "${{ matrix.profile.cxx }}"}' default
|
||||
- name: archive profile
|
||||
# Create this archive before dependencies are added to the local cache.
|
||||
run: tar -czf conan.tar -C ~/.conan .
|
||||
- name: build dependencies
|
||||
uses: ./.github/actions/dependencies
|
||||
env:
|
||||
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
|
||||
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
|
||||
with:
|
||||
configuration: ${{ matrix.configuration }}
|
||||
- name: upload archive
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
|
||||
path: conan.tar
|
||||
if-no-files-found: error
|
||||
|
||||
|
||||
test:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
platform:
|
||||
- linux
|
||||
compiler:
|
||||
- gcc
|
||||
- clang
|
||||
configuration:
|
||||
- Debug
|
||||
- Release
|
||||
cmake-args:
|
||||
-
|
||||
- "-Dunity=ON"
|
||||
needs: dependencies
|
||||
runs-on: [self-hosted, heavy]
|
||||
container: rippleci/rippled-build-ubuntu:aaf5e3e
|
||||
env:
|
||||
build_dir: .build
|
||||
steps:
|
||||
- name: upgrade conan
|
||||
run: |
|
||||
pip install --upgrade "conan<2"
|
||||
- name: download cache
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
|
||||
- name: extract cache
|
||||
run: |
|
||||
mkdir -p ~/.conan
|
||||
tar -xzf conan.tar -C ~/.conan
|
||||
- name: check environment
|
||||
run: |
|
||||
env | sort
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
conan --version
|
||||
cmake --version
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: dependencies
|
||||
uses: ./.github/actions/dependencies
|
||||
env:
|
||||
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
with:
|
||||
configuration: ${{ matrix.configuration }}
|
||||
- name: build
|
||||
uses: ./.github/actions/build
|
||||
with:
|
||||
generator: Ninja
|
||||
configuration: ${{ matrix.configuration }}
|
||||
cmake-args: ${{ matrix.cmake-args }}
|
||||
- name: test
|
||||
run: |
|
||||
${build_dir}/rippled --unittest --unittest-jobs $(nproc)
|
||||
|
||||
|
||||
coverage:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
platform:
|
||||
- linux
|
||||
compiler:
|
||||
- gcc
|
||||
configuration:
|
||||
- Debug
|
||||
needs: dependencies
|
||||
runs-on: [self-hosted, heavy]
|
||||
container: rippleci/rippled-build-ubuntu:aaf5e3e
|
||||
env:
|
||||
build_dir: .build
|
||||
steps:
|
||||
- name: upgrade conan
|
||||
run: |
|
||||
pip install --upgrade "conan<2"
|
||||
- name: download cache
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: ${{ matrix.platform }}-${{ matrix.compiler }}-${{ matrix.configuration }}
|
||||
- name: extract cache
|
||||
run: |
|
||||
mkdir -p ~/.conan
|
||||
tar -xzf conan.tar -C ~/.conan
|
||||
- name: install gcovr
|
||||
run: pip install "gcovr>=7,<8"
|
||||
- name: check environment
|
||||
run: |
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
conan --version
|
||||
cmake --version
|
||||
gcovr --version
|
||||
env | sort
|
||||
ls ~/.conan
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: dependencies
|
||||
uses: ./.github/actions/dependencies
|
||||
env:
|
||||
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
with:
|
||||
configuration: ${{ matrix.configuration }}
|
||||
- name: build
|
||||
uses: ./.github/actions/build
|
||||
with:
|
||||
generator: Ninja
|
||||
configuration: ${{ matrix.configuration }}
|
||||
cmake-args: >-
|
||||
-Dcoverage=ON
|
||||
-Dcoverage_format=xml
|
||||
-DCODE_COVERAGE_VERBOSE=ON
|
||||
-DCMAKE_CXX_FLAGS="-O0"
|
||||
-DCMAKE_C_FLAGS="-O0"
|
||||
cmake-target: coverage
|
||||
- name: move coverage report
|
||||
shell: bash
|
||||
run: |
|
||||
mv "${build_dir}/coverage.xml" ./
|
||||
- name: archive coverage report
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: coverage.xml
|
||||
path: coverage.xml
|
||||
retention-days: 30
|
||||
- name: upload coverage report
|
||||
uses: wandalen/wretry.action@v1.4.10
|
||||
with:
|
||||
action: codecov/codecov-action@v4.5.0
|
||||
with: |
|
||||
files: coverage.xml
|
||||
fail_ci_if_error: true
|
||||
disable_search: true
|
||||
verbose: true
|
||||
plugin: noop
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
attempt_limit: 5
|
||||
attempt_delay: 210000 # in milliseconds
|
||||
|
||||
conan:
|
||||
needs: dependencies
|
||||
runs-on: [self-hosted, heavy]
|
||||
container: rippleci/rippled-build-ubuntu:aaf5e3e
|
||||
env:
|
||||
build_dir: .build
|
||||
configuration: Release
|
||||
steps:
|
||||
- name: upgrade conan
|
||||
run: |
|
||||
pip install --upgrade "conan<2"
|
||||
- name: download cache
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: linux-gcc-${{ env.configuration }}
|
||||
- name: extract cache
|
||||
run: |
|
||||
mkdir -p ~/.conan
|
||||
tar -xzf conan.tar -C ~/.conan
|
||||
- name: check environment
|
||||
run: |
|
||||
env | sort
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
conan --version
|
||||
cmake --version
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: dependencies
|
||||
uses: ./.github/actions/dependencies
|
||||
env:
|
||||
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
with:
|
||||
configuration: ${{ env.configuration }}
|
||||
- name: export
|
||||
run: |
|
||||
version=$(conan inspect --raw version .)
|
||||
reference="xrpl/${version}@local/test"
|
||||
conan remove -f ${reference} || true
|
||||
conan export . local/test
|
||||
echo "reference=${reference}" >> "${GITHUB_ENV}"
|
||||
- name: build
|
||||
run: |
|
||||
cd examples/example
|
||||
mkdir ${build_dir}
|
||||
cd ${build_dir}
|
||||
conan install .. --output-folder . \
|
||||
--require-override ${reference} --build missing
|
||||
cmake .. \
|
||||
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=./build/${configuration}/generators/conan_toolchain.cmake \
|
||||
-DCMAKE_BUILD_TYPE=${configuration}
|
||||
cmake --build .
|
||||
./example | grep '^[[:digit:]]\+\.[[:digit:]]\+\.[[:digit:]]\+'
|
||||
133
.github/workflows/on-pr.yml
vendored
Normal file
133
.github/workflows/on-pr.yml
vendored
Normal file
@@ -0,0 +1,133 @@
|
||||
# This workflow runs all workflows to check, build and test the project on
|
||||
# various Linux flavors, as well as on MacOS and Windows, on every push to a
|
||||
# user branch. However, it will not run if the pull request is a draft unless it
|
||||
# has the 'DraftRunCI' label.
|
||||
name: PR
|
||||
|
||||
on:
|
||||
merge_group:
|
||||
types:
|
||||
- checks_requested
|
||||
pull_request:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
- synchronize
|
||||
- ready_for_review
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
jobs:
|
||||
# This job determines whether the rest of the workflow should run. It runs
|
||||
# when the PR is not a draft (which should also cover merge-group) or
|
||||
# has the 'DraftRunCI' label.
|
||||
should-run:
|
||||
if: ${{ !github.event.pull_request.draft || contains(github.event.pull_request.labels.*.name, 'DraftRunCI') }}
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
- name: Determine changed files
|
||||
# This step checks whether any files have changed that should
|
||||
# cause the next jobs to run. We do it this way rather than
|
||||
# using `paths` in the `on:` section, because all required
|
||||
# checks must pass, even for changes that do not modify anything
|
||||
# that affects those checks. We would therefore like to make the
|
||||
# checks required only if the job runs, but GitHub does not
|
||||
# support that directly. By always executing the workflow on new
|
||||
# commits and by using the changed-files action below, we ensure
|
||||
# that Github considers any skipped jobs to have passed, and in
|
||||
# turn the required checks as well.
|
||||
id: changes
|
||||
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
|
||||
with:
|
||||
files: |
|
||||
# These paths are unique to `on-pr.yml`.
|
||||
.github/scripts/levelization/**
|
||||
.github/workflows/reusable-check-levelization.yml
|
||||
.github/workflows/reusable-notify-clio.yml
|
||||
.github/workflows/on-pr.yml
|
||||
|
||||
# Keep the paths below in sync with those in `on-trigger.yml`.
|
||||
.github/actions/build-deps/**
|
||||
.github/actions/build-test/**
|
||||
.github/actions/setup-conan/**
|
||||
.github/scripts/strategy-matrix/**
|
||||
.github/workflows/reusable-build.yml
|
||||
.github/workflows/reusable-build-test-config.yml
|
||||
.github/workflows/reusable-build-test.yml
|
||||
.github/workflows/reusable-strategy-matrix.yml
|
||||
.github/workflows/reusable-test.yml
|
||||
.codecov.yml
|
||||
cmake/**
|
||||
conan/**
|
||||
external/**
|
||||
include/**
|
||||
src/**
|
||||
tests/**
|
||||
CMakeLists.txt
|
||||
conanfile.py
|
||||
conan.lock
|
||||
- name: Check whether to run
|
||||
# This step determines whether the rest of the workflow should
|
||||
# run. The rest of the workflow will run if this job runs AND at
|
||||
# least one of:
|
||||
# * Any of the files checked in the `changes` step were modified
|
||||
# * The PR is NOT a draft and is labeled "Ready to merge"
|
||||
# * The workflow is running from the merge queue
|
||||
id: go
|
||||
env:
|
||||
FILES: ${{ steps.changes.outputs.any_changed }}
|
||||
DRAFT: ${{ github.event.pull_request.draft }}
|
||||
READY: ${{ contains(github.event.pull_request.labels.*.name, 'Ready to merge') }}
|
||||
MERGE: ${{ github.event_name == 'merge_group' }}
|
||||
run: |
|
||||
echo "go=${{ (env.DRAFT != 'true' && env.READY == 'true') || env.FILES == 'true' || env.MERGE == 'true' }}" >> "${GITHUB_OUTPUT}"
|
||||
cat "${GITHUB_OUTPUT}"
|
||||
outputs:
|
||||
go: ${{ steps.go.outputs.go == 'true' }}
|
||||
|
||||
check-levelization:
|
||||
needs: should-run
|
||||
if: ${{ needs.should-run.outputs.go == 'true' }}
|
||||
uses: ./.github/workflows/reusable-check-levelization.yml
|
||||
|
||||
build-test:
|
||||
needs: should-run
|
||||
if: ${{ needs.should-run.outputs.go == 'true' }}
|
||||
uses: ./.github/workflows/reusable-build-test.yml
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os: [linux, macos, windows]
|
||||
with:
|
||||
os: ${{ matrix.os }}
|
||||
secrets:
|
||||
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||
|
||||
notify-clio:
|
||||
needs:
|
||||
- should-run
|
||||
- build-test
|
||||
if: ${{ needs.should-run.outputs.go == 'true' && contains(fromJSON('["release", "master"]'), github.ref_name) }}
|
||||
uses: ./.github/workflows/reusable-notify-clio.yml
|
||||
secrets:
|
||||
clio_notify_token: ${{ secrets.CLIO_NOTIFY_TOKEN }}
|
||||
conan_remote_username: ${{ secrets.CONAN_REMOTE_USERNAME }}
|
||||
conan_remote_password: ${{ secrets.CONAN_REMOTE_PASSWORD }}
|
||||
|
||||
passed:
|
||||
if: failure() || cancelled()
|
||||
needs:
|
||||
- build-test
|
||||
- check-levelization
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Fail
|
||||
run: false
|
||||
75
.github/workflows/on-trigger.yml
vendored
Normal file
75
.github/workflows/on-trigger.yml
vendored
Normal file
@@ -0,0 +1,75 @@
|
||||
# This workflow runs all workflows to build the dependencies required for the
|
||||
# project on various Linux flavors, as well as on MacOS and Windows, on a
|
||||
# scheduled basis, on merge into the 'develop', 'release', or 'master' branches,
|
||||
# or manually. The missing commits check is only run when the code is merged
|
||||
# into the 'develop' or 'release' branches, and the documentation is built when
|
||||
# the code is merged into the 'develop' branch.
|
||||
name: Trigger
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- "develop"
|
||||
- "release*"
|
||||
- "master"
|
||||
paths:
|
||||
# These paths are unique to `on-trigger.yml`.
|
||||
- ".github/workflows/reusable-check-missing-commits.yml"
|
||||
- ".github/workflows/on-trigger.yml"
|
||||
- ".github/workflows/publish-docs.yml"
|
||||
|
||||
# Keep the paths below in sync with those in `on-pr.yml`.
|
||||
- ".github/actions/build-deps/**"
|
||||
- ".github/actions/build-test/**"
|
||||
- ".github/actions/setup-conan/**"
|
||||
- ".github/scripts/strategy-matrix/**"
|
||||
- ".github/workflows/reusable-build.yml"
|
||||
- ".github/workflows/reusable-build-test-config.yml"
|
||||
- ".github/workflows/reusable-build-test.yml"
|
||||
- ".github/workflows/reusable-strategy-matrix.yml"
|
||||
- ".github/workflows/reusable-test.yml"
|
||||
- ".codecov.yml"
|
||||
- "cmake/**"
|
||||
- "conan/**"
|
||||
- "external/**"
|
||||
- "include/**"
|
||||
- "src/**"
|
||||
- "tests/**"
|
||||
- "CMakeLists.txt"
|
||||
- "conanfile.py"
|
||||
- "conan.lock"
|
||||
|
||||
# Run at 06:32 UTC on every day of the week from Monday through Friday. This
|
||||
# will force all dependencies to be rebuilt, which is useful to verify that
|
||||
# all dependencies can be built successfully. Only the dependencies that
|
||||
# are actually missing from the remote will be uploaded.
|
||||
schedule:
|
||||
- cron: "32 6 * * 1-5"
|
||||
|
||||
# Run when manually triggered via the GitHub UI or API.
|
||||
workflow_dispatch:
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
jobs:
|
||||
check-missing-commits:
|
||||
if: ${{ github.event_name == 'push' && github.ref_type == 'branch' && contains(fromJSON('["develop", "release"]'), github.ref_name) }}
|
||||
uses: ./.github/workflows/reusable-check-missing-commits.yml
|
||||
|
||||
build-test:
|
||||
uses: ./.github/workflows/reusable-build-test.yml
|
||||
strategy:
|
||||
fail-fast: ${{ github.event_name == 'merge_group' }}
|
||||
matrix:
|
||||
os: [linux, macos, windows]
|
||||
with:
|
||||
os: ${{ matrix.os }}
|
||||
strategy_matrix: ${{ github.event_name == 'schedule' && 'all' || 'minimal' }}
|
||||
secrets:
|
||||
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||
15
.github/workflows/pre-commit.yml
vendored
Normal file
15
.github/workflows/pre-commit.yml
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
name: Run pre-commit hooks
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
branches: [develop, release, master]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
# Call the workflow in the XRPLF/actions repo that runs the pre-commit hooks.
|
||||
run-hooks:
|
||||
uses: XRPLF/actions/.github/workflows/pre-commit.yml@34790936fae4c6c751f62ec8c06696f9c1a5753a
|
||||
with:
|
||||
runs_on: ubuntu-latest
|
||||
container: '{ "image": "ghcr.io/xrplf/ci/tools-rippled-pre-commit:sha-a8c7be1" }'
|
||||
60
.github/workflows/publish-docs.yml
vendored
Normal file
60
.github/workflows/publish-docs.yml
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
# This workflow builds the documentation for the repository, and publishes it to
|
||||
# GitHub Pages when changes are merged into the default branch.
|
||||
name: Build and publish documentation
|
||||
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- ".github/workflows/publish-docs.yml"
|
||||
- "*.md"
|
||||
- "**/*.md"
|
||||
- "docs/**"
|
||||
- "include/**"
|
||||
- "src/libxrpl/**"
|
||||
- "src/xrpld/**"
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
env:
|
||||
BUILD_DIR: .build
|
||||
|
||||
jobs:
|
||||
publish:
|
||||
runs-on: ubuntu-latest
|
||||
container: ghcr.io/xrplf/ci/tools-rippled-documentation:sha-a8c7be1
|
||||
permissions:
|
||||
contents: write
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
- name: Check configuration
|
||||
run: |
|
||||
echo 'Checking path.'
|
||||
echo ${PATH} | tr ':' '\n'
|
||||
|
||||
echo 'Checking environment variables.'
|
||||
env | sort
|
||||
|
||||
echo 'Checking CMake version.'
|
||||
cmake --version
|
||||
|
||||
echo 'Checking Doxygen version.'
|
||||
doxygen --version
|
||||
- name: Build documentation
|
||||
run: |
|
||||
mkdir -p "${BUILD_DIR}"
|
||||
cd "${BUILD_DIR}"
|
||||
cmake -Donly_docs=ON ..
|
||||
cmake --build . --target docs --parallel $(nproc)
|
||||
- name: Publish documentation
|
||||
if: ${{ github.ref_type == 'branch' && github.ref_name == github.event.repository.default_branch }}
|
||||
uses: peaceiris/actions-gh-pages@4f9cc6602d3f66b9c108549d475ec49e8ef4d45e # v4.0.0
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
publish_dir: ${{ env.BUILD_DIR }}/docs/html
|
||||
69
.github/workflows/reusable-build-test-config.yml
vendored
Normal file
69
.github/workflows/reusable-build-test-config.yml
vendored
Normal file
@@ -0,0 +1,69 @@
|
||||
name: Build and test configuration
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
build_dir:
|
||||
description: "The directory where to build."
|
||||
required: true
|
||||
type: string
|
||||
build_only:
|
||||
description: 'Whether to only build or to build and test the code ("true", "false").'
|
||||
required: true
|
||||
type: boolean
|
||||
build_type:
|
||||
description: 'The build type to use ("Debug", "Release").'
|
||||
type: string
|
||||
required: true
|
||||
cmake_args:
|
||||
description: "Additional arguments to pass to CMake."
|
||||
required: false
|
||||
type: string
|
||||
default: ""
|
||||
cmake_target:
|
||||
description: "The CMake target to build."
|
||||
type: string
|
||||
required: true
|
||||
|
||||
runs_on:
|
||||
description: Runner to run the job on as a JSON string
|
||||
required: true
|
||||
type: string
|
||||
image:
|
||||
description: "The image to run in (leave empty to run natively)"
|
||||
required: true
|
||||
type: string
|
||||
|
||||
config_name:
|
||||
description: "The configuration string (used for naming artifacts and such)."
|
||||
required: true
|
||||
type: string
|
||||
|
||||
secrets:
|
||||
CODECOV_TOKEN:
|
||||
description: "The Codecov token to use for uploading coverage reports."
|
||||
required: true
|
||||
|
||||
jobs:
|
||||
build:
|
||||
uses: ./.github/workflows/reusable-build.yml
|
||||
with:
|
||||
build_dir: ${{ inputs.build_dir }}
|
||||
build_type: ${{ inputs.build_type }}
|
||||
cmake_args: ${{ inputs.cmake_args }}
|
||||
cmake_target: ${{ inputs.cmake_target }}
|
||||
runs_on: ${{ inputs.runs_on }}
|
||||
image: ${{ inputs.image }}
|
||||
config_name: ${{ inputs.config_name }}
|
||||
secrets:
|
||||
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||
|
||||
test:
|
||||
needs: build
|
||||
uses: ./.github/workflows/reusable-test.yml
|
||||
with:
|
||||
run_tests: ${{ !inputs.build_only }}
|
||||
verify_voidstar: ${{ contains(inputs.cmake_args, '-Dvoidstar=ON') }}
|
||||
runs_on: ${{ inputs.runs_on }}
|
||||
image: ${{ inputs.image }}
|
||||
config_name: ${{ inputs.config_name }}
|
||||
58
.github/workflows/reusable-build-test.yml
vendored
Normal file
58
.github/workflows/reusable-build-test.yml
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
# This workflow builds and tests the binary for various configurations.
|
||||
name: Build and test
|
||||
|
||||
# This workflow can only be triggered by other workflows. Note that the
|
||||
# workflow_call event does not support the 'choice' input type, see
|
||||
# https://docs.github.com/en/actions/reference/workflows-and-actions/workflow-syntax#onworkflow_callinputsinput_idtype,
|
||||
# so we use 'string' instead.
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
build_dir:
|
||||
description: "The directory where to build."
|
||||
required: false
|
||||
type: string
|
||||
default: ".build"
|
||||
os:
|
||||
description: 'The operating system to use for the build ("linux", "macos", "windows").'
|
||||
required: true
|
||||
type: string
|
||||
strategy_matrix:
|
||||
# TODO: Support additional strategies, e.g. "ubuntu" for generating all Ubuntu configurations.
|
||||
description: 'The strategy matrix to use for generating the configurations ("minimal", "all").'
|
||||
required: false
|
||||
type: string
|
||||
default: "minimal"
|
||||
secrets:
|
||||
CODECOV_TOKEN:
|
||||
description: "The Codecov token to use for uploading coverage reports."
|
||||
required: true
|
||||
|
||||
jobs:
|
||||
# Generate the strategy matrix to be used by the following job.
|
||||
generate-matrix:
|
||||
uses: ./.github/workflows/reusable-strategy-matrix.yml
|
||||
with:
|
||||
os: ${{ inputs.os }}
|
||||
strategy_matrix: ${{ inputs.strategy_matrix }}
|
||||
|
||||
# Build and test the binary for each configuration.
|
||||
build-test-config:
|
||||
needs:
|
||||
- generate-matrix
|
||||
uses: ./.github/workflows/reusable-build-test-config.yml
|
||||
strategy:
|
||||
fail-fast: ${{ github.event_name == 'merge_group' }}
|
||||
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
|
||||
max-parallel: 10
|
||||
with:
|
||||
build_dir: ${{ inputs.build_dir }}
|
||||
build_only: ${{ matrix.build_only }}
|
||||
build_type: ${{ matrix.build_type }}
|
||||
cmake_args: ${{ matrix.cmake_args }}
|
||||
cmake_target: ${{ matrix.cmake_target }}
|
||||
runs_on: ${{ toJSON(matrix.architecture.runner) }}
|
||||
image: ${{ contains(matrix.architecture.platform, 'linux') && format('ghcr.io/xrplf/ci/{0}-{1}:{2}-{3}-sha-{4}', matrix.os.distro_name, matrix.os.distro_version, matrix.os.compiler_name, matrix.os.compiler_version, matrix.os.image_sha) || '' }}
|
||||
config_name: ${{ matrix.config_name }}
|
||||
secrets:
|
||||
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||
138
.github/workflows/reusable-build.yml
vendored
Normal file
138
.github/workflows/reusable-build.yml
vendored
Normal file
@@ -0,0 +1,138 @@
|
||||
name: Build rippled
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
build_dir:
|
||||
description: "The directory where to build."
|
||||
required: true
|
||||
type: string
|
||||
build_type:
|
||||
description: 'The build type to use ("Debug", "Release").'
|
||||
required: true
|
||||
type: string
|
||||
cmake_args:
|
||||
description: "Additional arguments to pass to CMake."
|
||||
required: true
|
||||
type: string
|
||||
cmake_target:
|
||||
description: "The CMake target to build."
|
||||
required: true
|
||||
type: string
|
||||
|
||||
runs_on:
|
||||
description: Runner to run the job on as a JSON string
|
||||
required: true
|
||||
type: string
|
||||
image:
|
||||
description: "The image to run in (leave empty to run natively)"
|
||||
required: true
|
||||
type: string
|
||||
|
||||
config_name:
|
||||
description: "The name of the configuration."
|
||||
required: true
|
||||
type: string
|
||||
|
||||
secrets:
|
||||
CODECOV_TOKEN:
|
||||
description: "The Codecov token to use for uploading coverage reports."
|
||||
required: true
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Build ${{ inputs.config_name }}
|
||||
runs-on: ${{ fromJSON(inputs.runs_on) }}
|
||||
container: ${{ inputs.image != '' && inputs.image || null }}
|
||||
timeout-minutes: 60
|
||||
steps:
|
||||
- name: Cleanup workspace
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
uses: XRPLF/actions/.github/actions/cleanup-workspace@3f044c7478548e3c32ff68980eeb36ece02b364e
|
||||
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
|
||||
- name: Prepare runner
|
||||
uses: XRPLF/actions/.github/actions/prepare-runner@638e0dc11ea230f91bd26622fb542116bb5254d5
|
||||
with:
|
||||
disable_ccache: false
|
||||
|
||||
- name: Print build environment
|
||||
uses: ./.github/actions/print-env
|
||||
|
||||
- name: Setup Conan
|
||||
uses: ./.github/actions/setup-conan
|
||||
|
||||
- name: Build dependencies
|
||||
uses: ./.github/actions/build-deps
|
||||
with:
|
||||
build_dir: ${{ inputs.build_dir }}
|
||||
build_type: ${{ inputs.build_type }}
|
||||
|
||||
- name: Configure CMake
|
||||
shell: bash
|
||||
working-directory: ${{ inputs.build_dir }}
|
||||
env:
|
||||
BUILD_TYPE: ${{ inputs.build_type }}
|
||||
CMAKE_ARGS: ${{ inputs.cmake_args }}
|
||||
run: |
|
||||
cmake \
|
||||
-G '${{ runner.os == 'Windows' && 'Visual Studio 17 2022' || 'Ninja' }}' \
|
||||
-DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake \
|
||||
-DCMAKE_BUILD_TYPE="${BUILD_TYPE}" \
|
||||
${CMAKE_ARGS} \
|
||||
..
|
||||
|
||||
- name: Build the binary
|
||||
shell: bash
|
||||
working-directory: ${{ inputs.build_dir }}
|
||||
env:
|
||||
BUILD_TYPE: ${{ inputs.build_type }}
|
||||
CMAKE_TARGET: ${{ inputs.cmake_target }}
|
||||
run: |
|
||||
cmake \
|
||||
--build . \
|
||||
--config "${BUILD_TYPE}" \
|
||||
--parallel $(nproc) \
|
||||
--target "${CMAKE_TARGET}"
|
||||
|
||||
- name: Put built binaries in one location
|
||||
shell: bash
|
||||
working-directory: ${{ inputs.build_dir }}
|
||||
env:
|
||||
BUILD_TYPE_DIR: ${{ runner.os == 'Windows' && inputs.build_type || '' }}
|
||||
CMAKE_TARGET: ${{ inputs.cmake_target }}
|
||||
run: |
|
||||
mkdir -p ./binaries/doctest/
|
||||
|
||||
cp ./${BUILD_TYPE_DIR}/rippled* ./binaries/
|
||||
if [ "${CMAKE_TARGET}" != 'coverage' ]; then
|
||||
cp ./src/tests/libxrpl/${BUILD_TYPE_DIR}/xrpl.test.* ./binaries/doctest/
|
||||
fi
|
||||
|
||||
- name: Upload rippled artifact
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
|
||||
env:
|
||||
BUILD_DIR: ${{ inputs.build_dir }}
|
||||
with:
|
||||
name: rippled-${{ inputs.config_name }}
|
||||
path: ${{ env.BUILD_DIR }}/binaries/
|
||||
retention-days: 3
|
||||
if-no-files-found: error
|
||||
|
||||
- name: Upload coverage report
|
||||
if: ${{ inputs.cmake_target == 'coverage' }}
|
||||
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
|
||||
with:
|
||||
disable_search: true
|
||||
disable_telem: true
|
||||
fail_ci_if_error: true
|
||||
files: ${{ inputs.build_dir }}/coverage.xml
|
||||
plugins: noop
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
verbose: true
|
||||
46
.github/workflows/reusable-check-levelization.yml
vendored
Normal file
46
.github/workflows/reusable-check-levelization.yml
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
# This workflow checks if the dependencies between the modules are correctly
|
||||
# indexed.
|
||||
name: Check levelization
|
||||
|
||||
# This workflow can only be triggered by other workflows.
|
||||
on: workflow_call
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}-levelization
|
||||
cancel-in-progress: true
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
jobs:
|
||||
levelization:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
- name: Check levelization
|
||||
run: .github/scripts/levelization/generate.sh
|
||||
- name: Check for differences
|
||||
env:
|
||||
MESSAGE: |
|
||||
|
||||
The dependency relationships between the modules in rippled have
|
||||
changed, which may be an improvement or a regression.
|
||||
|
||||
A rule of thumb is that if your changes caused something to be
|
||||
removed from loops.txt, it's probably an improvement, while if
|
||||
something was added, it's probably a regression.
|
||||
|
||||
Run '.github/scripts/levelization/generate.sh' in your repo, commit
|
||||
and push the changes. See .github/scripts/levelization/README.md for
|
||||
more info.
|
||||
run: |
|
||||
DIFF=$(git status --porcelain)
|
||||
if [ -n "${DIFF}" ]; then
|
||||
# Print the differences to give the contributor a hint about what to
|
||||
# expect when running levelization on their own machine.
|
||||
git diff
|
||||
echo "${MESSAGE}"
|
||||
exit 1
|
||||
fi
|
||||
62
.github/workflows/reusable-check-missing-commits.yml
vendored
Normal file
62
.github/workflows/reusable-check-missing-commits.yml
vendored
Normal file
@@ -0,0 +1,62 @@
|
||||
# This workflow checks that all commits in the "master" branch are also in the
|
||||
# "release" and "develop" branches, and that all commits in the "release" branch
|
||||
# are also in the "develop" branch.
|
||||
name: Check for missing commits
|
||||
|
||||
# This workflow can only be triggered by other workflows.
|
||||
on: workflow_call
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}-missing-commits
|
||||
cancel-in-progress: true
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
jobs:
|
||||
check:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: Check for missing commits
|
||||
env:
|
||||
MESSAGE: |
|
||||
|
||||
If you are reading this, then the commits indicated above are missing
|
||||
from the "develop" and/or "release" branch. Do a reverse-merge as soon
|
||||
as possible. See CONTRIBUTING.md for instructions.
|
||||
run: |
|
||||
set -o pipefail
|
||||
# Branches are ordered by how "canonical" they are. Every commit in one
|
||||
# branch should be in all the branches behind it.
|
||||
order=(master release develop)
|
||||
branches=()
|
||||
for branch in "${order[@]}"; do
|
||||
# Check that the branches exist so that this job will work on forked
|
||||
# repos, which don't necessarily have master and release branches.
|
||||
echo "Checking if ${branch} exists."
|
||||
if git ls-remote --exit-code --heads origin \
|
||||
refs/heads/${branch} > /dev/null; then
|
||||
branches+=(origin/${branch})
|
||||
fi
|
||||
done
|
||||
|
||||
prior=()
|
||||
for branch in "${branches[@]}"; do
|
||||
if [[ ${#prior[@]} -ne 0 ]]; then
|
||||
echo "Checking ${prior[@]} for commits missing from ${branch}."
|
||||
git log --oneline --no-merges "${prior[@]}" \
|
||||
^$branch | tee -a "missing-commits.txt"
|
||||
echo
|
||||
fi
|
||||
prior+=("${branch}")
|
||||
done
|
||||
|
||||
if [[ $(cat missing-commits.txt | wc -l) -ne 0 ]]; then
|
||||
echo "${MESSAGE}"
|
||||
exit 1
|
||||
fi
|
||||
91
.github/workflows/reusable-notify-clio.yml
vendored
Normal file
91
.github/workflows/reusable-notify-clio.yml
vendored
Normal file
@@ -0,0 +1,91 @@
|
||||
# This workflow exports the built libxrpl package to the Conan remote on a
|
||||
# a channel named after the pull request, and notifies the Clio repository about
|
||||
# the new version so it can check for compatibility.
|
||||
name: Notify Clio
|
||||
|
||||
# This workflow can only be triggered by other workflows.
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
conan_remote_name:
|
||||
description: "The name of the Conan remote to use."
|
||||
required: false
|
||||
type: string
|
||||
default: xrplf
|
||||
conan_remote_url:
|
||||
description: "The URL of the Conan endpoint to use."
|
||||
required: false
|
||||
type: string
|
||||
default: https://conan.ripplex.io
|
||||
secrets:
|
||||
clio_notify_token:
|
||||
description: "The GitHub token to notify Clio about new versions."
|
||||
required: true
|
||||
conan_remote_username:
|
||||
description: "The username for logging into the Conan remote."
|
||||
required: true
|
||||
conan_remote_password:
|
||||
description: "The password for logging into the Conan remote."
|
||||
required: true
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}-clio
|
||||
cancel-in-progress: true
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
jobs:
|
||||
upload:
|
||||
if: ${{ github.event.pull_request.head.repo.full_name == github.repository }}
|
||||
runs-on: ubuntu-latest
|
||||
container: ghcr.io/xrplf/ci/ubuntu-noble:gcc-13-sha-5dd7158
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
- name: Generate outputs
|
||||
id: generate
|
||||
env:
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
run: |
|
||||
echo 'Generating user and channel.'
|
||||
echo "user=clio" >> "${GITHUB_OUTPUT}"
|
||||
echo "channel=pr_${PR_NUMBER}" >> "${GITHUB_OUTPUT}"
|
||||
echo 'Extracting version.'
|
||||
echo "version=$(cat src/libxrpl/protocol/BuildInfo.cpp | grep "versionString =" | awk -F '"' '{print $2}')" >> "${GITHUB_OUTPUT}"
|
||||
- name: Calculate conan reference
|
||||
id: conan_ref
|
||||
run: |
|
||||
echo "conan_ref=${{ steps.generate.outputs.version }}@${{ steps.generate.outputs.user }}/${{ steps.generate.outputs.channel }}" >> "${GITHUB_OUTPUT}"
|
||||
- name: Set up Conan
|
||||
uses: ./.github/actions/setup-conan
|
||||
with:
|
||||
conan_remote_name: ${{ inputs.conan_remote_name }}
|
||||
conan_remote_url: ${{ inputs.conan_remote_url }}
|
||||
- name: Log into Conan remote
|
||||
env:
|
||||
CONAN_REMOTE_NAME: ${{ inputs.conan_remote_name }}
|
||||
run: conan remote login "${CONAN_REMOTE_NAME}" "${{ secrets.conan_remote_username }}" --password "${{ secrets.conan_remote_password }}"
|
||||
- name: Upload package
|
||||
env:
|
||||
CONAN_REMOTE_NAME: ${{ inputs.conan_remote_name }}
|
||||
run: |
|
||||
conan export --user=${{ steps.generate.outputs.user }} --channel=${{ steps.generate.outputs.channel }} .
|
||||
conan upload --confirm --check --remote="${CONAN_REMOTE_NAME}" xrpl/${{ steps.conan_ref.outputs.conan_ref }}
|
||||
outputs:
|
||||
conan_ref: ${{ steps.conan_ref.outputs.conan_ref }}
|
||||
|
||||
notify:
|
||||
needs: upload
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Notify Clio
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.clio_notify_token }}
|
||||
PR_URL: ${{ github.event.pull_request.html_url }}
|
||||
run: |
|
||||
gh api --method POST -H "Accept: application/vnd.github+json" -H "X-GitHub-Api-Version: 2022-11-28" \
|
||||
/repos/xrplf/clio/dispatches -f "event_type=check_libxrpl" \
|
||||
-F "client_payload[conan_ref]=${{ needs.upload.outputs.conan_ref }}" \
|
||||
-F "client_payload[pr_url]=${PR_URL}"
|
||||
41
.github/workflows/reusable-strategy-matrix.yml
vendored
Normal file
41
.github/workflows/reusable-strategy-matrix.yml
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
name: Generate strategy matrix
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
os:
|
||||
description: 'The operating system to use for the build ("linux", "macos", "windows").'
|
||||
required: false
|
||||
type: string
|
||||
strategy_matrix:
|
||||
# TODO: Support additional strategies, e.g. "ubuntu" for generating all Ubuntu configurations.
|
||||
description: 'The strategy matrix to use for generating the configurations ("minimal", "all").'
|
||||
required: false
|
||||
type: string
|
||||
default: "minimal"
|
||||
outputs:
|
||||
matrix:
|
||||
description: "The generated strategy matrix."
|
||||
value: ${{ jobs.generate-matrix.outputs.matrix }}
|
||||
|
||||
jobs:
|
||||
generate-matrix:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
matrix: ${{ steps.generate.outputs.matrix }}
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
|
||||
with:
|
||||
python-version: 3.13
|
||||
|
||||
- name: Generate strategy matrix
|
||||
working-directory: .github/scripts/strategy-matrix
|
||||
id: generate
|
||||
env:
|
||||
GENERATE_CONFIG: ${{ inputs.os != '' && format('--config={0}.json', inputs.os) || '' }}
|
||||
GENERATE_OPTION: ${{ inputs.strategy_matrix == 'all' && '--all' || '' }}
|
||||
run: ./generate.py ${GENERATE_OPTION} ${GENERATE_CONFIG} >> "${GITHUB_OUTPUT}"
|
||||
87
.github/workflows/reusable-test.yml
vendored
Normal file
87
.github/workflows/reusable-test.yml
vendored
Normal file
@@ -0,0 +1,87 @@
|
||||
name: Test rippled
|
||||
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
verify_voidstar:
|
||||
description: "Whether to verify the presence of voidstar instrumentation."
|
||||
required: true
|
||||
type: boolean
|
||||
run_tests:
|
||||
description: "Whether to run unit tests"
|
||||
required: true
|
||||
type: boolean
|
||||
|
||||
runs_on:
|
||||
description: Runner to run the job on as a JSON string
|
||||
required: true
|
||||
type: string
|
||||
image:
|
||||
description: "The image to run in (leave empty to run natively)"
|
||||
required: true
|
||||
type: string
|
||||
|
||||
config_name:
|
||||
description: "The name of the configuration."
|
||||
required: true
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
test:
|
||||
name: Test ${{ inputs.config_name }}
|
||||
runs-on: ${{ fromJSON(inputs.runs_on) }}
|
||||
container: ${{ inputs.image != '' && inputs.image || null }}
|
||||
timeout-minutes: 30
|
||||
steps:
|
||||
- name: Cleanup workspace
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
uses: XRPLF/actions/.github/actions/cleanup-workspace@3f044c7478548e3c32ff68980eeb36ece02b364e
|
||||
|
||||
- name: Download rippled artifact
|
||||
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
|
||||
with:
|
||||
name: rippled-${{ inputs.config_name }}
|
||||
|
||||
- name: Make binary executable (Linux and macOS)
|
||||
shell: bash
|
||||
if: ${{ runner.os == 'Linux' || runner.os == 'macOS' }}
|
||||
run: |
|
||||
chmod +x ./rippled
|
||||
|
||||
- name: Check linking (Linux)
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
shell: bash
|
||||
run: |
|
||||
ldd ./rippled
|
||||
if [ "$(ldd ./rippled | grep -E '(libstdc\+\+|libgcc)' | wc -l)" -eq 0 ]; then
|
||||
echo 'The binary is statically linked.'
|
||||
else
|
||||
echo 'The binary is dynamically linked.'
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Verifying presence of instrumentation
|
||||
if: ${{ inputs.verify_voidstar }}
|
||||
shell: bash
|
||||
run: |
|
||||
./rippled --version | grep libvoidstar
|
||||
|
||||
- name: Run the embedded tests
|
||||
if: ${{ inputs.run_tests }}
|
||||
shell: bash
|
||||
run: |
|
||||
./rippled --unittest --unittest-jobs $(nproc)
|
||||
|
||||
- name: Run the separate tests
|
||||
if: ${{ inputs.run_tests }}
|
||||
shell: bash
|
||||
run: |
|
||||
for test_file in ./doctest/*; do
|
||||
echo "Executing $test_file"
|
||||
chmod +x "$test_file"
|
||||
if [[ "${{ runner.os }}" == "Windows" && "$test_file" == "./doctest/xrpl.test.net.exe" ]]; then
|
||||
echo "Skipping $test_file on Windows"
|
||||
else
|
||||
"$test_file"
|
||||
fi
|
||||
done
|
||||
94
.github/workflows/upload-conan-deps.yml
vendored
Normal file
94
.github/workflows/upload-conan-deps.yml
vendored
Normal file
@@ -0,0 +1,94 @@
|
||||
name: Upload Conan Dependencies
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "0 3 * * 2-6"
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
force_source_build:
|
||||
description: "Force source build of all dependencies"
|
||||
required: false
|
||||
default: false
|
||||
type: boolean
|
||||
force_upload:
|
||||
description: "Force upload of all dependencies"
|
||||
required: false
|
||||
default: false
|
||||
type: boolean
|
||||
pull_request:
|
||||
branches: [develop]
|
||||
paths:
|
||||
# This allows testing changes to the upload workflow in a PR
|
||||
- .github/workflows/upload-conan-deps.yml
|
||||
push:
|
||||
branches: [develop]
|
||||
paths:
|
||||
- .github/workflows/upload-conan-deps.yml
|
||||
- .github/workflows/reusable-strategy-matrix.yml
|
||||
- .github/actions/build-deps/action.yml
|
||||
- .github/actions/setup-conan/action.yml
|
||||
- ".github/scripts/strategy-matrix/**"
|
||||
- conanfile.py
|
||||
- conan.lock
|
||||
|
||||
env:
|
||||
CONAN_REMOTE_NAME: xrplf
|
||||
CONAN_REMOTE_URL: https://conan.ripplex.io
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
# Generate the strategy matrix to be used by the following job.
|
||||
generate-matrix:
|
||||
uses: ./.github/workflows/reusable-strategy-matrix.yml
|
||||
with:
|
||||
strategy_matrix: ${{ github.event_name == 'pull_request' && 'minimal' || 'all' }}
|
||||
|
||||
# Build and upload the dependencies for each configuration.
|
||||
run-upload-conan-deps:
|
||||
needs:
|
||||
- generate-matrix
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }}
|
||||
max-parallel: 10
|
||||
runs-on: ${{ matrix.architecture.runner }}
|
||||
container: ${{ contains(matrix.architecture.platform, 'linux') && format('ghcr.io/xrplf/ci/{0}-{1}:{2}-{3}-sha-{4}', matrix.os.distro_name, matrix.os.distro_version, matrix.os.compiler_name, matrix.os.compiler_version, matrix.os.image_sha) || null }}
|
||||
steps:
|
||||
- name: Cleanup workspace
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
uses: XRPLF/actions/.github/actions/cleanup-workspace@3f044c7478548e3c32ff68980eeb36ece02b364e
|
||||
|
||||
- uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
|
||||
- name: Prepare runner
|
||||
uses: XRPLF/actions/.github/actions/prepare-runner@638e0dc11ea230f91bd26622fb542116bb5254d5
|
||||
with:
|
||||
disable_ccache: false
|
||||
|
||||
- name: Setup Conan
|
||||
uses: ./.github/actions/setup-conan
|
||||
with:
|
||||
conan_remote_name: ${{ env.CONAN_REMOTE_NAME }}
|
||||
conan_remote_url: ${{ env.CONAN_REMOTE_URL }}
|
||||
|
||||
- name: Build dependencies
|
||||
uses: ./.github/actions/build-deps
|
||||
with:
|
||||
build_dir: .build
|
||||
build_type: ${{ matrix.build_type }}
|
||||
force_build: ${{ github.event_name == 'schedule' || github.event.inputs.force_source_build == 'true' }}
|
||||
# The verbosity is set to "quiet" for Windows to avoid an excessive amount of logs, while it
|
||||
# is set to "verbose" otherwise to provide more information during the build process.
|
||||
verbosity: ${{ runner.os == 'Windows' && 'quiet' || 'verbose' }}
|
||||
|
||||
- name: Log into Conan remote
|
||||
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' }}
|
||||
run: conan remote login "${CONAN_REMOTE_NAME}" "${{ secrets.CONAN_REMOTE_USERNAME }}" --password "${{ secrets.CONAN_REMOTE_PASSWORD }}"
|
||||
|
||||
- name: Upload Conan packages
|
||||
if: ${{ github.repository_owner == 'XRPLF' && github.event_name != 'pull_request' && github.event_name != 'schedule' }}
|
||||
env:
|
||||
FORCE_OPTION: ${{ github.event.inputs.force_upload == 'true' && '--force' || '' }}
|
||||
run: conan upload "*" --remote="${CONAN_REMOTE_NAME}" --confirm ${FORCE_OPTION}
|
||||
91
.github/workflows/windows.yml
vendored
91
.github/workflows/windows.yml
vendored
@@ -1,91 +0,0 @@
|
||||
name: windows
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
push:
|
||||
# If the branches list is ever changed, be sure to change it on all
|
||||
# build/test jobs (nix, macos, windows)
|
||||
branches:
|
||||
# Always build the package branches
|
||||
- develop
|
||||
- release
|
||||
- master
|
||||
# Branches that opt-in to running
|
||||
- 'ci/**'
|
||||
|
||||
# https://docs.github.com/en/actions/using-jobs/using-concurrency
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
|
||||
test:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
generator:
|
||||
- Visual Studio 16 2019
|
||||
configuration:
|
||||
- Release
|
||||
# Github hosted runners tend to hang when running Debug unit tests.
|
||||
# Instead of trying to work around it, disable the Debug job until
|
||||
# something beefier (i.e. a heavy self-hosted runner) becomes
|
||||
# available.
|
||||
# - Debug
|
||||
runs-on: windows-2019
|
||||
env:
|
||||
build_dir: .build
|
||||
steps:
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: choose Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.9
|
||||
- name: learn Python cache directory
|
||||
id: pip-cache
|
||||
shell: bash
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
echo "dir=$(pip cache dir)" | tee ${GITHUB_OUTPUT}
|
||||
- name: restore Python cache directory
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ${{ steps.pip-cache.outputs.dir }}
|
||||
key: ${{ runner.os }}-${{ hashFiles('.github/workflows/windows.yml') }}
|
||||
- name: install Conan
|
||||
run: pip install wheel 'conan<2'
|
||||
- name: check environment
|
||||
run: |
|
||||
dir env:
|
||||
$env:PATH -split ';'
|
||||
python --version
|
||||
conan --version
|
||||
cmake --version
|
||||
- name: configure Conan
|
||||
shell: bash
|
||||
run: |
|
||||
conan profile new default --detect
|
||||
conan profile update settings.compiler.cppstd=20 default
|
||||
conan profile update settings.compiler.runtime=MT${{ matrix.configuration == 'Debug' && 'd' || '' }} default
|
||||
- name: build dependencies
|
||||
uses: ./.github/actions/dependencies
|
||||
env:
|
||||
CONAN_URL: http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
CONAN_LOGIN_USERNAME_RIPPLE: ${{ secrets.CONAN_USERNAME }}
|
||||
CONAN_PASSWORD_RIPPLE: ${{ secrets.CONAN_TOKEN }}
|
||||
with:
|
||||
configuration: ${{ matrix.configuration }}
|
||||
- name: build
|
||||
uses: ./.github/actions/build
|
||||
with:
|
||||
generator: '${{ matrix.generator }}'
|
||||
configuration: ${{ matrix.configuration }}
|
||||
# Hard code for now. Move to the matrix if varied options are needed
|
||||
cmake-args: '-Dassert=ON -Dreporting=OFF -Dunity=ON'
|
||||
cmake-target: install
|
||||
- name: test
|
||||
shell: bash
|
||||
run: |
|
||||
${build_dir}/${{ matrix.configuration }}/rippled --unittest --unittest-jobs $(nproc)
|
||||
9
.gitignore
vendored
9
.gitignore
vendored
@@ -37,10 +37,9 @@ Release/*.*
|
||||
*.gcov
|
||||
|
||||
# Levelization checking
|
||||
Builds/levelization/results/rawincludes.txt
|
||||
Builds/levelization/results/paths.txt
|
||||
Builds/levelization/results/includes/
|
||||
Builds/levelization/results/includedby/
|
||||
.github/scripts/levelization/results/*
|
||||
!.github/scripts/levelization/results/loops.txt
|
||||
!.github/scripts/levelization/results/ordering.txt
|
||||
|
||||
# Ignore tmp directory.
|
||||
tmp
|
||||
@@ -111,4 +110,4 @@ bld.rippled/
|
||||
.vscode
|
||||
|
||||
# Suggested in-tree build directory
|
||||
/.build/
|
||||
/.build*/
|
||||
|
||||
@@ -1,6 +1,39 @@
|
||||
# .pre-commit-config.yaml
|
||||
# To run pre-commit hooks, first install pre-commit:
|
||||
# - `pip install pre-commit==${PRE_COMMIT_VERSION}`
|
||||
#
|
||||
# Then, run the following command to install the git hook scripts:
|
||||
# - `pre-commit install`
|
||||
# You can run all configured hooks against all files with:
|
||||
# - `pre-commit run --all-files`
|
||||
# To manually run a specific hook, use:
|
||||
# - `pre-commit run <hook_id> --all-files`
|
||||
# To run the hooks against only the staged files, use:
|
||||
# - `pre-commit run`
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/mirrors-clang-format
|
||||
rev: v18.1.3
|
||||
hooks:
|
||||
- id: clang-format
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: 3e8a8703264a2f4a69428a0aa4dcb512790b2c8c # frozen: v6.0.0
|
||||
hooks:
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
- id: mixed-line-ending
|
||||
- id: check-merge-conflict
|
||||
args: [--assume-in-merge]
|
||||
|
||||
- repo: https://github.com/pre-commit/mirrors-clang-format
|
||||
rev: 7d85583be209cb547946c82fbe51f4bc5dd1d017 # frozen: v18.1.8
|
||||
hooks:
|
||||
- id: clang-format
|
||||
args: [--style=file]
|
||||
"types_or": [c++, c, proto]
|
||||
|
||||
- repo: https://github.com/rbubley/mirrors-prettier
|
||||
rev: 5ba47274f9b181bce26a5150a725577f3c336011 # frozen: v3.6.2
|
||||
hooks:
|
||||
- id: prettier
|
||||
|
||||
exclude: |
|
||||
(?x)^(
|
||||
external/.*|
|
||||
.github/scripts/levelization/results/.*\.txt|
|
||||
conan\.lock
|
||||
)$
|
||||
|
||||
1
.prettierignore
Normal file
1
.prettierignore
Normal file
@@ -0,0 +1 @@
|
||||
external
|
||||
@@ -83,28 +83,60 @@ The [commandline](https://xrpl.org/docs/references/http-websocket-apis/api-conve
|
||||
|
||||
The `network_id` field was added in the `server_info` response in version 1.5.0 (2019), but it is not returned in [reporting mode](https://xrpl.org/rippled-server-modes.html#reporting-mode). However, use of reporting mode is now discouraged, in favor of using [Clio](https://github.com/XRPLF/clio) instead.
|
||||
|
||||
## XRP Ledger server version 2.2.0
|
||||
## XRP Ledger server version 2.5.0
|
||||
|
||||
The following is a non-breaking addition to the API.
|
||||
As of 2025-04-04, version 2.5.0 is in development. You can use a pre-release version by building from source or [using the `nightly` package](https://xrpl.org/docs/infrastructure/installation/install-rippled-on-ubuntu).
|
||||
|
||||
- The `feature` method now has a non-admin mode for users. (It was previously only available to admin connections.) The method returns an updated list of amendments, including their names and other information. ([#4781](https://github.com/XRPLF/rippled/pull/4781))
|
||||
### Additions and bugfixes in 2.5.0
|
||||
|
||||
### Breaking change in 2.3
|
||||
- `channel_authorize`: If `signing_support` is not enabled in the config, the RPC is disabled.
|
||||
|
||||
## XRP Ledger server version 2.4.0
|
||||
|
||||
[Version 2.4.0](https://github.com/XRPLF/rippled/releases/tag/2.4.0) was released on March 4, 2025.
|
||||
|
||||
### Additions and bugfixes in 2.4.0
|
||||
|
||||
- `ledger_entry`: `state` is added an alias for `ripple_state`.
|
||||
- `ledger_entry`: Enables case-insensitive filtering by canonical name in addition to case-sensitive filtering by RPC name.
|
||||
- `validators`: Added new field `validator_list_threshold` in response.
|
||||
- `simulate`: A new RPC that executes a [dry run of a transaction submission](https://github.com/XRPLF/XRPL-Standards/tree/master/XLS-0069d-simulate#2-rpc-simulate)
|
||||
- Signing methods autofill fees better and properly handle transactions that don't have a base fee, and will also autofill the `NetworkID` field.
|
||||
|
||||
## XRP Ledger server version 2.3.0
|
||||
|
||||
[Version 2.3.0](https://github.com/XRPLF/rippled/releases/tag/2.3.0) was released on Nov 25, 2024.
|
||||
|
||||
### Breaking changes in 2.3.0
|
||||
|
||||
- `book_changes`: If the requested ledger version is not available on this node, a `ledgerNotFound` error is returned and the node does not attempt to acquire the ledger from the p2p network (as with other non-admin RPCs).
|
||||
|
||||
Admins can still attempt to retrieve old ledgers with the `ledger_request` RPC.
|
||||
|
||||
### Addition in 2.3
|
||||
### Additions and bugfixes in 2.3.0
|
||||
|
||||
- `book_changes`: Returns a `validated` field in its response, which was missing in prior versions.
|
||||
|
||||
The following additions are non-breaking (because they are purely additive).
|
||||
## XRP Ledger server version 2.2.0
|
||||
|
||||
[Version 2.2.0](https://github.com/XRPLF/rippled/releases/tag/2.2.0) was released on Jun 5, 2024. The following additions are non-breaking (because they are purely additive):
|
||||
|
||||
- The `feature` method now has a non-admin mode for users. (It was previously only available to admin connections.) The method returns an updated list of amendments, including their names and other information. ([#4781](https://github.com/XRPLF/rippled/pull/4781))
|
||||
|
||||
## XRP Ledger server version 2.0.0
|
||||
|
||||
[Version 2.0.0](https://github.com/XRPLF/rippled/releases/tag/2.0.0) was released on Jan 9, 2024. The following additions are non-breaking (because they are purely additive):
|
||||
|
||||
- `server_definitions`: A new RPC that generates a `definitions.json`-like output that can be used in XRPL libraries.
|
||||
- In `Payment` transactions, `DeliverMax` has been added. This is a replacement for the `Amount` field, which should not be used. Typically, the `delivered_amount` (in transaction metadata) should be used. To ease the transition, `DeliverMax` is present regardless of API version, since adding a field is non-breaking.
|
||||
- API version 2 has been moved from beta to supported, meaning that it is generally available (regardless of the `beta_rpc_api` setting).
|
||||
|
||||
## XRP Ledger server version 2.2.0
|
||||
|
||||
The following is a non-breaking addition to the API.
|
||||
|
||||
- The `feature` method now has a non-admin mode for users. (It was previously only available to admin connections.) The method returns an updated list of amendments, including their names and other information. ([#4781](https://github.com/XRPLF/rippled/pull/4781))
|
||||
|
||||
## XRP Ledger server version 1.12.0
|
||||
|
||||
[Version 1.12.0](https://github.com/XRPLF/rippled/releases/tag/1.12.0) was released on Sep 6, 2023. The following additions are non-breaking (because they are purely additive).
|
||||
|
||||
618
BUILD.md
618
BUILD.md
@@ -3,29 +3,29 @@
|
||||
| These instructions assume you have a C++ development environment ready with Git, Python, Conan, CMake, and a C++ compiler. For help setting one up on Linux, macOS, or Windows, [see this guide](./docs/build/environment.md). |
|
||||
|
||||
> These instructions also assume a basic familiarity with Conan and CMake.
|
||||
> If you are unfamiliar with Conan,
|
||||
> you can read our [crash course](./docs/build/conan.md)
|
||||
> or the official [Getting Started][3] walkthrough.
|
||||
> If you are unfamiliar with Conan, you can read our
|
||||
> [crash course](./docs/build/conan.md) or the official [Getting Started][3]
|
||||
> walkthrough.
|
||||
|
||||
## Branches
|
||||
|
||||
For a stable release, choose the `master` branch or one of the [tagged
|
||||
releases](https://github.com/ripple/rippled/releases).
|
||||
|
||||
```
|
||||
```bash
|
||||
git checkout master
|
||||
```
|
||||
|
||||
For the latest release candidate, choose the `release` branch.
|
||||
|
||||
```
|
||||
```bash
|
||||
git checkout release
|
||||
```
|
||||
|
||||
For the latest set of untested features, or to contribute, choose the `develop`
|
||||
branch.
|
||||
|
||||
```
|
||||
```bash
|
||||
git checkout develop
|
||||
```
|
||||
|
||||
@@ -33,176 +33,307 @@ git checkout develop
|
||||
|
||||
See [System Requirements](https://xrpl.org/system-requirements.html).
|
||||
|
||||
Building rippled generally requires git, Python, Conan, CMake, and a C++ compiler. Some guidance on setting up such a [C++ development environment can be found here](./docs/build/environment.md).
|
||||
Building rippled generally requires git, Python, Conan, CMake, and a C++
|
||||
compiler. Some guidance on setting up such a [C++ development environment can be
|
||||
found here](./docs/build/environment.md).
|
||||
|
||||
- [Python 3.7](https://www.python.org/downloads/)
|
||||
- [Conan 1.60](https://conan.io/downloads.html)[^1]
|
||||
- [CMake 3.16](https://cmake.org/download/)
|
||||
- [Python 3.11](https://www.python.org/downloads/), or higher
|
||||
- [Conan 2.17](https://conan.io/downloads.html)[^1], or higher
|
||||
- [CMake 3.22](https://cmake.org/download/), or higher
|
||||
|
||||
[^1]: It is possible to build with Conan 2.x,
|
||||
but the instructions are significantly different,
|
||||
which is why we are not recommending it yet.
|
||||
Notably, the `conan profile update` command is removed in 2.x.
|
||||
Profiles must be edited by hand.
|
||||
[^1]:
|
||||
It is possible to build with Conan 1.60+, but the instructions are
|
||||
significantly different, which is why we are not recommending it.
|
||||
|
||||
`rippled` is written in the C++20 dialect and includes the `<concepts>` header.
|
||||
The [minimum compiler versions][2] required are:
|
||||
|
||||
| Compiler | Version |
|
||||
|-------------|---------|
|
||||
| GCC | 11 |
|
||||
| Clang | 13 |
|
||||
| Apple Clang | 13.1.6 |
|
||||
| MSVC | 19.23 |
|
||||
| Compiler | Version |
|
||||
| ----------- | --------- |
|
||||
| GCC | 12 |
|
||||
| Clang | 16 |
|
||||
| Apple Clang | 16 |
|
||||
| MSVC | 19.44[^3] |
|
||||
|
||||
### Linux
|
||||
|
||||
The Ubuntu operating system has received the highest level of
|
||||
quality assurance, testing, and support.
|
||||
The Ubuntu Linux distribution has received the highest level of quality
|
||||
assurance, testing, and support. We also support Red Hat and use Debian
|
||||
internally.
|
||||
|
||||
Here are [sample instructions for setting up a C++ development environment on Linux](./docs/build/environment.md#linux).
|
||||
Here are [sample instructions for setting up a C++ development environment on
|
||||
Linux](./docs/build/environment.md#linux).
|
||||
|
||||
### Mac
|
||||
|
||||
Many rippled engineers use macOS for development.
|
||||
|
||||
Here are [sample instructions for setting up a C++ development environment on macOS](./docs/build/environment.md#macos).
|
||||
Here are [sample instructions for setting up a C++ development environment on
|
||||
macOS](./docs/build/environment.md#macos).
|
||||
|
||||
### Windows
|
||||
|
||||
Windows is not recommended for production use at this time.
|
||||
Windows is used by some engineers for development only.
|
||||
|
||||
- Additionally, 32-bit Windows development is not supported.
|
||||
|
||||
[Boost]: https://www.boost.org/
|
||||
[^3]: Windows is not recommended for production use.
|
||||
|
||||
## Steps
|
||||
|
||||
### Set Up Conan
|
||||
|
||||
After you have a [C++ development environment](./docs/build/environment.md) ready with Git, Python, Conan, CMake, and a C++ compiler, you may need to set up your Conan profile.
|
||||
After you have a [C++ development environment](./docs/build/environment.md) ready with Git, Python,
|
||||
Conan, CMake, and a C++ compiler, you may need to set up your Conan profile.
|
||||
|
||||
These instructions assume a basic familiarity with Conan and CMake.
|
||||
These instructions assume a basic familiarity with Conan and CMake. If you are
|
||||
unfamiliar with Conan, then please read [this crash course](./docs/build/conan.md) or the official
|
||||
[Getting Started][3] walkthrough.
|
||||
|
||||
If you are unfamiliar with Conan, then please read [this crash course](./docs/build/conan.md) or the official [Getting Started][3] walkthrough.
|
||||
#### Default profile
|
||||
|
||||
You'll need at least one Conan profile:
|
||||
We recommend that you import the provided `conan/profiles/default` profile:
|
||||
|
||||
```
|
||||
conan profile new default --detect
|
||||
```
|
||||
|
||||
Update the compiler settings:
|
||||
|
||||
```
|
||||
conan profile update settings.compiler.cppstd=20 default
|
||||
```
|
||||
|
||||
Configure Conan (1.x only) to use recipe revisions:
|
||||
|
||||
```
|
||||
conan config set general.revisions_enabled=1
|
||||
```
|
||||
|
||||
**Linux** developers will commonly have a default Conan [profile][] that compiles
|
||||
with GCC and links with libstdc++.
|
||||
If you are linking with libstdc++ (see profile setting `compiler.libcxx`),
|
||||
then you will need to choose the `libstdc++11` ABI:
|
||||
|
||||
```
|
||||
conan profile update settings.compiler.libcxx=libstdc++11 default
|
||||
```
|
||||
|
||||
|
||||
Ensure inter-operability between `boost::string_view` and `std::string_view` types:
|
||||
|
||||
```
|
||||
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_BEAST_USE_STD_STRING_VIEW"]' default
|
||||
conan profile update 'env.CXXFLAGS="-DBOOST_BEAST_USE_STD_STRING_VIEW"' default
|
||||
```bash
|
||||
conan config install conan/profiles/ -tf $(conan config home)/profiles/
|
||||
```
|
||||
|
||||
If you have other flags in the `conf.tools.build` or `env.CXXFLAGS` sections, make sure to retain the existing flags and append the new ones. You can check them with:
|
||||
```
|
||||
conan profile show default
|
||||
You can check your Conan profile by running:
|
||||
|
||||
```bash
|
||||
conan profile show
|
||||
```
|
||||
|
||||
#### Custom profile
|
||||
|
||||
**Windows** developers may need to use the x64 native build tools.
|
||||
An easy way to do that is to run the shortcut "x64 Native Tools Command
|
||||
Prompt" for the version of Visual Studio that you have installed.
|
||||
If the default profile does not work for you and you do not yet have a Conan
|
||||
profile, you can create one by running:
|
||||
|
||||
Windows developers must also build `rippled` and its dependencies for the x64
|
||||
architecture:
|
||||
|
||||
```
|
||||
conan profile update settings.arch=x86_64 default
|
||||
```
|
||||
|
||||
### Multiple compilers
|
||||
|
||||
When `/usr/bin/g++` exists on a platform, it is the default cpp compiler. This
|
||||
default works for some users.
|
||||
|
||||
However, if this compiler cannot build rippled or its dependencies, then you can
|
||||
install another compiler and set Conan and CMake to use it.
|
||||
Update the `conf.tools.build:compiler_executables` setting in order to set the correct variables (`CMAKE_<LANG>_COMPILER`) in the
|
||||
generated CMake toolchain file.
|
||||
For example, on Ubuntu 20, you may have gcc at `/usr/bin/gcc` and g++ at `/usr/bin/g++`; if that is the case, you can select those compilers with:
|
||||
```
|
||||
conan profile update 'conf.tools.build:compiler_executables={"c": "/usr/bin/gcc", "cpp": "/usr/bin/g++"}' default
|
||||
```bash
|
||||
conan profile detect
|
||||
```
|
||||
|
||||
Replace `/usr/bin/gcc` and `/usr/bin/g++` with paths to the desired compilers.
|
||||
You may need to make changes to the profile to suit your environment. You can
|
||||
refer to the provided `conan/profiles/default` profile for inspiration, and you
|
||||
may also need to apply the required [tweaks](#conan-profile-tweaks) to this
|
||||
default profile.
|
||||
|
||||
It should choose the compiler for dependencies as well,
|
||||
but not all of them have a Conan recipe that respects this setting (yet).
|
||||
For the rest, you can set these environment variables.
|
||||
Replace `<path>` with paths to the desired compilers:
|
||||
### Patched recipes
|
||||
|
||||
- `conan profile update env.CC=<path> default`
|
||||
- `conan profile update env.CXX=<path> default`
|
||||
The recipes in Conan Center occasionally need to be patched for compatibility
|
||||
with the latest version of `rippled`. We maintain a fork of the Conan Center
|
||||
[here](https://github.com/XRPLF/conan-center-index/) containing the patches.
|
||||
|
||||
Export our [Conan recipe for Snappy](./external/snappy).
|
||||
It does not explicitly link the C++ standard library,
|
||||
which allows you to statically link it with GCC, if you want.
|
||||
To ensure our patched recipes are used, you must add our Conan remote at a
|
||||
higher index than the default Conan Center remote, so it is consulted first. You
|
||||
can do this by running:
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/snappy snappy/1.1.10@
|
||||
# Conan 2.x
|
||||
conan export --version 1.1.10 external/snappy
|
||||
```
|
||||
```bash
|
||||
conan remote add --index 0 xrplf https://conan.ripplex.io
|
||||
```
|
||||
|
||||
Export our [Conan recipe for RocksDB](./external/rocksdb).
|
||||
It does not override paths to dependencies when building with Visual Studio.
|
||||
Alternatively, you can pull the patched recipes into the repository and use them
|
||||
locally:
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/rocksdb rocksdb/6.29.5@
|
||||
# Conan 2.x
|
||||
conan export --version 6.29.5 external/rocksdb
|
||||
```
|
||||
```bash
|
||||
cd external
|
||||
git init
|
||||
git remote add origin git@github.com:XRPLF/conan-center-index.git
|
||||
git sparse-checkout init
|
||||
git sparse-checkout set recipes/snappy
|
||||
git sparse-checkout add recipes/soci
|
||||
git fetch origin master
|
||||
git checkout master
|
||||
conan export --version 1.1.10 recipes/snappy/all
|
||||
conan export --version 4.0.3 recipes/soci/all
|
||||
rm -rf .git
|
||||
```
|
||||
|
||||
Export our [Conan recipe for SOCI](./external/soci).
|
||||
It patches their CMake to correctly import its dependencies.
|
||||
In the case we switch to a newer version of a dependency that still requires a
|
||||
patch, it will be necessary for you to pull in the changes and re-export the
|
||||
updated dependencies with the newer version. However, if we switch to a newer
|
||||
version that no longer requires a patch, no action is required on your part, as
|
||||
the new recipe will be automatically pulled from the official Conan Center.
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/soci soci/4.0.3@
|
||||
# Conan 2.x
|
||||
conan export --version 4.0.3 external/soci
|
||||
```
|
||||
> [!NOTE]
|
||||
> You might need to add `--lockfile=""` to your `conan install` command
|
||||
> to avoid automatic use of the existing `conan.lock` file when you run `conan export` manually on your machine
|
||||
|
||||
Export our [Conan recipe for NuDB](./external/nudb).
|
||||
It fixes some source files to add missing `#include`s.
|
||||
### Conan profile tweaks
|
||||
|
||||
#### Missing compiler version
|
||||
|
||||
```
|
||||
# Conan 1.x
|
||||
conan export external/nudb nudb/2.0.8@
|
||||
# Conan 2.x
|
||||
conan export --version 2.0.8 external/nudb
|
||||
```
|
||||
If you see an error similar to the following after running `conan profile show`:
|
||||
|
||||
```bash
|
||||
ERROR: Invalid setting '17' is not a valid 'settings.compiler.version' value.
|
||||
Possible values are ['5.0', '5.1', '6.0', '6.1', '7.0', '7.3', '8.0', '8.1',
|
||||
'9.0', '9.1', '10.0', '11.0', '12.0', '13', '13.0', '13.1', '14', '14.0', '15',
|
||||
'15.0', '16', '16.0']
|
||||
Read "http://docs.conan.io/2/knowledge/faq.html#error-invalid-setting"
|
||||
```
|
||||
|
||||
you need to amend the list of compiler versions in
|
||||
`$(conan config home)/settings.yml`, by appending the required version number(s)
|
||||
to the `version` array specific for your compiler. For example:
|
||||
|
||||
```yaml
|
||||
apple-clang:
|
||||
version:
|
||||
[
|
||||
"5.0",
|
||||
"5.1",
|
||||
"6.0",
|
||||
"6.1",
|
||||
"7.0",
|
||||
"7.3",
|
||||
"8.0",
|
||||
"8.1",
|
||||
"9.0",
|
||||
"9.1",
|
||||
"10.0",
|
||||
"11.0",
|
||||
"12.0",
|
||||
"13",
|
||||
"13.0",
|
||||
"13.1",
|
||||
"14",
|
||||
"14.0",
|
||||
"15",
|
||||
"15.0",
|
||||
"16",
|
||||
"16.0",
|
||||
"17",
|
||||
"17.0",
|
||||
]
|
||||
```
|
||||
|
||||
#### Multiple compilers
|
||||
|
||||
If you have multiple compilers installed, make sure to select the one to use in
|
||||
your default Conan configuration **before** running `conan profile detect`, by
|
||||
setting the `CC` and `CXX` environment variables.
|
||||
|
||||
For example, if you are running MacOS and have [homebrew
|
||||
LLVM@18](https://formulae.brew.sh/formula/llvm@18), and want to use it as a
|
||||
compiler in the new Conan profile:
|
||||
|
||||
```bash
|
||||
export CC=$(brew --prefix llvm@18)/bin/clang
|
||||
export CXX=$(brew --prefix llvm@18)/bin/clang++
|
||||
conan profile detect
|
||||
```
|
||||
|
||||
You should also explicitly set the path to the compiler in the profile file,
|
||||
which helps to avoid errors when `CC` and/or `CXX` are set and disagree with the
|
||||
selected Conan profile. For example:
|
||||
|
||||
```text
|
||||
[conf]
|
||||
tools.build:compiler_executables={'c':'/usr/bin/gcc','cpp':'/usr/bin/g++'}
|
||||
```
|
||||
|
||||
#### Multiple profiles
|
||||
|
||||
You can manage multiple Conan profiles in the directory
|
||||
`$(conan config home)/profiles`, for example renaming `default` to a different
|
||||
name and then creating a new `default` profile for a different compiler.
|
||||
|
||||
#### Select language
|
||||
|
||||
The default profile created by Conan will typically select different C++ dialect
|
||||
than C++20 used by this project. You should set `20` in the profile line
|
||||
starting with `compiler.cppstd=`. For example:
|
||||
|
||||
```bash
|
||||
sed -i.bak -e 's|^compiler\.cppstd=.*$|compiler.cppstd=20|' $(conan config home)/profiles/default
|
||||
```
|
||||
|
||||
#### Select standard library in Linux
|
||||
|
||||
**Linux** developers will commonly have a default Conan [profile][] that
|
||||
compiles with GCC and links with libstdc++. If you are linking with libstdc++
|
||||
(see profile setting `compiler.libcxx`), then you will need to choose the
|
||||
`libstdc++11` ABI:
|
||||
|
||||
```bash
|
||||
sed -i.bak -e 's|^compiler\.libcxx=.*$|compiler.libcxx=libstdc++11|' $(conan config home)/profiles/default
|
||||
```
|
||||
|
||||
#### Select architecture and runtime in Windows
|
||||
|
||||
**Windows** developers may need to use the x64 native build tools. An easy way
|
||||
to do that is to run the shortcut "x64 Native Tools Command Prompt" for the
|
||||
version of Visual Studio that you have installed.
|
||||
|
||||
Windows developers must also build `rippled` and its dependencies for the x64
|
||||
architecture:
|
||||
|
||||
```bash
|
||||
sed -i.bak -e 's|^arch=.*$|arch=x86_64|' $(conan config home)/profiles/default
|
||||
```
|
||||
|
||||
**Windows** developers also must select static runtime:
|
||||
|
||||
```bash
|
||||
sed -i.bak -e 's|^compiler\.runtime=.*$|compiler.runtime=static|' $(conan config home)/profiles/default
|
||||
```
|
||||
|
||||
#### Clang workaround for grpc
|
||||
|
||||
If your compiler is clang, version 19 or later, or apple-clang, version 17 or
|
||||
later, you may encounter a compilation error while building the `grpc`
|
||||
dependency:
|
||||
|
||||
```text
|
||||
In file included from .../lib/promise/try_seq.h:26:
|
||||
.../lib/promise/detail/basic_seq.h:499:38: error: a template argument list is expected after a name prefixed by the template keyword [-Wmissing-template-arg-list-after-template-kw]
|
||||
499 | Traits::template CallSeqFactory(f_, *cur_, std::move(arg)));
|
||||
| ^
|
||||
```
|
||||
|
||||
The workaround for this error is to add two lines to profile:
|
||||
|
||||
```text
|
||||
[conf]
|
||||
tools.build:cxxflags=['-Wno-missing-template-arg-list-after-template-kw']
|
||||
```
|
||||
|
||||
#### Workaround for gcc 12
|
||||
|
||||
If your compiler is gcc, version 12, and you have enabled `werr` option, you may
|
||||
encounter a compilation error such as:
|
||||
|
||||
```text
|
||||
/usr/include/c++/12/bits/char_traits.h:435:56: error: 'void* __builtin_memcpy(void*, const void*, long unsigned int)' accessing 9223372036854775810 or more bytes at offsets [2, 9223372036854775807] and 1 may overlap up to 9223372036854775813 bytes at offset -3 [-Werror=restrict]
|
||||
435 | return static_cast<char_type*>(__builtin_memcpy(__s1, __s2, __n));
|
||||
| ~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
|
||||
cc1plus: all warnings being treated as errors
|
||||
```
|
||||
|
||||
The workaround for this error is to add two lines to your profile:
|
||||
|
||||
```text
|
||||
[conf]
|
||||
tools.build:cxxflags=['-Wno-restrict']
|
||||
```
|
||||
|
||||
#### Workaround for clang 16
|
||||
|
||||
If your compiler is clang, version 16, you may encounter compilation error such
|
||||
as:
|
||||
|
||||
```text
|
||||
In file included from .../boost/beast/websocket/stream.hpp:2857:
|
||||
.../boost/beast/websocket/impl/read.hpp:695:17: error: call to 'async_teardown' is ambiguous
|
||||
async_teardown(impl.role, impl.stream(),
|
||||
^~~~~~~~~~~~~~
|
||||
```
|
||||
|
||||
The workaround for this error is to add two lines to your profile:
|
||||
|
||||
```text
|
||||
[conf]
|
||||
tools.build:cxxflags=['-DBOOST_ASIO_DISABLE_CONCEPTS']
|
||||
```
|
||||
|
||||
### Build and Test
|
||||
|
||||
@@ -222,63 +353,67 @@ It fixes some source files to add missing `#include`s.
|
||||
the `install-folder` or `-if` option to every `conan install` command
|
||||
in the next step.
|
||||
|
||||
2. Generate CMake files for every configuration you want to build.
|
||||
2. Use conan to generate CMake files for every configuration you want to build:
|
||||
|
||||
```
|
||||
conan install .. --output-folder . --build missing --settings build_type=Release
|
||||
conan install .. --output-folder . --build missing --settings build_type=Debug
|
||||
```
|
||||
```
|
||||
conan install .. --output-folder . --build missing --settings build_type=Release
|
||||
conan install .. --output-folder . --build missing --settings build_type=Debug
|
||||
```
|
||||
|
||||
For a single-configuration generator, e.g. `Unix Makefiles` or `Ninja`,
|
||||
you only need to run this command once.
|
||||
For a multi-configuration generator, e.g. `Visual Studio`, you may want to
|
||||
run it more than once.
|
||||
To build Debug, in the next step, be sure to set `-DCMAKE_BUILD_TYPE=Debug`
|
||||
|
||||
Each of these commands should also have a different `build_type` setting.
|
||||
A second command with the same `build_type` setting will overwrite the files
|
||||
generated by the first. You can pass the build type on the command line with
|
||||
`--settings build_type=$BUILD_TYPE` or in the profile itself,
|
||||
under the section `[settings]` with the key `build_type`.
|
||||
For a single-configuration generator, e.g. `Unix Makefiles` or `Ninja`,
|
||||
you only need to run this command once.
|
||||
For a multi-configuration generator, e.g. `Visual Studio`, you may want to
|
||||
run it more than once.
|
||||
|
||||
If you are using a Microsoft Visual C++ compiler,
|
||||
then you will need to ensure consistency between the `build_type` setting
|
||||
and the `compiler.runtime` setting.
|
||||
Each of these commands should also have a different `build_type` setting.
|
||||
A second command with the same `build_type` setting will overwrite the files
|
||||
generated by the first. You can pass the build type on the command line with
|
||||
`--settings build_type=$BUILD_TYPE` or in the profile itself,
|
||||
under the section `[settings]` with the key `build_type`.
|
||||
|
||||
When `build_type` is `Release`, `compiler.runtime` should be `MT`.
|
||||
If you are using a Microsoft Visual C++ compiler,
|
||||
then you will need to ensure consistency between the `build_type` setting
|
||||
and the `compiler.runtime` setting.
|
||||
|
||||
When `build_type` is `Debug`, `compiler.runtime` should be `MTd`.
|
||||
When `build_type` is `Release`, `compiler.runtime` should be `MT`.
|
||||
|
||||
```
|
||||
conan install .. --output-folder . --build missing --settings build_type=Release --settings compiler.runtime=MT
|
||||
conan install .. --output-folder . --build missing --settings build_type=Debug --settings compiler.runtime=MTd
|
||||
```
|
||||
When `build_type` is `Debug`, `compiler.runtime` should be `MTd`.
|
||||
|
||||
```
|
||||
conan install .. --output-folder . --build missing --settings build_type=Release --settings compiler.runtime=MT
|
||||
conan install .. --output-folder . --build missing --settings build_type=Debug --settings compiler.runtime=MTd
|
||||
```
|
||||
|
||||
3. Configure CMake and pass the toolchain file generated by Conan, located at
|
||||
`$OUTPUT_FOLDER/build/generators/conan_toolchain.cmake`.
|
||||
|
||||
Single-config generators:
|
||||
Single-config generators:
|
||||
|
||||
```
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release -Dxrpld=ON -Dtests=ON ..
|
||||
```
|
||||
Pass the CMake variable [`CMAKE_BUILD_TYPE`][build_type]
|
||||
and make sure it matches the one of the `build_type` settings
|
||||
you chose in the previous step.
|
||||
|
||||
Pass the CMake variable [`CMAKE_BUILD_TYPE`][build_type]
|
||||
and make sure it matches the `build_type` setting you chose in the previous
|
||||
step.
|
||||
For example, to build Debug, in the next command, replace "Release" with "Debug"
|
||||
|
||||
Multi-config generators:
|
||||
```
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release -Dxrpld=ON -Dtests=ON ..
|
||||
```
|
||||
|
||||
```
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -Dxrpld=ON -Dtests=ON ..
|
||||
```
|
||||
Multi-config generators:
|
||||
|
||||
**Note:** You can pass build options for `rippled` in this step.
|
||||
```
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -Dxrpld=ON -Dtests=ON ..
|
||||
```
|
||||
|
||||
**Note:** You can pass build options for `rippled` in this step.
|
||||
|
||||
4. Build `rippled`.
|
||||
|
||||
For a single-configuration generator, it will build whatever configuration
|
||||
you passed for `CMAKE_BUILD_TYPE`. For a multi-configuration generator,
|
||||
you must pass the option `--config` to select the build configuration.
|
||||
you passed for `CMAKE_BUILD_TYPE`. For a multi-configuration generator, you
|
||||
must pass the option `--config` to select the build configuration.
|
||||
|
||||
Single-config generators:
|
||||
|
||||
@@ -298,19 +433,49 @@ It fixes some source files to add missing `#include`s.
|
||||
Single-config generators:
|
||||
|
||||
```
|
||||
./rippled --unittest
|
||||
./rippled --unittest --unittest-jobs N
|
||||
```
|
||||
|
||||
Multi-config generators:
|
||||
|
||||
```
|
||||
./Release/rippled --unittest
|
||||
./Debug/rippled --unittest
|
||||
./Release/rippled --unittest --unittest-jobs N
|
||||
./Debug/rippled --unittest --unittest-jobs N
|
||||
```
|
||||
|
||||
The location of `rippled` in your build directory depends on your CMake
|
||||
generator. Pass `--help` to see the rest of the command line options.
|
||||
Replace the `--unittest-jobs` parameter N with the desired unit tests
|
||||
concurrency. Recommended setting is half of the number of available CPU
|
||||
cores.
|
||||
|
||||
The location of `rippled` binary in your build directory depends on your
|
||||
CMake generator. Pass `--help` to see the rest of the command line options.
|
||||
|
||||
#### Conan lockfile
|
||||
|
||||
To achieve reproducible dependencies, we use [Conan lockfile](https://docs.conan.io/2/tutorial/versioning/lockfiles.html).
|
||||
|
||||
The `conan.lock` file in the repository contains a "snapshot" of the current dependencies.
|
||||
It is implicitly used when running `conan` commands, you don't need to specify it.
|
||||
|
||||
You have to update this file every time you add a new dependency or change a revision or version of an existing dependency.
|
||||
|
||||
> [!NOTE]
|
||||
> Conan uses local cache by default when creating a lockfile.
|
||||
>
|
||||
> To ensure, that lockfile creation works the same way on all developer machines, you should clear the local cache before creating a new lockfile.
|
||||
|
||||
To create a new lockfile, run the following commands in the repository root:
|
||||
|
||||
```bash
|
||||
conan remove '*' --confirm
|
||||
rm conan.lock
|
||||
# This ensure that xrplf remote is the first to be consulted
|
||||
conan remote add --force --index 0 xrplf https://conan.ripplex.io
|
||||
conan lock create . -o '&:jemalloc=True' -o '&:rocksdb=True'
|
||||
```
|
||||
|
||||
> [!NOTE]
|
||||
> If some dependencies are exclusive for some OS, you may need to run the last command for them adding `--profile:all <PROFILE>`.
|
||||
|
||||
## Coverage report
|
||||
|
||||
@@ -351,7 +516,7 @@ variable in `cmake`. The specific command line used to run the `gcovr` tool will
|
||||
displayed if the `CODE_COVERAGE_VERBOSE` variable is set.
|
||||
|
||||
By default, the code coverage tool runs parallel unit tests with `--unittest-jobs`
|
||||
set to the number of available CPU cores. This may cause spurious test
|
||||
set to the number of available CPU cores. This may cause spurious test
|
||||
errors on Apple. Developers can override the number of unit test jobs with
|
||||
the `coverage_test_parallelism` variable in `cmake`.
|
||||
|
||||
@@ -367,88 +532,74 @@ cmake --build . --target coverage
|
||||
After the `coverage` target is completed, the generated coverage report will be
|
||||
stored inside the build directory, as either of:
|
||||
|
||||
- file named `coverage.`_extension_ , with a suitable extension for the report format, or
|
||||
- file named `coverage.`_extension_, with a suitable extension for the report format, or
|
||||
- directory named `coverage`, with the `index.html` and other files inside, for the `html-details` or `html-nested` report formats.
|
||||
|
||||
|
||||
## Options
|
||||
|
||||
| Option | Default Value | Description |
|
||||
| --- | ---| ---|
|
||||
| `assert` | OFF | Enable assertions.
|
||||
| `coverage` | OFF | Prepare the coverage report. |
|
||||
| `san` | N/A | Enable a sanitizer with Clang. Choices are `thread` and `address`. |
|
||||
| `tests` | OFF | Build tests. |
|
||||
| `unity` | ON | Configure a unity build. |
|
||||
| `xrpld` | OFF | Build the xrpld (`rippled`) application, and not just the libxrpl library. |
|
||||
| Option | Default Value | Description |
|
||||
| ---------- | ------------- | -------------------------------------------------------------------------- |
|
||||
| `assert` | OFF | Enable assertions. |
|
||||
| `coverage` | OFF | Prepare the coverage report. |
|
||||
| `san` | N/A | Enable a sanitizer with Clang. Choices are `thread` and `address`. |
|
||||
| `tests` | OFF | Build tests. |
|
||||
| `unity` | OFF | Configure a unity build. |
|
||||
| `xrpld` | OFF | Build the xrpld (`rippled`) application, and not just the libxrpl library. |
|
||||
| `werr` | OFF | Treat compilation warnings as errors |
|
||||
| `wextra` | OFF | Enable additional compilation warnings |
|
||||
|
||||
[Unity builds][5] may be faster for the first build
|
||||
(at the cost of much more memory) since they concatenate sources into fewer
|
||||
translation units. Non-unity builds may be faster for incremental builds,
|
||||
and can be helpful for detecting `#include` omissions.
|
||||
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
|
||||
### Conan
|
||||
|
||||
After any updates or changes to dependencies, you may need to do the following:
|
||||
|
||||
1. Remove your build directory.
|
||||
2. Remove the Conan cache:
|
||||
2. Remove individual libraries from the Conan cache, e.g.
|
||||
|
||||
```bash
|
||||
conan remove 'grpc/*'
|
||||
```
|
||||
rm -rf ~/.conan/data
|
||||
|
||||
**or**
|
||||
|
||||
Remove all libraries from Conan cache:
|
||||
|
||||
```bash
|
||||
conan remove '*'
|
||||
```
|
||||
4. Re-run [conan install](#build-and-test).
|
||||
|
||||
3. Re-run [conan export](#patched-recipes) if needed.
|
||||
4. [Regenerate lockfile](#conan-lockfile).
|
||||
5. Re-run [conan install](#build-and-test).
|
||||
|
||||
### no std::result_of
|
||||
#### ERROR: Package not resolved
|
||||
|
||||
If your compiler version is recent enough to have removed `std::result_of` as
|
||||
part of C++20, e.g. Apple Clang 15.0, then you might need to add a preprocessor
|
||||
definition to your build.
|
||||
If you're seeing an error like `ERROR: Package 'snappy/1.1.10' not resolved: Unable to find 'snappy/1.1.10#968fef506ff261592ec30c574d4a7809%1756234314.246' in remotes.`,
|
||||
please add `xrplf` remote or re-run `conan export` for [patched recipes](#patched-recipes).
|
||||
|
||||
### `protobuf/port_def.inc` file not found
|
||||
|
||||
If `cmake --build .` results in an error due to a missing a protobuf file, then
|
||||
you might have generated CMake files for a different `build_type` than the
|
||||
`CMAKE_BUILD_TYPE` you passed to Conan.
|
||||
|
||||
```
|
||||
conan profile update 'options.boost:extra_b2_flags="define=BOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
|
||||
conan profile update 'env.CFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
|
||||
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"' default
|
||||
conan profile update 'conf.tools.build:cflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
|
||||
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_HAS_STD_INVOKE_RESULT"]' default
|
||||
/rippled/.build/pb-xrpl.libpb/xrpl/proto/ripple.pb.h:10:10: fatal error: 'google/protobuf/port_def.inc' file not found
|
||||
10 | #include <google/protobuf/port_def.inc>
|
||||
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
1 error generated.
|
||||
```
|
||||
|
||||
For example, if you want to build Debug:
|
||||
|
||||
### call to 'async_teardown' is ambiguous
|
||||
|
||||
If you are compiling with an early version of Clang 16, then you might hit
|
||||
a [regression][6] when compiling C++20 that manifests as an [error in a Boost
|
||||
header][7]. You can workaround it by adding this preprocessor definition:
|
||||
|
||||
```
|
||||
conan profile update 'env.CXXFLAGS="-DBOOST_ASIO_DISABLE_CONCEPTS"' default
|
||||
conan profile update 'conf.tools.build:cxxflags+=["-DBOOST_ASIO_DISABLE_CONCEPTS"]' default
|
||||
```
|
||||
|
||||
|
||||
### recompile with -fPIC
|
||||
|
||||
If you get a linker error suggesting that you recompile Boost with
|
||||
position-independent code, such as:
|
||||
|
||||
```
|
||||
/usr/bin/ld.gold: error: /home/username/.conan/data/boost/1.77.0/_/_/package/.../lib/libboost_container.a(alloc_lib.o):
|
||||
requires unsupported dynamic reloc 11; recompile with -fPIC
|
||||
```
|
||||
|
||||
Conan most likely downloaded a bad binary distribution of the dependency.
|
||||
This seems to be a [bug][1] in Conan just for Boost 1.77.0 compiled with GCC
|
||||
for Linux. The solution is to build the dependency locally by passing
|
||||
`--build boost` when calling `conan install`.
|
||||
|
||||
```
|
||||
conan install --build boost ...
|
||||
```
|
||||
|
||||
1. For conan install, pass `--settings build_type=Debug`
|
||||
2. For cmake, pass `-DCMAKE_BUILD_TYPE=Debug`
|
||||
|
||||
## Add a Dependency
|
||||
|
||||
@@ -456,16 +607,15 @@ If you want to experiment with a new package, follow these steps:
|
||||
|
||||
1. Search for the package on [Conan Center](https://conan.io/center/).
|
||||
2. Modify [`conanfile.py`](./conanfile.py):
|
||||
- Add a version of the package to the `requires` property.
|
||||
- Change any default options for the package by adding them to the
|
||||
`default_options` property (with syntax `'$package:$option': $value`).
|
||||
- Add a version of the package to the `requires` property.
|
||||
- Change any default options for the package by adding them to the
|
||||
`default_options` property (with syntax `'$package:$option': $value`).
|
||||
3. Modify [`CMakeLists.txt`](./CMakeLists.txt):
|
||||
- Add a call to `find_package($package REQUIRED)`.
|
||||
- Link a library from the package to the target `ripple_libs`
|
||||
(search for the existing call to `target_link_libraries(ripple_libs INTERFACE ...)`).
|
||||
- Add a call to `find_package($package REQUIRED)`.
|
||||
- Link a library from the package to the target `ripple_libs`
|
||||
(search for the existing call to `target_link_libraries(ripple_libs INTERFACE ...)`).
|
||||
4. Start coding! Don't forget to include whatever headers you need from the package.
|
||||
|
||||
|
||||
[1]: https://github.com/conan-io/conan-center-index/issues/13168
|
||||
[2]: https://en.cppreference.com/w/cpp/compiler_support/20
|
||||
[3]: https://docs.conan.io/en/latest/getting_started.html
|
||||
|
||||
@@ -16,16 +16,36 @@ set(CMAKE_CXX_EXTENSIONS OFF)
|
||||
set(CMAKE_CXX_STANDARD 20)
|
||||
set(CMAKE_CXX_STANDARD_REQUIRED ON)
|
||||
|
||||
if(CMAKE_CXX_COMPILER_ID MATCHES "GNU")
|
||||
# GCC-specific fixes
|
||||
add_compile_options(-Wno-unknown-pragmas -Wno-subobject-linkage)
|
||||
# -Wno-subobject-linkage can be removed when we upgrade GCC version to at least 13.3
|
||||
elseif(CMAKE_CXX_COMPILER_ID MATCHES "Clang")
|
||||
# Clang-specific fixes
|
||||
add_compile_options(-Wno-unknown-warning-option) # Ignore unknown warning options
|
||||
elseif(MSVC)
|
||||
# MSVC-specific fixes
|
||||
add_compile_options(/wd4068) # Ignore unknown pragmas
|
||||
endif()
|
||||
|
||||
# make GIT_COMMIT_HASH define available to all sources
|
||||
find_package(Git)
|
||||
if(Git_FOUND)
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} --git-dir=${CMAKE_CURRENT_SOURCE_DIR}/.git describe --always --abbrev=40
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} --git-dir=${CMAKE_CURRENT_SOURCE_DIR}/.git rev-parse HEAD
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE OUTPUT_VARIABLE gch)
|
||||
if(gch)
|
||||
set(GIT_COMMIT_HASH "${gch}")
|
||||
message(STATUS gch: ${GIT_COMMIT_HASH})
|
||||
add_definitions(-DGIT_COMMIT_HASH="${GIT_COMMIT_HASH}")
|
||||
endif()
|
||||
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} --git-dir=${CMAKE_CURRENT_SOURCE_DIR}/.git rev-parse --abbrev-ref HEAD
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE OUTPUT_VARIABLE gb)
|
||||
if(gb)
|
||||
set(GIT_BRANCH "${gb}")
|
||||
message(STATUS gb: ${GIT_BRANCH})
|
||||
add_definitions(-DGIT_BRANCH="${GIT_BRANCH}")
|
||||
endif()
|
||||
endif() #git
|
||||
|
||||
if(thread_safety_analysis)
|
||||
@@ -70,9 +90,15 @@ set_target_properties(OpenSSL::SSL PROPERTIES
|
||||
INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2
|
||||
)
|
||||
set(SECP256K1_INSTALL TRUE)
|
||||
set(SECP256K1_BUILD_BENCHMARK FALSE)
|
||||
set(SECP256K1_BUILD_TESTS FALSE)
|
||||
set(SECP256K1_BUILD_EXHAUSTIVE_TESTS FALSE)
|
||||
set(SECP256K1_BUILD_CTIME_TESTS FALSE)
|
||||
set(SECP256K1_BUILD_EXAMPLES FALSE)
|
||||
add_subdirectory(external/secp256k1)
|
||||
add_library(secp256k1::secp256k1 ALIAS secp256k1)
|
||||
add_subdirectory(external/ed25519-donna)
|
||||
add_subdirectory(external/antithesis-sdk)
|
||||
find_package(gRPC REQUIRED)
|
||||
find_package(lz4 REQUIRED)
|
||||
# Target names with :: are not allowed in a generator expression.
|
||||
@@ -123,3 +149,8 @@ set(PROJECT_EXPORT_SET RippleExports)
|
||||
include(RippledCore)
|
||||
include(RippledInstall)
|
||||
include(RippledValidatorKeys)
|
||||
|
||||
if(tests)
|
||||
include(CTest)
|
||||
add_subdirectory(src/tests/libxrpl)
|
||||
endif()
|
||||
|
||||
1039
CONTRIBUTING.md
1039
CONTRIBUTING.md
File diff suppressed because it is too large
Load Diff
@@ -1,4 +1,4 @@
|
||||
ISC License
|
||||
ISC License
|
||||
|
||||
Copyright (c) 2011, Arthur Britto, David Schwartz, Jed McCaleb, Vinnie Falco, Bob Way, Eric Lombrozo, Nikolaos D. Bougalis, Howard Hinnant.
|
||||
Copyright (c) 2012-2020, the XRP Ledger developers.
|
||||
@@ -14,4 +14,3 @@ ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
|
||||
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
|
||||
|
||||
48
README.md
48
README.md
@@ -1,51 +1,54 @@
|
||||
[](https://codecov.io/gh/XRPLF/rippled)
|
||||
|
||||
# The XRP Ledger
|
||||
|
||||
The [XRP Ledger](https://xrpl.org/) is a decentralized cryptographic ledger powered by a network of peer-to-peer nodes. The XRP Ledger uses a novel Byzantine Fault Tolerant consensus algorithm to settle and record transactions in a secure distributed database without a central operator.
|
||||
|
||||
## XRP
|
||||
[XRP](https://xrpl.org/xrp.html) is a public, counterparty-free asset native to the XRP Ledger, and is designed to bridge the many different currencies in use worldwide. XRP is traded on the open-market and is available for anyone to access. The XRP Ledger was created in 2012 with a finite supply of 100 billion units of XRP.
|
||||
|
||||
[XRP](https://xrpl.org/xrp.html) is a public, counterparty-free crypto-asset native to the XRP Ledger, and is designed as a gas token for network services and to bridge different currencies. XRP is traded on the open-market and is available for anyone to access. The XRP Ledger was created in 2012 with a finite supply of 100 billion units of XRP.
|
||||
|
||||
## rippled
|
||||
|
||||
The server software that powers the XRP Ledger is called `rippled` and is available in this repository under the permissive [ISC open-source license](LICENSE.md). The `rippled` server software is written primarily in C++ and runs on a variety of platforms. The `rippled` server software can run in several modes depending on its [configuration](https://xrpl.org/rippled-server-modes.html).
|
||||
|
||||
If you are interested in running an **API Server** (including a **Full History Server**), take a look at [Clio](https://github.com/XRPLF/clio). (rippled Reporting Mode has been replaced by Clio.)
|
||||
|
||||
### Build from Source
|
||||
|
||||
* [Read the build instructions in `BUILD.md`](BUILD.md)
|
||||
* If you encounter any issues, please [open an issue](https://github.com/XRPLF/rippled/issues)
|
||||
- [Read the build instructions in `BUILD.md`](BUILD.md)
|
||||
- If you encounter any issues, please [open an issue](https://github.com/XRPLF/rippled/issues)
|
||||
|
||||
## Key Features of the XRP Ledger
|
||||
|
||||
- **[Censorship-Resistant Transaction Processing][]:** No single party decides which transactions succeed or fail, and no one can "roll back" a transaction after it completes. As long as those who choose to participate in the network keep it healthy, they can settle transactions in seconds.
|
||||
- **[Fast, Efficient Consensus Algorithm][]:** The XRP Ledger's consensus algorithm settles transactions in 4 to 5 seconds, processing at a throughput of up to 1500 transactions per second. These properties put XRP at least an order of magnitude ahead of other top digital assets.
|
||||
- **[Finite XRP Supply][]:** When the XRP Ledger began, 100 billion XRP were created, and no more XRP will ever be created. The available supply of XRP decreases slowly over time as small amounts are destroyed to pay transaction costs.
|
||||
- **[Responsible Software Governance][]:** A team of full-time, world-class developers at Ripple maintain and continually improve the XRP Ledger's underlying software with contributions from the open-source community. Ripple acts as a steward for the technology and an advocate for its interests, and builds constructive relationships with governments and financial institutions worldwide.
|
||||
- **[Finite XRP Supply][]:** When the XRP Ledger began, 100 billion XRP were created, and no more XRP will ever be created. The available supply of XRP decreases slowly over time as small amounts are destroyed to pay transaction fees.
|
||||
- **[Responsible Software Governance][]:** A team of full-time developers at Ripple & other organizations maintain and continually improve the XRP Ledger's underlying software with contributions from the open-source community. Ripple acts as a steward for the technology and an advocate for its interests.
|
||||
- **[Secure, Adaptable Cryptography][]:** The XRP Ledger relies on industry standard digital signature systems like ECDSA (the same scheme used by Bitcoin) but also supports modern, efficient algorithms like Ed25519. The extensible nature of the XRP Ledger's software makes it possible to add and disable algorithms as the state of the art in cryptography advances.
|
||||
- **[Modern Features for Smart Contracts][]:** Features like Escrow, Checks, and Payment Channels support cutting-edge financial applications including the [Interledger Protocol](https://interledger.org/). This toolbox of advanced features comes with safety features like a process for amending the network and separate checks against invariant constraints.
|
||||
- **[Modern Features][]:** Features like Escrow, Checks, and Payment Channels support financial applications atop of the XRP Ledger. This toolbox of advanced features comes with safety features like a process for amending the network and separate checks against invariant constraints.
|
||||
- **[On-Ledger Decentralized Exchange][]:** In addition to all the features that make XRP useful on its own, the XRP Ledger also has a fully-functional accounting system for tracking and trading obligations denominated in any way users want, and an exchange built into the protocol. The XRP Ledger can settle long, cross-currency payment paths and exchanges of multiple currencies in atomic transactions, bridging gaps of trust with XRP.
|
||||
|
||||
[Censorship-Resistant Transaction Processing]: https://xrpl.org/xrp-ledger-overview.html#censorship-resistant-transaction-processing
|
||||
[Fast, Efficient Consensus Algorithm]: https://xrpl.org/xrp-ledger-overview.html#fast-efficient-consensus-algorithm
|
||||
[Finite XRP Supply]: https://xrpl.org/xrp-ledger-overview.html#finite-xrp-supply
|
||||
[Responsible Software Governance]: https://xrpl.org/xrp-ledger-overview.html#responsible-software-governance
|
||||
[Secure, Adaptable Cryptography]: https://xrpl.org/xrp-ledger-overview.html#secure-adaptable-cryptography
|
||||
[Modern Features for Smart Contracts]: https://xrpl.org/xrp-ledger-overview.html#modern-features-for-smart-contracts
|
||||
[On-Ledger Decentralized Exchange]: https://xrpl.org/xrp-ledger-overview.html#on-ledger-decentralized-exchange
|
||||
|
||||
[Censorship-Resistant Transaction Processing]: https://xrpl.org/transaction-censorship-detection.html#transaction-censorship-detection
|
||||
[Fast, Efficient Consensus Algorithm]: https://xrpl.org/consensus-research.html#consensus-research
|
||||
[Finite XRP Supply]: https://xrpl.org/what-is-xrp.html
|
||||
[Responsible Software Governance]: https://xrpl.org/contribute-code.html#contribute-code-to-the-xrp-ledger
|
||||
[Secure, Adaptable Cryptography]: https://xrpl.org/cryptographic-keys.html#cryptographic-keys
|
||||
[Modern Features]: https://xrpl.org/use-specialized-payment-types.html
|
||||
[On-Ledger Decentralized Exchange]: https://xrpl.org/decentralized-exchange.html#decentralized-exchange
|
||||
|
||||
## Source Code
|
||||
|
||||
Here are some good places to start learning the source code:
|
||||
|
||||
- Read the markdown files in the source tree: `src/ripple/**/*.md`.
|
||||
- Read [the levelization document](./Builds/levelization) to get an idea of the internal dependency graph.
|
||||
- Read [the levelization document](.github/scripts/levelization) to get an idea of the internal dependency graph.
|
||||
- In the big picture, the `main` function constructs an `ApplicationImp` object, which implements the `Application` virtual interface. Almost every component in the application takes an `Application&` parameter in its constructor, typically named `app` and stored as a member variable `app_`. This allows most components to depend on any other component.
|
||||
|
||||
### Repository Contents
|
||||
|
||||
| Folder | Contents |
|
||||
|:-----------|:-------------------------------------------------|
|
||||
| :--------- | :----------------------------------------------- |
|
||||
| `./bin` | Scripts and data files for Ripple integrators. |
|
||||
| `./Builds` | Platform-specific guides for building `rippled`. |
|
||||
| `./docs` | Source documentation files and doxygen config. |
|
||||
@@ -55,15 +58,14 @@ Here are some good places to start learning the source code:
|
||||
Some of the directories under `src` are external repositories included using
|
||||
git-subtree. See those directories' README files for more details.
|
||||
|
||||
|
||||
## Additional Documentation
|
||||
|
||||
* [XRP Ledger Dev Portal](https://xrpl.org/)
|
||||
* [Setup and Installation](https://xrpl.org/install-rippled.html)
|
||||
* [Source Documentation (Doxygen)](https://xrplf.github.io/rippled/)
|
||||
- [XRP Ledger Dev Portal](https://xrpl.org/)
|
||||
- [Setup and Installation](https://xrpl.org/install-rippled.html)
|
||||
- [Source Documentation (Doxygen)](https://xrplf.github.io/rippled/)
|
||||
|
||||
## See Also
|
||||
|
||||
* [Clio API Server for the XRP Ledger](https://github.com/XRPLF/clio)
|
||||
* [Mailing List for Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
* [Learn more about the XRP Ledger (YouTube)](https://www.youtube.com/playlist?list=PLJQ55Tj1hIVZtJ_JdTvSum2qMTsedWkNi)
|
||||
- [Clio API Server for the XRP Ledger](https://github.com/XRPLF/clio)
|
||||
- [Mailing List for Release Announcements](https://groups.google.com/g/ripple-server)
|
||||
- [Learn more about the XRP Ledger (YouTube)](https://www.youtube.com/playlist?list=PLJQ55Tj1hIVZtJ_JdTvSum2qMTsedWkNi)
|
||||
|
||||
4715
RELEASENOTES.md
4715
RELEASENOTES.md
File diff suppressed because it is too large
Load Diff
14
SECURITY.md
14
SECURITY.md
@@ -2,7 +2,6 @@
|
||||
|
||||
For more details on operating an XRP Ledger server securely, please visit https://xrpl.org/manage-the-rippled-server.html.
|
||||
|
||||
|
||||
# Security Policy
|
||||
|
||||
## Supported Versions
|
||||
@@ -77,13 +76,14 @@ The amount paid varies dramatically. Vulnerabilities that are harmless on their
|
||||
|
||||
To report a qualifying bug, please send a detailed report to:
|
||||
|
||||
|Email Address|bugs@ripple.com |
|
||||
|:-----------:|:----------------------------------------------------|
|
||||
|Short Key ID | `0xC57929BE` |
|
||||
|Long Key ID | `0xCD49A0AFC57929BE` |
|
||||
|Fingerprint | `24E6 3B02 37E0 FA9C 5E96 8974 CD49 A0AF C579 29BE` |
|
||||
| Email Address | bugs@ripple.com |
|
||||
| :-----------: | :-------------------------------------------------- |
|
||||
| Short Key ID | `0xC57929BE` |
|
||||
| Long Key ID | `0xCD49A0AFC57929BE` |
|
||||
| Fingerprint | `24E6 3B02 37E0 FA9C 5E96 8974 CD49 A0AF C579 29BE` |
|
||||
|
||||
The full PGP key for this address, which is also available on several key servers (e.g. on [keyserver.ubuntu.com](https://keyserver.ubuntu.com)), is:
|
||||
|
||||
The full PGP key for this address, which is also available on several key servers (e.g. on [keys.gnupg.net](https://keys.gnupg.net)), is:
|
||||
```
|
||||
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||
mQINBFUwGHYBEAC0wpGpBPkd8W1UdQjg9+cEFzeIEJRaoZoeuJD8mofwI5Ejnjdt
|
||||
|
||||
470
bin/browser.js
470
bin/browser.js
@@ -1,470 +0,0 @@
|
||||
#!/usr/bin/node
|
||||
//
|
||||
// ledger?l=L
|
||||
// transaction?h=H
|
||||
// ledger_entry?l=L&h=H
|
||||
// account?l=L&a=A
|
||||
// directory?l=L&dir_root=H&i=I
|
||||
// directory?l=L&o=A&i=I // owner directory
|
||||
// offer?l=L&offer=H
|
||||
// offer?l=L&account=A&i=I
|
||||
// ripple_state=l=L&a=A&b=A&c=C
|
||||
// account_lines?l=L&a=A
|
||||
//
|
||||
// A=address
|
||||
// C=currency 3 letter code
|
||||
// H=hash
|
||||
// I=index
|
||||
// L=current | closed | validated | index | hash
|
||||
//
|
||||
|
||||
var async = require("async");
|
||||
var extend = require("extend");
|
||||
var http = require("http");
|
||||
var url = require("url");
|
||||
|
||||
var Remote = require("ripple-lib").Remote;
|
||||
|
||||
var program = process.argv[1];
|
||||
|
||||
var httpd_response = function (res, opts) {
|
||||
var self=this;
|
||||
|
||||
res.statusCode = opts.statusCode;
|
||||
res.end(
|
||||
"<HTML>"
|
||||
+ "<HEAD><TITLE>Title</TITLE></HEAD>"
|
||||
+ "<BODY BACKGROUND=\"#FFFFFF\">"
|
||||
+ "State:" + self.state
|
||||
+ "<UL>"
|
||||
+ "<LI><A HREF=\"/\">home</A>"
|
||||
+ "<LI>" + html_link('r4EM4gBQfr1QgQLXSPF4r7h84qE9mb6iCC')
|
||||
// + "<LI><A HREF=\""+test+"\">rHb9CJAWyB4rj91VRWn96DkukG4bwdtyTh</A>"
|
||||
+ "<LI><A HREF=\"/ledger\">ledger</A>"
|
||||
+ "</UL>"
|
||||
+ (opts.body || '')
|
||||
+ '<HR><PRE>'
|
||||
+ (opts.url || '')
|
||||
+ '</PRE>'
|
||||
+ "</BODY>"
|
||||
+ "</HTML>"
|
||||
);
|
||||
};
|
||||
|
||||
var html_link = function (generic) {
|
||||
return '<A HREF="' + build_uri({ type: 'account', account: generic}) + '">' + generic + '</A>';
|
||||
};
|
||||
|
||||
// Build a link to a type.
|
||||
var build_uri = function (params, opts) {
|
||||
var c;
|
||||
|
||||
if (params.type === 'account') {
|
||||
c = {
|
||||
pathname: 'account',
|
||||
query: {
|
||||
l: params.ledger,
|
||||
a: params.account,
|
||||
},
|
||||
};
|
||||
|
||||
} else if (params.type === 'ledger') {
|
||||
c = {
|
||||
pathname: 'ledger',
|
||||
query: {
|
||||
l: params.ledger,
|
||||
},
|
||||
};
|
||||
|
||||
} else if (params.type === 'transaction') {
|
||||
c = {
|
||||
pathname: 'transaction',
|
||||
query: {
|
||||
h: params.hash,
|
||||
},
|
||||
};
|
||||
} else {
|
||||
c = {};
|
||||
}
|
||||
|
||||
opts = opts || {};
|
||||
|
||||
c.protocol = "http";
|
||||
c.hostname = opts.hostname || self.base.hostname;
|
||||
c.port = opts.port || self.base.port;
|
||||
|
||||
return url.format(c);
|
||||
};
|
||||
|
||||
var build_link = function (item, link) {
|
||||
console.log(link);
|
||||
return "<A HREF=" + link + ">" + item + "</A>";
|
||||
};
|
||||
|
||||
var rewrite_field = function (type, obj, field, opts) {
|
||||
if (field in obj) {
|
||||
obj[field] = rewrite_type(type, obj[field], opts);
|
||||
}
|
||||
};
|
||||
|
||||
var rewrite_type = function (type, obj, opts) {
|
||||
if ('amount' === type) {
|
||||
if ('string' === typeof obj) {
|
||||
// XRP.
|
||||
return '<B>' + obj + '</B>';
|
||||
|
||||
} else {
|
||||
rewrite_field('address', obj, 'issuer', opts);
|
||||
|
||||
return obj;
|
||||
}
|
||||
return build_link(
|
||||
obj,
|
||||
build_uri({
|
||||
type: 'account',
|
||||
account: obj
|
||||
}, opts)
|
||||
);
|
||||
}
|
||||
if ('address' === type) {
|
||||
return build_link(
|
||||
obj,
|
||||
build_uri({
|
||||
type: 'account',
|
||||
account: obj
|
||||
}, opts)
|
||||
);
|
||||
}
|
||||
else if ('ledger' === type) {
|
||||
return build_link(
|
||||
obj,
|
||||
build_uri({
|
||||
type: 'ledger',
|
||||
ledger: obj,
|
||||
}, opts)
|
||||
);
|
||||
}
|
||||
else if ('node' === type) {
|
||||
// A node
|
||||
if ('PreviousTxnID' in obj)
|
||||
obj.PreviousTxnID = rewrite_type('transaction', obj.PreviousTxnID, opts);
|
||||
|
||||
if ('Offer' === obj.LedgerEntryType) {
|
||||
if ('NewFields' in obj) {
|
||||
if ('TakerGets' in obj.NewFields)
|
||||
obj.NewFields.TakerGets = rewrite_type('amount', obj.NewFields.TakerGets, opts);
|
||||
|
||||
if ('TakerPays' in obj.NewFields)
|
||||
obj.NewFields.TakerPays = rewrite_type('amount', obj.NewFields.TakerPays, opts);
|
||||
}
|
||||
}
|
||||
|
||||
obj.LedgerEntryType = '<B>' + obj.LedgerEntryType + '</B>';
|
||||
|
||||
return obj;
|
||||
}
|
||||
else if ('transaction' === type) {
|
||||
// Reference to a transaction.
|
||||
return build_link(
|
||||
obj,
|
||||
build_uri({
|
||||
type: 'transaction',
|
||||
hash: obj
|
||||
}, opts)
|
||||
);
|
||||
}
|
||||
|
||||
return 'ERROR: ' + type;
|
||||
};
|
||||
|
||||
var rewrite_object = function (obj, opts) {
|
||||
var out = extend({}, obj);
|
||||
|
||||
rewrite_field('address', out, 'Account', opts);
|
||||
|
||||
rewrite_field('ledger', out, 'parent_hash', opts);
|
||||
rewrite_field('ledger', out, 'ledger_index', opts);
|
||||
rewrite_field('ledger', out, 'ledger_current_index', opts);
|
||||
rewrite_field('ledger', out, 'ledger_hash', opts);
|
||||
|
||||
if ('ledger' in obj) {
|
||||
// It's a ledger header.
|
||||
out.ledger = rewrite_object(out.ledger, opts);
|
||||
|
||||
if ('ledger_hash' in out.ledger)
|
||||
out.ledger.ledger_hash = '<B>' + out.ledger.ledger_hash + '</B>';
|
||||
|
||||
delete out.ledger.hash;
|
||||
delete out.ledger.totalCoins;
|
||||
}
|
||||
|
||||
if ('TransactionType' in obj) {
|
||||
// It's a transaction.
|
||||
out.TransactionType = '<B>' + obj.TransactionType + '</B>';
|
||||
|
||||
rewrite_field('amount', out, 'TakerGets', opts);
|
||||
rewrite_field('amount', out, 'TakerPays', opts);
|
||||
rewrite_field('ledger', out, 'inLedger', opts);
|
||||
|
||||
out.meta.AffectedNodes = out.meta.AffectedNodes.map(function (node) {
|
||||
var kind = 'CreatedNode' in node
|
||||
? 'CreatedNode'
|
||||
: 'ModifiedNode' in node
|
||||
? 'ModifiedNode'
|
||||
: 'DeletedNode' in node
|
||||
? 'DeletedNode'
|
||||
: undefined;
|
||||
|
||||
if (kind) {
|
||||
node[kind] = rewrite_type('node', node[kind], opts);
|
||||
}
|
||||
return node;
|
||||
});
|
||||
}
|
||||
else if ('node' in obj && 'LedgerEntryType' in obj.node) {
|
||||
// Its a ledger entry.
|
||||
|
||||
if (obj.node.LedgerEntryType === 'AccountRoot') {
|
||||
rewrite_field('address', out.node, 'Account', opts);
|
||||
rewrite_field('transaction', out.node, 'PreviousTxnID', opts);
|
||||
rewrite_field('ledger', out.node, 'PreviousTxnLgrSeq', opts);
|
||||
}
|
||||
|
||||
out.node.LedgerEntryType = '<B>' + out.node.LedgerEntryType + '</B>';
|
||||
}
|
||||
|
||||
return out;
|
||||
};
|
||||
|
||||
var augment_object = function (obj, opts, done) {
|
||||
if (obj.node.LedgerEntryType == 'AccountRoot') {
|
||||
var tx_hash = obj.node.PreviousTxnID;
|
||||
var tx_ledger = obj.node.PreviousTxnLgrSeq;
|
||||
|
||||
obj.history = [];
|
||||
|
||||
async.whilst(
|
||||
function () { return tx_hash; },
|
||||
function (callback) {
|
||||
// console.log("augment_object: request: %s %s", tx_hash, tx_ledger);
|
||||
opts.remote.request_tx(tx_hash)
|
||||
.on('success', function (m) {
|
||||
tx_hash = undefined;
|
||||
tx_ledger = undefined;
|
||||
|
||||
//console.log("augment_object: ", JSON.stringify(m));
|
||||
m.meta.AffectedNodes.filter(function(n) {
|
||||
// console.log("augment_object: ", JSON.stringify(n));
|
||||
// if (n.ModifiedNode)
|
||||
// console.log("augment_object: %s %s %s %s %s %s/%s", 'ModifiedNode' in n, n.ModifiedNode && (n.ModifiedNode.LedgerEntryType === 'AccountRoot'), n.ModifiedNode && n.ModifiedNode.FinalFields && (n.ModifiedNode.FinalFields.Account === obj.node.Account), Object.keys(n)[0], n.ModifiedNode && (n.ModifiedNode.LedgerEntryType), obj.node.Account, n.ModifiedNode && n.ModifiedNode.FinalFields && n.ModifiedNode.FinalFields.Account);
|
||||
// if ('ModifiedNode' in n && n.ModifiedNode.LedgerEntryType === 'AccountRoot')
|
||||
// {
|
||||
// console.log("***: ", JSON.stringify(m));
|
||||
// console.log("***: ", JSON.stringify(n));
|
||||
// }
|
||||
return 'ModifiedNode' in n
|
||||
&& n.ModifiedNode.LedgerEntryType === 'AccountRoot'
|
||||
&& n.ModifiedNode.FinalFields
|
||||
&& n.ModifiedNode.FinalFields.Account === obj.node.Account;
|
||||
})
|
||||
.forEach(function (n) {
|
||||
tx_hash = n.ModifiedNode.PreviousTxnID;
|
||||
tx_ledger = n.ModifiedNode.PreviousTxnLgrSeq;
|
||||
|
||||
obj.history.push({
|
||||
tx_hash: tx_hash,
|
||||
tx_ledger: tx_ledger
|
||||
});
|
||||
console.log("augment_object: next: %s %s", tx_hash, tx_ledger);
|
||||
});
|
||||
|
||||
callback();
|
||||
})
|
||||
.on('error', function (m) {
|
||||
callback(m);
|
||||
})
|
||||
.request();
|
||||
},
|
||||
function (err) {
|
||||
if (err) {
|
||||
done();
|
||||
}
|
||||
else {
|
||||
async.forEach(obj.history, function (o, callback) {
|
||||
opts.remote.request_account_info(obj.node.Account)
|
||||
.ledger_index(o.tx_ledger)
|
||||
.on('success', function (m) {
|
||||
//console.log("augment_object: ", JSON.stringify(m));
|
||||
o.Balance = m.account_data.Balance;
|
||||
// o.account_data = m.account_data;
|
||||
callback();
|
||||
})
|
||||
.on('error', function (m) {
|
||||
o.error = m;
|
||||
callback();
|
||||
})
|
||||
.request();
|
||||
},
|
||||
function (err) {
|
||||
done(err);
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
else {
|
||||
done();
|
||||
}
|
||||
};
|
||||
|
||||
if (process.argv.length < 4 || process.argv.length > 7) {
|
||||
console.log("Usage: %s ws_ip ws_port [<ip> [<port> [<start>]]]", program);
|
||||
}
|
||||
else {
|
||||
var ws_ip = process.argv[2];
|
||||
var ws_port = process.argv[3];
|
||||
var ip = process.argv.length > 4 ? process.argv[4] : "127.0.0.1";
|
||||
var port = process.argv.length > 5 ? process.argv[5] : "8080";
|
||||
|
||||
// console.log("START");
|
||||
var self = this;
|
||||
|
||||
var remote = (new Remote({
|
||||
websocket_ip: ws_ip,
|
||||
websocket_port: ws_port,
|
||||
trace: false
|
||||
}))
|
||||
.on('state', function (m) {
|
||||
console.log("STATE: %s", m);
|
||||
|
||||
self.state = m;
|
||||
})
|
||||
// .once('ledger_closed', callback)
|
||||
.connect()
|
||||
;
|
||||
|
||||
self.base = {
|
||||
hostname: ip,
|
||||
port: port,
|
||||
remote: remote,
|
||||
};
|
||||
|
||||
// console.log("SERVE");
|
||||
var server = http.createServer(function (req, res) {
|
||||
var input = "";
|
||||
|
||||
req.setEncoding();
|
||||
|
||||
req.on('data', function (buffer) {
|
||||
// console.log("DATA: %s", buffer);
|
||||
input = input + buffer;
|
||||
});
|
||||
|
||||
req.on('end', function () {
|
||||
// console.log("URL: %s", req.url);
|
||||
// console.log("HEADERS: %s", JSON.stringify(req.headers, undefined, 2));
|
||||
|
||||
var _parsed = url.parse(req.url, true);
|
||||
var _url = JSON.stringify(_parsed, undefined, 2);
|
||||
|
||||
// console.log("HEADERS: %s", JSON.stringify(_parsed, undefined, 2));
|
||||
if (_parsed.pathname === "/account") {
|
||||
var request = remote
|
||||
.request_ledger_entry('account_root')
|
||||
.ledger_index(-1)
|
||||
.account_root(_parsed.query.a)
|
||||
.on('success', function (m) {
|
||||
// console.log("account_root: %s", JSON.stringify(m, undefined, 2));
|
||||
|
||||
augment_object(m, self.base, function() {
|
||||
httpd_response(res,
|
||||
{
|
||||
statusCode: 200,
|
||||
url: _url,
|
||||
body: "<PRE>"
|
||||
+ JSON.stringify(rewrite_object(m, self.base), undefined, 2)
|
||||
+ "</PRE>"
|
||||
});
|
||||
});
|
||||
})
|
||||
.request();
|
||||
|
||||
} else if (_parsed.pathname === "/ledger") {
|
||||
var request = remote
|
||||
.request_ledger(undefined, { expand: true, transactions: true })
|
||||
.on('success', function (m) {
|
||||
// console.log("Ledger: %s", JSON.stringify(m, undefined, 2));
|
||||
|
||||
httpd_response(res,
|
||||
{
|
||||
statusCode: 200,
|
||||
url: _url,
|
||||
body: "<PRE>"
|
||||
+ JSON.stringify(rewrite_object(m, self.base), undefined, 2)
|
||||
+"</PRE>"
|
||||
});
|
||||
})
|
||||
|
||||
if (_parsed.query.l && _parsed.query.l.length === 64) {
|
||||
request.ledger_hash(_parsed.query.l);
|
||||
}
|
||||
else if (_parsed.query.l) {
|
||||
request.ledger_index(Number(_parsed.query.l));
|
||||
}
|
||||
else {
|
||||
request.ledger_index(-1);
|
||||
}
|
||||
|
||||
request.request();
|
||||
|
||||
} else if (_parsed.pathname === "/transaction") {
|
||||
var request = remote
|
||||
.request_tx(_parsed.query.h)
|
||||
// .request_transaction_entry(_parsed.query.h)
|
||||
// .ledger_select(_parsed.query.l)
|
||||
.on('success', function (m) {
|
||||
// console.log("transaction: %s", JSON.stringify(m, undefined, 2));
|
||||
|
||||
httpd_response(res,
|
||||
{
|
||||
statusCode: 200,
|
||||
url: _url,
|
||||
body: "<PRE>"
|
||||
+ JSON.stringify(rewrite_object(m, self.base), undefined, 2)
|
||||
+"</PRE>"
|
||||
});
|
||||
})
|
||||
.on('error', function (m) {
|
||||
httpd_response(res,
|
||||
{
|
||||
statusCode: 200,
|
||||
url: _url,
|
||||
body: "<PRE>"
|
||||
+ 'ERROR: ' + JSON.stringify(m, undefined, 2)
|
||||
+"</PRE>"
|
||||
});
|
||||
})
|
||||
.request();
|
||||
|
||||
} else {
|
||||
var test = build_uri({
|
||||
type: 'account',
|
||||
ledger: 'closed',
|
||||
account: 'rHb9CJAWyB4rj91VRWn96DkukG4bwdtyTh',
|
||||
}, self.base);
|
||||
|
||||
httpd_response(res,
|
||||
{
|
||||
statusCode: req.url === "/" ? 200 : 404,
|
||||
url: _url,
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
server.listen(port, ip, undefined,
|
||||
function () {
|
||||
console.log("Listening at: http://%s:%s", ip, port);
|
||||
});
|
||||
}
|
||||
|
||||
// vim:sw=2:sts=2:ts=8:et
|
||||
@@ -1,64 +0,0 @@
|
||||
var ripple = require('ripple-lib');
|
||||
|
||||
var v = {
|
||||
seed: "snoPBrXtMeMyMHUVTgbuqAfg1SUTb",
|
||||
addr: "rHb9CJAWyB4rj91VRWn96DkukG4bwdtyTh"
|
||||
};
|
||||
|
||||
var remote = ripple.Remote.from_config({
|
||||
"trusted" : true,
|
||||
"websocket_ip" : "127.0.0.1",
|
||||
"websocket_port" : 5006,
|
||||
"websocket_ssl" : false,
|
||||
"local_signing" : true
|
||||
});
|
||||
|
||||
var tx_json = {
|
||||
"Account" : v.addr,
|
||||
"Amount" : "10000000",
|
||||
"Destination" : "rEu2ULPiEQm1BAL8pYzmXnNX1aFX9sCks",
|
||||
"Fee" : "10",
|
||||
"Flags" : 0,
|
||||
"Sequence" : 3,
|
||||
"TransactionType" : "Payment"
|
||||
|
||||
//"SigningPubKey": '0396941B22791A448E5877A44CE98434DB217D6FB97D63F0DAD23BE49ED45173C9'
|
||||
};
|
||||
|
||||
remote.on('connected', function () {
|
||||
var req = remote.request_sign(v.seed, tx_json);
|
||||
req.message.debug_signing = true;
|
||||
req.on('success', function (result) {
|
||||
console.log("SERVER RESULT");
|
||||
console.log(result);
|
||||
|
||||
var sim = {};
|
||||
var tx = remote.transaction();
|
||||
tx.tx_json = tx_json;
|
||||
tx._secret = v.seed;
|
||||
tx.complete();
|
||||
var unsigned = tx.serialize().to_hex();
|
||||
tx.sign();
|
||||
|
||||
sim.tx_blob = tx.serialize().to_hex();
|
||||
sim.tx_json = tx.tx_json;
|
||||
sim.tx_signing_hash = tx.signing_hash().to_hex();
|
||||
sim.tx_unsigned = unsigned;
|
||||
|
||||
console.log("\nLOCAL RESULT");
|
||||
console.log(sim);
|
||||
|
||||
remote.connect(false);
|
||||
});
|
||||
req.on('error', function (err) {
|
||||
if (err.error === "remoteError" && err.remote.error === "srcActNotFound") {
|
||||
console.log("Please fund account "+v.addr+" to run this test.");
|
||||
} else {
|
||||
console.log('error', err);
|
||||
}
|
||||
remote.connect(false);
|
||||
});
|
||||
req.request();
|
||||
|
||||
});
|
||||
remote.connect();
|
||||
@@ -1,18 +0,0 @@
|
||||
#!/usr/bin/node
|
||||
//
|
||||
// Returns a Gravatar style hash as per: http://en.gravatar.com/site/implement/hash/
|
||||
//
|
||||
|
||||
if (3 != process.argv.length) {
|
||||
process.stderr.write("Usage: " + process.argv[1] + " email_address\n\nReturns gravatar style hash.\n");
|
||||
process.exit(1);
|
||||
|
||||
} else {
|
||||
var md5 = require('crypto').createHash('md5');
|
||||
|
||||
md5.update(process.argv[2].trim().toLowerCase());
|
||||
|
||||
process.stdout.write(md5.digest('hex') + "\n");
|
||||
}
|
||||
|
||||
// vim:sw=2:sts=2:ts=8:et
|
||||
@@ -1,31 +0,0 @@
|
||||
#!/usr/bin/node
|
||||
//
|
||||
// This program allows IE 9 ripple-clients to make websocket connections to
|
||||
// rippled using flash. As IE 9 does not have websocket support, this required
|
||||
// if you wish to support IE 9 ripple-clients.
|
||||
//
|
||||
// http://www.lightsphere.com/dev/articles/flash_socket_policy.html
|
||||
//
|
||||
// For better security, be sure to set the Port below to the port of your
|
||||
// [websocket_public_port].
|
||||
//
|
||||
|
||||
var net = require("net"),
|
||||
port = "*",
|
||||
domains = ["*:"+port]; // Domain:Port
|
||||
|
||||
net.createServer(
|
||||
function(socket) {
|
||||
socket.write("<?xml version='1.0' ?>\n");
|
||||
socket.write("<!DOCTYPE cross-domain-policy SYSTEM 'http://www.macromedia.com/xml/dtds/cross-domain-policy.dtd'>\n");
|
||||
socket.write("<cross-domain-policy>\n");
|
||||
domains.forEach(
|
||||
function(domain) {
|
||||
var parts = domain.split(':');
|
||||
socket.write("\t<allow-access-from domain='" + parts[0] + "' to-ports='" + parts[1] + "' />\n");
|
||||
}
|
||||
);
|
||||
socket.write("</cross-domain-policy>\n");
|
||||
socket.end();
|
||||
}
|
||||
).listen(843);
|
||||
@@ -1,150 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# This script generates information about your rippled installation
|
||||
# and system. It can be used to help debug issues that you may face
|
||||
# in your installation. While this script endeavors to not display any
|
||||
# sensitive information, it is recommended that you read the output
|
||||
# before sharing with any third parties.
|
||||
|
||||
|
||||
rippled_exe=/opt/ripple/bin/rippled
|
||||
conf_file=/etc/opt/ripple/rippled.cfg
|
||||
|
||||
while getopts ":e:c:" opt; do
|
||||
case $opt in
|
||||
e)
|
||||
rippled_exe=${OPTARG}
|
||||
;;
|
||||
c)
|
||||
conf_file=${OPTARG}
|
||||
;;
|
||||
\?)
|
||||
echo "Invalid option: -$OPTARG"
|
||||
exit -1
|
||||
esac
|
||||
done
|
||||
|
||||
tmp_loc=$(mktemp -d --tmpdir ripple_info.XXXXX)
|
||||
chmod 751 ${tmp_loc}
|
||||
awk_prog=${tmp_loc}/cfg.awk
|
||||
summary_out=${tmp_loc}/rippled_info.md
|
||||
printf "# rippled report info\n\n> generated at %s\n" "$(date -R)" > ${summary_out}
|
||||
|
||||
function log_section {
|
||||
printf "\n## %s\n" "$*" >> ${summary_out}
|
||||
|
||||
while read -r l; do
|
||||
echo " $l" >> ${summary_out}
|
||||
done </dev/stdin
|
||||
}
|
||||
|
||||
function join_by {
|
||||
local IFS="$1"; shift; echo "$*";
|
||||
}
|
||||
|
||||
if [[ -f ${conf_file} ]] ; then
|
||||
exclude=( ips ips_fixed node_seed validation_seed validator_token )
|
||||
cleaned_conf=${tmp_loc}/cleaned_rippled_cfg.txt
|
||||
cat << 'EOP' >> ${awk_prog}
|
||||
BEGIN {FS="[[:space:]]*=[[:space:]]*"; skip=0; db_path=""; print > OUT_FILE; split(exl,exa,"|")}
|
||||
/^#/ {next}
|
||||
save==2 && /^[[:space:]]*$/ {next}
|
||||
/^\[.+\]$/ {
|
||||
section=tolower(gensub(/^\[[[:space:]]*([a-zA-Z_]+)[[:space:]]*\]$/, "\\1", "g"))
|
||||
skip = 0
|
||||
for (i in exa) {
|
||||
if (section == exa[i])
|
||||
skip = 1
|
||||
}
|
||||
if (section == "database_path")
|
||||
save = 1
|
||||
}
|
||||
skip==1 {next}
|
||||
save==2 {save=0; db_path=$0}
|
||||
save==1 {save=2}
|
||||
$1 ~ /password/ {$0=$1"=<redacted>"}
|
||||
{print >> OUT_FILE}
|
||||
END {print db_path}
|
||||
EOP
|
||||
|
||||
db=$(\
|
||||
sed -r -e 's/\<s[[:alnum:]]{28}\>/<redactedsecret>/g;s/^[[:space:]]*//;s/[[:space:]]*$//' ${conf_file} |\
|
||||
awk -v OUT_FILE=${cleaned_conf} -v exl="$(join_by '|' "${exclude[@]}")" -f ${awk_prog})
|
||||
rm ${awk_prog}
|
||||
cat ${cleaned_conf} | log_section "cleaned config file"
|
||||
rm ${cleaned_conf}
|
||||
echo "${db}" | log_section "database path"
|
||||
df ${db} | log_section "df: database"
|
||||
fi
|
||||
|
||||
# Send output from this script to a log file
|
||||
## this captures any messages
|
||||
## or errors from the script itself
|
||||
|
||||
log_file=${tmp_loc}/get_info.log
|
||||
exec 3>&1 1>>${log_file} 2>&1
|
||||
|
||||
## Send all stdout files to /tmp
|
||||
|
||||
if [[ -x ${rippled_exe} ]] ; then
|
||||
pgrep rippled && \
|
||||
${rippled_exe} --conf ${conf_file} \
|
||||
-- server_info | log_section "server info"
|
||||
fi
|
||||
|
||||
cat /proc/meminfo | log_section "meminfo"
|
||||
cat /proc/swaps | log_section "swap space"
|
||||
ulimit -a | log_section "ulimit"
|
||||
|
||||
if command -v lshw >/dev/null 2>&1 ; then
|
||||
lshw 2>/dev/null | log_section "hardware info"
|
||||
else
|
||||
lscpu > ${tmp_loc}/hw_info.txt
|
||||
hwinfo >> ${tmp_loc}/hw_info.txt
|
||||
lspci >> ${tmp_loc}/hw_info.txt
|
||||
lsblk >> ${tmp_loc}/hw_info.txt
|
||||
cat ${tmp_loc}/hw_info.txt | log_section "hardware info"
|
||||
rm ${tmp_loc}/hw_info.txt
|
||||
fi
|
||||
|
||||
if command -v iostat >/dev/null 2>&1 ; then
|
||||
iostat -t -d -x 2 6 | log_section "iostat"
|
||||
fi
|
||||
|
||||
df -h | log_section "free disk space"
|
||||
drives=($(df | awk '$1 ~ /^\/dev\// {print $1}' | xargs -n 1 basename))
|
||||
block_devs=($(ls /sys/block/))
|
||||
for d in "${drives[@]}"; do
|
||||
for dev in "${block_devs[@]}"; do
|
||||
#echo "D: [$d], DEV: [$dev]"
|
||||
if [[ $d =~ $dev ]]; then
|
||||
# this file (if exists) has 0 for SSD and 1 for HDD
|
||||
if [[ "$(cat /sys/block/${dev}/queue/rotational 2>/dev/null)" == 0 ]] ; then
|
||||
echo "${d} : SSD" >> ${tmp_loc}/is_ssd.txt
|
||||
else
|
||||
echo "${d} : NO SSD" >> ${tmp_loc}/is_ssd.txt
|
||||
fi
|
||||
fi
|
||||
done
|
||||
done
|
||||
|
||||
if [[ -f ${tmp_loc}/is_ssd.txt ]] ; then
|
||||
cat ${tmp_loc}/is_ssd.txt | log_section "SSD"
|
||||
rm ${tmp_loc}/is_ssd.txt
|
||||
fi
|
||||
|
||||
cat ${log_file} | log_section "script log"
|
||||
|
||||
cat << MSG | tee /dev/fd/3
|
||||
####################################################
|
||||
rippled info has been gathered. Please copy the
|
||||
contents of ${summary_out}
|
||||
to a github gist at https://gist.github.com/
|
||||
|
||||
PLEASE REVIEW THIS FILE FOR ANY SENSITIVE DATA
|
||||
BEFORE POSTING! We have tried our best to omit
|
||||
any sensitive information from this file, but you
|
||||
should verify before posting.
|
||||
####################################################
|
||||
MSG
|
||||
|
||||
85
bin/git/setup-upstreams.sh
Executable file
85
bin/git/setup-upstreams.sh
Executable file
@@ -0,0 +1,85 @@
|
||||
#!/bin/bash
|
||||
|
||||
if [[ $# -ne 1 || "$1" == "--help" || "$1" == "-h" ]]
|
||||
then
|
||||
name=$( basename $0 )
|
||||
cat <<- USAGE
|
||||
Usage: $name <username>
|
||||
|
||||
Where <username> is the Github username of the upstream repo. e.g. XRPLF
|
||||
USAGE
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Create upstream remotes based on origin
|
||||
shift
|
||||
user="$1"
|
||||
# Get the origin URL. Expect it be an SSH-style URL
|
||||
origin=$( git remote get-url origin )
|
||||
if [[ "${origin}" == "" ]]
|
||||
then
|
||||
echo Invalid origin remote >&2
|
||||
exit 1
|
||||
fi
|
||||
# echo "Origin: ${origin}"
|
||||
# Parse the origin
|
||||
ifs_orig="${IFS}"
|
||||
IFS=':' read remote originpath <<< "${origin}"
|
||||
# echo "Remote: ${remote}, Originpath: ${originpath}"
|
||||
IFS='@' read sshuser server <<< "${remote}"
|
||||
# echo "SSHUser: ${sshuser}, Server: ${server}"
|
||||
IFS='/' read originuser repo <<< "${originpath}"
|
||||
# echo "Originuser: ${originuser}, Repo: ${repo}"
|
||||
if [[ "${sshuser}" == "" || "${server}" == "" || "${originuser}" == ""
|
||||
|| "${repo}" == "" ]]
|
||||
then
|
||||
echo "Can't parse origin URL: ${origin}" >&2
|
||||
exit 1
|
||||
fi
|
||||
upstream="https://${server}/${user}/${repo}"
|
||||
upstreampush="${remote}:${user}/${repo}"
|
||||
upstreamgroup="upstream upstream-push"
|
||||
current=$( git remote get-url upstream 2>/dev/null )
|
||||
currentpush=$( git remote get-url upstream-push 2>/dev/null )
|
||||
currentgroup=$( git config remotes.upstreams )
|
||||
if [[ "${current}" == "${upstream}" ]]
|
||||
then
|
||||
echo "Upstream already set up correctly. Skip"
|
||||
elif [[ -n "${current}" && "${current}" != "${upstream}" &&
|
||||
"${current}" != "${upstreampush}" ]]
|
||||
then
|
||||
echo "Upstream already set up as: ${current}. Skip"
|
||||
else
|
||||
if [[ "${current}" == "${upstreampush}" ]]
|
||||
then
|
||||
echo "Upstream set to dangerous push URL. Update."
|
||||
_run git remote rename upstream upstream-push || \
|
||||
_run git remote remove upstream
|
||||
currentpush=$( git remote get-url upstream-push 2>/dev/null )
|
||||
fi
|
||||
_run git remote add upstream "${upstream}"
|
||||
fi
|
||||
|
||||
if [[ "${currentpush}" == "${upstreampush}" ]]
|
||||
then
|
||||
echo "upstream-push already set up correctly. Skip"
|
||||
elif [[ -n "${currentpush}" && "${currentpush}" != "${upstreampush}" ]]
|
||||
then
|
||||
echo "upstream-push already set up as: ${currentpush}. Skip"
|
||||
else
|
||||
_run git remote add upstream-push "${upstreampush}"
|
||||
fi
|
||||
|
||||
if [[ "${currentgroup}" == "${upstreamgroup}" ]]
|
||||
then
|
||||
echo "Upstreams group already set up correctly. Skip"
|
||||
elif [[ -n "${currentgroup}" && "${currentgroup}" != "${upstreamgroup}" ]]
|
||||
then
|
||||
echo "Upstreams group already set up as: ${currentgroup}. Skip"
|
||||
else
|
||||
_run git config --add remotes.upstreams "${upstreamgroup}"
|
||||
fi
|
||||
|
||||
_run git fetch --jobs=$(nproc) upstreams
|
||||
|
||||
exit 0
|
||||
68
bin/git/squash-branches.sh
Executable file
68
bin/git/squash-branches.sh
Executable file
@@ -0,0 +1,68 @@
|
||||
#!/bin/bash
|
||||
|
||||
if [[ $# -lt 3 || "$1" == "--help" || "$1" = "-h" ]]
|
||||
then
|
||||
name=$( basename $0 )
|
||||
cat <<- USAGE
|
||||
Usage: $name workbranch base/branch user/branch [user/branch [...]]
|
||||
|
||||
* workbranch will be created locally from base/branch
|
||||
* base/branch and user/branch may be specified as user:branch to allow
|
||||
easy copying from Github PRs
|
||||
* Remotes for each user must already be set up
|
||||
USAGE
|
||||
exit 0
|
||||
fi
|
||||
|
||||
work="$1"
|
||||
shift
|
||||
|
||||
branches=( $( echo "${@}" | sed "s/:/\//" ) )
|
||||
base="${branches[0]}"
|
||||
unset branches[0]
|
||||
|
||||
set -e
|
||||
|
||||
users=()
|
||||
for b in "${branches[@]}"
|
||||
do
|
||||
users+=( $( echo $b | cut -d/ -f1 ) )
|
||||
done
|
||||
|
||||
users=( $( printf '%s\n' "${users[@]}" | sort -u ) )
|
||||
|
||||
git fetch --multiple upstreams "${users[@]}"
|
||||
git checkout -B "$work" --no-track "$base"
|
||||
|
||||
for b in "${branches[@]}"
|
||||
do
|
||||
git merge --squash "${b}"
|
||||
git commit -S # Use the commit message provided on the PR
|
||||
done
|
||||
|
||||
# Make sure the commits look right
|
||||
git log --show-signature "$base..HEAD"
|
||||
|
||||
parts=( $( echo $base | sed "s/\// /" ) )
|
||||
repo="${parts[0]}"
|
||||
b="${parts[1]}"
|
||||
push=$repo
|
||||
if [[ "$push" == "upstream" ]]
|
||||
then
|
||||
push="upstream-push"
|
||||
fi
|
||||
if [[ "$repo" == "upstream" ]]
|
||||
then
|
||||
repo="upstreams"
|
||||
fi
|
||||
cat << PUSH
|
||||
|
||||
-------------------------------------------------------------------
|
||||
This script will not push. Verify everything is correct, then push
|
||||
to your repo, and create a PR if necessary. Once the PR is approved,
|
||||
run:
|
||||
|
||||
git push $push HEAD:$b
|
||||
git fetch $repo
|
||||
-------------------------------------------------------------------
|
||||
PUSH
|
||||
58
bin/git/update-version.sh
Executable file
58
bin/git/update-version.sh
Executable file
@@ -0,0 +1,58 @@
|
||||
#!/bin/bash
|
||||
|
||||
if [[ $# -ne 3 || "$1" == "--help" || "$1" = "-h" ]]
|
||||
then
|
||||
name=$( basename $0 )
|
||||
cat <<- USAGE
|
||||
Usage: $name workbranch base/branch version
|
||||
|
||||
* workbranch will be created locally from base/branch. If it exists,
|
||||
it will be reused, so make sure you don't overwrite any work.
|
||||
* base/branch may be specified as user:branch to allow easy copying
|
||||
from Github PRs.
|
||||
USAGE
|
||||
exit 0
|
||||
fi
|
||||
|
||||
work="$1"
|
||||
shift
|
||||
|
||||
base=$( echo "$1" | sed "s/:/\//" )
|
||||
shift
|
||||
|
||||
version=$1
|
||||
shift
|
||||
|
||||
set -e
|
||||
|
||||
git fetch upstreams
|
||||
|
||||
git checkout -B "${work}" --no-track "${base}"
|
||||
|
||||
push=$( git rev-parse --abbrev-ref --symbolic-full-name '@{push}' \
|
||||
2>/dev/null ) || true
|
||||
if [[ "${push}" != "" ]]
|
||||
then
|
||||
echo "Warning: ${push} may already exist."
|
||||
fi
|
||||
|
||||
build=$( find -name BuildInfo.cpp )
|
||||
sed 's/\(^.*versionString =\).*$/\1 "'${version}'"/' ${build} > version.cpp && \
|
||||
diff "${build}" version.cpp && exit 1 || \
|
||||
mv -vi version.cpp ${build}
|
||||
|
||||
git diff
|
||||
|
||||
git add ${build}
|
||||
|
||||
git commit -S -m "Set version to ${version}"
|
||||
|
||||
git log --oneline --first-parent ${base}^..
|
||||
|
||||
cat << PUSH
|
||||
|
||||
-------------------------------------------------------------------
|
||||
This script will not push. Verify everything is correct, then push
|
||||
to your repo, and create a PR as described in CONTRIBUTING.md.
|
||||
-------------------------------------------------------------------
|
||||
PUSH
|
||||
@@ -1,23 +0,0 @@
|
||||
#!/usr/bin/node
|
||||
//
|
||||
// Returns hex of lowercasing a string.
|
||||
//
|
||||
|
||||
var stringToHex = function (s) {
|
||||
return Array.prototype.map.call(s, function (c) {
|
||||
var b = c.charCodeAt(0);
|
||||
|
||||
return b < 16 ? "0" + b.toString(16) : b.toString(16);
|
||||
}).join("");
|
||||
};
|
||||
|
||||
if (3 != process.argv.length) {
|
||||
process.stderr.write("Usage: " + process.argv[1] + " string\n\nReturns hex of lowercasing string.\n");
|
||||
process.exit(1);
|
||||
|
||||
} else {
|
||||
|
||||
process.stdout.write(stringToHex(process.argv[2].toLowerCase()) + "\n");
|
||||
}
|
||||
|
||||
// vim:sw=2:sts=2:ts=8:et
|
||||
@@ -1,42 +0,0 @@
|
||||
#!/usr/bin/node
|
||||
//
|
||||
// This is a tool to issue JSON-RPC requests from the command line.
|
||||
//
|
||||
// This can be used to test a JSON-RPC server.
|
||||
//
|
||||
// Requires: npm simple-jsonrpc
|
||||
//
|
||||
|
||||
var jsonrpc = require('simple-jsonrpc');
|
||||
|
||||
var program = process.argv[1];
|
||||
|
||||
if (5 !== process.argv.length) {
|
||||
console.log("Usage: %s <URL> <method> <json>", program);
|
||||
}
|
||||
else {
|
||||
var url = process.argv[2];
|
||||
var method = process.argv[3];
|
||||
var json_raw = process.argv[4];
|
||||
var json;
|
||||
|
||||
try {
|
||||
json = JSON.parse(json_raw);
|
||||
}
|
||||
catch (e) {
|
||||
console.log("JSON parse error: %s", e.message);
|
||||
throw e;
|
||||
}
|
||||
|
||||
var client = jsonrpc.client(url);
|
||||
|
||||
client.call(method, json,
|
||||
function (result) {
|
||||
console.log(JSON.stringify(result, undefined, 2));
|
||||
},
|
||||
function (error) {
|
||||
console.log(JSON.stringify(error, undefined, 2));
|
||||
});
|
||||
}
|
||||
|
||||
// vim:sw=2:sts=2:ts=8:et
|
||||
@@ -1,68 +0,0 @@
|
||||
#!/usr/bin/node
|
||||
//
|
||||
// This is a tool to listen for JSON-RPC requests at an IP and port.
|
||||
//
|
||||
// This will report the request to console and echo back the request as the response.
|
||||
//
|
||||
|
||||
var http = require("http");
|
||||
|
||||
var program = process.argv[1];
|
||||
|
||||
if (4 !== process.argv.length) {
|
||||
console.log("Usage: %s <ip> <port>", program);
|
||||
}
|
||||
else {
|
||||
var ip = process.argv[2];
|
||||
var port = process.argv[3];
|
||||
|
||||
var server = http.createServer(function (req, res) {
|
||||
console.log("CONNECT");
|
||||
var input = "";
|
||||
|
||||
req.setEncoding();
|
||||
|
||||
req.on('data', function (buffer) {
|
||||
// console.log("DATA: %s", buffer);
|
||||
input = input + buffer;
|
||||
});
|
||||
|
||||
req.on('end', function () {
|
||||
// console.log("END");
|
||||
|
||||
var json_req;
|
||||
|
||||
console.log("URL: %s", req.url);
|
||||
console.log("HEADERS: %s", JSON.stringify(req.headers, undefined, 2));
|
||||
|
||||
try {
|
||||
json_req = JSON.parse(input);
|
||||
|
||||
console.log("REQ: %s", JSON.stringify(json_req, undefined, 2));
|
||||
}
|
||||
catch (e) {
|
||||
console.log("BAD JSON: %s", e.message);
|
||||
|
||||
json_req = { error : e.message }
|
||||
}
|
||||
|
||||
res.statusCode = 200;
|
||||
res.end(JSON.stringify({
|
||||
jsonrpc: "2.0",
|
||||
result: { request : json_req },
|
||||
id: req.id
|
||||
}));
|
||||
});
|
||||
|
||||
req.on('close', function () {
|
||||
console.log("CLOSE");
|
||||
});
|
||||
});
|
||||
|
||||
server.listen(port, ip, undefined,
|
||||
function () {
|
||||
console.log("Listening at: %s:%s", ip, port);
|
||||
});
|
||||
}
|
||||
|
||||
// vim:sw=2:sts=2:ts=8:et
|
||||
218
bin/physical.sh
218
bin/physical.sh
@@ -1,218 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -o errexit
|
||||
|
||||
marker_base=985c80fbc6131f3a8cedd0da7e8af98dfceb13c7
|
||||
marker_commit=${1:-${marker_base}}
|
||||
|
||||
if [ $(git merge-base ${marker_commit} ${marker_base}) != ${marker_base} ]; then
|
||||
echo "first marker commit not an ancestor: ${marker_commit}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ $(git merge-base ${marker_commit} HEAD) != $(git rev-parse --verify ${marker_commit}) ]; then
|
||||
echo "given marker commit not an ancestor: ${marker_commit}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -e Builds/CMake ]; then
|
||||
echo move CMake
|
||||
git mv Builds/CMake cmake
|
||||
git add --update .
|
||||
git commit -m 'Move CMake directory' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
fi
|
||||
|
||||
if [ -e src/ripple ]; then
|
||||
|
||||
echo move protocol buffers
|
||||
mkdir -p include/xrpl
|
||||
if [ -e src/ripple/proto ]; then
|
||||
git mv src/ripple/proto include/xrpl
|
||||
fi
|
||||
|
||||
extract_list() {
|
||||
git show ${marker_commit}:Builds/CMake/RippledCore.cmake | \
|
||||
awk "/END ${1}/ { p = 0 } p && /src\/ripple/; /BEGIN ${1}/ { p = 1 }" | \
|
||||
sed -e 's#src/ripple/##' -e 's#[^a-z]\+$##'
|
||||
}
|
||||
|
||||
move_files() {
|
||||
oldroot="$1"; shift
|
||||
newroot="$1"; shift
|
||||
detail="$1"; shift
|
||||
files=("$@")
|
||||
for file in ${files[@]}; do
|
||||
if [ ! -e ${oldroot}/${file} ]; then
|
||||
continue
|
||||
fi
|
||||
dir=$(dirname ${file})
|
||||
if [ $(basename ${dir}) == 'details' ]; then
|
||||
dir=$(dirname ${dir})
|
||||
fi
|
||||
if [ $(basename ${dir}) == 'impl' ]; then
|
||||
dir="$(dirname ${dir})/${detail}"
|
||||
fi
|
||||
mkdir -p ${newroot}/${dir}
|
||||
git mv ${oldroot}/${file} ${newroot}/${dir}
|
||||
done
|
||||
}
|
||||
|
||||
echo move libxrpl headers
|
||||
files=$(extract_list 'LIBXRPL HEADERS')
|
||||
files+=(
|
||||
basics/SlabAllocator.h
|
||||
|
||||
beast/asio/io_latency_probe.h
|
||||
beast/container/aged_container.h
|
||||
beast/container/aged_container_utility.h
|
||||
beast/container/aged_map.h
|
||||
beast/container/aged_multimap.h
|
||||
beast/container/aged_multiset.h
|
||||
beast/container/aged_set.h
|
||||
beast/container/aged_unordered_map.h
|
||||
beast/container/aged_unordered_multimap.h
|
||||
beast/container/aged_unordered_multiset.h
|
||||
beast/container/aged_unordered_set.h
|
||||
beast/container/detail/aged_associative_container.h
|
||||
beast/container/detail/aged_container_iterator.h
|
||||
beast/container/detail/aged_ordered_container.h
|
||||
beast/container/detail/aged_unordered_container.h
|
||||
beast/container/detail/empty_base_optimization.h
|
||||
beast/core/LockFreeStack.h
|
||||
beast/insight/Collector.h
|
||||
beast/insight/Counter.h
|
||||
beast/insight/CounterImpl.h
|
||||
beast/insight/Event.h
|
||||
beast/insight/EventImpl.h
|
||||
beast/insight/Gauge.h
|
||||
beast/insight/GaugeImpl.h
|
||||
beast/insight/Group.h
|
||||
beast/insight/Groups.h
|
||||
beast/insight/Hook.h
|
||||
beast/insight/HookImpl.h
|
||||
beast/insight/Insight.h
|
||||
beast/insight/Meter.h
|
||||
beast/insight/MeterImpl.h
|
||||
beast/insight/NullCollector.h
|
||||
beast/insight/StatsDCollector.h
|
||||
beast/test/fail_counter.h
|
||||
beast/test/fail_stream.h
|
||||
beast/test/pipe_stream.h
|
||||
beast/test/sig_wait.h
|
||||
beast/test/string_iostream.h
|
||||
beast/test/string_istream.h
|
||||
beast/test/string_ostream.h
|
||||
beast/test/test_allocator.h
|
||||
beast/test/yield_to.h
|
||||
beast/utility/hash_pair.h
|
||||
beast/utility/maybe_const.h
|
||||
beast/utility/temp_dir.h
|
||||
|
||||
# included by only json/impl/json_assert.h
|
||||
json/json_errors.h
|
||||
|
||||
protocol/PayChan.h
|
||||
protocol/RippleLedgerHash.h
|
||||
protocol/messages.h
|
||||
protocol/st.h
|
||||
)
|
||||
files+=(
|
||||
basics/README.md
|
||||
crypto/README.md
|
||||
json/README.md
|
||||
protocol/README.md
|
||||
resource/README.md
|
||||
)
|
||||
move_files src/ripple include/xrpl detail ${files[@]}
|
||||
|
||||
echo move libxrpl sources
|
||||
files=$(extract_list 'LIBXRPL SOURCES')
|
||||
move_files src/ripple src/libxrpl "" ${files[@]}
|
||||
|
||||
echo check leftovers
|
||||
dirs=$(cd include/xrpl; ls -d */)
|
||||
dirs=$(cd src/ripple; ls -d ${dirs} 2>/dev/null || true)
|
||||
files="$(cd src/ripple; find ${dirs} -type f)"
|
||||
if [ -n "${files}" ]; then
|
||||
echo "leftover files:"
|
||||
echo ${files}
|
||||
exit
|
||||
fi
|
||||
|
||||
echo remove empty directories
|
||||
empty_dirs="$(cd src/ripple; find ${dirs} -depth -type d)"
|
||||
for dir in ${empty_dirs[@]}; do
|
||||
if [ -e ${dir} ]; then
|
||||
rmdir ${dir}
|
||||
fi
|
||||
done
|
||||
|
||||
echo move xrpld sources
|
||||
files=$(
|
||||
extract_list 'XRPLD SOURCES'
|
||||
cd src/ripple
|
||||
find * -regex '.*\.\(h\|ipp\|md\|pu\|uml\|png\)'
|
||||
)
|
||||
move_files src/ripple src/xrpld detail ${files[@]}
|
||||
|
||||
files="$(cd src/ripple; find . -type f)"
|
||||
if [ -n "${files}" ]; then
|
||||
echo "leftover files:"
|
||||
echo ${files}
|
||||
exit
|
||||
fi
|
||||
|
||||
fi
|
||||
|
||||
rm -rf src/ripple
|
||||
|
||||
echo rename .hpp to .h
|
||||
find include src -name '*.hpp' -exec bash -c 'f="{}"; git mv "${f}" "${f%hpp}h"' \;
|
||||
|
||||
echo move PerfLog.h
|
||||
if [ -e include/xrpl/basics/PerfLog.h ]; then
|
||||
git mv include/xrpl/basics/PerfLog.h src/xrpld/perflog
|
||||
fi
|
||||
|
||||
# Make sure all protobuf includes have the correct prefix.
|
||||
protobuf_replace='s:^#include\s*["<].*org/xrpl\([^">]\+\)[">]:#include <xrpl/proto/org/xrpl\1>:'
|
||||
# Make sure first-party includes use angle brackets and .h extension.
|
||||
ripple_replace='s:include\s*["<]ripple/\(.*\)\.h\(pp\)\?[">]:include <ripple/\1.h>:'
|
||||
beast_replace='s:include\s*<beast/:include <xrpl/beast/:'
|
||||
# Rename impl directories to detail.
|
||||
impl_rename='s:\(<xrpl.*\)/impl\(/details\)\?/:\1/detail/:'
|
||||
|
||||
echo rewrite includes in libxrpl
|
||||
find include/xrpl src/libxrpl -type f -exec sed -i \
|
||||
-e "${protobuf_replace}" \
|
||||
-e "${ripple_replace}" \
|
||||
-e "${beast_replace}" \
|
||||
-e 's:^#include <ripple/:#include <xrpl/:' \
|
||||
-e "${impl_rename}" \
|
||||
{} +
|
||||
|
||||
echo rewrite includes in xrpld
|
||||
# # https://www.baeldung.com/linux/join-multiple-lines
|
||||
libxrpl_dirs="$(cd include/xrpl; ls -d1 */ | sed 's:/$::')"
|
||||
# libxrpl_dirs='a\nb\nc\n'
|
||||
readarray -t libxrpl_dirs <<< "${libxrpl_dirs}"
|
||||
# libxrpl_dirs=(a b c)
|
||||
libxrpl_dirs=$(printf -v txt '%s\\|' "${libxrpl_dirs[@]}"; echo "${txt%\\|}")
|
||||
# libxrpl_dirs='a\|b\|c'
|
||||
find src/xrpld src/test -type f -exec sed -i \
|
||||
-e "${protobuf_replace}" \
|
||||
-e "${ripple_replace}" \
|
||||
-e "${beast_replace}" \
|
||||
-e "s:^#include <ripple/basics/PerfLog.h>:#include <xrpld/perflog/PerfLog.h>:" \
|
||||
-e "s:^#include <ripple/\(${libxrpl_dirs}\)/:#include <xrpl/\1/:" \
|
||||
-e 's:^#include <ripple/:#include <xrpld/:' \
|
||||
-e "${impl_rename}" \
|
||||
{} +
|
||||
|
||||
git commit -m 'Rearrange sources' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
find include src -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -exec clang-format-10 -i {} +
|
||||
git add --update .
|
||||
git commit -m 'Rewrite includes' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
./Builds/levelization/levelization.sh
|
||||
git add --update .
|
||||
git commit -m 'Recompute loops' --author 'Pretty Printer <cpp@ripple.com>'
|
||||
252
bin/rlint.js
252
bin/rlint.js
@@ -1,252 +0,0 @@
|
||||
#!/usr/bin/node
|
||||
|
||||
var async = require('async');
|
||||
var Remote = require('ripple-lib').Remote;
|
||||
var Transaction = require('ripple-lib').Transaction;
|
||||
var UInt160 = require('ripple-lib').UInt160;
|
||||
var Amount = require('ripple-lib').Amount;
|
||||
|
||||
var book_key = function (book) {
|
||||
return book.taker_pays.currency
|
||||
+ ":" + book.taker_pays.issuer
|
||||
+ ":" + book.taker_gets.currency
|
||||
+ ":" + book.taker_gets.issuer;
|
||||
};
|
||||
|
||||
var book_key_cross = function (book) {
|
||||
return book.taker_gets.currency
|
||||
+ ":" + book.taker_gets.issuer
|
||||
+ ":" + book.taker_pays.currency
|
||||
+ ":" + book.taker_pays.issuer;
|
||||
};
|
||||
|
||||
var ledger_verify = function (ledger) {
|
||||
var dir_nodes = ledger.accountState.filter(function (entry) {
|
||||
return entry.LedgerEntryType === 'DirectoryNode' // Only directories
|
||||
&& entry.index === entry.RootIndex // Only root nodes
|
||||
&& 'TakerGetsCurrency' in entry; // Only offer directories
|
||||
});
|
||||
|
||||
var books = {};
|
||||
|
||||
dir_nodes.forEach(function (node) {
|
||||
var book = {
|
||||
taker_gets: {
|
||||
currency: UInt160.from_generic(node.TakerGetsCurrency).to_json(),
|
||||
issuer: UInt160.from_generic(node.TakerGetsIssuer).to_json()
|
||||
},
|
||||
taker_pays: {
|
||||
currency: UInt160.from_generic(node.TakerPaysCurrency).to_json(),
|
||||
issuer: UInt160.from_generic(node.TakerPaysIssuer).to_json()
|
||||
},
|
||||
quality: Amount.from_quality(node.RootIndex),
|
||||
index: node.RootIndex
|
||||
};
|
||||
|
||||
books[book_key(book)] = book;
|
||||
|
||||
// console.log(JSON.stringify(node, undefined, 2));
|
||||
});
|
||||
|
||||
// console.log(JSON.stringify(dir_entry, undefined, 2));
|
||||
console.log("#%s books: %s", ledger.ledger_index, Object.keys(books).length);
|
||||
|
||||
Object.keys(books).forEach(function (key) {
|
||||
var book = books[key];
|
||||
var key_cross = book_key_cross(book);
|
||||
var book_cross = books[key_cross];
|
||||
|
||||
if (book && book_cross && !book_cross.done)
|
||||
{
|
||||
var book_cross_quality_inverted = Amount.from_json("1.0/1/1").divide(book_cross.quality);
|
||||
|
||||
if (book_cross_quality_inverted.compareTo(book.quality) >= 0)
|
||||
{
|
||||
// Crossing books
|
||||
console.log("crossing: #%s :: %s :: %s :: %s :: %s :: %s :: %s", ledger.ledger_index, key, book.quality.to_text(), book_cross.quality.to_text(), book_cross_quality_inverted.to_text(),
|
||||
book.index, book_cross.index);
|
||||
}
|
||||
|
||||
book_cross.done = true;
|
||||
}
|
||||
});
|
||||
|
||||
var ripple_selfs = {};
|
||||
|
||||
var accounts = {};
|
||||
var counts = {};
|
||||
|
||||
ledger.accountState.forEach(function (entry) {
|
||||
if (entry.LedgerEntryType === 'Offer')
|
||||
{
|
||||
counts[entry.Account] = (counts[entry.Account] || 0) + 1;
|
||||
}
|
||||
else if (entry.LedgerEntryType === 'RippleState')
|
||||
{
|
||||
if (entry.Flags & (0x10000 | 0x40000))
|
||||
{
|
||||
counts[entry.LowLimit.issuer] = (counts[entry.LowLimit.issuer] || 0) + 1;
|
||||
}
|
||||
|
||||
if (entry.Flags & (0x20000 | 0x80000))
|
||||
{
|
||||
counts[entry.HighLimit.issuer] = (counts[entry.HighLimit.issuer] || 0) + 1;
|
||||
}
|
||||
|
||||
if (entry.HighLimit.issuer === entry.LowLimit.issuer)
|
||||
ripple_selfs[entry.Account] = entry;
|
||||
}
|
||||
else if (entry.LedgerEntryType == 'AccountRoot')
|
||||
{
|
||||
accounts[entry.Account] = entry;
|
||||
}
|
||||
});
|
||||
|
||||
var low = 0; // Accounts with too low a count.
|
||||
var high = 0;
|
||||
var missing_accounts = 0; // Objects with no referencing account.
|
||||
var missing_objects = 0; // Accounts specifying an object but having none.
|
||||
|
||||
Object.keys(counts).forEach(function (account) {
|
||||
if (account in accounts)
|
||||
{
|
||||
if (counts[account] !== accounts[account].OwnerCount)
|
||||
{
|
||||
if (counts[account] < accounts[account].OwnerCount)
|
||||
{
|
||||
high += 1;
|
||||
console.log("%s: high count %s/%s", account, counts[account], accounts[account].OwnerCount);
|
||||
}
|
||||
else
|
||||
{
|
||||
low += 1;
|
||||
console.log("%s: low count %s/%s", account, counts[account], accounts[account].OwnerCount);
|
||||
}
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
missing_accounts += 1;
|
||||
|
||||
console.log("%s: missing : count %s", account, counts[account]);
|
||||
}
|
||||
});
|
||||
|
||||
Object.keys(accounts).forEach(function (account) {
|
||||
if (!('OwnerCount' in accounts[account]))
|
||||
{
|
||||
console.log("%s: bad entry : %s", account, JSON.stringify(accounts[account], undefined, 2));
|
||||
}
|
||||
else if (!(account in counts) && accounts[account].OwnerCount)
|
||||
{
|
||||
missing_objects += 1;
|
||||
|
||||
console.log("%s: no objects : %s/%s", account, 0, accounts[account].OwnerCount);
|
||||
}
|
||||
});
|
||||
|
||||
if (low)
|
||||
console.log("counts too low = %s", low);
|
||||
|
||||
if (high)
|
||||
console.log("counts too high = %s", high);
|
||||
|
||||
if (missing_objects)
|
||||
console.log("missing_objects = %s", missing_objects);
|
||||
|
||||
if (missing_accounts)
|
||||
console.log("missing_accounts = %s", missing_accounts);
|
||||
|
||||
if (Object.keys(ripple_selfs).length)
|
||||
console.log("RippleState selfs = %s", Object.keys(ripple_selfs).length);
|
||||
|
||||
};
|
||||
|
||||
var ledger_request = function (remote, ledger_index, done) {
|
||||
remote.request_ledger(undefined, {
|
||||
accounts: true,
|
||||
expand: true,
|
||||
})
|
||||
.ledger_index(ledger_index)
|
||||
.on('success', function (m) {
|
||||
// console.log("ledger: ", ledger_index);
|
||||
// console.log("ledger: ", JSON.stringify(m, undefined, 2));
|
||||
done(m.ledger);
|
||||
})
|
||||
.on('error', function (m) {
|
||||
console.log("error");
|
||||
done();
|
||||
})
|
||||
.request();
|
||||
};
|
||||
|
||||
var usage = function () {
|
||||
console.log("rlint.js _websocket_ip_ _websocket_port_ ");
|
||||
};
|
||||
|
||||
var finish = function (remote) {
|
||||
remote.disconnect();
|
||||
|
||||
// XXX Because remote.disconnect() doesn't work:
|
||||
process.exit();
|
||||
};
|
||||
|
||||
console.log("args: ", process.argv.length);
|
||||
console.log("args: ", process.argv);
|
||||
|
||||
if (process.argv.length < 4) {
|
||||
usage();
|
||||
}
|
||||
else {
|
||||
var remote = Remote.from_config({
|
||||
websocket_ip: process.argv[2],
|
||||
websocket_port: process.argv[3],
|
||||
})
|
||||
.once('ledger_closed', function (m) {
|
||||
console.log("ledger_closed: ", JSON.stringify(m, undefined, 2));
|
||||
|
||||
if (process.argv.length === 5) {
|
||||
var ledger_index = process.argv[4];
|
||||
|
||||
ledger_request(remote, ledger_index, function (l) {
|
||||
if (l) {
|
||||
ledger_verify(l);
|
||||
}
|
||||
|
||||
finish(remote);
|
||||
});
|
||||
|
||||
} else if (process.argv.length === 6) {
|
||||
var ledger_start = Number(process.argv[4]);
|
||||
var ledger_end = Number(process.argv[5]);
|
||||
var ledger_cursor = ledger_end;
|
||||
|
||||
async.whilst(
|
||||
function () {
|
||||
return ledger_start <= ledger_cursor && ledger_cursor <=ledger_end;
|
||||
},
|
||||
function (callback) {
|
||||
// console.log(ledger_cursor);
|
||||
|
||||
ledger_request(remote, ledger_cursor, function (l) {
|
||||
if (l) {
|
||||
ledger_verify(l);
|
||||
}
|
||||
|
||||
--ledger_cursor;
|
||||
|
||||
callback();
|
||||
});
|
||||
},
|
||||
function (error) {
|
||||
finish(remote);
|
||||
});
|
||||
|
||||
} else {
|
||||
finish(remote);
|
||||
}
|
||||
})
|
||||
.connect();
|
||||
}
|
||||
|
||||
// vim:sw=2:sts=2:ts=8:et
|
||||
@@ -1,51 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -exu
|
||||
|
||||
: ${TRAVIS_BUILD_DIR:=""}
|
||||
: ${VCPKG_DIR:=".vcpkg"}
|
||||
export VCPKG_ROOT=${VCPKG_DIR}
|
||||
: ${VCPKG_DEFAULT_TRIPLET:="x64-windows-static"}
|
||||
|
||||
export VCPKG_DEFAULT_TRIPLET
|
||||
|
||||
EXE="vcpkg"
|
||||
if [[ -z ${COMSPEC:-} ]]; then
|
||||
EXE="${EXE}.exe"
|
||||
fi
|
||||
|
||||
if [[ -d "${VCPKG_DIR}" && -x "${VCPKG_DIR}/${EXE}" && -d "${VCPKG_DIR}/installed" ]] ; then
|
||||
echo "Using cached vcpkg at ${VCPKG_DIR}"
|
||||
${VCPKG_DIR}/${EXE} list
|
||||
else
|
||||
if [[ -d "${VCPKG_DIR}" ]] ; then
|
||||
rm -rf "${VCPKG_DIR}"
|
||||
fi
|
||||
git clone --branch 2021.04.30 https://github.com/Microsoft/vcpkg.git ${VCPKG_DIR}
|
||||
pushd ${VCPKG_DIR}
|
||||
BSARGS=()
|
||||
if [[ "$(uname)" == "Darwin" ]] ; then
|
||||
BSARGS+=(--allowAppleClang)
|
||||
fi
|
||||
if [[ -z ${COMSPEC:-} ]]; then
|
||||
chmod +x ./bootstrap-vcpkg.sh
|
||||
time ./bootstrap-vcpkg.sh "${BSARGS[@]}"
|
||||
else
|
||||
time ./bootstrap-vcpkg.bat
|
||||
fi
|
||||
popd
|
||||
fi
|
||||
|
||||
# TODO: bring boost in this way as well ?
|
||||
# NOTE: can pin specific ports to a commit/version like this:
|
||||
# git checkout <SOME COMMIT HASH> ports/boost
|
||||
if [ $# -eq 0 ]; then
|
||||
echo "No extra packages specified..."
|
||||
PKGS=()
|
||||
else
|
||||
PKGS=( "$@" )
|
||||
fi
|
||||
for LIB in "${PKGS[@]}"; do
|
||||
time ${VCPKG_DIR}/${EXE} --clean-after-build install ${LIB}
|
||||
done
|
||||
|
||||
|
||||
@@ -1,40 +0,0 @@
|
||||
|
||||
# NOTE: must be sourced from a shell so it can export vars
|
||||
|
||||
cat << BATCH > ./getenv.bat
|
||||
CALL %*
|
||||
ENV
|
||||
BATCH
|
||||
|
||||
while read line ; do
|
||||
IFS='"' read x path arg <<<"${line}"
|
||||
if [ -f "${path}" ] ; then
|
||||
echo "FOUND: $path"
|
||||
export VCINSTALLDIR=$(./getenv.bat "${path}" ${arg} | grep "^VCINSTALLDIR=" | sed -E "s/^VCINSTALLDIR=//g")
|
||||
if [ "${VCINSTALLDIR}" != "" ] ; then
|
||||
echo "USING ${VCINSTALLDIR}"
|
||||
export LIB=$(./getenv.bat "${path}" ${arg} | grep "^LIB=" | sed -E "s/^LIB=//g")
|
||||
export LIBPATH=$(./getenv.bat "${path}" ${arg} | grep "^LIBPATH=" | sed -E "s/^LIBPATH=//g")
|
||||
export INCLUDE=$(./getenv.bat "${path}" ${arg} | grep "^INCLUDE=" | sed -E "s/^INCLUDE=//g")
|
||||
ADDPATH=$(./getenv.bat "${path}" ${arg} | grep "^PATH=" | sed -E "s/^PATH=//g")
|
||||
export PATH="${ADDPATH}:${PATH}"
|
||||
break
|
||||
fi
|
||||
fi
|
||||
done <<EOL
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio/2019/BuildTools/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio/2017/BuildTools/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio/2017/Community/VC/Auxiliary/Build/vcvarsall.bat" x86_amd64
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio 15.0/VC/vcvarsall.bat" amd64
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/vcvarsall.bat" amd64
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio 13.0/VC/vcvarsall.bat" amd64
|
||||
"C:/Program Files (x86)/Microsoft Visual Studio 12.0/VC/vcvarsall.bat" amd64
|
||||
EOL
|
||||
# TODO: update the list above as needed to support newer versions of msvc tools
|
||||
|
||||
rm -f getenv.bat
|
||||
|
||||
if [ "${VCINSTALLDIR}" = "" ] ; then
|
||||
echo "No compatible visual studio found!"
|
||||
fi
|
||||
@@ -1,246 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
"""A script to test rippled in an infinite loop of start-sync-stop.
|
||||
|
||||
- Requires Python 3.7+.
|
||||
- Can be stopped with SIGINT.
|
||||
- Has no dependencies outside the standard library.
|
||||
"""
|
||||
|
||||
import sys
|
||||
|
||||
assert sys.version_info.major == 3 and sys.version_info.minor >= 7
|
||||
|
||||
import argparse
|
||||
import asyncio
|
||||
import configparser
|
||||
import contextlib
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
from pathlib import Path
|
||||
import platform
|
||||
import subprocess
|
||||
import time
|
||||
import urllib.error
|
||||
import urllib.request
|
||||
|
||||
# Enable asynchronous subprocesses on Windows. The default changed in 3.8.
|
||||
# https://docs.python.org/3.7/library/asyncio-platforms.html#subprocess-support-on-windows
|
||||
if (platform.system() == 'Windows' and sys.version_info.major == 3
|
||||
and sys.version_info.minor < 8):
|
||||
asyncio.set_event_loop_policy(asyncio.WindowsProactorEventLoopPolicy())
|
||||
|
||||
DEFAULT_EXE = 'rippled'
|
||||
DEFAULT_CONFIGURATION_FILE = 'rippled.cfg'
|
||||
# Number of seconds to wait before forcefully terminating.
|
||||
PATIENCE = 120
|
||||
# Number of contiguous seconds in a sync state to be considered synced.
|
||||
DEFAULT_SYNC_DURATION = 60
|
||||
# Number of seconds between polls of state.
|
||||
DEFAULT_POLL_INTERVAL = 5
|
||||
SYNC_STATES = ('full', 'validating', 'proposing')
|
||||
|
||||
|
||||
def read_config(config_file):
|
||||
# strict = False: Allow duplicate keys, e.g. [rpc_startup].
|
||||
# allow_no_value = True: Allow keys with no values. Generally, these
|
||||
# instances use the "key" as the value, and the section name is the key,
|
||||
# e.g. [debug_logfile].
|
||||
# delimiters = ('='): Allow ':' as a character in Windows paths. Some of
|
||||
# our "keys" are actually values, and we don't want to split them on ':'.
|
||||
config = configparser.ConfigParser(
|
||||
strict=False,
|
||||
allow_no_value=True,
|
||||
delimiters=('='),
|
||||
)
|
||||
config.read(config_file)
|
||||
return config
|
||||
|
||||
|
||||
def to_list(value, separator=','):
|
||||
"""Parse a list from a delimited string value."""
|
||||
return [s.strip() for s in value.split(separator) if s]
|
||||
|
||||
|
||||
def find_log_file(config_file):
|
||||
"""Try to figure out what log file the user has chosen. Raises all kinds
|
||||
of exceptions if there is any possibility of ambiguity."""
|
||||
config = read_config(config_file)
|
||||
values = list(config['debug_logfile'].keys())
|
||||
if len(values) < 1:
|
||||
raise ValueError(
|
||||
f'no [debug_logfile] in configuration file: {config_file}')
|
||||
if len(values) > 1:
|
||||
raise ValueError(
|
||||
f'too many [debug_logfile] in configuration file: {config_file}')
|
||||
return values[0]
|
||||
|
||||
|
||||
def find_http_port(config_file):
|
||||
config = read_config(config_file)
|
||||
names = list(config['server'].keys())
|
||||
for name in names:
|
||||
server = config[name]
|
||||
if 'http' in to_list(server.get('protocol', '')):
|
||||
return int(server['port'])
|
||||
raise ValueError(f'no server in [server] for "http" protocol')
|
||||
|
||||
|
||||
@contextlib.asynccontextmanager
|
||||
async def rippled(exe=DEFAULT_EXE, config_file=DEFAULT_CONFIGURATION_FILE):
|
||||
"""A context manager for a rippled process."""
|
||||
# Start the server.
|
||||
process = await asyncio.create_subprocess_exec(
|
||||
str(exe),
|
||||
'--conf',
|
||||
str(config_file),
|
||||
stdout=subprocess.DEVNULL,
|
||||
stderr=subprocess.DEVNULL,
|
||||
)
|
||||
logging.info(f'rippled started with pid {process.pid}')
|
||||
try:
|
||||
yield process
|
||||
finally:
|
||||
# Ask it to stop.
|
||||
logging.info(f'asking rippled (pid: {process.pid}) to stop')
|
||||
start = time.time()
|
||||
process.terminate()
|
||||
|
||||
# Wait nicely.
|
||||
try:
|
||||
await asyncio.wait_for(process.wait(), PATIENCE)
|
||||
except asyncio.TimeoutError:
|
||||
# Ask the operating system to kill it.
|
||||
logging.warning(f'killing rippled ({process.pid})')
|
||||
try:
|
||||
process.kill()
|
||||
except ProcessLookupError:
|
||||
pass
|
||||
|
||||
code = await process.wait()
|
||||
end = time.time()
|
||||
logging.info(
|
||||
f'rippled stopped after {end - start:.1f} seconds with code {code}'
|
||||
)
|
||||
|
||||
|
||||
async def sync(
|
||||
port,
|
||||
*,
|
||||
duration=DEFAULT_SYNC_DURATION,
|
||||
interval=DEFAULT_POLL_INTERVAL,
|
||||
):
|
||||
"""Poll rippled on an interval until it has been synced for a duration."""
|
||||
start = time.perf_counter()
|
||||
while (time.perf_counter() - start) < duration:
|
||||
await asyncio.sleep(interval)
|
||||
|
||||
request = urllib.request.Request(
|
||||
f'http://127.0.0.1:{port}',
|
||||
data=json.dumps({
|
||||
'method': 'server_state'
|
||||
}).encode(),
|
||||
headers={'Content-Type': 'application/json'},
|
||||
)
|
||||
with urllib.request.urlopen(request) as response:
|
||||
try:
|
||||
body = json.loads(response.read())
|
||||
except urllib.error.HTTPError as cause:
|
||||
logging.warning(f'server_state returned not JSON: {cause}')
|
||||
start = time.perf_counter()
|
||||
continue
|
||||
|
||||
try:
|
||||
state = body['result']['state']['server_state']
|
||||
except KeyError as cause:
|
||||
logging.warning(f'server_state response missing key: {cause.key}')
|
||||
start = time.perf_counter()
|
||||
continue
|
||||
logging.info(f'server_state: {state}')
|
||||
if state not in SYNC_STATES:
|
||||
# Require a contiguous sync state.
|
||||
start = time.perf_counter()
|
||||
|
||||
|
||||
async def loop(test,
|
||||
*,
|
||||
exe=DEFAULT_EXE,
|
||||
config_file=DEFAULT_CONFIGURATION_FILE):
|
||||
"""
|
||||
Start-test-stop rippled in an infinite loop.
|
||||
|
||||
Moves log to a different file after each iteration.
|
||||
"""
|
||||
log_file = find_log_file(config_file)
|
||||
id = 0
|
||||
while True:
|
||||
logging.info(f'iteration: {id}')
|
||||
async with rippled(exe, config_file) as process:
|
||||
start = time.perf_counter()
|
||||
exited = asyncio.create_task(process.wait())
|
||||
tested = asyncio.create_task(test())
|
||||
# Try to sync as long as the process is running.
|
||||
done, pending = await asyncio.wait(
|
||||
{exited, tested},
|
||||
return_when=asyncio.FIRST_COMPLETED,
|
||||
)
|
||||
if done == {exited}:
|
||||
code = exited.result()
|
||||
logging.warning(
|
||||
f'server halted for unknown reason with code {code}')
|
||||
else:
|
||||
assert done == {tested}
|
||||
assert tested.exception() is None
|
||||
end = time.perf_counter()
|
||||
logging.info(f'synced after {end - start:.0f} seconds')
|
||||
os.replace(log_file, f'debug.{id}.log')
|
||||
id += 1
|
||||
|
||||
|
||||
logging.basicConfig(
|
||||
format='%(asctime)s %(levelname)-8s %(message)s',
|
||||
level=logging.INFO,
|
||||
datefmt='%Y-%m-%d %H:%M:%S',
|
||||
)
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
|
||||
parser.add_argument(
|
||||
'rippled',
|
||||
type=Path,
|
||||
nargs='?',
|
||||
default=DEFAULT_EXE,
|
||||
help='Path to rippled.',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--conf',
|
||||
type=Path,
|
||||
default=DEFAULT_CONFIGURATION_FILE,
|
||||
help='Path to configuration file.',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--duration',
|
||||
type=int,
|
||||
default=DEFAULT_SYNC_DURATION,
|
||||
help='Number of contiguous seconds required in a synchronized state.',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--interval',
|
||||
type=int,
|
||||
default=DEFAULT_POLL_INTERVAL,
|
||||
help='Number of seconds to wait between polls of state.',
|
||||
)
|
||||
args = parser.parse_args()
|
||||
|
||||
port = find_http_port(args.conf)
|
||||
|
||||
|
||||
def test():
|
||||
return sync(port, duration=args.duration, interval=args.interval)
|
||||
|
||||
|
||||
try:
|
||||
asyncio.run(loop(test, exe=args.rippled, config_file=args.conf))
|
||||
except KeyboardInterrupt:
|
||||
# Squelch the message. This is a normal mode of exit.
|
||||
pass
|
||||
133
bin/stop-test.js
133
bin/stop-test.js
@@ -1,133 +0,0 @@
|
||||
/* -------------------------------- REQUIRES -------------------------------- */
|
||||
|
||||
var child = require("child_process");
|
||||
var assert = require("assert");
|
||||
|
||||
/* --------------------------------- CONFIG --------------------------------- */
|
||||
|
||||
if (process.argv[2] == null) {
|
||||
[
|
||||
'Usage: ',
|
||||
'',
|
||||
' `node bin/stop-test.js i,j [rippled_path] [rippled_conf]`',
|
||||
'',
|
||||
' Launch rippled and stop it after n seconds for all n in [i, j}',
|
||||
' For all even values of n launch rippled with `--fg`',
|
||||
' For values of n where n % 3 == 0 launch rippled with `--fg`\n',
|
||||
'Examples: ',
|
||||
'',
|
||||
' $ node bin/stop-test.js 5,10',
|
||||
(' $ node bin/stop-test.js 1,4 ' +
|
||||
'build/clang.debug/rippled $HOME/.confs/rippled.cfg')
|
||||
]
|
||||
.forEach(function(l){console.log(l)});
|
||||
|
||||
process.exit();
|
||||
} else {
|
||||
var testRange = process.argv[2].split(',').map(Number);
|
||||
var rippledPath = process.argv[3] || 'build/rippled'
|
||||
var rippledConf = process.argv[4] || 'rippled.cfg'
|
||||
}
|
||||
|
||||
var options = {
|
||||
env: process.env,
|
||||
stdio: 'ignore' // we could dump the child io when it fails abnormally
|
||||
};
|
||||
|
||||
// default args
|
||||
var conf_args = ['--conf='+rippledConf];
|
||||
var start_args = conf_args.concat([/*'--net'*/])
|
||||
var stop_args = conf_args.concat(['stop']);
|
||||
|
||||
/* --------------------------------- HELPERS -------------------------------- */
|
||||
|
||||
function start(args) {
|
||||
return child.spawn(rippledPath, args, options);
|
||||
}
|
||||
function stop(rippled) { child.execFile(rippledPath, stop_args, options)}
|
||||
function secs_l8r(ms, f) {setTimeout(f, ms * 1000); }
|
||||
|
||||
function show_results_and_exit(results) {
|
||||
console.log(JSON.stringify(results, undefined, 2));
|
||||
process.exit();
|
||||
}
|
||||
|
||||
var timeTakes = function (range) {
|
||||
function sumRange(n) {return (n+1) * n /2}
|
||||
var ret = sumRange(range[1]);
|
||||
if (range[0] > 1) {
|
||||
ret = ret - sumRange(range[0] - 1)
|
||||
}
|
||||
var stopping = (range[1] - range[0]) * 0.5;
|
||||
return ret + stopping;
|
||||
}
|
||||
|
||||
/* ---------------------------------- TEST ---------------------------------- */
|
||||
|
||||
console.log("Test will take ~%s seconds", timeTakes(testRange));
|
||||
|
||||
(function oneTest(n /* seconds */, results) {
|
||||
if (n >= testRange[1]) {
|
||||
// show_results_and_exit(results);
|
||||
console.log(JSON.stringify(results, undefined, 2));
|
||||
oneTest(testRange[0], []);
|
||||
return;
|
||||
}
|
||||
|
||||
var args = start_args;
|
||||
if (n % 2 == 0) {args = args.concat(['--fg'])}
|
||||
if (n % 3 == 0) {args = args.concat(['--net'])}
|
||||
|
||||
var result = {args: args, alive_for: n};
|
||||
results.push(result);
|
||||
|
||||
console.log("\nLaunching `%s` with `%s` for %d seconds",
|
||||
rippledPath, JSON.stringify(args), n);
|
||||
|
||||
rippled = start(args);
|
||||
console.log("Rippled pid: %d", rippled.pid);
|
||||
|
||||
// defaults
|
||||
var b4StopSent = false;
|
||||
var stopSent = false;
|
||||
var stop_took = null;
|
||||
|
||||
rippled.once('exit', function(){
|
||||
if (!stopSent && !b4StopSent) {
|
||||
console.warn('\nRippled exited itself b4 stop issued');
|
||||
process.exit();
|
||||
};
|
||||
|
||||
// The io handles close AFTER exit, may have implications for
|
||||
// `stdio:'inherit'` option to `child.spawn`.
|
||||
rippled.once('close', function() {
|
||||
result.stop_took = (+new Date() - stop_took) / 1000; // seconds
|
||||
console.log("Stopping after %d seconds took %s seconds",
|
||||
n, result.stop_took);
|
||||
oneTest(n+1, results);
|
||||
});
|
||||
});
|
||||
|
||||
secs_l8r(n, function(){
|
||||
console.log("Stopping rippled after %d seconds", n);
|
||||
|
||||
// possible race here ?
|
||||
// seems highly unlikely, but I was having issues at one point
|
||||
b4StopSent=true;
|
||||
stop_took = (+new Date());
|
||||
// when does `exit` actually get sent?
|
||||
stop();
|
||||
stopSent=true;
|
||||
|
||||
// Sometimes we want to attach with a debugger.
|
||||
if (process.env.ABORT_TESTS_ON_STALL != null) {
|
||||
// We wait 30 seconds, and if it hasn't stopped, we abort the process
|
||||
secs_l8r(30, function() {
|
||||
if (result.stop_took == null) {
|
||||
console.log("rippled has stalled");
|
||||
process.exit();
|
||||
};
|
||||
});
|
||||
}
|
||||
})
|
||||
}(testRange[0], []));
|
||||
@@ -1,119 +0,0 @@
|
||||
/**
|
||||
* bin/update_bintypes.js
|
||||
*
|
||||
* This unholy abomination of a script generates the JavaScript file
|
||||
* src/js/bintypes.js from various parts of the C++ source code.
|
||||
*
|
||||
* This should *NOT* be part of any automatic build process unless the C++
|
||||
* source data are brought into a more easily parseable format. Until then,
|
||||
* simply run this script manually and fix as needed.
|
||||
*/
|
||||
|
||||
// XXX: Process LedgerFormats.(h|cpp) as well.
|
||||
|
||||
var filenameProto = __dirname + '/../src/cpp/ripple/SerializeProto.h',
|
||||
filenameTxFormatsH = __dirname + '/../src/cpp/ripple/TransactionFormats.h',
|
||||
filenameTxFormats = __dirname + '/../src/cpp/ripple/TransactionFormats.cpp';
|
||||
|
||||
var fs = require('fs');
|
||||
|
||||
var output = [];
|
||||
|
||||
// Stage 1: Get the field types and codes from SerializeProto.h
|
||||
var types = {},
|
||||
fields = {};
|
||||
String(fs.readFileSync(filenameProto)).split('\n').forEach(function (line) {
|
||||
line = line.replace(/^\s+|\s+$/g, '').replace(/\s+/g, '');
|
||||
if (!line.length || line.slice(0, 2) === '//' || line.slice(-1) !== ')') return;
|
||||
|
||||
var tmp = line.slice(0, -1).split('('),
|
||||
type = tmp[0],
|
||||
opts = tmp[1].split(',');
|
||||
|
||||
if (type === 'TYPE') types[opts[1]] = [opts[0], +opts[2]];
|
||||
else if (type === 'FIELD') fields[opts[0]] = [types[opts[1]][0], +opts[2]];
|
||||
});
|
||||
|
||||
output.push('var ST = require("./serializedtypes");');
|
||||
output.push('');
|
||||
output.push('var REQUIRED = exports.REQUIRED = 0,');
|
||||
output.push(' OPTIONAL = exports.OPTIONAL = 1,');
|
||||
output.push(' DEFAULT = exports.DEFAULT = 2;');
|
||||
output.push('');
|
||||
|
||||
function pad(s, n) { while (s.length < n) s += ' '; return s; }
|
||||
function padl(s, n) { while (s.length < n) s = ' '+s; return s; }
|
||||
|
||||
Object.keys(types).forEach(function (type) {
|
||||
output.push(pad('ST.'+types[type][0]+'.id', 25) + ' = '+types[type][1]+';');
|
||||
});
|
||||
output.push('');
|
||||
|
||||
// Stage 2: Get the transaction type IDs from TransactionFormats.h
|
||||
var ttConsts = {};
|
||||
String(fs.readFileSync(filenameTxFormatsH)).split('\n').forEach(function (line) {
|
||||
var regex = /tt([A-Z_]+)\s+=\s+([0-9-]+)/;
|
||||
var match = line.match(regex);
|
||||
if (match) ttConsts[match[1]] = +match[2];
|
||||
});
|
||||
|
||||
// Stage 3: Get the transaction formats from TransactionFormats.cpp
|
||||
var base = [],
|
||||
sections = [],
|
||||
current = base;
|
||||
String(fs.readFileSync(filenameTxFormats)).split('\n').forEach(function (line) {
|
||||
line = line.replace(/^\s+|\s+$/g, '').replace(/\s+/g, '');
|
||||
|
||||
var d_regex = /DECLARE_TF\(([A-Za-z]+),tt([A-Z_]+)/;
|
||||
var d_match = line.match(d_regex);
|
||||
|
||||
var s_regex = /SOElement\(sf([a-z]+),SOE_(REQUIRED|OPTIONAL|DEFAULT)/i;
|
||||
var s_match = line.match(s_regex);
|
||||
|
||||
if (d_match) sections.push(current = [d_match[1], ttConsts[d_match[2]]]);
|
||||
else if (s_match) current.push([s_match[1], s_match[2]]);
|
||||
});
|
||||
|
||||
function removeFinalComma(arr) {
|
||||
arr[arr.length-1] = arr[arr.length-1].slice(0, -1);
|
||||
}
|
||||
|
||||
output.push('var base = [');
|
||||
base.forEach(function (field) {
|
||||
var spec = fields[field[0]];
|
||||
output.push(' [ '+
|
||||
pad("'"+field[0]+"'", 21)+', '+
|
||||
pad(field[1], 8)+', '+
|
||||
padl(""+spec[1], 2)+', '+
|
||||
'ST.'+pad(spec[0], 3)+
|
||||
' ],');
|
||||
});
|
||||
removeFinalComma(output);
|
||||
output.push('];');
|
||||
output.push('');
|
||||
|
||||
|
||||
output.push('exports.tx = {');
|
||||
sections.forEach(function (section) {
|
||||
var name = section.shift(),
|
||||
ttid = section.shift();
|
||||
|
||||
output.push(' '+name+': ['+ttid+'].concat(base, [');
|
||||
section.forEach(function (field) {
|
||||
var spec = fields[field[0]];
|
||||
output.push(' [ '+
|
||||
pad("'"+field[0]+"'", 21)+', '+
|
||||
pad(field[1], 8)+', '+
|
||||
padl(""+spec[1], 2)+', '+
|
||||
'ST.'+pad(spec[0], 3)+
|
||||
' ],');
|
||||
});
|
||||
removeFinalComma(output);
|
||||
output.push(' ]),');
|
||||
});
|
||||
removeFinalComma(output);
|
||||
output.push('};');
|
||||
output.push('');
|
||||
|
||||
console.log(output.join('\n'));
|
||||
|
||||
@@ -396,8 +396,8 @@
|
||||
# true - enables compression
|
||||
# false - disables compression [default].
|
||||
#
|
||||
# The rippled server can save bandwidth by compressing its peer-to-peer communications,
|
||||
# at a cost of greater CPU usage. If you enable link compression,
|
||||
# The rippled server can save bandwidth by compressing its peer-to-peer communications,
|
||||
# at a cost of greater CPU usage. If you enable link compression,
|
||||
# the server automatically compresses communications with peer servers
|
||||
# that also have link compression enabled.
|
||||
# https://xrpl.org/enable-link-compression.html
|
||||
@@ -410,14 +410,17 @@
|
||||
# starter list is included in the code and used if no other hostnames are
|
||||
# available.
|
||||
#
|
||||
# One address or domain name per line is allowed. A port may must be
|
||||
# specified after adding a space to the address. The ordering of entries
|
||||
# does not generally matter.
|
||||
# One address or domain name per line is allowed. A port may be specified
|
||||
# after adding a space to the address. If a port is not specified, the default
|
||||
# port of 2459 will be used. Many servers still use the legacy port of 51235.
|
||||
# To connect to such servers, you must specify the port number. The ordering
|
||||
# of entries does not generally matter.
|
||||
#
|
||||
# The default list of entries is:
|
||||
# - r.ripple.com 51235
|
||||
# - sahyadri.isrdc.in 51235
|
||||
# - hubs.xrpkuwait.com 51235
|
||||
# - hub.xrpl-commons.org 51235
|
||||
#
|
||||
# Examples:
|
||||
#
|
||||
@@ -972,6 +975,47 @@
|
||||
# number of ledger records online. Must be greater
|
||||
# than or equal to ledger_history.
|
||||
#
|
||||
# Optional keys for NuDB only:
|
||||
#
|
||||
# nudb_block_size EXPERIMENTAL: Block size in bytes for NuDB storage.
|
||||
# Must be a power of 2 between 4096 and 32768. Default is 4096.
|
||||
#
|
||||
# This parameter controls the fundamental storage unit
|
||||
# size for NuDB's internal data structures. The choice
|
||||
# of block size can significantly impact performance
|
||||
# depending on your storage hardware and filesystem:
|
||||
#
|
||||
# - 4096 bytes: Optimal for most standard SSDs and
|
||||
# traditional filesystems (ext4, NTFS, HFS+).
|
||||
# Provides good balance of performance and storage
|
||||
# efficiency. Recommended for most deployments.
|
||||
# Minimizes memory footprint and provides consistent
|
||||
# low-latency access patterns across diverse hardware.
|
||||
#
|
||||
# - 8192-16384 bytes: May improve performance on
|
||||
# high-end NVMe SSDs and copy-on-write filesystems
|
||||
# like ZFS or Btrfs that benefit from larger block
|
||||
# alignment. Can reduce metadata overhead for large
|
||||
# databases. Offers better sequential throughput and
|
||||
# reduced I/O operations at the cost of higher memory
|
||||
# usage per operation.
|
||||
#
|
||||
# - 32768 bytes (32K): Maximum supported block size
|
||||
# for high-performance scenarios with very fast
|
||||
# storage. May increase memory usage and reduce
|
||||
# efficiency for smaller databases. Best suited for
|
||||
# enterprise environments with abundant RAM.
|
||||
#
|
||||
# Performance testing is recommended before deploying
|
||||
# any non-default block size in production environments.
|
||||
#
|
||||
# Note: This setting cannot be changed after database
|
||||
# creation without rebuilding the entire database.
|
||||
# Choose carefully based on your hardware and expected
|
||||
# database size.
|
||||
#
|
||||
# Example: nudb_block_size=4096
|
||||
#
|
||||
# These keys modify the behavior of online_delete, and thus are only
|
||||
# relevant if online_delete is defined and non-zero:
|
||||
#
|
||||
@@ -1008,7 +1052,7 @@
|
||||
# that rippled is still in sync with the network,
|
||||
# and that the validated ledger is less than
|
||||
# 'age_threshold_seconds' old. If not, then continue
|
||||
# sleeping for this number of seconds and
|
||||
# sleeping for this number of seconds and
|
||||
# checking until healthy.
|
||||
# Default is 5.
|
||||
#
|
||||
@@ -1110,7 +1154,7 @@
|
||||
# page_size Valid values: integer (MUST be power of 2 between 512 and 65536)
|
||||
# The default is 4096 bytes. This setting determines
|
||||
# the size of a page in the transaction.db file.
|
||||
# See https://www.sqlite.org/pragma.html#pragma_page_size
|
||||
# See https://www.sqlite.org/pragma.html#pragma_page_size
|
||||
# for more details about the available options.
|
||||
#
|
||||
# journal_size_limit Valid values: integer
|
||||
@@ -1423,6 +1467,9 @@ admin = 127.0.0.1
|
||||
protocol = http
|
||||
|
||||
[port_peer]
|
||||
# Many servers still use the legacy port of 51235, so for backward-compatibility
|
||||
# we maintain that port number here. However, for new servers we recommend
|
||||
# changing this to the default port of 2459.
|
||||
port = 51235
|
||||
ip = 0.0.0.0
|
||||
# alternatively, to accept connections on IPv4 + IPv6, use:
|
||||
@@ -1465,6 +1512,7 @@ secure_gateway = 127.0.0.1
|
||||
[node_db]
|
||||
type=NuDB
|
||||
path=/var/lib/rippled/db/nudb
|
||||
nudb_block_size=4096
|
||||
online_delete=512
|
||||
advisory_delete=0
|
||||
|
||||
|
||||
@@ -26,7 +26,7 @@
|
||||
#
|
||||
# Examples:
|
||||
# https://vl.ripple.com
|
||||
# https://vl.xrplf.org
|
||||
# https://unl.xrplf.org
|
||||
# http://127.0.0.1:8000
|
||||
# file:///etc/opt/ripple/vl.txt
|
||||
#
|
||||
@@ -54,13 +54,13 @@
|
||||
|
||||
[validator_list_sites]
|
||||
https://vl.ripple.com
|
||||
https://vl.xrplf.org
|
||||
https://unl.xrplf.org
|
||||
|
||||
[validator_list_keys]
|
||||
#vl.ripple.com
|
||||
ED2677ABFFD1B33AC6FBC3062B71F1E8397C1505E1C42C64D11AD1B28FF73F4734
|
||||
# vl.xrplf.org
|
||||
ED45D1840EE724BE327ABE9146503D5848EFD5F38B6D5FEDE71E80ACCE5E6E738B
|
||||
#unl.xrplf.org
|
||||
ED42AEC58B701EEBB77356FFFEC26F83C1F0407263530F068C7C73D392C7E06FD1
|
||||
|
||||
# To use the test network (see https://xrpl.org/connect-your-rippled-to-the-xrp-test-net.html),
|
||||
# use the following configuration instead:
|
||||
@@ -70,3 +70,21 @@ ED45D1840EE724BE327ABE9146503D5848EFD5F38B6D5FEDE71E80ACCE5E6E738B
|
||||
#
|
||||
# [validator_list_keys]
|
||||
# ED264807102805220DA0F312E71FC2C69E1552C9C5790F6C25E3729DEB573D5860
|
||||
|
||||
|
||||
# [validator_list_threshold]
|
||||
#
|
||||
# Minimum number of validator lists on which a validator must be listed in
|
||||
# order to be used.
|
||||
#
|
||||
# This can be set explicitly to any positive integer number not greater than
|
||||
# the size of [validator_list_keys]. If it is not set, or set to 0, the
|
||||
# value will be calculated at startup from the size of [validator_list_keys],
|
||||
# where the calculation is:
|
||||
#
|
||||
# threshold = size(validator_list_keys) < 3
|
||||
# ? 1
|
||||
# : floor(size(validator_list_keys) / 2) + 1
|
||||
|
||||
[validator_list_threshold]
|
||||
0
|
||||
|
||||
@@ -98,6 +98,17 @@
|
||||
# 2024-04-03, Bronek Kozicki
|
||||
# - add support for output formats: jacoco, clover, lcov
|
||||
#
|
||||
# 2025-05-12, Jingchen Wu
|
||||
# - add -fprofile-update=atomic to ensure atomic profile generation
|
||||
#
|
||||
# 2025-08-28, Bronek Kozicki
|
||||
# - fix "At least one COMMAND must be given" CMake warning from policy CMP0175
|
||||
#
|
||||
# 2025-09-03, Jingchen Wu
|
||||
# - remove the unused function append_coverage_compiler_flags and append_coverage_compiler_flags_to_target
|
||||
# - add a new function add_code_coverage_to_target
|
||||
# - remove some unused code
|
||||
#
|
||||
# USAGE:
|
||||
#
|
||||
# 1. Copy this file into your cmake modules path.
|
||||
@@ -106,10 +117,8 @@
|
||||
# using a CMake option() to enable it just optionally):
|
||||
# include(CodeCoverage)
|
||||
#
|
||||
# 3. Append necessary compiler flags for all supported source files:
|
||||
# append_coverage_compiler_flags()
|
||||
# Or for specific target:
|
||||
# append_coverage_compiler_flags_to_target(YOUR_TARGET_NAME)
|
||||
# 3. Append necessary compiler flags and linker flags for all supported source files:
|
||||
# add_code_coverage_to_target(<target> <PRIVATE|PUBLIC|INTERFACE>)
|
||||
#
|
||||
# 3.a (OPTIONAL) Set appropriate optimization flags, e.g. -O0, -O1 or -Og
|
||||
#
|
||||
@@ -198,55 +207,69 @@ endforeach()
|
||||
|
||||
set(COVERAGE_COMPILER_FLAGS "-g --coverage"
|
||||
CACHE INTERNAL "")
|
||||
|
||||
set(COVERAGE_CXX_COMPILER_FLAGS "")
|
||||
set(COVERAGE_C_COMPILER_FLAGS "")
|
||||
set(COVERAGE_CXX_LINKER_FLAGS "")
|
||||
set(COVERAGE_C_LINKER_FLAGS "")
|
||||
|
||||
if(CMAKE_CXX_COMPILER_ID MATCHES "(GNU|Clang)")
|
||||
include(CheckCXXCompilerFlag)
|
||||
include(CheckCCompilerFlag)
|
||||
include(CheckLinkerFlag)
|
||||
|
||||
set(COVERAGE_CXX_COMPILER_FLAGS ${COVERAGE_COMPILER_FLAGS})
|
||||
set(COVERAGE_C_COMPILER_FLAGS ${COVERAGE_COMPILER_FLAGS})
|
||||
set(COVERAGE_CXX_LINKER_FLAGS ${COVERAGE_COMPILER_FLAGS})
|
||||
set(COVERAGE_C_LINKER_FLAGS ${COVERAGE_COMPILER_FLAGS})
|
||||
|
||||
check_cxx_compiler_flag(-fprofile-abs-path HAVE_cxx_fprofile_abs_path)
|
||||
if(HAVE_cxx_fprofile_abs_path)
|
||||
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_CXX_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
endif()
|
||||
include(CheckCCompilerFlag)
|
||||
|
||||
check_c_compiler_flag(-fprofile-abs-path HAVE_c_fprofile_abs_path)
|
||||
if(HAVE_c_fprofile_abs_path)
|
||||
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_C_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
set(CMAKE_Fortran_FLAGS_COVERAGE
|
||||
${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the Fortran compiler during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_CXX_FLAGS_COVERAGE
|
||||
${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the C++ compiler during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_C_FLAGS_COVERAGE
|
||||
${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the C compiler during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_EXE_LINKER_FLAGS_COVERAGE
|
||||
""
|
||||
CACHE STRING "Flags used for linking binaries during coverage builds."
|
||||
FORCE )
|
||||
set(CMAKE_SHARED_LINKER_FLAGS_COVERAGE
|
||||
""
|
||||
CACHE STRING "Flags used by the shared libraries linker during coverage builds."
|
||||
FORCE )
|
||||
mark_as_advanced(
|
||||
CMAKE_Fortran_FLAGS_COVERAGE
|
||||
CMAKE_CXX_FLAGS_COVERAGE
|
||||
CMAKE_C_FLAGS_COVERAGE
|
||||
CMAKE_EXE_LINKER_FLAGS_COVERAGE
|
||||
CMAKE_SHARED_LINKER_FLAGS_COVERAGE )
|
||||
check_linker_flag(CXX -fprofile-abs-path HAVE_cxx_linker_fprofile_abs_path)
|
||||
if(HAVE_cxx_linker_fprofile_abs_path)
|
||||
set(COVERAGE_CXX_LINKER_FLAGS "${COVERAGE_CXX_LINKER_FLAGS} -fprofile-abs-path")
|
||||
endif()
|
||||
|
||||
check_linker_flag(C -fprofile-abs-path HAVE_c_linker_fprofile_abs_path)
|
||||
if(HAVE_c_linker_fprofile_abs_path)
|
||||
set(COVERAGE_C_LINKER_FLAGS "${COVERAGE_C_LINKER_FLAGS} -fprofile-abs-path")
|
||||
endif()
|
||||
|
||||
check_cxx_compiler_flag(-fprofile-update=atomic HAVE_cxx_fprofile_update)
|
||||
if(HAVE_cxx_fprofile_update)
|
||||
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_CXX_COMPILER_FLAGS} -fprofile-update=atomic")
|
||||
endif()
|
||||
|
||||
check_c_compiler_flag(-fprofile-update=atomic HAVE_c_fprofile_update)
|
||||
if(HAVE_c_fprofile_update)
|
||||
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_C_COMPILER_FLAGS} -fprofile-update=atomic")
|
||||
endif()
|
||||
|
||||
check_linker_flag(CXX -fprofile-update=atomic HAVE_cxx_linker_fprofile_update)
|
||||
if(HAVE_cxx_linker_fprofile_update)
|
||||
set(COVERAGE_CXX_LINKER_FLAGS "${COVERAGE_CXX_LINKER_FLAGS} -fprofile-update=atomic")
|
||||
endif()
|
||||
|
||||
check_linker_flag(C -fprofile-update=atomic HAVE_c_linker_fprofile_update)
|
||||
if(HAVE_c_linker_fprofile_update)
|
||||
set(COVERAGE_C_LINKER_FLAGS "${COVERAGE_C_LINKER_FLAGS} -fprofile-update=atomic")
|
||||
endif()
|
||||
|
||||
endif()
|
||||
|
||||
get_property(GENERATOR_IS_MULTI_CONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
|
||||
if(NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG))
|
||||
message(WARNING "Code coverage results with an optimised (non-Debug) build may be misleading")
|
||||
endif() # NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG)
|
||||
|
||||
if(CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
|
||||
link_libraries(gcov)
|
||||
endif()
|
||||
|
||||
# Defines a target for running and collection code coverage information
|
||||
# Builds dependencies, runs the given executable and outputs reports.
|
||||
# NOTE! The executable should always have a ZERO as exit code otherwise
|
||||
@@ -431,23 +454,24 @@ function(setup_target_for_coverage_gcovr)
|
||||
|
||||
# Show info where to find the report
|
||||
add_custom_command(TARGET ${Coverage_NAME} POST_BUILD
|
||||
COMMAND ;
|
||||
COMMAND echo
|
||||
COMMENT "Code coverage report saved in ${GCOVR_OUTPUT_FILE} formatted as ${Coverage_FORMAT}"
|
||||
)
|
||||
endfunction() # setup_target_for_coverage_gcovr
|
||||
|
||||
function(append_coverage_compiler_flags)
|
||||
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
set(CMAKE_Fortran_FLAGS "${CMAKE_Fortran_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
message(STATUS "Appending code coverage compiler flags: ${COVERAGE_COMPILER_FLAGS}")
|
||||
endfunction() # append_coverage_compiler_flags
|
||||
function(add_code_coverage_to_target name scope)
|
||||
separate_arguments(COVERAGE_CXX_COMPILER_FLAGS NATIVE_COMMAND "${COVERAGE_CXX_COMPILER_FLAGS}")
|
||||
separate_arguments(COVERAGE_C_COMPILER_FLAGS NATIVE_COMMAND "${COVERAGE_C_COMPILER_FLAGS}")
|
||||
separate_arguments(COVERAGE_CXX_LINKER_FLAGS NATIVE_COMMAND "${COVERAGE_CXX_LINKER_FLAGS}")
|
||||
separate_arguments(COVERAGE_C_LINKER_FLAGS NATIVE_COMMAND "${COVERAGE_C_LINKER_FLAGS}")
|
||||
|
||||
# Setup coverage for specific library
|
||||
function(append_coverage_compiler_flags_to_target name)
|
||||
separate_arguments(_flag_list NATIVE_COMMAND "${COVERAGE_COMPILER_FLAGS}")
|
||||
target_compile_options(${name} PRIVATE ${_flag_list})
|
||||
if(CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
|
||||
target_link_libraries(${name} PRIVATE gcov)
|
||||
endif()
|
||||
endfunction()
|
||||
# Add compiler options to the target
|
||||
target_compile_options(${name} ${scope}
|
||||
$<$<COMPILE_LANGUAGE:CXX>:${COVERAGE_CXX_COMPILER_FLAGS}>
|
||||
$<$<COMPILE_LANGUAGE:C>:${COVERAGE_C_COMPILER_FLAGS}>)
|
||||
|
||||
target_link_libraries (${name} ${scope}
|
||||
$<$<LINK_LANGUAGE:CXX>:${COVERAGE_CXX_LINKER_FLAGS} gcov>
|
||||
$<$<LINK_LANGUAGE:C>:${COVERAGE_C_LINKER_FLAGS} gcov>
|
||||
)
|
||||
endfunction() # add_code_coverage_to_target
|
||||
|
||||
@@ -16,13 +16,16 @@ set(CMAKE_CXX_EXTENSIONS OFF)
|
||||
target_compile_definitions (common
|
||||
INTERFACE
|
||||
$<$<CONFIG:Debug>:DEBUG _DEBUG>
|
||||
$<$<AND:$<BOOL:${profile}>,$<NOT:$<BOOL:${assert}>>>:NDEBUG>)
|
||||
# ^^^^ NOTE: CMAKE release builds already have NDEBUG
|
||||
# defined, so no need to add it explicitly except for
|
||||
# this special case of (profile ON) and (assert OFF)
|
||||
# -- presumably this is because we don't want profile
|
||||
# builds asserting unless asserts were specifically
|
||||
# requested
|
||||
#[===[
|
||||
NOTE: CMAKE release builds already have NDEBUG defined, so no need to add it
|
||||
explicitly except for the special case of (profile ON) and (assert OFF).
|
||||
Presumably this is because we don't want profile builds asserting unless
|
||||
asserts were specifically requested.
|
||||
]===]
|
||||
$<$<AND:$<BOOL:${profile}>,$<NOT:$<BOOL:${assert}>>>:NDEBUG>
|
||||
# TODO: Remove once we have migrated functions from OpenSSL 1.x to 3.x.
|
||||
OPENSSL_SUPPRESS_DEPRECATED
|
||||
)
|
||||
|
||||
if (MSVC)
|
||||
# remove existing exception flag since we set it to -EHa
|
||||
@@ -90,28 +93,16 @@ if (MSVC)
|
||||
-errorreport:none
|
||||
-machine:X64)
|
||||
else ()
|
||||
# HACK : because these need to come first, before any warning demotion
|
||||
string (APPEND CMAKE_CXX_FLAGS " -Wall -Wdeprecated")
|
||||
if (wextra)
|
||||
string (APPEND CMAKE_CXX_FLAGS " -Wextra -Wno-unused-parameter")
|
||||
endif ()
|
||||
# not MSVC
|
||||
target_compile_options (common
|
||||
INTERFACE
|
||||
-Wall
|
||||
-Wdeprecated
|
||||
$<$<BOOL:${is_clang}>:-Wno-deprecated-declarations>
|
||||
$<$<BOOL:${wextra}>:-Wextra -Wno-unused-parameter>
|
||||
$<$<BOOL:${werr}>:-Werror>
|
||||
$<$<COMPILE_LANGUAGE:CXX>:
|
||||
-frtti
|
||||
-Wnon-virtual-dtor
|
||||
>
|
||||
-Wno-sign-compare
|
||||
-Wno-char-subscripts
|
||||
-Wno-format
|
||||
-Wno-unused-local-typedefs
|
||||
-fstack-protector
|
||||
$<$<BOOL:${is_gcc}>:
|
||||
-Wno-unused-but-set-variable
|
||||
-Wno-deprecated
|
||||
>
|
||||
-Wno-sign-compare
|
||||
-Wno-unused-but-set-variable
|
||||
$<$<NOT:$<CONFIG:Debug>>:-fno-strict-aliasing>
|
||||
# tweak gcc optimization for debug
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<CONFIG:Debug>>:-O0>
|
||||
@@ -120,7 +111,7 @@ else ()
|
||||
target_link_libraries (common
|
||||
INTERFACE
|
||||
-rdynamic
|
||||
$<$<BOOL:${is_linux}>:-Wl,-z,relro,-z,now>
|
||||
$<$<BOOL:${is_linux}>:-Wl,-z,relro,-z,now,--build-id>
|
||||
# link to static libc/c++ iff:
|
||||
# * static option set and
|
||||
# * NOT APPLE (AppleClang does not support static libc/c++) and
|
||||
@@ -131,6 +122,17 @@ else ()
|
||||
>)
|
||||
endif ()
|
||||
|
||||
# Antithesis instrumentation will only be built and deployed using machines running Linux.
|
||||
if (voidstar)
|
||||
if (NOT CMAKE_BUILD_TYPE STREQUAL "Debug")
|
||||
message(FATAL_ERROR "Antithesis instrumentation requires Debug build type, aborting...")
|
||||
elseif (NOT is_linux)
|
||||
message(FATAL_ERROR "Antithesis instrumentation requires Linux, aborting...")
|
||||
elseif (NOT (is_clang AND CMAKE_CXX_COMPILER_VERSION VERSION_GREATER_EQUAL 16.0))
|
||||
message(FATAL_ERROR "Antithesis instrumentation requires Clang version 16 or later, aborting...")
|
||||
endif ()
|
||||
endif ()
|
||||
|
||||
if (use_mold)
|
||||
# use mold linker if available
|
||||
execute_process (
|
||||
|
||||
@@ -9,6 +9,7 @@ include(target_protobuf_sources)
|
||||
# define a bunch of `static const` variables with the same names,
|
||||
# so we just build them as a separate library.
|
||||
add_library(xrpl.libpb)
|
||||
set_target_properties(xrpl.libpb PROPERTIES UNITY_BUILD OFF)
|
||||
target_protobuf_sources(xrpl.libpb xrpl/proto
|
||||
LANGUAGE cpp
|
||||
IMPORT_DIRS include/xrpl/proto
|
||||
@@ -47,37 +48,11 @@ target_link_libraries(xrpl.libpb
|
||||
gRPC::grpc++
|
||||
)
|
||||
|
||||
add_library(xrpl.libxrpl)
|
||||
set_target_properties(xrpl.libxrpl PROPERTIES OUTPUT_NAME xrpl)
|
||||
if(unity)
|
||||
set_target_properties(xrpl.libxrpl PROPERTIES UNITY_BUILD ON)
|
||||
endif()
|
||||
# TODO: Clean up the number of library targets later.
|
||||
add_library(xrpl.imports.main INTERFACE)
|
||||
|
||||
add_library(xrpl::libxrpl ALIAS xrpl.libxrpl)
|
||||
|
||||
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src/libxrpl/*.cpp"
|
||||
)
|
||||
target_sources(xrpl.libxrpl PRIVATE ${sources})
|
||||
|
||||
target_include_directories(xrpl.libxrpl
|
||||
PUBLIC
|
||||
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/include>
|
||||
$<INSTALL_INTERFACE:include>)
|
||||
|
||||
target_compile_definitions(xrpl.libxrpl
|
||||
PUBLIC
|
||||
BOOST_ASIO_USE_TS_EXECUTOR_AS_DEFAULT
|
||||
BOOST_CONTAINER_FWD_BAD_DEQUE
|
||||
HAS_UNCAUGHT_EXCEPTIONS=1)
|
||||
|
||||
target_compile_options(xrpl.libxrpl
|
||||
PUBLIC
|
||||
$<$<BOOL:${is_gcc}>:-Wno-maybe-uninitialized>
|
||||
)
|
||||
|
||||
target_link_libraries(xrpl.libxrpl
|
||||
PUBLIC
|
||||
target_link_libraries(xrpl.imports.main
|
||||
INTERFACE
|
||||
LibArchive::LibArchive
|
||||
OpenSSL::Crypto
|
||||
Ripple::boost
|
||||
@@ -89,15 +64,98 @@ target_link_libraries(xrpl.libxrpl
|
||||
secp256k1::secp256k1
|
||||
xrpl.libpb
|
||||
xxHash::xxhash
|
||||
$<$<BOOL:${voidstar}>:antithesis-sdk-cpp>
|
||||
)
|
||||
|
||||
include(add_module)
|
||||
include(target_link_modules)
|
||||
|
||||
# Level 01
|
||||
add_module(xrpl beast)
|
||||
target_link_libraries(xrpl.libxrpl.beast PUBLIC
|
||||
xrpl.imports.main
|
||||
xrpl.libpb
|
||||
)
|
||||
|
||||
# Level 02
|
||||
add_module(xrpl basics)
|
||||
target_link_libraries(xrpl.libxrpl.basics PUBLIC xrpl.libxrpl.beast)
|
||||
|
||||
# Level 03
|
||||
add_module(xrpl json)
|
||||
target_link_libraries(xrpl.libxrpl.json PUBLIC xrpl.libxrpl.basics)
|
||||
|
||||
add_module(xrpl crypto)
|
||||
target_link_libraries(xrpl.libxrpl.crypto PUBLIC xrpl.libxrpl.basics)
|
||||
|
||||
# Level 04
|
||||
add_module(xrpl protocol)
|
||||
target_link_libraries(xrpl.libxrpl.protocol PUBLIC
|
||||
xrpl.libxrpl.crypto
|
||||
xrpl.libxrpl.json
|
||||
)
|
||||
|
||||
# Level 05
|
||||
add_module(xrpl resource)
|
||||
target_link_libraries(xrpl.libxrpl.resource PUBLIC xrpl.libxrpl.protocol)
|
||||
|
||||
# Level 06
|
||||
add_module(xrpl net)
|
||||
target_link_libraries(xrpl.libxrpl.net PUBLIC
|
||||
xrpl.libxrpl.basics
|
||||
xrpl.libxrpl.json
|
||||
xrpl.libxrpl.protocol
|
||||
xrpl.libxrpl.resource
|
||||
)
|
||||
|
||||
add_module(xrpl server)
|
||||
target_link_libraries(xrpl.libxrpl.server PUBLIC xrpl.libxrpl.protocol)
|
||||
|
||||
add_module(xrpl ledger)
|
||||
target_link_libraries(xrpl.libxrpl.ledger PUBLIC
|
||||
xrpl.libxrpl.basics
|
||||
xrpl.libxrpl.json
|
||||
xrpl.libxrpl.protocol
|
||||
)
|
||||
|
||||
add_library(xrpl.libxrpl)
|
||||
set_target_properties(xrpl.libxrpl PROPERTIES OUTPUT_NAME xrpl)
|
||||
|
||||
add_library(xrpl::libxrpl ALIAS xrpl.libxrpl)
|
||||
|
||||
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src/libxrpl/*.cpp"
|
||||
)
|
||||
target_sources(xrpl.libxrpl PRIVATE ${sources})
|
||||
|
||||
target_link_modules(xrpl PUBLIC
|
||||
basics
|
||||
beast
|
||||
crypto
|
||||
json
|
||||
protocol
|
||||
resource
|
||||
server
|
||||
net
|
||||
ledger
|
||||
)
|
||||
|
||||
# All headers in libxrpl are in modules.
|
||||
# Uncomment this stanza if you have not yet moved new headers into a module.
|
||||
# target_include_directories(xrpl.libxrpl
|
||||
# PRIVATE
|
||||
# $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/src>
|
||||
# PUBLIC
|
||||
# $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}/include>
|
||||
# $<INSTALL_INTERFACE:include>)
|
||||
|
||||
if(xrpld)
|
||||
add_executable(rippled)
|
||||
if(unity)
|
||||
set_target_properties(rippled PROPERTIES UNITY_BUILD ON)
|
||||
endif()
|
||||
if(tests)
|
||||
target_compile_definitions(rippled PUBLIC ENABLE_TESTS)
|
||||
target_compile_definitions(rippled PRIVATE
|
||||
UNIT_TEST_REFERENCE_FEE=${UNIT_TEST_REFERENCE_FEE}
|
||||
)
|
||||
endif()
|
||||
target_include_directories(rippled
|
||||
PRIVATE
|
||||
@@ -129,6 +187,19 @@ if(xrpld)
|
||||
target_compile_definitions(rippled PRIVATE RIPPLED_RUNNING_IN_CI)
|
||||
endif ()
|
||||
|
||||
if(voidstar)
|
||||
target_compile_options(rippled
|
||||
PRIVATE
|
||||
-fsanitize-coverage=trace-pc-guard
|
||||
)
|
||||
# rippled requires access to antithesis-sdk-cpp implementation file
|
||||
# antithesis_instrumentation.h, which is not exported as INTERFACE
|
||||
target_include_directories(rippled
|
||||
PRIVATE
|
||||
${CMAKE_SOURCE_DIR}/external/antithesis-sdk
|
||||
)
|
||||
endif()
|
||||
|
||||
# any files that don't play well with unity should be added here
|
||||
if(tests)
|
||||
set_source_files_properties(
|
||||
|
||||
@@ -33,6 +33,8 @@ setup_target_for_coverage_gcovr(
|
||||
FORMAT ${coverage_format}
|
||||
EXECUTABLE rippled
|
||||
EXECUTABLE_ARGS --unittest$<$<BOOL:${coverage_test}>:=${coverage_test}> --unittest-jobs ${coverage_test_parallelism} --quiet --unittest-log
|
||||
EXCLUDE "src/test" "include/xrpl/beast/test" "include/xrpl/beast/unit_test" "${CMAKE_BINARY_DIR}/pb-xrpl.libpb"
|
||||
EXCLUDE "src/test" "src/tests" "include/xrpl/beast/test" "include/xrpl/beast/unit_test" "${CMAKE_BINARY_DIR}/pb-xrpl.libpb"
|
||||
DEPENDENCIES rippled
|
||||
)
|
||||
|
||||
add_code_coverage_to_target(opts INTERFACE)
|
||||
|
||||
@@ -53,9 +53,9 @@ set(download_script "${CMAKE_BINARY_DIR}/docs/download-cppreference.cmake")
|
||||
file(WRITE
|
||||
"${download_script}"
|
||||
"file(DOWNLOAD \
|
||||
http://upload.cppreference.com/mwiki/images/b/b2/html_book_20190607.zip \
|
||||
https://github.com/PeterFeicht/cppreference-doc/releases/download/v20250209/html-book-20250209.zip \
|
||||
${CMAKE_BINARY_DIR}/docs/cppreference.zip \
|
||||
EXPECTED_HASH MD5=82b3a612d7d35a83e3cb1195a63689ab \
|
||||
EXPECTED_HASH MD5=bda585f72fbca4b817b29a3d5746567b \
|
||||
)\n \
|
||||
execute_process( \
|
||||
COMMAND \"${CMAKE_COMMAND}\" -E tar -xf cppreference.zip \
|
||||
|
||||
@@ -2,14 +2,27 @@
|
||||
install stuff
|
||||
#]===================================================================]
|
||||
|
||||
include(create_symbolic_link)
|
||||
|
||||
install (
|
||||
TARGETS
|
||||
common
|
||||
opts
|
||||
ripple_syslibs
|
||||
ripple_boost
|
||||
xrpl.imports.main
|
||||
xrpl.libpb
|
||||
xrpl.libxrpl.basics
|
||||
xrpl.libxrpl.beast
|
||||
xrpl.libxrpl.crypto
|
||||
xrpl.libxrpl.json
|
||||
xrpl.libxrpl.protocol
|
||||
xrpl.libxrpl.resource
|
||||
xrpl.libxrpl.ledger
|
||||
xrpl.libxrpl.server
|
||||
xrpl.libxrpl.net
|
||||
xrpl.libxrpl
|
||||
antithesis-sdk-cpp
|
||||
EXPORT RippleExports
|
||||
LIBRARY DESTINATION lib
|
||||
ARCHIVE DESTINATION lib
|
||||
@@ -21,12 +34,12 @@ install(
|
||||
DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}"
|
||||
)
|
||||
|
||||
if(NOT WIN32)
|
||||
install(
|
||||
CODE "file(CREATE_LINK xrpl \
|
||||
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple SYMBOLIC)"
|
||||
)
|
||||
endif()
|
||||
install(CODE "
|
||||
set(CMAKE_MODULE_PATH \"${CMAKE_MODULE_PATH}\")
|
||||
include(create_symbolic_link)
|
||||
create_symbolic_link(xrpl \
|
||||
\$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}/ripple)
|
||||
")
|
||||
|
||||
install (EXPORT RippleExports
|
||||
FILE RippleTargets.cmake
|
||||
@@ -55,12 +68,12 @@ if (is_root_project AND TARGET rippled)
|
||||
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/rippled-example.cfg\" etc rippled.cfg)
|
||||
copy_if_not_exists(\"${CMAKE_CURRENT_SOURCE_DIR}/cfg/validators-example.txt\" etc validators.txt)
|
||||
")
|
||||
if(NOT WIN32)
|
||||
install(
|
||||
CODE "file(CREATE_LINK rippled${suffix} \
|
||||
\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix} SYMBOLIC)"
|
||||
)
|
||||
endif()
|
||||
install(CODE "
|
||||
set(CMAKE_MODULE_PATH \"${CMAKE_MODULE_PATH}\")
|
||||
include(create_symbolic_link)
|
||||
create_symbolic_link(rippled${suffix} \
|
||||
\$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_BINDIR}/xrpld${suffix})
|
||||
")
|
||||
endif ()
|
||||
|
||||
install (
|
||||
|
||||
@@ -7,6 +7,9 @@ add_library (Ripple::opts ALIAS opts)
|
||||
target_compile_definitions (opts
|
||||
INTERFACE
|
||||
BOOST_ASIO_DISABLE_HANDLER_TYPE_REQUIREMENTS
|
||||
BOOST_ASIO_USE_TS_EXECUTOR_AS_DEFAULT
|
||||
BOOST_CONTAINER_FWD_BAD_DEQUE
|
||||
HAS_UNCAUGHT_EXCEPTIONS=1
|
||||
$<$<BOOL:${boost_show_deprecated}>:
|
||||
BOOST_ASIO_NO_DEPRECATED
|
||||
BOOST_FILESYSTEM_NO_DEPRECATED
|
||||
@@ -18,20 +21,18 @@ target_compile_definitions (opts
|
||||
>
|
||||
$<$<BOOL:${beast_no_unit_test_inline}>:BEAST_NO_UNIT_TEST_INLINE=1>
|
||||
$<$<BOOL:${beast_disable_autolink}>:BEAST_DONT_AUTOLINK_TO_WIN32_LIBRARIES=1>
|
||||
$<$<BOOL:${single_io_service_thread}>:RIPPLE_SINGLE_IO_SERVICE_THREAD=1>)
|
||||
$<$<BOOL:${single_io_service_thread}>:RIPPLE_SINGLE_IO_SERVICE_THREAD=1>
|
||||
$<$<BOOL:${voidstar}>:ENABLE_VOIDSTAR>)
|
||||
target_compile_options (opts
|
||||
INTERFACE
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<COMPILE_LANGUAGE:CXX>>:-Wsuggest-override>
|
||||
$<$<BOOL:${is_gcc}>:-Wno-maybe-uninitialized>
|
||||
$<$<BOOL:${perf}>:-fno-omit-frame-pointer>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-g --coverage -fprofile-abs-path>
|
||||
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-g --coverage>
|
||||
$<$<BOOL:${profile}>:-pg>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
|
||||
|
||||
target_link_libraries (opts
|
||||
INTERFACE
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${coverage}>>:-g --coverage -fprofile-abs-path>
|
||||
$<$<AND:$<BOOL:${is_clang}>,$<BOOL:${coverage}>>:-g --coverage>
|
||||
$<$<BOOL:${profile}>:-pg>
|
||||
$<$<AND:$<BOOL:${is_gcc}>,$<BOOL:${profile}>>:-p>)
|
||||
|
||||
|
||||
@@ -2,16 +2,6 @@
|
||||
convenience variables and sanity checks
|
||||
#]===================================================================]
|
||||
|
||||
include(ProcessorCount)
|
||||
|
||||
if (NOT ep_procs)
|
||||
ProcessorCount(ep_procs)
|
||||
if (ep_procs GREATER 1)
|
||||
# never use more than half of cores for EP builds
|
||||
math (EXPR ep_procs "${ep_procs} / 2")
|
||||
message (STATUS "Using ${ep_procs} cores for ExternalProject builds.")
|
||||
endif ()
|
||||
endif ()
|
||||
get_property(is_multiconfig GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
|
||||
|
||||
set (CMAKE_CONFIGURATION_TYPES "Debug;Release" CACHE STRING "" FORCE)
|
||||
|
||||
@@ -11,12 +11,22 @@ option(assert "Enables asserts, even in release builds" OFF)
|
||||
option(xrpld "Build xrpld" ON)
|
||||
|
||||
option(tests "Build tests" ON)
|
||||
if(tests)
|
||||
# This setting allows making a separate workflow to test fees other than default 10
|
||||
if(NOT UNIT_TEST_REFERENCE_FEE)
|
||||
set(UNIT_TEST_REFERENCE_FEE "10" CACHE STRING "")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
option(unity "Creates a build using UNITY support in cmake. This is the default" ON)
|
||||
option(unity "Creates a build using UNITY support in cmake." OFF)
|
||||
if(unity)
|
||||
if(NOT is_ci)
|
||||
set(CMAKE_UNITY_BUILD_BATCH_SIZE 15 CACHE STRING "")
|
||||
endif()
|
||||
set(CMAKE_UNITY_BUILD ON CACHE BOOL "Do a unity build")
|
||||
endif()
|
||||
if(is_clang AND is_linux)
|
||||
option(voidstar "Enable Antithesis instrumentation." OFF)
|
||||
endif()
|
||||
if(is_gcc OR is_clang)
|
||||
option(coverage "Generates coverage info." OFF)
|
||||
@@ -108,7 +118,7 @@ option(beast_no_unit_test_inline
|
||||
"Prevents unit test definitions from being inserted into global table"
|
||||
OFF)
|
||||
option(single_io_service_thread
|
||||
"Restricts the number of threads calling io_service::run to one. \
|
||||
"Restricts the number of threads calling io_context::run to one. \
|
||||
This can be useful when debugging."
|
||||
OFF)
|
||||
option(boost_show_deprecated
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
option (validator_keys "Enables building of validator-keys-tool as a separate target (imported via FetchContent)" OFF)
|
||||
option (validator_keys "Enables building of validator-keys tool as a separate target (imported via FetchContent)" OFF)
|
||||
|
||||
if (validator_keys)
|
||||
git_branch (current_branch)
|
||||
@@ -6,17 +6,15 @@ if (validator_keys)
|
||||
if (NOT (current_branch STREQUAL "release"))
|
||||
set (current_branch "master")
|
||||
endif ()
|
||||
message (STATUS "tracking ValidatorKeys branch: ${current_branch}")
|
||||
message (STATUS "Tracking ValidatorKeys branch: ${current_branch}")
|
||||
|
||||
FetchContent_Declare (
|
||||
validator_keys_src
|
||||
validator_keys
|
||||
GIT_REPOSITORY https://github.com/ripple/validator-keys-tool.git
|
||||
GIT_TAG "${current_branch}"
|
||||
)
|
||||
FetchContent_GetProperties (validator_keys_src)
|
||||
if (NOT validator_keys_src_POPULATED)
|
||||
message (STATUS "Pausing to download ValidatorKeys...")
|
||||
FetchContent_Populate (validator_keys_src)
|
||||
endif ()
|
||||
add_subdirectory (${validator_keys_src_SOURCE_DIR} ${CMAKE_BINARY_DIR}/validator-keys)
|
||||
FetchContent_MakeAvailable(validator_keys)
|
||||
set_target_properties(validator-keys PROPERTIES RUNTIME_OUTPUT_DIRECTORY "${CMAKE_BINARY_DIR}")
|
||||
install(TARGETS validator-keys RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR})
|
||||
|
||||
endif ()
|
||||
|
||||
37
cmake/add_module.cmake
Normal file
37
cmake/add_module.cmake
Normal file
@@ -0,0 +1,37 @@
|
||||
include(isolate_headers)
|
||||
|
||||
# Create an OBJECT library target named
|
||||
#
|
||||
# ${PROJECT_NAME}.lib${parent}.${name}
|
||||
#
|
||||
# with sources in src/lib${parent}/${name}
|
||||
# and headers in include/${parent}/${name}
|
||||
# that cannot include headers from other directories in include/
|
||||
# unless they come through linked libraries.
|
||||
#
|
||||
# add_module(parent a)
|
||||
# add_module(parent b)
|
||||
# target_link_libraries(project.libparent.b PUBLIC project.libparent.a)
|
||||
function(add_module parent name)
|
||||
set(target ${PROJECT_NAME}.lib${parent}.${name})
|
||||
add_library(${target} OBJECT)
|
||||
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src/lib${parent}/${name}/*.cpp"
|
||||
)
|
||||
target_sources(${target} PRIVATE ${sources})
|
||||
target_include_directories(${target} PUBLIC
|
||||
"$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>"
|
||||
)
|
||||
isolate_headers(
|
||||
${target}
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/include"
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/include/${parent}/${name}"
|
||||
PUBLIC
|
||||
)
|
||||
isolate_headers(
|
||||
${target}
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src"
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/src/lib${parent}/${name}"
|
||||
PRIVATE
|
||||
)
|
||||
endfunction()
|
||||
20
cmake/create_symbolic_link.cmake
Normal file
20
cmake/create_symbolic_link.cmake
Normal file
@@ -0,0 +1,20 @@
|
||||
# file(CREATE_SYMLINK) only works on Windows with administrator privileges.
|
||||
# https://stackoverflow.com/a/61244115/618906
|
||||
function(create_symbolic_link target link)
|
||||
if(WIN32)
|
||||
if(NOT IS_SYMLINK "${link}")
|
||||
if(NOT IS_ABSOLUTE "${target}")
|
||||
# Relative links work do not work on Windows.
|
||||
set(target "${link}/../${target}")
|
||||
endif()
|
||||
file(TO_NATIVE_PATH "${target}" target)
|
||||
file(TO_NATIVE_PATH "${link}" link)
|
||||
execute_process(COMMAND cmd.exe /c mklink /J "${link}" "${target}")
|
||||
endif()
|
||||
else()
|
||||
file(CREATE_LINK "${target}" "${link}" SYMBOLIC)
|
||||
endif()
|
||||
if(NOT IS_SYMLINK "${link}")
|
||||
message(ERROR "failed to create symlink: <${link}>")
|
||||
endif()
|
||||
endfunction()
|
||||
@@ -2,7 +2,6 @@ find_package(Boost 1.82 REQUIRED
|
||||
COMPONENTS
|
||||
chrono
|
||||
container
|
||||
context
|
||||
coroutine
|
||||
date_time
|
||||
filesystem
|
||||
@@ -15,22 +14,17 @@ find_package(Boost 1.82 REQUIRED
|
||||
|
||||
add_library(ripple_boost INTERFACE)
|
||||
add_library(Ripple::boost ALIAS ripple_boost)
|
||||
if(XCODE)
|
||||
target_include_directories(ripple_boost BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
|
||||
target_compile_options(ripple_boost INTERFACE --system-header-prefix="boost/")
|
||||
else()
|
||||
target_include_directories(ripple_boost SYSTEM BEFORE INTERFACE ${Boost_INCLUDE_DIRS})
|
||||
endif()
|
||||
|
||||
target_link_libraries(ripple_boost
|
||||
INTERFACE
|
||||
Boost::boost
|
||||
Boost::headers
|
||||
Boost::chrono
|
||||
Boost::container
|
||||
Boost::coroutine
|
||||
Boost::date_time
|
||||
Boost::filesystem
|
||||
Boost::json
|
||||
Boost::process
|
||||
Boost::program_options
|
||||
Boost::regex
|
||||
Boost::system
|
||||
|
||||
48
cmake/isolate_headers.cmake
Normal file
48
cmake/isolate_headers.cmake
Normal file
@@ -0,0 +1,48 @@
|
||||
include(create_symbolic_link)
|
||||
|
||||
# Consider include directory B nested under prefix A:
|
||||
#
|
||||
# /path/to/A/then/to/B/...
|
||||
#
|
||||
# Call C the relative path from A to B.
|
||||
# C is what we want to write in `#include` directives:
|
||||
#
|
||||
# #include <then/to/B/...>
|
||||
#
|
||||
# Examples, all from the `jobqueue` module:
|
||||
#
|
||||
# - Library public headers:
|
||||
# B = /include/xrpl/jobqueue
|
||||
# A = /include/
|
||||
# C = xrpl/jobqueue
|
||||
#
|
||||
# - Library private headers:
|
||||
# B = /src/libxrpl/jobqueue
|
||||
# A = /src/
|
||||
# C = libxrpl/jobqueue
|
||||
#
|
||||
# - Test private headers:
|
||||
# B = /tests/jobqueue
|
||||
# A = /
|
||||
# C = tests/jobqueue
|
||||
#
|
||||
# To isolate headers from each other,
|
||||
# we want to create a symlink Y that points to B,
|
||||
# within a subdirectory X of the `CMAKE_BINARY_DIR`,
|
||||
# that has the same relative path C between X and Y,
|
||||
# and then add X as an include directory of the target,
|
||||
# sometimes `PUBLIC` and sometimes `PRIVATE`.
|
||||
# The Cs are all guaranteed to be unique.
|
||||
# We can guarantee a unique X per target by using
|
||||
# `${CMAKE_CURRENT_BINARY_DIR}/include/${target}`.
|
||||
#
|
||||
# isolate_headers(target A B scope)
|
||||
function(isolate_headers target A B scope)
|
||||
file(RELATIVE_PATH C "${A}" "${B}")
|
||||
set(X "${CMAKE_CURRENT_BINARY_DIR}/modules/${target}")
|
||||
set(Y "${X}/${C}")
|
||||
cmake_path(GET Y PARENT_PATH parent)
|
||||
file(MAKE_DIRECTORY "${parent}")
|
||||
create_symbolic_link("${B}" "${Y}")
|
||||
target_include_directories(${target} ${scope} "$<BUILD_INTERFACE:${X}>")
|
||||
endfunction()
|
||||
24
cmake/target_link_modules.cmake
Normal file
24
cmake/target_link_modules.cmake
Normal file
@@ -0,0 +1,24 @@
|
||||
# Link a library to its modules (see: `add_module`)
|
||||
# and remove the module sources from the library's sources.
|
||||
#
|
||||
# add_module(parent a)
|
||||
# add_module(parent b)
|
||||
# target_link_libraries(project.libparent.b PUBLIC project.libparent.a)
|
||||
# add_library(project.libparent)
|
||||
# target_link_modules(parent PUBLIC a b)
|
||||
function(target_link_modules parent scope)
|
||||
set(library ${PROJECT_NAME}.lib${parent})
|
||||
foreach(name ${ARGN})
|
||||
set(module ${library}.${name})
|
||||
get_target_property(sources ${library} SOURCES)
|
||||
list(LENGTH sources before)
|
||||
get_target_property(dupes ${module} SOURCES)
|
||||
list(LENGTH dupes expected)
|
||||
list(REMOVE_ITEM sources ${dupes})
|
||||
list(LENGTH sources after)
|
||||
math(EXPR actual "${before} - ${after}")
|
||||
message(STATUS "${module} with ${expected} sources took ${actual} sources from ${library}")
|
||||
set_target_properties(${library} PROPERTIES SOURCES "${sources}")
|
||||
target_link_libraries(${library} ${scope} ${module})
|
||||
endforeach()
|
||||
endfunction()
|
||||
41
cmake/xrpl_add_test.cmake
Normal file
41
cmake/xrpl_add_test.cmake
Normal file
@@ -0,0 +1,41 @@
|
||||
include(isolate_headers)
|
||||
|
||||
function(xrpl_add_test name)
|
||||
set(target ${PROJECT_NAME}.test.${name})
|
||||
|
||||
file(GLOB_RECURSE sources CONFIGURE_DEPENDS
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/${name}/*.cpp"
|
||||
"${CMAKE_CURRENT_SOURCE_DIR}/${name}.cpp"
|
||||
)
|
||||
add_executable(${target} ${ARGN} ${sources})
|
||||
|
||||
isolate_headers(
|
||||
${target}
|
||||
"${CMAKE_SOURCE_DIR}"
|
||||
"${CMAKE_SOURCE_DIR}/tests/${name}"
|
||||
PRIVATE
|
||||
)
|
||||
|
||||
# Make sure the test isn't optimized away in unity builds
|
||||
set_target_properties(${target} PROPERTIES
|
||||
UNITY_BUILD_MODE GROUP
|
||||
UNITY_BUILD_BATCH_SIZE 0) # Adjust as needed
|
||||
|
||||
add_test(NAME ${target} COMMAND ${target})
|
||||
set_tests_properties(
|
||||
${target} PROPERTIES
|
||||
FIXTURES_REQUIRED ${target}_fixture
|
||||
)
|
||||
|
||||
add_test(
|
||||
NAME ${target}.build
|
||||
COMMAND
|
||||
${CMAKE_COMMAND}
|
||||
--build ${CMAKE_BINARY_DIR}
|
||||
--config $<CONFIG>
|
||||
--target ${target}
|
||||
)
|
||||
set_tests_properties(${target}.build PROPERTIES
|
||||
FIXTURES_SETUP ${target}_fixture
|
||||
)
|
||||
endfunction()
|
||||
56
conan.lock
Normal file
56
conan.lock
Normal file
@@ -0,0 +1,56 @@
|
||||
{
|
||||
"version": "0.5",
|
||||
"requires": [
|
||||
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1756234269.497",
|
||||
"xxhash/0.8.3#681d36a0a6111fc56e5e45ea182c19cc%1756234289.683",
|
||||
"sqlite3/3.49.1#8631739a4c9b93bd3d6b753bac548a63%1756234266.869",
|
||||
"soci/4.0.3#a9f8d773cd33e356b5879a4b0564f287%1756234262.318",
|
||||
"snappy/1.1.10#968fef506ff261592ec30c574d4a7809%1756234314.246",
|
||||
"rocksdb/10.0.1#85537f46e538974d67da0c3977de48ac%1756234304.347",
|
||||
"re2/20230301#dfd6e2bf050eb90ddd8729cfb4c844a4%1756234257.976",
|
||||
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
|
||||
"openssl/3.5.4#a1d5835cc6ed5c5b8f3cd5b9b5d24205%1759746684.671",
|
||||
"nudb/2.0.9#c62cfd501e57055a7e0d8ee3d5e5427d%1756234237.107",
|
||||
"lz4/1.10.0#59fc63cac7f10fbe8e05c7e62c2f3504%1756234228.999",
|
||||
"libiconv/1.17#1e65319e945f2d31941a9d28cc13c058%1756223727.64",
|
||||
"libbacktrace/cci.20210118#a7691bfccd8caaf66309df196790a5a1%1756230911.03",
|
||||
"libarchive/3.8.1#5cf685686322e906cb42706ab7e099a8%1756234256.696",
|
||||
"jemalloc/5.3.0#e951da9cf599e956cebc117880d2d9f8%1729241615.244",
|
||||
"grpc/1.50.1#02291451d1e17200293a409410d1c4e1%1756234248.958",
|
||||
"doctest/2.4.11#a4211dfc329a16ba9f280f9574025659%1756234220.819",
|
||||
"date/3.0.4#f74bbba5a08fa388256688743136cb6f%1756234217.493",
|
||||
"c-ares/1.34.5#b78b91e7cfb1f11ce777a285bbf169c6%1756234217.915",
|
||||
"bzip2/1.0.8#00b4a4658791c1f06914e087f0e792f5%1756234261.716",
|
||||
"boost/1.88.0#8852c0b72ce8271fb8ff7c53456d4983%1756223752.326",
|
||||
"abseil/20230802.1#f0f91485b111dc9837a68972cb19ca7b%1756234220.907"
|
||||
],
|
||||
"build_requires": [
|
||||
"zlib/1.3.1#b8bc2603263cf7eccbd6e17e66b0ed76%1756234269.497",
|
||||
"strawberryperl/5.32.1.1#707032463aa0620fa17ec0d887f5fe41%1756234281.733",
|
||||
"protobuf/3.21.12#d927114e28de9f4691a6bbcdd9a529d1%1756234251.614",
|
||||
"nasm/2.16.01#31e26f2ee3c4346ecd347911bd126904%1756234232.901",
|
||||
"msys2/cci.latest#5b73b10144f73cc5bfe0572ed9be39e1%1751977009.857",
|
||||
"m4/1.4.19#b38ced39a01e31fef5435bc634461fd2%1700758725.451",
|
||||
"cmake/3.31.8#dde3bde00bb843687e55aea5afa0e220%1756234232.89",
|
||||
"b2/5.3.3#107c15377719889654eb9a162a673975%1756234226.28",
|
||||
"automake/1.16.5#b91b7c384c3deaa9d535be02da14d04f%1755524470.56",
|
||||
"autoconf/2.71#51077f068e61700d65bb05541ea1e4b0%1731054366.86"
|
||||
],
|
||||
"python_requires": [],
|
||||
"overrides": {
|
||||
"protobuf/3.21.12": [
|
||||
null,
|
||||
"protobuf/3.21.12"
|
||||
],
|
||||
"lz4/1.9.4": [
|
||||
"lz4/1.10.0"
|
||||
],
|
||||
"boost/1.83.0": [
|
||||
"boost/1.88.0"
|
||||
],
|
||||
"sqlite3/3.44.2": [
|
||||
"sqlite3/3.49.1"
|
||||
]
|
||||
},
|
||||
"config_requires": []
|
||||
}
|
||||
6
conan/global.conf
Normal file
6
conan/global.conf
Normal file
@@ -0,0 +1,6 @@
|
||||
# Global configuration for Conan. This is used to set the number of parallel
|
||||
# downloads, uploads, and build jobs.
|
||||
core:non_interactive=True
|
||||
core.download:parallel={{ os.cpu_count() }}
|
||||
core.upload:parallel={{ os.cpu_count() }}
|
||||
tools.build:jobs={{ os.cpu_count() - 1 }}
|
||||
31
conan/profiles/default
Normal file
31
conan/profiles/default
Normal file
@@ -0,0 +1,31 @@
|
||||
{% set os = detect_api.detect_os() %}
|
||||
{% set arch = detect_api.detect_arch() %}
|
||||
{% set compiler, version, compiler_exe = detect_api.detect_default_compiler() %}
|
||||
{% set compiler_version = version %}
|
||||
{% if os == "Linux" %}
|
||||
{% set compiler_version = detect_api.default_compiler_version(compiler, version) %}
|
||||
{% endif %}
|
||||
|
||||
[settings]
|
||||
os={{ os }}
|
||||
arch={{ arch }}
|
||||
build_type=Debug
|
||||
compiler={{compiler}}
|
||||
compiler.version={{ compiler_version }}
|
||||
compiler.cppstd=20
|
||||
{% if os == "Windows" %}
|
||||
compiler.runtime=static
|
||||
{% else %}
|
||||
compiler.libcxx={{detect_api.detect_libcxx(compiler, version, compiler_exe)}}
|
||||
{% endif %}
|
||||
|
||||
[conf]
|
||||
{% if compiler == "clang" and compiler_version >= 19 %}
|
||||
grpc/1.50.1:tools.build:cxxflags+=['-Wno-missing-template-arg-list-after-template-kw']
|
||||
{% endif %}
|
||||
{% if compiler == "apple-clang" and compiler_version >= 17 %}
|
||||
grpc/1.50.1:tools.build:cxxflags+=['-Wno-missing-template-arg-list-after-template-kw']
|
||||
{% endif %}
|
||||
{% if compiler == "gcc" and compiler_version < 13 %}
|
||||
tools.build:cxxflags+=['-Wno-restrict']
|
||||
{% endif %}
|
||||
61
conanfile.py
61
conanfile.py
@@ -1,4 +1,4 @@
|
||||
from conan import ConanFile
|
||||
from conan import ConanFile, __version__ as conan_version
|
||||
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
|
||||
import re
|
||||
|
||||
@@ -24,18 +24,20 @@ class Xrpl(ConanFile):
|
||||
}
|
||||
|
||||
requires = [
|
||||
'date/3.0.1',
|
||||
'grpc/1.50.1',
|
||||
'libarchive/3.6.2',
|
||||
'nudb/2.0.8',
|
||||
'openssl/1.1.1u',
|
||||
'libarchive/3.8.1',
|
||||
'nudb/2.0.9',
|
||||
'openssl/3.5.4',
|
||||
'soci/4.0.3',
|
||||
'xxhash/0.8.2',
|
||||
'zlib/1.2.13',
|
||||
'zlib/1.3.1',
|
||||
]
|
||||
|
||||
test_requires = [
|
||||
'doctest/2.4.11',
|
||||
]
|
||||
|
||||
tool_requires = [
|
||||
'protobuf/3.21.9',
|
||||
'protobuf/3.21.12',
|
||||
]
|
||||
|
||||
default_options = {
|
||||
@@ -87,30 +89,36 @@ class Xrpl(ConanFile):
|
||||
}
|
||||
|
||||
def set_version(self):
|
||||
path = f'{self.recipe_folder}/src/libxrpl/protocol/BuildInfo.cpp'
|
||||
regex = r'versionString\s?=\s?\"(.*)\"'
|
||||
with open(path, 'r') as file:
|
||||
matches = (re.search(regex, line) for line in file)
|
||||
match = next(m for m in matches if m)
|
||||
self.version = match.group(1)
|
||||
if self.version is None:
|
||||
path = f'{self.recipe_folder}/src/libxrpl/protocol/BuildInfo.cpp'
|
||||
regex = r'versionString\s?=\s?\"(.*)\"'
|
||||
with open(path, encoding='utf-8') as file:
|
||||
matches = (re.search(regex, line) for line in file)
|
||||
match = next(m for m in matches if m)
|
||||
self.version = match.group(1)
|
||||
|
||||
def configure(self):
|
||||
if self.settings.compiler == 'apple-clang':
|
||||
self.options['boost'].visibility = 'global'
|
||||
if self.settings.compiler in ['clang', 'gcc']:
|
||||
self.options['boost'].without_cobalt = True
|
||||
|
||||
def requirements(self):
|
||||
self.requires('boost/1.82.0', force=True)
|
||||
self.requires('lz4/1.9.3', force=True)
|
||||
self.requires('protobuf/3.21.9', force=True)
|
||||
self.requires('sqlite3/3.42.0', force=True)
|
||||
# Conan 2 requires transitive headers to be specified
|
||||
transitive_headers_opt = {'transitive_headers': True} if conan_version.split('.')[0] == '2' else {}
|
||||
self.requires('boost/1.88.0', force=True, **transitive_headers_opt)
|
||||
self.requires('date/3.0.4', **transitive_headers_opt)
|
||||
self.requires('lz4/1.10.0', force=True)
|
||||
self.requires('protobuf/3.21.12', force=True)
|
||||
self.requires('sqlite3/3.49.1', force=True)
|
||||
if self.options.jemalloc:
|
||||
self.requires('jemalloc/5.3.0')
|
||||
if self.options.rocksdb:
|
||||
self.requires('rocksdb/6.29.5')
|
||||
self.requires('rocksdb/10.0.1')
|
||||
self.requires('xxhash/0.8.3', **transitive_headers_opt)
|
||||
|
||||
exports_sources = (
|
||||
'CMakeLists.txt',
|
||||
'bin/getRippledInfo',
|
||||
'cfg/*',
|
||||
'cmake/*',
|
||||
'external/*',
|
||||
@@ -161,7 +169,18 @@ class Xrpl(ConanFile):
|
||||
# `include/`, not `include/ripple/proto/`.
|
||||
libxrpl.includedirs = ['include', 'include/ripple/proto']
|
||||
libxrpl.requires = [
|
||||
'boost::boost',
|
||||
'boost::headers',
|
||||
'boost::chrono',
|
||||
'boost::container',
|
||||
'boost::coroutine',
|
||||
'boost::date_time',
|
||||
'boost::filesystem',
|
||||
'boost::json',
|
||||
'boost::program_options',
|
||||
'boost::process',
|
||||
'boost::regex',
|
||||
'boost::system',
|
||||
'boost::thread',
|
||||
'date::date',
|
||||
'grpc::grpc++',
|
||||
'libarchive::libarchive',
|
||||
|
||||
@@ -30,7 +30,7 @@ the ledger (so the entire network has the same view). This will help the network
|
||||
see which validators are **currently** unreliable, and adjust their quorum
|
||||
calculation accordingly.
|
||||
|
||||
*Improving the liveness of the network is the main motivation for the negative UNL.*
|
||||
_Improving the liveness of the network is the main motivation for the negative UNL._
|
||||
|
||||
### Targeted Faults
|
||||
|
||||
@@ -53,16 +53,17 @@ even if the number of remaining validators gets to 60%. Say we have a network
|
||||
with 10 validators on the UNL and everything is operating correctly. The quorum
|
||||
required for this network would be 8 (80% of 10). When validators fail, the
|
||||
quorum required would be as low as 6 (60% of 10), which is the absolute
|
||||
***minimum quorum***. We need the absolute minimum quorum to be strictly greater
|
||||
**_minimum quorum_**. We need the absolute minimum quorum to be strictly greater
|
||||
than 50% of the original UNL so that there cannot be two partitions of
|
||||
well-behaved nodes headed in different directions. We arbitrarily choose 60% as
|
||||
the minimum quorum to give a margin of safety.
|
||||
|
||||
Consider these events in the absence of negative UNL:
|
||||
|
||||
1. 1:00pm - validator1 fails, votes vs. quorum: 9 >= 8, we have quorum
|
||||
1. 3:00pm - validator2 fails, votes vs. quorum: 8 >= 8, we have quorum
|
||||
1. 5:00pm - validator3 fails, votes vs. quorum: 7 < 8, we don’t have quorum
|
||||
* **network cannot validate new ledgers with 3 failed validators**
|
||||
- **network cannot validate new ledgers with 3 failed validators**
|
||||
|
||||
We're below 80% agreement, so new ledgers cannot be validated. This is how the
|
||||
XRP Ledger operates today, but if the negative UNL was enabled, the events would
|
||||
@@ -70,18 +71,20 @@ happen as follows. (Please note that the events below are from a simplified
|
||||
version of our protocol.)
|
||||
|
||||
1. 1:00pm - validator1 fails, votes vs. quorum: 9 >= 8, we have quorum
|
||||
1. 1:40pm - network adds validator1 to negative UNL, quorum changes to ceil(9 * 0.8), or 8
|
||||
1. 1:40pm - network adds validator1 to negative UNL, quorum changes to ceil(9 \* 0.8), or 8
|
||||
1. 3:00pm - validator2 fails, votes vs. quorum: 8 >= 8, we have quorum
|
||||
1. 3:40pm - network adds validator2 to negative UNL, quorum changes to ceil(8 * 0.8), or 7
|
||||
1. 3:40pm - network adds validator2 to negative UNL, quorum changes to ceil(8 \* 0.8), or 7
|
||||
1. 5:00pm - validator3 fails, votes vs. quorum: 7 >= 7, we have quorum
|
||||
1. 5:40pm - network adds validator3 to negative UNL, quorum changes to ceil(7 * 0.8), or 6
|
||||
1. 5:40pm - network adds validator3 to negative UNL, quorum changes to ceil(7 \* 0.8), or 6
|
||||
1. 7:00pm - validator4 fails, votes vs. quorum: 6 >= 6, we have quorum
|
||||
* **network can still validate new ledgers with 4 failed validators**
|
||||
- **network can still validate new ledgers with 4 failed validators**
|
||||
|
||||
## External Interactions
|
||||
|
||||
### Message Format Changes
|
||||
|
||||
This proposal will:
|
||||
|
||||
1. add a new pseudo-transaction type
|
||||
1. add the negative UNL to the ledger data structure.
|
||||
|
||||
@@ -89,19 +92,20 @@ Any tools or systems that rely on the format of this data will have to be
|
||||
updated.
|
||||
|
||||
### Amendment
|
||||
|
||||
This feature **will** need an amendment to activate.
|
||||
|
||||
## Design
|
||||
|
||||
This section discusses the following topics about the Negative UNL design:
|
||||
|
||||
* [Negative UNL protocol overview](#Negative-UNL-Protocol-Overview)
|
||||
* [Validator reliability measurement](#Validator-Reliability-Measurement)
|
||||
* [Format Changes](#Format-Changes)
|
||||
* [Negative UNL maintenance](#Negative-UNL-Maintenance)
|
||||
* [Quorum size calculation](#Quorum-Size-Calculation)
|
||||
* [Filter validation messages](#Filter-Validation-Messages)
|
||||
* [High level sequence diagram of code
|
||||
- [Negative UNL protocol overview](#Negative-UNL-Protocol-Overview)
|
||||
- [Validator reliability measurement](#Validator-Reliability-Measurement)
|
||||
- [Format Changes](#Format-Changes)
|
||||
- [Negative UNL maintenance](#Negative-UNL-Maintenance)
|
||||
- [Quorum size calculation](#Quorum-Size-Calculation)
|
||||
- [Filter validation messages](#Filter-Validation-Messages)
|
||||
- [High level sequence diagram of code
|
||||
changes](#High-Level-Sequence-Diagram-of-Code-Changes)
|
||||
|
||||
### Negative UNL Protocol Overview
|
||||
@@ -114,9 +118,9 @@ with V in their UNL adjust the quorum and V’s validation message is not counte
|
||||
when verifying if a ledger is fully validated. V’s flow of messages and network
|
||||
interactions, however, will remain the same.
|
||||
|
||||
We define the ***effective UNL** = original UNL - negative UNL*, and the
|
||||
***effective quorum*** as the quorum of the *effective UNL*. And we set
|
||||
*effective quorum = Ceiling(80% * effective UNL)*.
|
||||
We define the **\*effective UNL** = original UNL - negative UNL\*, and the
|
||||
**_effective quorum_** as the quorum of the _effective UNL_. And we set
|
||||
_effective quorum = Ceiling(80% _ effective UNL)\*.
|
||||
|
||||
### Validator Reliability Measurement
|
||||
|
||||
@@ -126,16 +130,16 @@ measure about its validators, but we have chosen ledger validation messages.
|
||||
This is because every validator shall send one and only one signed validation
|
||||
message per ledger. This keeps the measurement simple and removes
|
||||
timing/clock-sync issues. A node will measure the percentage of agreeing
|
||||
validation messages (*PAV*) received from each validator on the node's UNL. Note
|
||||
validation messages (_PAV_) received from each validator on the node's UNL. Note
|
||||
that the node will only count the validation messages that agree with its own
|
||||
validations.
|
||||
|
||||
We define the **PAV** as the **P**ercentage of **A**greed **V**alidation
|
||||
messages received for the last N ledgers, where N = 256 by default.
|
||||
|
||||
When the PAV drops below the ***low-water mark***, the validator is considered
|
||||
When the PAV drops below the **_low-water mark_**, the validator is considered
|
||||
unreliable, and is a candidate to be disabled by being added to the negative
|
||||
UNL. A validator must have a PAV higher than the ***high-water mark*** to be
|
||||
UNL. A validator must have a PAV higher than the **_high-water mark_** to be
|
||||
re-enabled. The validator is re-enabled by removing it from the negative UNL. In
|
||||
the implementation, we plan to set the low-water mark as 50% and the high-water
|
||||
mark as 80%.
|
||||
@@ -143,22 +147,24 @@ mark as 80%.
|
||||
### Format Changes
|
||||
|
||||
The negative UNL component in a ledger contains three fields.
|
||||
* ***NegativeUNL***: The current negative UNL, a list of unreliable validators.
|
||||
* ***ToDisable***: The validator to be added to the negative UNL on the next
|
||||
|
||||
- **_NegativeUNL_**: The current negative UNL, a list of unreliable validators.
|
||||
- **_ToDisable_**: The validator to be added to the negative UNL on the next
|
||||
flag ledger.
|
||||
* ***ToReEnable***: The validator to be removed from the negative UNL on the
|
||||
- **_ToReEnable_**: The validator to be removed from the negative UNL on the
|
||||
next flag ledger.
|
||||
|
||||
All three fields are optional. When the *ToReEnable* field exists, the
|
||||
*NegativeUNL* field cannot be empty.
|
||||
All three fields are optional. When the _ToReEnable_ field exists, the
|
||||
_NegativeUNL_ field cannot be empty.
|
||||
|
||||
A new pseudo-transaction, ***UNLModify***, is added. It has three fields
|
||||
* ***Disabling***: A flag indicating whether the modification is to disable or
|
||||
A new pseudo-transaction, **_UNLModify_**, is added. It has three fields
|
||||
|
||||
- **_Disabling_**: A flag indicating whether the modification is to disable or
|
||||
to re-enable a validator.
|
||||
* ***Seq***: The ledger sequence number.
|
||||
* ***Validator***: The validator to be disabled or re-enabled.
|
||||
- **_Seq_**: The ledger sequence number.
|
||||
- **_Validator_**: The validator to be disabled or re-enabled.
|
||||
|
||||
There would be at most one *disable* `UNLModify` and one *re-enable* `UNLModify`
|
||||
There would be at most one _disable_ `UNLModify` and one _re-enable_ `UNLModify`
|
||||
transaction per flag ledger. The full machinery is described further on.
|
||||
|
||||
### Negative UNL Maintenance
|
||||
@@ -167,19 +173,19 @@ The negative UNL can only be modified on the flag ledgers. If a validator's
|
||||
reliability status changes, it takes two flag ledgers to modify the negative
|
||||
UNL. Let's see an example of the algorithm:
|
||||
|
||||
* Ledger seq = 100: A validator V goes offline.
|
||||
* Ledger seq = 256: This is a flag ledger, and V's reliability measurement *PAV*
|
||||
- Ledger seq = 100: A validator V goes offline.
|
||||
- Ledger seq = 256: This is a flag ledger, and V's reliability measurement _PAV_
|
||||
is lower than the low-water mark. Other validators add `UNLModify`
|
||||
pseudo-transactions `{true, 256, V}` to the transaction set which goes through
|
||||
the consensus. Then the pseudo-transaction is applied to the negative UNL
|
||||
ledger component by setting `ToDisable = V`.
|
||||
* Ledger seq = 257 ~ 511: The negative UNL ledger component is copied from the
|
||||
- Ledger seq = 257 ~ 511: The negative UNL ledger component is copied from the
|
||||
parent ledger.
|
||||
* Ledger seq=512: This is a flag ledger, and the negative UNL is updated
|
||||
- Ledger seq=512: This is a flag ledger, and the negative UNL is updated
|
||||
`NegativeUNL = NegativeUNL + ToDisable`.
|
||||
|
||||
The negative UNL may have up to `MaxNegativeListed = floor(original UNL * 25%)`
|
||||
validators. The 25% is because of 75% * 80% = 60%, where 75% = 100% - 25%, 80%
|
||||
validators. The 25% is because of 75% \* 80% = 60%, where 75% = 100% - 25%, 80%
|
||||
is the quorum of the effective UNL, and 60% is the absolute minimum quorum of
|
||||
the original UNL. Adding more than 25% validators to the negative UNL does not
|
||||
improve the liveness of the network, because adding more validators to the
|
||||
@@ -187,52 +193,43 @@ negative UNL cannot lower the effective quorum.
|
||||
|
||||
The following is the detailed algorithm:
|
||||
|
||||
* **If** the ledger seq = x is a flag ledger
|
||||
- **If** the ledger seq = x is a flag ledger
|
||||
1. Compute `NegativeUNL = NegativeUNL + ToDisable - ToReEnable` if they
|
||||
exist in the parent ledger
|
||||
|
||||
1. Compute `NegativeUNL = NegativeUNL + ToDisable - ToReEnable` if they
|
||||
exist in the parent ledger
|
||||
1. Try to find a candidate to disable if `sizeof NegativeUNL < MaxNegativeListed`
|
||||
|
||||
1. Try to find a candidate to disable if `sizeof NegativeUNL < MaxNegativeListed`
|
||||
1. Find a validator V that has a _PAV_ lower than the low-water
|
||||
mark, but is not in `NegativeUNL`.
|
||||
|
||||
1. Find a validator V that has a *PAV* lower than the low-water
|
||||
mark, but is not in `NegativeUNL`.
|
||||
1. If two or more are found, their public keys are XORed with the hash
|
||||
of the parent ledger and the one with the lowest XOR result is chosen.
|
||||
1. If V is found, create a `UNLModify` pseudo-transaction
|
||||
`TxDisableValidator = {true, x, V}`
|
||||
1. Try to find a candidate to re-enable if `sizeof NegativeUNL > 0`:
|
||||
1. Find a validator U that is in `NegativeUNL` and has a _PAV_ higher
|
||||
than the high-water mark.
|
||||
1. If U is not found, try to find one in `NegativeUNL` but not in the
|
||||
local _UNL_.
|
||||
1. If two or more are found, their public keys are XORed with the hash
|
||||
of the parent ledger and the one with the lowest XOR result is chosen.
|
||||
1. If U is found, create a `UNLModify` pseudo-transaction
|
||||
`TxReEnableValidator = {false, x, U}`
|
||||
|
||||
1. If two or more are found, their public keys are XORed with the hash
|
||||
of the parent ledger and the one with the lowest XOR result is chosen.
|
||||
|
||||
1. If V is found, create a `UNLModify` pseudo-transaction
|
||||
`TxDisableValidator = {true, x, V}`
|
||||
|
||||
1. Try to find a candidate to re-enable if `sizeof NegativeUNL > 0`:
|
||||
|
||||
1. Find a validator U that is in `NegativeUNL` and has a *PAV* higher
|
||||
than the high-water mark.
|
||||
|
||||
1. If U is not found, try to find one in `NegativeUNL` but not in the
|
||||
local *UNL*.
|
||||
|
||||
1. If two or more are found, their public keys are XORed with the hash
|
||||
of the parent ledger and the one with the lowest XOR result is chosen.
|
||||
|
||||
1. If U is found, create a `UNLModify` pseudo-transaction
|
||||
`TxReEnableValidator = {false, x, U}`
|
||||
|
||||
1. If any `UNLModify` pseudo-transactions are created, add them to the
|
||||
transaction set. The transaction set goes through the consensus algorithm.
|
||||
|
||||
1. If have enough support, the `UNLModify` pseudo-transactions remain in the
|
||||
transaction set agreed by the validators. Then the pseudo-transactions are
|
||||
applied to the ledger:
|
||||
|
||||
1. If have `TxDisableValidator`, set `ToDisable=TxDisableValidator.V`.
|
||||
Else clear `ToDisable`.
|
||||
|
||||
1. If have `TxReEnableValidator`, set
|
||||
`ToReEnable=TxReEnableValidator.U`. Else clear `ToReEnable`.
|
||||
|
||||
* **Else** (not a flag ledger)
|
||||
1. If any `UNLModify` pseudo-transactions are created, add them to the
|
||||
transaction set. The transaction set goes through the consensus algorithm.
|
||||
1. If have enough support, the `UNLModify` pseudo-transactions remain in the
|
||||
transaction set agreed by the validators. Then the pseudo-transactions are
|
||||
applied to the ledger:
|
||||
|
||||
1. Copy the negative UNL ledger component from the parent ledger
|
||||
1. If have `TxDisableValidator`, set `ToDisable=TxDisableValidator.V`.
|
||||
Else clear `ToDisable`.
|
||||
|
||||
1. If have `TxReEnableValidator`, set
|
||||
`ToReEnable=TxReEnableValidator.U`. Else clear `ToReEnable`.
|
||||
|
||||
- **Else** (not a flag ledger)
|
||||
1. Copy the negative UNL ledger component from the parent ledger
|
||||
|
||||
The negative UNL is stored on each ledger because we don't know when a validator
|
||||
may reconnect to the network. If the negative UNL was stored only on every flag
|
||||
@@ -273,31 +270,26 @@ not counted when checking if the ledger is fully validated.
|
||||
The diagram below is the sequence of one round of consensus. Classes and
|
||||
components with non-trivial changes are colored green.
|
||||
|
||||
* The `ValidatorList` class is modified to compute the quorum of the effective
|
||||
- The `ValidatorList` class is modified to compute the quorum of the effective
|
||||
UNL.
|
||||
|
||||
* The `Validations` class provides an interface for querying the validation
|
||||
- The `Validations` class provides an interface for querying the validation
|
||||
messages from trusted validators.
|
||||
|
||||
* The `ConsensusAdaptor` component:
|
||||
|
||||
* The `RCLConsensus::Adaptor` class is modified for creating `UNLModify`
|
||||
Pseudo-Transactions.
|
||||
|
||||
* The `Change` class is modified for applying `UNLModify`
|
||||
Pseudo-Transactions.
|
||||
|
||||
* The `Ledger` class is modified for creating and adjusting the negative UNL
|
||||
ledger component.
|
||||
|
||||
* The `LedgerMaster` class is modified for filtering out validation messages
|
||||
from negative UNL validators when verifying if a ledger is fully
|
||||
validated.
|
||||
- The `ConsensusAdaptor` component:
|
||||
- The `RCLConsensus::Adaptor` class is modified for creating `UNLModify`
|
||||
Pseudo-Transactions.
|
||||
- The `Change` class is modified for applying `UNLModify`
|
||||
Pseudo-Transactions.
|
||||
- The `Ledger` class is modified for creating and adjusting the negative UNL
|
||||
ledger component.
|
||||
- The `LedgerMaster` class is modified for filtering out validation messages
|
||||
from negative UNL validators when verifying if a ledger is fully
|
||||
validated.
|
||||
|
||||

|
||||
|
||||
|
||||
## Roads Not Taken
|
||||
|
||||
### Use a Mechanism Like Fee Voting to Process UNLModify Pseudo-Transactions
|
||||
@@ -311,7 +303,7 @@ and different quorums for the same ledger. As a result, the network's safety is
|
||||
impacted.
|
||||
|
||||
This updated version does not impact safety though operates a bit more slowly.
|
||||
The negative UNL modifications in the *UNLModify* pseudo-transaction approved by
|
||||
The negative UNL modifications in the _UNLModify_ pseudo-transaction approved by
|
||||
the consensus will take effect at the next flag ledger. The extra time of the
|
||||
256 ledgers should be enough for nodes to be in sync of the negative UNL
|
||||
modifications.
|
||||
@@ -334,29 +326,28 @@ expiration approach cannot be simply applied.
|
||||
### Validator Reliability Measurement and Flag Ledger Frequency
|
||||
|
||||
If the ledger time is about 4.5 seconds and the low-water mark is 50%, then in
|
||||
the worst case, it takes 48 minutes *((0.5 * 256 + 256 + 256) * 4.5 / 60 = 48)*
|
||||
the worst case, it takes 48 minutes _((0.5 _ 256 + 256 + 256) _ 4.5 / 60 = 48)_
|
||||
to put an offline validator on the negative UNL. We considered lowering the flag
|
||||
ledger frequency so that the negative UNL can be more responsive. We also
|
||||
considered decoupling the reliability measurement and flag ledger frequency to
|
||||
be more flexible. In practice, however, their benefits are not clear.
|
||||
|
||||
|
||||
## New Attack Vectors
|
||||
|
||||
A group of malicious validators may try to frame a reliable validator and put it
|
||||
on the negative UNL. But they cannot succeed. Because:
|
||||
|
||||
1. A reliable validator sends a signed validation message every ledger. A
|
||||
sufficient peer-to-peer network will propagate the validation messages to other
|
||||
validators. The validators will decide if another validator is reliable or not
|
||||
only by its local observation of the validation messages received. So an honest
|
||||
validator’s vote on another validator’s reliability is accurate.
|
||||
sufficient peer-to-peer network will propagate the validation messages to other
|
||||
validators. The validators will decide if another validator is reliable or not
|
||||
only by its local observation of the validation messages received. So an honest
|
||||
validator’s vote on another validator’s reliability is accurate.
|
||||
|
||||
1. Given the votes are accurate, and one vote per validator, an honest validator
|
||||
will not create a UNLModify transaction of a reliable validator.
|
||||
will not create a UNLModify transaction of a reliable validator.
|
||||
|
||||
1. A validator can be added to a negative UNL only through a UNLModify
|
||||
transaction.
|
||||
transaction.
|
||||
|
||||
Assuming the group of malicious validators is less than the quorum, they cannot
|
||||
frame a reliable validator.
|
||||
@@ -365,32 +356,32 @@ frame a reliable validator.
|
||||
|
||||
The bullet points below briefly summarize the current proposal:
|
||||
|
||||
* The motivation of the negative UNL is to improve the liveness of the network.
|
||||
- The motivation of the negative UNL is to improve the liveness of the network.
|
||||
|
||||
* The targeted faults are the ones frequently observed in the production
|
||||
- The targeted faults are the ones frequently observed in the production
|
||||
network.
|
||||
|
||||
* Validators propose negative UNL candidates based on their local measurements.
|
||||
- Validators propose negative UNL candidates based on their local measurements.
|
||||
|
||||
* The absolute minimum quorum is 60% of the original UNL.
|
||||
- The absolute minimum quorum is 60% of the original UNL.
|
||||
|
||||
* The format of the ledger is changed, and a new *UNLModify* pseudo-transaction
|
||||
- The format of the ledger is changed, and a new _UNLModify_ pseudo-transaction
|
||||
is added. Any tools or systems that rely on the format of these data will have
|
||||
to be updated.
|
||||
|
||||
* The negative UNL can only be modified on the flag ledgers.
|
||||
- The negative UNL can only be modified on the flag ledgers.
|
||||
|
||||
* At most one validator can be added to the negative UNL at a flag ledger.
|
||||
- At most one validator can be added to the negative UNL at a flag ledger.
|
||||
|
||||
* At most one validator can be removed from the negative UNL at a flag ledger.
|
||||
- At most one validator can be removed from the negative UNL at a flag ledger.
|
||||
|
||||
* If a validator's reliability status changes, it takes two flag ledgers to
|
||||
- If a validator's reliability status changes, it takes two flag ledgers to
|
||||
modify the negative UNL.
|
||||
|
||||
* The quorum is the larger of 80% of the effective UNL and 60% of the original
|
||||
- The quorum is the larger of 80% of the effective UNL and 60% of the original
|
||||
UNL.
|
||||
|
||||
* If a validator is on the negative UNL, its validation messages are ignored
|
||||
- If a validator is on the negative UNL, its validation messages are ignored
|
||||
when the local node verifies if a ledger is fully validated.
|
||||
|
||||
## FAQ
|
||||
@@ -415,7 +406,7 @@ lower quorum size while keeping the network safe.
|
||||
validator removed from the negative UNL? </h3>
|
||||
|
||||
A validator’s reliability is measured by other validators. If a validator
|
||||
becomes unreliable, at a flag ledger, other validators propose *UNLModify*
|
||||
becomes unreliable, at a flag ledger, other validators propose _UNLModify_
|
||||
pseudo-transactions which vote the validator to add to the negative UNL during
|
||||
the consensus session. If agreed, the validator is added to the negative UNL at
|
||||
the next flag ledger. The mechanism of removing a validator from the negative
|
||||
@@ -423,32 +414,32 @@ UNL is the same.
|
||||
|
||||
### Question: Given a negative UNL, what happens if the UNL changes?
|
||||
|
||||
Answer: Let’s consider the cases:
|
||||
Answer: Let’s consider the cases:
|
||||
|
||||
1. A validator is added to the UNL, and it is already in the negative UNL. This
|
||||
case could happen when not all the nodes have the same UNL. Note that the
|
||||
negative UNL on the ledger lists unreliable nodes that are not necessarily the
|
||||
validators for everyone.
|
||||
1. A validator is added to the UNL, and it is already in the negative UNL. This
|
||||
case could happen when not all the nodes have the same UNL. Note that the
|
||||
negative UNL on the ledger lists unreliable nodes that are not necessarily the
|
||||
validators for everyone.
|
||||
|
||||
In this case, the liveness is affected negatively. Because the minimum
|
||||
quorum could be larger but the usable validators are not increased.
|
||||
In this case, the liveness is affected negatively. Because the minimum
|
||||
quorum could be larger but the usable validators are not increased.
|
||||
|
||||
1. A validator is removed from the UNL, and it is in the negative UNL.
|
||||
1. A validator is removed from the UNL, and it is in the negative UNL.
|
||||
|
||||
In this case, the liveness is affected positively. Because the quorum could
|
||||
be smaller but the usable validators are not reduced.
|
||||
|
||||
1. A validator is added to the UNL, and it is not in the negative UNL.
|
||||
1. A validator is removed from the UNL, and it is not in the negative UNL.
|
||||
|
||||
1. A validator is added to the UNL, and it is not in the negative UNL.
|
||||
1. A validator is removed from the UNL, and it is not in the negative UNL.
|
||||
|
||||
Case 3 and 4 are not affected by the negative UNL protocol.
|
||||
|
||||
### Question: Can we simply lower the quorum to 60% without the negative UNL?
|
||||
### Question: Can we simply lower the quorum to 60% without the negative UNL?
|
||||
|
||||
Answer: No, because the negative UNL approach is safer.
|
||||
|
||||
First let’s compare the two approaches intuitively, (1) the *negative UNL*
|
||||
approach, and (2) *lower quorum*: simply lowering the quorum from 80% to 60%
|
||||
First let’s compare the two approaches intuitively, (1) the _negative UNL_
|
||||
approach, and (2) _lower quorum_: simply lowering the quorum from 80% to 60%
|
||||
without the negative UNL. The negative UNL approach uses consensus to come up
|
||||
with a list of unreliable validators, which are then removed from the effective
|
||||
UNL temporarily. With this approach, the list of unreliable validators is agreed
|
||||
@@ -462,75 +453,75 @@ Next we compare the two approaches quantitatively with examples, and apply
|
||||
Theorem 8 of [Analysis of the XRP Ledger Consensus
|
||||
Protocol](https://arxiv.org/abs/1802.07242) paper:
|
||||
|
||||
*XRP LCP guarantees fork safety if **O<sub>i,j</sub> > n<sub>j</sub> / 2 +
|
||||
_XRP LCP guarantees fork safety if **O<sub>i,j</sub> > n<sub>j</sub> / 2 +
|
||||
n<sub>i</sub> − q<sub>i</sub> + t<sub>i,j</sub>** for every pair of nodes
|
||||
P<sub>i</sub>, P<sub>j</sub>,*
|
||||
P<sub>i</sub>, P<sub>j</sub>,_
|
||||
|
||||
where *O<sub>i,j</sub>* is the overlapping requirement, n<sub>j</sub> and
|
||||
where _O<sub>i,j</sub>_ is the overlapping requirement, n<sub>j</sub> and
|
||||
n<sub>i</sub> are UNL sizes, q<sub>i</sub> is the quorum size of P<sub>i</sub>,
|
||||
*t<sub>i,j</sub> = min(t<sub>i</sub>, t<sub>j</sub>, O<sub>i,j</sub>)*, and
|
||||
_t<sub>i,j</sub> = min(t<sub>i</sub>, t<sub>j</sub>, O<sub>i,j</sub>)_, and
|
||||
t<sub>i</sub> and t<sub>j</sub> are the number of faults can be tolerated by
|
||||
P<sub>i</sub> and P<sub>j</sub>.
|
||||
|
||||
We denote *UNL<sub>i</sub>* as *P<sub>i</sub>'s UNL*, and *|UNL<sub>i</sub>|* as
|
||||
the size of *P<sub>i</sub>'s UNL*.
|
||||
We denote _UNL<sub>i</sub>_ as _P<sub>i</sub>'s UNL_, and _|UNL<sub>i</sub>|_ as
|
||||
the size of _P<sub>i</sub>'s UNL_.
|
||||
|
||||
Assuming *|UNL<sub>i</sub>| = |UNL<sub>j</sub>|*, let's consider the following
|
||||
Assuming _|UNL<sub>i</sub>| = |UNL<sub>j</sub>|_, let's consider the following
|
||||
three cases:
|
||||
|
||||
1. With 80% quorum and 20% faults, *O<sub>i,j</sub> > 100% / 2 + 100% - 80% +
|
||||
20% = 90%*. I.e. fork safety requires > 90% UNL overlaps. This is one of the
|
||||
results in the analysis paper.
|
||||
1. With 80% quorum and 20% faults, _O<sub>i,j</sub> > 100% / 2 + 100% - 80% +
|
||||
20% = 90%_. I.e. fork safety requires > 90% UNL overlaps. This is one of the
|
||||
results in the analysis paper.
|
||||
|
||||
1. If the quorum is 60%, the relationship between the overlapping requirement
|
||||
and the faults that can be tolerated is *O<sub>i,j</sub> > 90% +
|
||||
t<sub>i,j</sub>*. Under the same overlapping condition (i.e. 90%), to guarantee
|
||||
the fork safety, the network cannot tolerate any faults. So under the same
|
||||
overlapping condition, if the quorum is simply lowered, the network can tolerate
|
||||
fewer faults.
|
||||
1. If the quorum is 60%, the relationship between the overlapping requirement
|
||||
and the faults that can be tolerated is _O<sub>i,j</sub> > 90% +
|
||||
t<sub>i,j</sub>_. Under the same overlapping condition (i.e. 90%), to guarantee
|
||||
the fork safety, the network cannot tolerate any faults. So under the same
|
||||
overlapping condition, if the quorum is simply lowered, the network can tolerate
|
||||
fewer faults.
|
||||
|
||||
1. With the negative UNL approach, we want to argue that the inequation
|
||||
*O<sub>i,j</sub> > n<sub>j</sub> / 2 + n<sub>i</sub> − q<sub>i</sub> +
|
||||
t<sub>i,j</sub>* is always true to guarantee fork safety, while the negative UNL
|
||||
protocol runs, i.e. the effective quorum is lowered without weakening the
|
||||
network's fault tolerance. To make the discussion easier, we rewrite the
|
||||
inequation as *O<sub>i,j</sub> > n<sub>j</sub> / 2 + (n<sub>i</sub> −
|
||||
q<sub>i</sub>) + min(t<sub>i</sub>, t<sub>j</sub>)*, where O<sub>i,j</sub> is
|
||||
dropped from the definition of t<sub>i,j</sub> because *O<sub>i,j</sub> >
|
||||
min(t<sub>i</sub>, t<sub>j</sub>)* always holds under the parameters we will
|
||||
use. Assuming a validator V is added to the negative UNL, now let's consider the
|
||||
4 cases:
|
||||
1. With the negative UNL approach, we want to argue that the inequation
|
||||
_O<sub>i,j</sub> > n<sub>j</sub> / 2 + n<sub>i</sub> − q<sub>i</sub> +
|
||||
t<sub>i,j</sub>_ is always true to guarantee fork safety, while the negative UNL
|
||||
protocol runs, i.e. the effective quorum is lowered without weakening the
|
||||
network's fault tolerance. To make the discussion easier, we rewrite the
|
||||
inequation as _O<sub>i,j</sub> > n<sub>j</sub> / 2 + (n<sub>i</sub> −
|
||||
q<sub>i</sub>) + min(t<sub>i</sub>, t<sub>j</sub>)_, where O<sub>i,j</sub> is
|
||||
dropped from the definition of t<sub>i,j</sub> because _O<sub>i,j</sub> >
|
||||
min(t<sub>i</sub>, t<sub>j</sub>)_ always holds under the parameters we will
|
||||
use. Assuming a validator V is added to the negative UNL, now let's consider the
|
||||
4 cases:
|
||||
|
||||
1. V is not on UNL<sub>i</sub> nor UNL<sub>j</sub>
|
||||
1. V is not on UNL<sub>i</sub> nor UNL<sub>j</sub>
|
||||
|
||||
The inequation holds because none of the variables change.
|
||||
The inequation holds because none of the variables change.
|
||||
|
||||
1. V is on UNL<sub>i</sub> but not on UNL<sub>j</sub>
|
||||
1. V is on UNL<sub>i</sub> but not on UNL<sub>j</sub>
|
||||
|
||||
The value of *(n<sub>i</sub> − q<sub>i</sub>)* is smaller. The value of
|
||||
*min(t<sub>i</sub>, t<sub>j</sub>)* could be smaller too. Other
|
||||
variables do not change. Overall, the left side of the inequation does
|
||||
not change, but the right side is smaller. So the inequation holds.
|
||||
|
||||
1. V is not on UNL<sub>i</sub> but on UNL<sub>j</sub>
|
||||
The value of *(n<sub>i</sub> − q<sub>i</sub>)* is smaller. The value of
|
||||
*min(t<sub>i</sub>, t<sub>j</sub>)* could be smaller too. Other
|
||||
variables do not change. Overall, the left side of the inequation does
|
||||
not change, but the right side is smaller. So the inequation holds.
|
||||
|
||||
The value of *n<sub>j</sub> / 2* is smaller. The value of
|
||||
*min(t<sub>i</sub>, t<sub>j</sub>)* could be smaller too. Other
|
||||
variables do not change. Overall, the left side of the inequation does
|
||||
not change, but the right side is smaller. So the inequation holds.
|
||||
|
||||
1. V is on both UNL<sub>i</sub> and UNL<sub>j</sub>
|
||||
1. V is not on UNL<sub>i</sub> but on UNL<sub>j</sub>
|
||||
|
||||
The value of *O<sub>i,j</sub>* is reduced by 1. The values of
|
||||
*n<sub>j</sub> / 2*, *(n<sub>i</sub> − q<sub>i</sub>)*, and
|
||||
*min(t<sub>i</sub>, t<sub>j</sub>)* are reduced by 0.5, 0.2, and 1
|
||||
respectively. The right side is reduced by 1.7. Overall, the left side
|
||||
of the inequation is reduced by 1, and the right side is reduced by 1.7.
|
||||
So the inequation holds.
|
||||
The value of *n<sub>j</sub> / 2* is smaller. The value of
|
||||
*min(t<sub>i</sub>, t<sub>j</sub>)* could be smaller too. Other
|
||||
variables do not change. Overall, the left side of the inequation does
|
||||
not change, but the right side is smaller. So the inequation holds.
|
||||
|
||||
The inequation holds for all the cases. So with the negative UNL approach,
|
||||
the network's fork safety is preserved, while the quorum is lowered that
|
||||
increases the network's liveness.
|
||||
1. V is on both UNL<sub>i</sub> and UNL<sub>j</sub>
|
||||
|
||||
The value of *O<sub>i,j</sub>* is reduced by 1. The values of
|
||||
*n<sub>j</sub> / 2*, *(n<sub>i</sub> − q<sub>i</sub>)*, and
|
||||
*min(t<sub>i</sub>, t<sub>j</sub>)* are reduced by 0.5, 0.2, and 1
|
||||
respectively. The right side is reduced by 1.7. Overall, the left side
|
||||
of the inequation is reduced by 1, and the right side is reduced by 1.7.
|
||||
So the inequation holds.
|
||||
|
||||
The inequation holds for all the cases. So with the negative UNL approach,
|
||||
the network's fork safety is preserved, while the quorum is lowered that
|
||||
increases the network's liveness.
|
||||
|
||||
<h3> Question: We have observed that occasionally a validator wanders off on its
|
||||
own chain. How is this case handled by the negative UNL algorithm? </h3>
|
||||
@@ -565,11 +556,11 @@ will be used after that. We want to see the test cases still pass with real
|
||||
network delay. A test case specifies:
|
||||
|
||||
1. a UNL with different number of validators for different test cases,
|
||||
1. a network with zero or more non-validator nodes,
|
||||
1. a network with zero or more non-validator nodes,
|
||||
1. a sequence of validator reliability change events (by killing/restarting
|
||||
nodes, or by running modified rippled that does not send all validation
|
||||
messages),
|
||||
1. the correct outcomes.
|
||||
1. the correct outcomes.
|
||||
|
||||
For all the test cases, the correct outcomes are verified by examining logs. We
|
||||
will grep the log to see if the correct negative UNLs are generated, and whether
|
||||
@@ -579,6 +570,7 @@ timing parameters of rippled will be changed to have faster ledger time. Most if
|
||||
not all test cases do not need client transactions.
|
||||
|
||||
For example, the test cases for the prototype:
|
||||
|
||||
1. A 10-validator UNL.
|
||||
1. The network does not have other nodes.
|
||||
1. The validators will be started from the genesis. Once they start to produce
|
||||
@@ -587,11 +579,11 @@ For example, the test cases for the prototype:
|
||||
1. A sequence of events (or the lack of events) such as a killed validator is
|
||||
added to the negative UNL.
|
||||
|
||||
#### Roads Not Taken: Test with Extended CSF
|
||||
#### Roads Not Taken: Test with Extended CSF
|
||||
|
||||
We considered testing with the current unit test framework, specifically the
|
||||
[Consensus Simulation
|
||||
Framework](https://github.com/ripple/rippled/blob/develop/src/test/csf/README.md)
|
||||
(CSF). However, the CSF currently can only test the generic consensus algorithm
|
||||
as in the paper: [Analysis of the XRP Ledger Consensus
|
||||
Protocol](https://arxiv.org/abs/1802.07242).
|
||||
Protocol](https://arxiv.org/abs/1802.07242).
|
||||
|
||||
@@ -5,8 +5,8 @@ skinparam roundcorner 20
|
||||
skinparam maxmessagesize 160
|
||||
|
||||
actor "Rippled Start" as RS
|
||||
participant "Timer" as T
|
||||
participant "NetworkOPs" as NOP
|
||||
participant "Timer" as T
|
||||
participant "NetworkOPs" as NOP
|
||||
participant "ValidatorList" as VL #lightgreen
|
||||
participant "Consensus" as GC
|
||||
participant "ConsensusAdaptor" as CA #lightgreen
|
||||
@@ -20,7 +20,7 @@ VL -> NOP
|
||||
NOP -> VL: update trusted validators
|
||||
activate VL
|
||||
VL -> VL: re-calculate quorum
|
||||
hnote over VL#lightgreen: ignore negative listed validators\nwhen calculate quorum
|
||||
hnote over VL#lightgreen: ignore negative listed validators\nwhen calculate quorum
|
||||
VL -> NOP
|
||||
deactivate VL
|
||||
NOP -> GC: start round
|
||||
@@ -36,14 +36,14 @@ activate GC
|
||||
end
|
||||
|
||||
alt phase == OPEN
|
||||
alt should close ledger
|
||||
alt should close ledger
|
||||
GC -> GC: phase = ESTABLISH
|
||||
GC -> CA: onClose
|
||||
activate CA
|
||||
alt sqn%256==0
|
||||
alt sqn%256==0
|
||||
CA -[#green]> RM: <font color=green>getValidations
|
||||
CA -[#green]> CA: <font color=green>create UNLModify Tx
|
||||
hnote over CA#lightgreen: use validatations of the last 256 ledgers\nto figure out UNLModify Tx candidates.\nIf any, create UNLModify Tx, and add to TxSet.
|
||||
CA -[#green]> CA: <font color=green>create UNLModify Tx
|
||||
hnote over CA#lightgreen: use validatations of the last 256 ledgers\nto figure out UNLModify Tx candidates.\nIf any, create UNLModify Tx, and add to TxSet.
|
||||
end
|
||||
CA -> GC
|
||||
GC -> CA: propose
|
||||
@@ -61,14 +61,14 @@ else phase == ESTABLISH
|
||||
CA -> CA : build LCL
|
||||
hnote over CA #lightgreen: copy negative UNL from parent ledger
|
||||
alt sqn%256==0
|
||||
CA -[#green]> CA: <font color=green>Adjust negative UNL
|
||||
CA -[#green]> CA: <font color=green>Adjust negative UNL
|
||||
CA -[#green]> CA: <font color=green>apply UNLModify Tx
|
||||
end
|
||||
CA -> CA : validate and send validation message
|
||||
activate NOP
|
||||
CA -> NOP : end consensus and\n<b>begin next consensus round
|
||||
deactivate NOP
|
||||
deactivate CA
|
||||
deactivate CA
|
||||
hnote over RM: receive validations
|
||||
end
|
||||
else phase == ACCEPTED
|
||||
@@ -76,4 +76,4 @@ else phase == ACCEPTED
|
||||
end
|
||||
deactivate GC
|
||||
|
||||
@enduml
|
||||
@enduml
|
||||
|
||||
@@ -82,7 +82,9 @@ pattern and the way coroutines are implemented, where every yield saves the spot
|
||||
in the code where it left off and every resume jumps back to that spot.
|
||||
|
||||
### Sequence Diagram
|
||||
|
||||

|
||||
|
||||
### Class Diagram
|
||||
|
||||

|
||||
|
||||
@@ -4,7 +4,7 @@ class TimeoutCounter {
|
||||
#app_ : Application&
|
||||
}
|
||||
|
||||
TimeoutCounter o-- "1" Application
|
||||
TimeoutCounter o-- "1" Application
|
||||
': app_
|
||||
|
||||
Stoppable <.. Application
|
||||
@@ -14,13 +14,13 @@ class Application {
|
||||
-m_inboundLedgers : uptr<InboundLedgers>
|
||||
}
|
||||
|
||||
Application *-- "1" LedgerReplayer
|
||||
Application *-- "1" LedgerReplayer
|
||||
': m_ledgerReplayer
|
||||
Application *-- "1" InboundLedgers
|
||||
Application *-- "1" InboundLedgers
|
||||
': m_inboundLedgers
|
||||
|
||||
Stoppable <.. InboundLedgers
|
||||
Application "1" --o InboundLedgers
|
||||
Application "1" --o InboundLedgers
|
||||
': app_
|
||||
|
||||
class InboundLedgers {
|
||||
@@ -28,9 +28,9 @@ class InboundLedgers {
|
||||
}
|
||||
|
||||
Stoppable <.. LedgerReplayer
|
||||
InboundLedgers "1" --o LedgerReplayer
|
||||
InboundLedgers "1" --o LedgerReplayer
|
||||
': inboundLedgers_
|
||||
Application "1" --o LedgerReplayer
|
||||
Application "1" --o LedgerReplayer
|
||||
': app_
|
||||
|
||||
class LedgerReplayer {
|
||||
@@ -42,17 +42,17 @@ class LedgerReplayer {
|
||||
-skipLists_ : hash_map<u256, wptr<SkipListAcquire>>
|
||||
}
|
||||
|
||||
LedgerReplayer *-- LedgerReplayTask
|
||||
LedgerReplayer *-- LedgerReplayTask
|
||||
': tasks_
|
||||
LedgerReplayer o-- LedgerDeltaAcquire
|
||||
LedgerReplayer o-- LedgerDeltaAcquire
|
||||
': deltas_
|
||||
LedgerReplayer o-- SkipListAcquire
|
||||
LedgerReplayer o-- SkipListAcquire
|
||||
': skipLists_
|
||||
|
||||
TimeoutCounter <.. LedgerReplayTask
|
||||
InboundLedgers "1" --o LedgerReplayTask
|
||||
InboundLedgers "1" --o LedgerReplayTask
|
||||
': inboundLedgers_
|
||||
LedgerReplayer "1" --o LedgerReplayTask
|
||||
LedgerReplayer "1" --o LedgerReplayTask
|
||||
': replayer_
|
||||
|
||||
class LedgerReplayTask {
|
||||
@@ -63,15 +63,15 @@ class LedgerReplayTask {
|
||||
+addDelta(sptr<LedgerDeltaAcquire>)
|
||||
}
|
||||
|
||||
LedgerReplayTask *-- "1" SkipListAcquire
|
||||
LedgerReplayTask *-- "1" SkipListAcquire
|
||||
': skipListAcquirer_
|
||||
LedgerReplayTask *-- LedgerDeltaAcquire
|
||||
LedgerReplayTask *-- LedgerDeltaAcquire
|
||||
': deltas_
|
||||
|
||||
TimeoutCounter <.. SkipListAcquire
|
||||
InboundLedgers "1" --o SkipListAcquire
|
||||
InboundLedgers "1" --o SkipListAcquire
|
||||
': inboundLedgers_
|
||||
LedgerReplayer "1" --o SkipListAcquire
|
||||
LedgerReplayer "1" --o SkipListAcquire
|
||||
': replayer_
|
||||
LedgerReplayTask --o SkipListAcquire : implicit via callback
|
||||
|
||||
@@ -83,9 +83,9 @@ class SkipListAcquire {
|
||||
}
|
||||
|
||||
TimeoutCounter <.. LedgerDeltaAcquire
|
||||
InboundLedgers "1" --o LedgerDeltaAcquire
|
||||
InboundLedgers "1" --o LedgerDeltaAcquire
|
||||
': inboundLedgers_
|
||||
LedgerReplayer "1" --o LedgerDeltaAcquire
|
||||
LedgerReplayer "1" --o LedgerDeltaAcquire
|
||||
': replayer_
|
||||
LedgerReplayTask --o LedgerDeltaAcquire : implicit via callback
|
||||
|
||||
@@ -95,4 +95,4 @@ class LedgerDeltaAcquire {
|
||||
-replayer_ : LedgerReplayer&
|
||||
-dataReadyCallbacks_ : vector<callback>
|
||||
}
|
||||
@enduml
|
||||
@enduml
|
||||
|
||||
@@ -38,7 +38,7 @@ deactivate lr
|
||||
loop
|
||||
lr -> lda : make_shared(ledgerId, ledgerSeq)
|
||||
return delta
|
||||
lr -> lrt : addDelta(delta)
|
||||
lr -> lrt : addDelta(delta)
|
||||
lrt -> lda : addDataCallback(callback)
|
||||
return
|
||||
return
|
||||
@@ -62,7 +62,7 @@ deactivate peer
|
||||
lr -> lda : processData(ledgerHeader, txns)
|
||||
lda -> lda : notify()
|
||||
note over lda: call the callbacks added by\naddDataCallback(callback).
|
||||
lda -> lrt : callback(ledgerId)
|
||||
lda -> lrt : callback(ledgerId)
|
||||
lrt -> lrt : deltaReady(ledgerId)
|
||||
lrt -> lrt : tryAdvance()
|
||||
loop as long as child can be built
|
||||
@@ -82,4 +82,4 @@ deactivate peer
|
||||
deactivate peer
|
||||
|
||||
|
||||
@enduml
|
||||
@enduml
|
||||
|
||||
@@ -16,5 +16,5 @@
|
||||
## Function
|
||||
|
||||
- Minimize external dependencies
|
||||
* Pass options in the ctor instead of using theConfig
|
||||
* Use as few other classes as possible
|
||||
- Pass options in the ctor instead of using theConfig
|
||||
- Use as few other classes as possible
|
||||
|
||||
@@ -1,18 +1,18 @@
|
||||
# Coding Standards
|
||||
|
||||
Coding standards used here gradually evolve and propagate through
|
||||
Coding standards used here gradually evolve and propagate through
|
||||
code reviews. Some aspects are enforced more strictly than others.
|
||||
|
||||
## Rules
|
||||
|
||||
These rules only apply to our own code. We can't enforce any sort of
|
||||
These rules only apply to our own code. We can't enforce any sort of
|
||||
style on the external repositories and libraries we include. The best
|
||||
guideline is to maintain the standards that are used in those libraries.
|
||||
|
||||
* Tab inserts 4 spaces. No tab characters.
|
||||
* Braces are indented in the [Allman style][1].
|
||||
* Modern C++ principles. No naked ```new``` or ```delete```.
|
||||
* Line lengths limited to 80 characters. Exceptions limited to data and tables.
|
||||
- Tab inserts 4 spaces. No tab characters.
|
||||
- Braces are indented in the [Allman style][1].
|
||||
- Modern C++ principles. No naked `new` or `delete`.
|
||||
- Line lengths limited to 80 characters. Exceptions limited to data and tables.
|
||||
|
||||
## Guidelines
|
||||
|
||||
@@ -21,17 +21,17 @@ why you're doing it. Think, use common sense, and consider that this
|
||||
your changes will probably need to be maintained long after you've
|
||||
moved on to other projects.
|
||||
|
||||
* Use white space and blank lines to guide the eye and keep your intent clear.
|
||||
* Put private data members at the top of a class, and the 6 public special
|
||||
members immediately after, in the following order:
|
||||
* Destructor
|
||||
* Default constructor
|
||||
* Copy constructor
|
||||
* Copy assignment
|
||||
* Move constructor
|
||||
* Move assignment
|
||||
* Don't over-inline by defining large functions within the class
|
||||
declaration, not even for template classes.
|
||||
- Use white space and blank lines to guide the eye and keep your intent clear.
|
||||
- Put private data members at the top of a class, and the 6 public special
|
||||
members immediately after, in the following order:
|
||||
- Destructor
|
||||
- Default constructor
|
||||
- Copy constructor
|
||||
- Copy assignment
|
||||
- Move constructor
|
||||
- Move assignment
|
||||
- Don't over-inline by defining large functions within the class
|
||||
declaration, not even for template classes.
|
||||
|
||||
## Formatting
|
||||
|
||||
@@ -39,44 +39,44 @@ The goal of source code formatting should always be to make things as easy to
|
||||
read as possible. White space is used to guide the eye so that details are not
|
||||
overlooked. Blank lines are used to separate code into "paragraphs."
|
||||
|
||||
* Always place a space before and after all binary operators,
|
||||
- Always place a space before and after all binary operators,
|
||||
especially assignments (`operator=`).
|
||||
* The `!` operator should be preceded by a space, but not followed by one.
|
||||
* The `~` operator should be preceded by a space, but not followed by one.
|
||||
* The `++` and `--` operators should have no spaces between the operator and
|
||||
- The `!` operator should be preceded by a space, but not followed by one.
|
||||
- The `~` operator should be preceded by a space, but not followed by one.
|
||||
- The `++` and `--` operators should have no spaces between the operator and
|
||||
the operand.
|
||||
* A space never appears before a comma, and always appears after a comma.
|
||||
* Don't put spaces after a parenthesis. A typical member function call might
|
||||
- A space never appears before a comma, and always appears after a comma.
|
||||
- Don't put spaces after a parenthesis. A typical member function call might
|
||||
look like this: `foobar (1, 2, 3);`
|
||||
* In general, leave a blank line before an `if` statement.
|
||||
* In general, leave a blank line after a closing brace `}`.
|
||||
* Do not place code on the same line as any opening or
|
||||
- In general, leave a blank line before an `if` statement.
|
||||
- In general, leave a blank line after a closing brace `}`.
|
||||
- Do not place code on the same line as any opening or
|
||||
closing brace.
|
||||
* Do not write `if` statements all-on-one-line. The exception to this is when
|
||||
- Do not write `if` statements all-on-one-line. The exception to this is when
|
||||
you've got a sequence of similar `if` statements, and are aligning them all
|
||||
vertically to highlight their similarities.
|
||||
* In an `if-else` statement, if you surround one half of the statement with
|
||||
- In an `if-else` statement, if you surround one half of the statement with
|
||||
braces, you also need to put braces around the other half, to match.
|
||||
* When writing a pointer type, use this spacing: `SomeObject* myObject`.
|
||||
- When writing a pointer type, use this spacing: `SomeObject* myObject`.
|
||||
Technically, a more correct spacing would be `SomeObject *myObject`, but
|
||||
it makes more sense for the asterisk to be grouped with the type name,
|
||||
since being a pointer is part of the type, not the variable name. The only
|
||||
time that this can lead to any problems is when you're declaring multiple
|
||||
pointers of the same type in the same statement - which leads on to the next
|
||||
rule:
|
||||
* When declaring multiple pointers, never do so in a single statement, e.g.
|
||||
- When declaring multiple pointers, never do so in a single statement, e.g.
|
||||
`SomeObject* p1, *p2;` - instead, always split them out onto separate lines
|
||||
and write the type name again, to make it quite clear what's going on, and
|
||||
avoid the danger of missing out any vital asterisks.
|
||||
* The previous point also applies to references, so always put the `&` next to
|
||||
- The previous point also applies to references, so always put the `&` next to
|
||||
the type rather than the variable, e.g. `void foo (Thing const& thing)`. And
|
||||
don't put a space on both sides of the `*` or `&` - always put a space after
|
||||
it, but never before it.
|
||||
* The word `const` should be placed to the right of the thing that it modifies,
|
||||
- The word `const` should be placed to the right of the thing that it modifies,
|
||||
for consistency. For example `int const` refers to an int which is const.
|
||||
`int const*` is a pointer to an int which is const. `int *const` is a const
|
||||
pointer to an int.
|
||||
* Always place a space in between the template angle brackets and the type
|
||||
- Always place a space in between the template angle brackets and the type
|
||||
name. Template code is already hard enough to read!
|
||||
|
||||
[1]: http://en.wikipedia.org/wiki/Indent_style#Allman_style
|
||||
|
||||
@@ -31,7 +31,7 @@ and header under /opt/local/include:
|
||||
|
||||
$ scons clang profile-jemalloc=/opt/local
|
||||
|
||||
----------------------
|
||||
---
|
||||
|
||||
## Using the jemalloc library from within the code
|
||||
|
||||
@@ -60,4 +60,3 @@ Linking against the jemalloc library will override
|
||||
the system's default `malloc()` and related functions with jemalloc's
|
||||
implementation. This is the case even if the code is not instrumented
|
||||
to use jemalloc's specific API.
|
||||
|
||||
|
||||
@@ -7,7 +7,6 @@ Install these dependencies:
|
||||
- [Doxygen](http://www.doxygen.nl): All major platforms have [official binary
|
||||
distributions](http://www.doxygen.nl/download.html#srcbin), or you can
|
||||
build from [source](http://www.doxygen.nl/download.html#srcbin).
|
||||
|
||||
- MacOS: We recommend installing via Homebrew: `brew install doxygen`.
|
||||
The executable will be installed in `/usr/local/bin` which is already
|
||||
in the default `PATH`.
|
||||
@@ -21,18 +20,15 @@ Install these dependencies:
|
||||
$ ln -s /Applications/Doxygen.app/Contents/Resources/doxygen /usr/local/bin/doxygen
|
||||
```
|
||||
|
||||
- [PlantUML](http://plantuml.com):
|
||||
|
||||
- [PlantUML](http://plantuml.com):
|
||||
1. Install a functioning Java runtime, if you don't already have one.
|
||||
2. Download [`plantuml.jar`](http://sourceforge.net/projects/plantuml/files/plantuml.jar/download).
|
||||
|
||||
- [Graphviz](https://www.graphviz.org):
|
||||
|
||||
- Linux: Install from your package manager.
|
||||
- Windows: Use an [official installer](https://graphviz.gitlab.io/_pages/Download/Download_windows.html).
|
||||
- MacOS: Install via Homebrew: `brew install graphviz`.
|
||||
|
||||
|
||||
## Docker
|
||||
|
||||
Instead of installing the above dependencies locally, you can use the official
|
||||
@@ -40,14 +36,16 @@ build environment Docker image, which has all of them installed already.
|
||||
|
||||
1. Install [Docker](https://docs.docker.com/engine/installation/)
|
||||
2. Pull the image:
|
||||
```
|
||||
sudo docker pull rippleci/rippled-ci-builder:2944b78d22db
|
||||
```
|
||||
3. Run the image from the project folder:
|
||||
```
|
||||
sudo docker run -v $PWD:/opt/rippled --rm rippleci/rippled-ci-builder:2944b78d22db
|
||||
```
|
||||
|
||||
```
|
||||
sudo docker pull rippleci/rippled-ci-builder:2944b78d22db
|
||||
```
|
||||
|
||||
3. Run the image from the project folder:
|
||||
|
||||
```
|
||||
sudo docker run -v $PWD:/opt/rippled --rm rippleci/rippled-ci-builder:2944b78d22db
|
||||
```
|
||||
|
||||
## Build
|
||||
|
||||
|
||||
14
docs/build/conan.md
vendored
14
docs/build/conan.md
vendored
@@ -5,7 +5,6 @@ we should first understand _why_ we use Conan,
|
||||
and to understand that,
|
||||
we need to understand how we use CMake.
|
||||
|
||||
|
||||
### CMake
|
||||
|
||||
Technically, you don't need CMake to build this project.
|
||||
@@ -33,9 +32,9 @@ Parameters include:
|
||||
- where to find the compiler and linker
|
||||
- where to find dependencies, e.g. libraries and headers
|
||||
- how to link dependencies, e.g. any special compiler or linker flags that
|
||||
need to be used with them, including preprocessor definitions
|
||||
need to be used with them, including preprocessor definitions
|
||||
- how to compile translation units, e.g. with optimizations, debug symbols,
|
||||
position-independent code, etc.
|
||||
position-independent code, etc.
|
||||
- on Windows, which runtime library to link with
|
||||
|
||||
For some of these parameters, like the build system and compiler,
|
||||
@@ -54,7 +53,6 @@ Most humans prefer to put them into a configuration file, once, that
|
||||
CMake can read every time it is configured.
|
||||
For CMake, that file is a [toolchain file][toolchain].
|
||||
|
||||
|
||||
### Conan
|
||||
|
||||
These next few paragraphs on Conan are going to read much like the ones above
|
||||
@@ -79,10 +77,10 @@ Those files include:
|
||||
|
||||
- A single toolchain file.
|
||||
- For every dependency, a CMake [package configuration file][pcf],
|
||||
[package version file][pvf], and for every build type, a package
|
||||
targets file.
|
||||
Together, these files implement version checking and define `IMPORTED`
|
||||
targets for the dependencies.
|
||||
[package version file][pvf], and for every build type, a package
|
||||
targets file.
|
||||
Together, these files implement version checking and define `IMPORTED`
|
||||
targets for the dependencies.
|
||||
|
||||
The toolchain file itself amends the search path
|
||||
([`CMAKE_PREFIX_PATH`][prefix_path]) so that [`find_package()`][find_package]
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user