mirror of
https://github.com/XRPLF/clio.git
synced 2025-11-04 11:55:51 +00:00
Compare commits
272 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7b84fab076 | ||
|
|
74455f5b99 | ||
|
|
d47f3b71bd | ||
|
|
1842f26826 | ||
|
|
ee8a9f5ed0 | ||
|
|
8dbdb9d8e3 | ||
|
|
644a1fdb43 | ||
|
|
58a1833cf2 | ||
|
|
dc8d1658e3 | ||
|
|
73d427c1cb | ||
|
|
c7b637b3f3 | ||
|
|
51150d8474 | ||
|
|
a74970b81e | ||
|
|
b3e63b2491 | ||
|
|
a7f61c3e68 | ||
|
|
862fc48924 | ||
|
|
98ebc92bff | ||
|
|
e98e74d768 | ||
|
|
c94f55b7eb | ||
|
|
0f5da4414c | ||
|
|
33700e3305 | ||
|
|
a7a1a724e2 | ||
|
|
656ab286b6 | ||
|
|
190b5c6a37 | ||
|
|
27fe35a2d1 | ||
|
|
62f55a7dce | ||
|
|
26d663c0be | ||
|
|
9b0dab602f | ||
|
|
97a63db51d | ||
|
|
75c6ad5c8d | ||
|
|
52d6d2c54f | ||
|
|
b89cdb26f2 | ||
|
|
cce695c570 | ||
|
|
cea9c41a88 | ||
|
|
8575f786a8 | ||
|
|
08b02c64cb | ||
|
|
b358649cf9 | ||
|
|
6bd72355db | ||
|
|
a1699d7484 | ||
|
|
957aadd25a | ||
|
|
8f89a5913d | ||
|
|
ecfe5e84e5 | ||
|
|
03c0940649 | ||
|
|
dc5aacfe39 | ||
|
|
3fda74e3f7 | ||
|
|
df27c4e629 | ||
|
|
37ee74c293 | ||
|
|
ec335176bb | ||
|
|
ab33b26ec4 | ||
|
|
28c8fa2a9a | ||
|
|
12bbed194c | ||
|
|
1fa09006f8 | ||
|
|
e3b6fc4bd4 | ||
|
|
34594ff8c0 | ||
|
|
40eeb57920 | ||
|
|
3eb36c049c | ||
|
|
81602e8ae7 | ||
|
|
0cef9e0620 | ||
|
|
81d1b30607 | ||
|
|
923d021c83 | ||
|
|
cd2b09ffb7 | ||
|
|
3c62a1f42c | ||
|
|
f97e0690c8 | ||
|
|
eeaccbabd9 | ||
|
|
13d2d4e2ca | ||
|
|
350a45e7e2 | ||
|
|
ce86572274 | ||
|
|
ac97788db8 | ||
|
|
2893492569 | ||
|
|
b63e98bda0 | ||
|
|
f4df5c2185 | ||
|
|
93d5c12b14 | ||
|
|
2514b7986e | ||
|
|
d30e63d49a | ||
|
|
61f1e0853d | ||
|
|
eb1831c489 | ||
|
|
07bd4b0760 | ||
|
|
e26a1e37b5 | ||
|
|
e89640bcfb | ||
|
|
ae135759ef | ||
|
|
28188aa0f9 | ||
|
|
af485a0634 | ||
|
|
b609298870 | ||
|
|
d077093a8d | ||
|
|
781f3b3c48 | ||
|
|
a8bae96ad4 | ||
|
|
fe9649d872 | ||
|
|
431b5f5ab8 | ||
|
|
b1dc2775fb | ||
|
|
dd35a7cfd2 | ||
|
|
a9d685d5c0 | ||
|
|
6065d324b5 | ||
|
|
fe7b5fe18f | ||
|
|
1c663988f5 | ||
|
|
d11d566121 | ||
|
|
a467cb2526 | ||
|
|
f62e36dc94 | ||
|
|
d933ce2a29 | ||
|
|
db751e3807 | ||
|
|
3c4a8f0cfb | ||
|
|
397ce97175 | ||
|
|
ac6ad13f6c | ||
|
|
7d1d1749bc | ||
|
|
acf359d631 | ||
|
|
a34e107b86 | ||
|
|
b886586de3 | ||
|
|
a57abb15a3 | ||
|
|
c87586a265 | ||
|
|
8172670c93 | ||
|
|
3fdcd3315b | ||
|
|
dd018f1c5e | ||
|
|
c2b462da75 | ||
|
|
252920ec57 | ||
|
|
9ef6801c55 | ||
|
|
24c562fa2a | ||
|
|
35f119a268 | ||
|
|
1be368dcaf | ||
|
|
a5fbb01299 | ||
|
|
3b75d88a35 | ||
|
|
f0224581a5 | ||
|
|
b998473673 | ||
|
|
8ebe2d6a80 | ||
|
|
3bab90ca7a | ||
|
|
74660aebf1 | ||
|
|
db08de466a | ||
|
|
1bacad9e49 | ||
|
|
ca16858878 | ||
|
|
feae85782c | ||
|
|
b016c1d7ba | ||
|
|
0597a9d685 | ||
|
|
05bea6a971 | ||
|
|
fa660ef400 | ||
|
|
25d9e3cc36 | ||
|
|
58f13e1660 | ||
|
|
a16b680a7a | ||
|
|
320ebaa5d2 | ||
|
|
058df4d12a | ||
|
|
5145d07693 | ||
|
|
5e9e5f6f65 | ||
|
|
1ce7bcbc28 | ||
|
|
243858df12 | ||
|
|
b363cc93af | ||
|
|
200d97f0de | ||
|
|
1ec5d3e5a3 | ||
|
|
e062121917 | ||
|
|
1aab2b94b1 | ||
|
|
5de87b9ef8 | ||
|
|
398db13f4d | ||
|
|
5e8ffb66b4 | ||
|
|
939740494b | ||
|
|
ff3d2b5600 | ||
|
|
7080b4d549 | ||
|
|
8d783ecd6a | ||
|
|
5e6682ddc7 | ||
|
|
fca29694a0 | ||
|
|
a541e6d00e | ||
|
|
9bd38dd290 | ||
|
|
f683b25f76 | ||
|
|
91ad1ffc3b | ||
|
|
64b4a908da | ||
|
|
ac752c656e | ||
|
|
4fe868aaeb | ||
|
|
59eb40a1f2 | ||
|
|
0b5f667e4a | ||
|
|
fa42c5c900 | ||
|
|
0818b6ce5b | ||
|
|
e2cc56d25a | ||
|
|
caaa01bf0f | ||
|
|
4b53bef1f5 | ||
|
|
69f5025a29 | ||
|
|
d1c41a8bb7 | ||
|
|
207ba51461 | ||
|
|
ebe7688ccb | ||
|
|
6d9f8a7ead | ||
|
|
6ca777ea96 | ||
|
|
963685dd31 | ||
|
|
e36545058d | ||
|
|
44527140f0 | ||
|
|
0eaaa1fb31 | ||
|
|
1846f629a5 | ||
|
|
83af5af3c6 | ||
|
|
418a0ddbf2 | ||
|
|
6cfbfda014 | ||
|
|
91648f98ad | ||
|
|
71e1637c5f | ||
|
|
59cd2ce5aa | ||
|
|
d783edd57a | ||
|
|
1ce8a58167 | ||
|
|
92e5c4792b | ||
|
|
d7f36733bc | ||
|
|
435d56e7c5 | ||
|
|
bf3b24867c | ||
|
|
ec70127050 | ||
|
|
547cb340bd | ||
|
|
c20b14494a | ||
|
|
696b1a585c | ||
|
|
23442ff1a7 | ||
|
|
db4046e02a | ||
|
|
fc1b5ae4da | ||
|
|
5411fd7497 | ||
|
|
f6488f7024 | ||
|
|
e3ada6c5da | ||
|
|
d61d702ccd | ||
|
|
4d42cb3cdb | ||
|
|
111b55b397 | ||
|
|
c90bc15959 | ||
|
|
1804e3e9c0 | ||
|
|
24f69acd9e | ||
|
|
98d0a963dc | ||
|
|
665890d410 | ||
|
|
545886561f | ||
|
|
68eec01dbc | ||
|
|
02621fe02e | ||
|
|
6ad72446d1 | ||
|
|
1d0a43669b | ||
|
|
71aabc8c29 | ||
|
|
6b98579bfb | ||
|
|
375ac2ffa6 | ||
|
|
c6ca650767 | ||
|
|
2336148d0d | ||
|
|
12178abf4d | ||
|
|
b8705ae086 | ||
|
|
b83d7478ef | ||
|
|
4fd6d51d21 | ||
|
|
d195bdb66d | ||
|
|
50dbb51627 | ||
|
|
2f369e175c | ||
|
|
47e03a7da3 | ||
|
|
d7b84a2e7a | ||
|
|
e79425bc21 | ||
|
|
7710468f37 | ||
|
|
210d7fdbc8 | ||
|
|
ba8e7188ca | ||
|
|
271323b0f4 | ||
|
|
7b306f3ba0 | ||
|
|
73805d44ad | ||
|
|
f19772907d | ||
|
|
616f0176c9 | ||
|
|
9f4f5d319e | ||
|
|
dcbc4577c2 | ||
|
|
f4d8e18bf7 | ||
|
|
b3e001ebfb | ||
|
|
524821c0b0 | ||
|
|
a292a607c2 | ||
|
|
81894c0a90 | ||
|
|
0a7def18cd | ||
|
|
1e969ba13b | ||
|
|
ef62718a27 | ||
|
|
aadd9e50f0 | ||
|
|
d9e89746a4 | ||
|
|
557ea5d7f6 | ||
|
|
4cc3b3ec0f | ||
|
|
a960471ef4 | ||
|
|
871d43c85f | ||
|
|
5ce3fff788 | ||
|
|
a76194d299 | ||
|
|
14f9f98cf2 | ||
|
|
01e4eed130 | ||
|
|
893315c50d | ||
|
|
b83d206ced | ||
|
|
9d28e64383 | ||
|
|
b873af2d43 | ||
|
|
435db339df | ||
|
|
9836e4ceaf | ||
|
|
5d2c079f1a | ||
|
|
244337c5b6 | ||
|
|
b07fbb14dc | ||
|
|
7e8569b03a | ||
|
|
fc0c93b2ee | ||
|
|
8ba7388d58 | ||
|
|
0bbb539d0b | ||
|
|
c50174235f |
@@ -1,7 +1,7 @@
|
||||
---
|
||||
Language: Cpp
|
||||
AccessModifierOffset: -4
|
||||
AlignAfterOpenBracket: AlwaysBreak
|
||||
AlignAfterOpenBracket: BlockIndent
|
||||
AlignConsecutiveAssignments: false
|
||||
AlignConsecutiveDeclarations: false
|
||||
AlignEscapedNewlinesLeft: true
|
||||
@@ -18,20 +18,8 @@ AlwaysBreakBeforeMultilineStrings: true
|
||||
AlwaysBreakTemplateDeclarations: true
|
||||
BinPackArguments: false
|
||||
BinPackParameters: false
|
||||
BraceWrapping:
|
||||
AfterClass: true
|
||||
AfterControlStatement: true
|
||||
AfterEnum: false
|
||||
AfterFunction: true
|
||||
AfterNamespace: false
|
||||
AfterObjCDeclaration: true
|
||||
AfterStruct: true
|
||||
AfterUnion: true
|
||||
BeforeCatch: true
|
||||
BeforeElse: true
|
||||
IndentBraces: false
|
||||
BreakBeforeBinaryOperators: false
|
||||
BreakBeforeBraces: Custom
|
||||
BreakBeforeBraces: WebKit
|
||||
BreakBeforeTernaryOperators: true
|
||||
BreakConstructorInitializersBeforeComma: true
|
||||
ColumnLimit: 120
|
||||
@@ -43,13 +31,15 @@ Cpp11BracedListStyle: true
|
||||
DerivePointerAlignment: false
|
||||
DisableFormat: false
|
||||
ExperimentalAutoDetectBinPacking: false
|
||||
FixNamespaceComments: true
|
||||
ForEachMacros: [ Q_FOREACH, BOOST_FOREACH ]
|
||||
IncludeBlocks: Regroup
|
||||
IncludeCategories:
|
||||
- Regex: '^<(BeastConfig)'
|
||||
Priority: 0
|
||||
- Regex: '^<(ripple)/'
|
||||
- Regex: '^".*"$'
|
||||
Priority: 1
|
||||
- Regex: '^<.*\.(h|hpp)>$'
|
||||
Priority: 2
|
||||
- Regex: '^<(boost)/'
|
||||
- Regex: '^<.*>$'
|
||||
Priority: 3
|
||||
- Regex: '.*'
|
||||
Priority: 4
|
||||
@@ -58,6 +48,8 @@ IndentCaseLabels: true
|
||||
IndentFunctionDeclarationAfterType: false
|
||||
IndentWidth: 4
|
||||
IndentWrappedFunctionNames: false
|
||||
IndentRequiresClause: true
|
||||
RequiresClausePosition: OwnLine
|
||||
KeepEmptyLinesAtTheStartOfBlocks: false
|
||||
MaxEmptyLinesToKeep: 1
|
||||
NamespaceIndentation: None
|
||||
@@ -70,6 +62,7 @@ PenaltyBreakString: 1000
|
||||
PenaltyExcessCharacter: 1000000
|
||||
PenaltyReturnTypeOnItsOwnLine: 200
|
||||
PointerAlignment: Left
|
||||
QualifierAlignment: Right
|
||||
ReflowComments: true
|
||||
SortIncludes: true
|
||||
SpaceAfterCStyleCast: false
|
||||
|
||||
132
.clang-tidy
Normal file
132
.clang-tidy
Normal file
@@ -0,0 +1,132 @@
|
||||
---
|
||||
Checks: '-*,
|
||||
bugprone-argument-comment,
|
||||
bugprone-assert-side-effect,
|
||||
bugprone-bad-signal-to-kill-thread,
|
||||
bugprone-bool-pointer-implicit-conversion,
|
||||
bugprone-copy-constructor-init,
|
||||
bugprone-dangling-handle,
|
||||
bugprone-dynamic-static-initializers,
|
||||
bugprone-empty-catch,
|
||||
bugprone-fold-init-type,
|
||||
bugprone-forward-declaration-namespace,
|
||||
bugprone-inaccurate-erase,
|
||||
bugprone-incorrect-roundings,
|
||||
bugprone-infinite-loop,
|
||||
bugprone-integer-division,
|
||||
bugprone-lambda-function-name,
|
||||
bugprone-macro-parentheses,
|
||||
bugprone-macro-repeated-side-effects,
|
||||
bugprone-misplaced-operator-in-strlen-in-alloc,
|
||||
bugprone-misplaced-pointer-arithmetic-in-alloc,
|
||||
bugprone-misplaced-widening-cast,
|
||||
bugprone-move-forwarding-reference,
|
||||
bugprone-multiple-new-in-one-expression,
|
||||
bugprone-multiple-statement-macro,
|
||||
bugprone-no-escape,
|
||||
bugprone-non-zero-enum-to-bool-conversion,
|
||||
bugprone-parent-virtual-call,
|
||||
bugprone-posix-return,
|
||||
bugprone-redundant-branch-condition,
|
||||
bugprone-reserved-identifier,
|
||||
bugprone-unused-return-value,
|
||||
bugprone-shared-ptr-array-mismatch,
|
||||
bugprone-signal-handler,
|
||||
bugprone-signed-char-misuse,
|
||||
bugprone-sizeof-container,
|
||||
bugprone-sizeof-expression,
|
||||
bugprone-spuriously-wake-up-functions,
|
||||
bugprone-standalone-empty,
|
||||
bugprone-string-constructor,
|
||||
bugprone-string-integer-assignment,
|
||||
bugprone-string-literal-with-embedded-nul,
|
||||
bugprone-stringview-nullptr,
|
||||
bugprone-suspicious-enum-usage,
|
||||
bugprone-suspicious-include,
|
||||
bugprone-suspicious-memory-comparison,
|
||||
bugprone-suspicious-memset-usage,
|
||||
bugprone-suspicious-missing-comma,
|
||||
bugprone-suspicious-realloc-usage,
|
||||
bugprone-suspicious-semicolon,
|
||||
bugprone-suspicious-string-compare,
|
||||
bugprone-swapped-arguments,
|
||||
bugprone-switch-missing-default-case,
|
||||
bugprone-terminating-continue,
|
||||
bugprone-throw-keyword-missing,
|
||||
bugprone-too-small-loop-variable,
|
||||
bugprone-undefined-memory-manipulation,
|
||||
bugprone-undelegated-constructor,
|
||||
bugprone-unhandled-exception-at-new,
|
||||
bugprone-unhandled-self-assignment,
|
||||
bugprone-unique-ptr-array-mismatch,
|
||||
bugprone-unsafe-functions,
|
||||
bugprone-unused-raii,
|
||||
bugprone-use-after-move,
|
||||
bugprone-virtual-near-miss,
|
||||
cppcoreguidelines-init-variables,
|
||||
cppcoreguidelines-misleading-capture-default-by-value,
|
||||
cppcoreguidelines-pro-type-member-init,
|
||||
cppcoreguidelines-pro-type-static-cast-downcast,
|
||||
cppcoreguidelines-rvalue-reference-param-not-moved,
|
||||
cppcoreguidelines-use-default-member-init,
|
||||
cppcoreguidelines-virtual-class-destructor,
|
||||
llvm-namespace-comment,
|
||||
misc-const-correctness,
|
||||
misc-definitions-in-headers,
|
||||
misc-header-include-cycle,
|
||||
misc-include-cleaner,
|
||||
misc-misplaced-const,
|
||||
misc-redundant-expression,
|
||||
misc-static-assert,
|
||||
misc-throw-by-value-catch-by-reference,
|
||||
misc-unused-alias-decls,
|
||||
misc-unused-using-decls,
|
||||
modernize-concat-nested-namespaces,
|
||||
modernize-deprecated-headers,
|
||||
modernize-make-shared,
|
||||
modernize-make-unique,
|
||||
modernize-pass-by-value,
|
||||
modernize-type-traits,
|
||||
modernize-use-emplace,
|
||||
modernize-use-equals-default,
|
||||
modernize-use-equals-delete,
|
||||
modernize-use-override,
|
||||
modernize-use-using,
|
||||
performance-faster-string-find,
|
||||
performance-for-range-copy,
|
||||
performance-implicit-conversion-in-loop,
|
||||
performance-inefficient-vector-operation,
|
||||
performance-move-const-arg,
|
||||
performance-move-constructor-init,
|
||||
performance-no-automatic-move,
|
||||
performance-trivially-destructible,
|
||||
readability-braces-around-statements,
|
||||
readability-const-return-type,
|
||||
readability-container-contains,
|
||||
readability-container-size-empty,
|
||||
readability-convert-member-functions-to-static,
|
||||
readability-duplicate-include,
|
||||
readability-else-after-return,
|
||||
readability-implicit-bool-conversion,
|
||||
readability-inconsistent-declaration-parameter-name,
|
||||
readability-make-member-function-const,
|
||||
readability-misleading-indentation,
|
||||
readability-non-const-parameter,
|
||||
readability-redundant-declaration,
|
||||
readability-redundant-member-init,
|
||||
readability-redundant-string-init,
|
||||
readability-simplify-boolean-expr,
|
||||
readability-static-accessed-through-instance,
|
||||
readability-static-definition-in-anonymous-namespace,
|
||||
readability-suspicious-call-argument
|
||||
'
|
||||
|
||||
CheckOptions:
|
||||
readability-braces-around-statements.ShortStatementLines: 2
|
||||
bugprone-unsafe-functions.ReportMoreUnsafeFunctions: true
|
||||
bugprone-unused-return-value.CheckedReturnTypes: ::std::error_code;::std::error_condition;::std::errc;::std::expected
|
||||
misc-include-cleaner.IgnoreHeaders: '.*/(detail|impl)/.*'
|
||||
|
||||
HeaderFilterRegex: '^.*/(src|unittests)/.*\.(h|hpp)$'
|
||||
WarningsAsErrors: '*'
|
||||
|
||||
5
.clangd
Normal file
5
.clangd
Normal file
@@ -0,0 +1,5 @@
|
||||
Diagnostics:
|
||||
UnusedIncludes: Strict
|
||||
MissingIncludes: Strict
|
||||
Includes:
|
||||
IgnoreHeader: ".*/(detail|impl)/.*"
|
||||
245
.cmake-format.yaml
Normal file
245
.cmake-format.yaml
Normal file
@@ -0,0 +1,245 @@
|
||||
_help_parse: Options affecting listfile parsing
|
||||
parse:
|
||||
_help_additional_commands:
|
||||
- Specify structure for custom cmake functions
|
||||
additional_commands:
|
||||
foo:
|
||||
flags:
|
||||
- BAR
|
||||
- BAZ
|
||||
kwargs:
|
||||
HEADERS: '*'
|
||||
SOURCES: '*'
|
||||
DEPENDS: '*'
|
||||
_help_override_spec:
|
||||
- Override configurations per-command where available
|
||||
override_spec: {}
|
||||
_help_vartags:
|
||||
- Specify variable tags.
|
||||
vartags: []
|
||||
_help_proptags:
|
||||
- Specify property tags.
|
||||
proptags: []
|
||||
_help_format: Options affecting formatting.
|
||||
format:
|
||||
_help_disable:
|
||||
- Disable formatting entirely, making cmake-format a no-op
|
||||
disable: false
|
||||
_help_line_width:
|
||||
- How wide to allow formatted cmake files
|
||||
line_width: 120
|
||||
_help_tab_size:
|
||||
- How many spaces to tab for indent
|
||||
tab_size: 2
|
||||
_help_use_tabchars:
|
||||
- If true, lines are indented using tab characters (utf-8
|
||||
- 0x09) instead of <tab_size> space characters (utf-8 0x20).
|
||||
- In cases where the layout would require a fractional tab
|
||||
- character, the behavior of the fractional indentation is
|
||||
- governed by <fractional_tab_policy>
|
||||
use_tabchars: false
|
||||
_help_fractional_tab_policy:
|
||||
- If <use_tabchars> is True, then the value of this variable
|
||||
- indicates how fractional indentions are handled during
|
||||
- whitespace replacement. If set to 'use-space', fractional
|
||||
- indentation is left as spaces (utf-8 0x20). If set to
|
||||
- '`round-up` fractional indentation is replaced with a single'
|
||||
- tab character (utf-8 0x09) effectively shifting the column
|
||||
- to the next tabstop
|
||||
fractional_tab_policy: use-space
|
||||
_help_max_subgroups_hwrap:
|
||||
- If an argument group contains more than this many sub-groups
|
||||
- (parg or kwarg groups) then force it to a vertical layout.
|
||||
max_subgroups_hwrap: 4
|
||||
_help_max_pargs_hwrap:
|
||||
- If a positional argument group contains more than this many
|
||||
- arguments, then force it to a vertical layout.
|
||||
max_pargs_hwrap: 6
|
||||
_help_max_rows_cmdline:
|
||||
- If a cmdline positional group consumes more than this many
|
||||
- lines without nesting, then invalidate the layout (and nest)
|
||||
max_rows_cmdline: 2
|
||||
_help_separate_ctrl_name_with_space:
|
||||
- If true, separate flow control names from their parentheses
|
||||
- with a space
|
||||
separate_ctrl_name_with_space: true
|
||||
_help_separate_fn_name_with_space:
|
||||
- If true, separate function names from parentheses with a
|
||||
- space
|
||||
separate_fn_name_with_space: false
|
||||
_help_dangle_parens:
|
||||
- If a statement is wrapped to more than one line, than dangle
|
||||
- the closing parenthesis on its own line.
|
||||
dangle_parens: true
|
||||
_help_dangle_align:
|
||||
- If the trailing parenthesis must be 'dangled' on its on
|
||||
- 'line, then align it to this reference: `prefix`: the start'
|
||||
- 'of the statement, `prefix-indent`: the start of the'
|
||||
- 'statement, plus one indentation level, `child`: align to'
|
||||
- the column of the arguments
|
||||
dangle_align: prefix
|
||||
_help_min_prefix_chars:
|
||||
- If the statement spelling length (including space and
|
||||
- parenthesis) is smaller than this amount, then force reject
|
||||
- nested layouts.
|
||||
min_prefix_chars: 4
|
||||
_help_max_prefix_chars:
|
||||
- If the statement spelling length (including space and
|
||||
- parenthesis) is larger than the tab width by more than this
|
||||
- amount, then force reject un-nested layouts.
|
||||
max_prefix_chars: 10
|
||||
_help_max_lines_hwrap:
|
||||
- If a candidate layout is wrapped horizontally but it exceeds
|
||||
- this many lines, then reject the layout.
|
||||
max_lines_hwrap: 2
|
||||
_help_line_ending:
|
||||
- What style line endings to use in the output.
|
||||
line_ending: unix
|
||||
_help_command_case:
|
||||
- Format command names consistently as 'lower' or 'upper' case
|
||||
command_case: canonical
|
||||
_help_keyword_case:
|
||||
- Format keywords consistently as 'lower' or 'upper' case
|
||||
keyword_case: unchanged
|
||||
_help_always_wrap:
|
||||
- A list of command names which should always be wrapped
|
||||
always_wrap: []
|
||||
_help_enable_sort:
|
||||
- If true, the argument lists which are known to be sortable
|
||||
- will be sorted lexicographicall
|
||||
enable_sort: true
|
||||
_help_autosort:
|
||||
- If true, the parsers may infer whether or not an argument
|
||||
- list is sortable (without annotation).
|
||||
autosort: true
|
||||
_help_require_valid_layout:
|
||||
- By default, if cmake-format cannot successfully fit
|
||||
- everything into the desired linewidth it will apply the
|
||||
- last, most agressive attempt that it made. If this flag is
|
||||
- True, however, cmake-format will print error, exit with non-
|
||||
- zero status code, and write-out nothing
|
||||
require_valid_layout: false
|
||||
_help_layout_passes:
|
||||
- A dictionary mapping layout nodes to a list of wrap
|
||||
- decisions. See the documentation for more information.
|
||||
layout_passes: {}
|
||||
_help_markup: Options affecting comment reflow and formatting.
|
||||
markup:
|
||||
_help_bullet_char:
|
||||
- What character to use for bulleted lists
|
||||
bullet_char: '*'
|
||||
_help_enum_char:
|
||||
- What character to use as punctuation after numerals in an
|
||||
- enumerated list
|
||||
enum_char: .
|
||||
_help_first_comment_is_literal:
|
||||
- If comment markup is enabled, don't reflow the first comment
|
||||
- block in each listfile. Use this to preserve formatting of
|
||||
- your copyright/license statements.
|
||||
first_comment_is_literal: false
|
||||
_help_literal_comment_pattern:
|
||||
- If comment markup is enabled, don't reflow any comment block
|
||||
- which matches this (regex) pattern. Default is `None`
|
||||
- (disabled).
|
||||
literal_comment_pattern: null
|
||||
_help_fence_pattern:
|
||||
- Regular expression to match preformat fences in comments
|
||||
- default= ``r'^\s*([`~]{3}[`~]*)(.*)$'``
|
||||
fence_pattern: ^\s*([`~]{3}[`~]*)(.*)$
|
||||
_help_ruler_pattern:
|
||||
- Regular expression to match rulers in comments default=
|
||||
- '``r''^\s*[^\w\s]{3}.*[^\w\s]{3}$''``'
|
||||
ruler_pattern: ^\s*[^\w\s]{3}.*[^\w\s]{3}$
|
||||
_help_explicit_trailing_pattern:
|
||||
- If a comment line matches starts with this pattern then it
|
||||
- is explicitly a trailing comment for the preceeding
|
||||
- argument. Default is '#<'
|
||||
explicit_trailing_pattern: '#<'
|
||||
_help_hashruler_min_length:
|
||||
- If a comment line starts with at least this many consecutive
|
||||
- hash characters, then don't lstrip() them off. This allows
|
||||
- for lazy hash rulers where the first hash char is not
|
||||
- separated by space
|
||||
hashruler_min_length: 10
|
||||
_help_canonicalize_hashrulers:
|
||||
- If true, then insert a space between the first hash char and
|
||||
- remaining hash chars in a hash ruler, and normalize its
|
||||
- length to fill the column
|
||||
canonicalize_hashrulers: true
|
||||
_help_enable_markup:
|
||||
- enable comment markup parsing and reflow
|
||||
enable_markup: true
|
||||
_help_lint: Options affecting the linter
|
||||
lint:
|
||||
_help_disabled_codes:
|
||||
- a list of lint codes to disable
|
||||
disabled_codes: []
|
||||
_help_function_pattern:
|
||||
- regular expression pattern describing valid function names
|
||||
function_pattern: '[0-9a-z_]+'
|
||||
_help_macro_pattern:
|
||||
- regular expression pattern describing valid macro names
|
||||
macro_pattern: '[0-9A-Z_]+'
|
||||
_help_global_var_pattern:
|
||||
- regular expression pattern describing valid names for
|
||||
- variables with global (cache) scope
|
||||
global_var_pattern: '[A-Z][0-9A-Z_]+'
|
||||
_help_internal_var_pattern:
|
||||
- regular expression pattern describing valid names for
|
||||
- variables with global scope (but internal semantic)
|
||||
internal_var_pattern: _[A-Z][0-9A-Z_]+
|
||||
_help_local_var_pattern:
|
||||
- regular expression pattern describing valid names for
|
||||
- variables with local scope
|
||||
local_var_pattern: '[a-z][a-z0-9_]+'
|
||||
_help_private_var_pattern:
|
||||
- regular expression pattern describing valid names for
|
||||
- privatedirectory variables
|
||||
private_var_pattern: _[0-9a-z_]+
|
||||
_help_public_var_pattern:
|
||||
- regular expression pattern describing valid names for public
|
||||
- directory variables
|
||||
public_var_pattern: '[A-Z][0-9A-Z_]+'
|
||||
_help_argument_var_pattern:
|
||||
- regular expression pattern describing valid names for
|
||||
- function/macro arguments and loop variables.
|
||||
argument_var_pattern: '[a-z][a-z0-9_]+'
|
||||
_help_keyword_pattern:
|
||||
- regular expression pattern describing valid names for
|
||||
- keywords used in functions or macros
|
||||
keyword_pattern: '[A-Z][0-9A-Z_]+'
|
||||
_help_max_conditionals_custom_parser:
|
||||
- In the heuristic for C0201, how many conditionals to match
|
||||
- within a loop in before considering the loop a parser.
|
||||
max_conditionals_custom_parser: 2
|
||||
_help_min_statement_spacing:
|
||||
- Require at least this many newlines between statements
|
||||
min_statement_spacing: 1
|
||||
_help_max_statement_spacing:
|
||||
- Require no more than this many newlines between statements
|
||||
max_statement_spacing: 2
|
||||
max_returns: 6
|
||||
max_branches: 12
|
||||
max_arguments: 5
|
||||
max_localvars: 15
|
||||
max_statements: 50
|
||||
_help_encode: Options affecting file encoding
|
||||
encode:
|
||||
_help_emit_byteorder_mark:
|
||||
- If true, emit the unicode byte-order mark (BOM) at the start
|
||||
- of the file
|
||||
emit_byteorder_mark: false
|
||||
_help_input_encoding:
|
||||
- Specify the encoding of the input file. Defaults to utf-8
|
||||
input_encoding: utf-8
|
||||
_help_output_encoding:
|
||||
- Specify the encoding of the output file. Defaults to utf-8.
|
||||
- Note that cmake only claims to support utf-8 so be careful
|
||||
- when using anything else
|
||||
output_encoding: utf-8
|
||||
_help_misc: Miscellaneous configurations options.
|
||||
misc:
|
||||
_help_per_command:
|
||||
- A dictionary containing any per-command configuration
|
||||
- overrides. Currently only `command_case` is supported.
|
||||
per_command: {}
|
||||
11
.codecov.yml
Normal file
11
.codecov.yml
Normal file
@@ -0,0 +1,11 @@
|
||||
coverage:
|
||||
status:
|
||||
project:
|
||||
default:
|
||||
target: 50%
|
||||
threshold: 2%
|
||||
|
||||
patch:
|
||||
default:
|
||||
target: 20% # Need to bump this number https://docs.codecov.com/docs/commit-status#patch-status
|
||||
threshold: 2%
|
||||
62
.githooks/check-docs
Executable file
62
.githooks/check-docs
Executable file
@@ -0,0 +1,62 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Note: This script is intended to be run from the root of the repository.
|
||||
#
|
||||
# Not really a hook but should be used to check the completness of documentation for added code, otherwise CI will come for you.
|
||||
# It's good to have /tmp as the output so that consecutive runs are fast but no clutter in the repository.
|
||||
|
||||
echo "+ Checking documentation..."
|
||||
|
||||
ROOT=$(pwd)
|
||||
DOXYGEN=$(command -v doxygen)
|
||||
TMPDIR=${ROOT}/.cache/doxygen
|
||||
TMPFILE=${TMPDIR}/docs.log
|
||||
DOCDIR=${TMPDIR}/out
|
||||
|
||||
if [ -z "$DOXYGEN" ]; then
|
||||
# No hard error if doxygen is not installed yet
|
||||
cat <<EOF
|
||||
|
||||
WARNING
|
||||
-----------------------------------------------------------------------------
|
||||
'doxygen' is required to check documentation.
|
||||
Please install it for next time. For the time being it's on CI.
|
||||
-----------------------------------------------------------------------------
|
||||
|
||||
EOF
|
||||
exit 0
|
||||
fi
|
||||
|
||||
mkdir -p ${DOCDIR} > /dev/null 2>&1
|
||||
pushd ${DOCDIR} > /dev/null 2>&1
|
||||
|
||||
cat ${ROOT}/docs/Doxyfile | \
|
||||
sed \
|
||||
-e "s/\${LINT}/YES/" \
|
||||
-e "s!\${SOURCE}!${ROOT}!" \
|
||||
-e "s/\${USE_DOT}/NO/" \
|
||||
-e "s/\${EXCLUDES}/impl/" \
|
||||
| ${DOXYGEN} - 2> ${TMPFILE} 1> /dev/null
|
||||
|
||||
# We don't want to check for default values and typedefs as well as for member variables
|
||||
OUT=$(cat ${TMPFILE} \
|
||||
| grep -v "=default" \
|
||||
| grep -v "\(variable\)" \
|
||||
| grep -v "\(typedef\)")
|
||||
|
||||
rm -rf ${TMPFILE} > /dev/null 2>&1
|
||||
popd > /dev/null 2>&1
|
||||
|
||||
if [[ ! -z "$OUT" ]]; then
|
||||
cat <<EOF
|
||||
|
||||
ERROR
|
||||
-----------------------------------------------------------------------------
|
||||
Found issues with documentation:
|
||||
|
||||
$OUT
|
||||
-----------------------------------------------------------------------------
|
||||
|
||||
EOF
|
||||
exit 2
|
||||
fi
|
||||
99
.githooks/check-format
Executable file
99
.githooks/check-format
Executable file
@@ -0,0 +1,99 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Note: This script is intended to be run from the root of the repository.
|
||||
#
|
||||
# This script checks the format of the code and cmake files.
|
||||
# In many cases it will automatically fix the issues and abort the commit.
|
||||
|
||||
echo "+ Checking code format..."
|
||||
|
||||
# paths to check and re-format
|
||||
sources="src unittests"
|
||||
formatter="clang-format -i"
|
||||
version=$($formatter --version | grep -o '[0-9\.]*')
|
||||
|
||||
if [[ "17.0.0" > "$version" ]]; then
|
||||
cat <<EOF
|
||||
|
||||
ERROR
|
||||
-----------------------------------------------------------------------------
|
||||
A minimum of version 17 of `which clang-format` is required.
|
||||
Your version is $version.
|
||||
Please fix paths and run again.
|
||||
-----------------------------------------------------------------------------
|
||||
|
||||
EOF
|
||||
exit 3
|
||||
fi
|
||||
|
||||
# check there is no .h headers, only .hpp
|
||||
wrong_headers=$(find $sources -name "*.h" | sed 's/^/ - /')
|
||||
if [[ ! -z "$wrong_headers" ]]; then
|
||||
cat <<EOF
|
||||
|
||||
ERROR
|
||||
-----------------------------------------------------------------------------
|
||||
Found .h headers in the source code. Please rename them to .hpp:
|
||||
|
||||
$wrong_headers
|
||||
-----------------------------------------------------------------------------
|
||||
|
||||
EOF
|
||||
exit 2
|
||||
fi
|
||||
|
||||
if ! command -v cmake-format &> /dev/null; then
|
||||
cat <<EOF
|
||||
|
||||
ERROR
|
||||
-----------------------------------------------------------------------------
|
||||
'cmake-format' is required to run this script.
|
||||
Please install it and run again.
|
||||
-----------------------------------------------------------------------------
|
||||
|
||||
EOF
|
||||
exit 3
|
||||
fi
|
||||
|
||||
function grep_code {
|
||||
grep -l "${1}" ${sources} -r --include \*.hpp --include \*.cpp
|
||||
}
|
||||
|
||||
if [[ "$OSTYPE" == "darwin"* ]]; then
|
||||
# make all includes to be <...> style
|
||||
grep_code '#include ".*"' | xargs sed -i '' -E 's|#include "(.*)"|#include <\1>|g'
|
||||
|
||||
# make local includes to be "..." style
|
||||
main_src_dirs=$(find ./src -maxdepth 1 -type d -exec basename {} \; | tr '\n' '|' | sed 's/|$//' | sed 's/|/\\|/g')
|
||||
grep_code "#include <\($main_src_dirs\)/.*>" | xargs sed -i '' -E "s|#include <(($main_src_dirs)/.*)>|#include \"\1\"|g"
|
||||
else
|
||||
# make all includes to be <...> style
|
||||
grep_code '#include ".*"' | xargs sed -i -E 's|#include "(.*)"|#include <\1>|g'
|
||||
|
||||
# make local includes to be "..." style
|
||||
main_src_dirs=$(find ./src -type d -maxdepth 1 -exec basename {} \; | paste -sd '|' | sed 's/|/\\|/g')
|
||||
grep_code "#include <\($main_src_dirs\)/.*>" | xargs sed -i -E "s|#include <(($main_src_dirs)/.*)>|#include \"\1\"|g"
|
||||
fi
|
||||
|
||||
cmake_dirs=$(echo cmake $sources)
|
||||
cmake_files=$(find $cmake_dirs -type f \( -name "CMakeLists.txt" -o -name "*.cmake" \))
|
||||
cmake_files=$(echo $cmake_files ./CMakeLists.txt)
|
||||
|
||||
first=$(git diff $sources $cmake_files)
|
||||
find $sources -type f \( -name '*.cpp' -o -name '*.hpp' -o -name '*.ipp' \) -print0 | xargs -0 $formatter
|
||||
cmake-format -i $cmake_files
|
||||
second=$(git diff $sources $cmake_files)
|
||||
changes=$(diff <(echo "$first") <(echo "$second") | wc -l | sed -e 's/^[[:space:]]*//')
|
||||
|
||||
if [ "$changes" != "0" ]; then
|
||||
cat <<\EOF
|
||||
|
||||
WARNING
|
||||
-----------------------------------------------------------------------------
|
||||
Automatically re-formatted code with 'clang-format' - commit was aborted.
|
||||
Please manually add any updated files and commit again.
|
||||
-----------------------------------------------------------------------------
|
||||
|
||||
EOF
|
||||
exit 1
|
||||
fi
|
||||
@@ -1,20 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Pushing a release branch requires an annotated tag at the released commit
|
||||
branch=$(git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
if [[ $branch =~ master ]]; then
|
||||
# check if HEAD commit is tagged
|
||||
if ! git describe --exact-match HEAD; then
|
||||
echo "Commits to master must be tagged"
|
||||
exit 1
|
||||
fi
|
||||
elif [[ $branch =~ release/* ]]; then
|
||||
IFS=/ read -r branch rel_ver <<< ${branch}
|
||||
tag=$(git describe --tags --abbrev=0)
|
||||
if [[ "${rel_ver}" != "${tag}" ]]; then
|
||||
echo "release/${rel_ver} branches must have annotated tag ${rel_ver}"
|
||||
echo "git tag -am\"${rel_ver}\" ${rel_ver}"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
@@ -1,27 +1,6 @@
|
||||
#!/bin/bash
|
||||
|
||||
exec 1>&2
|
||||
# This script is intended to be run from the root of the repository.
|
||||
|
||||
# paths to check and re-format
|
||||
sources="src unittests"
|
||||
formatter="clang-format-11 -i"
|
||||
|
||||
first=$(git diff $sources)
|
||||
find $sources -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -print0 | xargs -0 $formatter
|
||||
second=$(git diff $sources)
|
||||
changes=$(diff <(echo "$first") <(echo "$second") | wc -l | sed -e 's/^[[:space:]]*//')
|
||||
|
||||
if [ "$changes" != "0" ]; then
|
||||
cat <<\EOF
|
||||
|
||||
WARNING
|
||||
-----------------------------------------------------------------------------
|
||||
Automatically re-formatted code with `clang-format` - commit was aborted.
|
||||
Please manually add any updated files and commit again.
|
||||
-----------------------------------------------------------------------------
|
||||
|
||||
EOF
|
||||
exit 1
|
||||
fi
|
||||
|
||||
.githooks/ensure_release_tag
|
||||
source .githooks/check-format
|
||||
source .githooks/check-docs
|
||||
|
||||
30
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
30
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,30 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
title: "[Title with short description] (Version: [Clio version])"
|
||||
labels: bug
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!-- Please search existing issues to avoid creating duplicates. -->
|
||||
|
||||
## Issue Description
|
||||
<!-- Provide a summary for your issue/bug. -->
|
||||
|
||||
## Steps to Reproduce
|
||||
<!-- List in detail the exact steps to reproduce the unexpected behavior of the software. -->
|
||||
|
||||
## Expected Result
|
||||
<!-- Explain in detail what behavior you expected to happen. -->
|
||||
|
||||
## Actual Result
|
||||
<!-- Explain in detail what behavior actually happened. -->
|
||||
|
||||
## Environment
|
||||
<!-- Please describe your environment setup (such as Ubuntu 20.04.2 with Boost 1.82). -->
|
||||
<!-- Please use the version returned by './clio_server --version' as the version number -->
|
||||
|
||||
## Supporting Files
|
||||
<!-- If you have supporting files such as a log, feel free to post a link here using Github Gist. -->
|
||||
<!-- Consider adding configuration files with private information removed via Github Gist. -->
|
||||
22
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
22
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,22 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
title: "[Title with short description] (Version: [Clio version])"
|
||||
labels: enhancement
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!-- Please search existing issues to avoid creating duplicates. -->
|
||||
|
||||
## Summary
|
||||
<!-- Provide a summary to the feature request -->
|
||||
|
||||
## Motivation
|
||||
<!-- Why do we need this feature? -->
|
||||
|
||||
## Solution
|
||||
<!-- What is the solution? -->
|
||||
|
||||
## Paths Not Taken
|
||||
<!-- What other alternatives have been considered? -->
|
||||
17
.github/ISSUE_TEMPLATE/question.md
vendored
Normal file
17
.github/ISSUE_TEMPLATE/question.md
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
---
|
||||
name: Question
|
||||
about: A question in form of an issue
|
||||
title: "[Title with short description] (Version: Clio version)"
|
||||
labels: question
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!-- Please search existing issues to avoid creating duplicates. -->
|
||||
<!-- Consider starting a [discussion](https://github.com/XRPLF/clio/discussions) instead. -->
|
||||
|
||||
## Question
|
||||
<!-- Your question -->
|
||||
|
||||
## Paths Not Taken
|
||||
<!-- If applicable, what other alternatives have been considered? -->
|
||||
18
.github/actions/build_clio/action.yml
vendored
Normal file
18
.github/actions/build_clio/action.yml
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
name: Build clio
|
||||
description: Build clio in build directory
|
||||
inputs:
|
||||
target:
|
||||
description: Build target name
|
||||
default: all
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Get number of threads
|
||||
uses: ./.github/actions/get_number_of_threads
|
||||
id: number_of_threads
|
||||
|
||||
- name: Build Clio
|
||||
shell: bash
|
||||
run: |
|
||||
cd build
|
||||
cmake --build . --parallel ${{ steps.number_of_threads.outputs.threads_number }} --target ${{ inputs.target }}
|
||||
21
.github/actions/code_coverage/action.yml
vendored
Normal file
21
.github/actions/code_coverage/action.yml
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
name: Generate code coverage report
|
||||
description: Run tests, generate code coverage report and upload it to codecov.io
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Run tests
|
||||
shell: bash
|
||||
run: |
|
||||
build/clio_tests --backend_host=scylladb
|
||||
|
||||
- name: Run gcovr
|
||||
shell: bash
|
||||
run: |
|
||||
gcovr -e unittests --xml build/coverage_report.xml -j8 --exclude-throw-branches
|
||||
|
||||
- name: Archive coverage report
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: coverage-report.xml
|
||||
path: build/coverage_report.xml
|
||||
retention-days: 30
|
||||
41
.github/actions/generate/action.yml
vendored
Normal file
41
.github/actions/generate/action.yml
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
name: Run conan and cmake
|
||||
description: Run conan and cmake
|
||||
inputs:
|
||||
conan_profile:
|
||||
description: Conan profile name
|
||||
required: true
|
||||
conan_cache_hit:
|
||||
description: Whether conan cache has been downloaded
|
||||
required: true
|
||||
default: 'false'
|
||||
build_type:
|
||||
description: Build type for third-party libraries and clio. Could be 'Release', 'Debug'
|
||||
required: true
|
||||
default: 'Release'
|
||||
code_coverage:
|
||||
description: Whether conan's coverage option should be on or not
|
||||
required: true
|
||||
default: 'false'
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Create build directory
|
||||
shell: bash
|
||||
run: mkdir -p build
|
||||
|
||||
- name: Run conan
|
||||
shell: bash
|
||||
env:
|
||||
BUILD_OPTION: "${{ inputs.conan_cache_hit == 'true' && 'missing' || '' }}"
|
||||
CODE_COVERAGE: "${{ inputs.code_coverage == 'true' && 'True' || 'False' }}"
|
||||
run: |
|
||||
cd build
|
||||
conan install .. -of . -b $BUILD_OPTION -s build_type=${{ inputs.build_type }} -o clio:tests=True -o clio:lint=False -o clio:coverage="${CODE_COVERAGE}" --profile ${{ inputs.conan_profile }}
|
||||
|
||||
- name: Run cmake
|
||||
shell: bash
|
||||
env:
|
||||
BUILD_TYPE: "${{ inputs.build_type }}"
|
||||
run: |
|
||||
cd build
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=${{ inputs.build_type }} ${{ inputs.extra_cmake_args }} .. -G Ninja
|
||||
26
.github/actions/get_number_of_threads/action.yml
vendored
Normal file
26
.github/actions/get_number_of_threads/action.yml
vendored
Normal file
@@ -0,0 +1,26 @@
|
||||
name: Get number of threads
|
||||
description: Determines number of threads to use on macOS and Linux
|
||||
outputs:
|
||||
threads_number:
|
||||
description: Number of threads to use
|
||||
value: ${{ steps.number_of_threads_export.outputs.num }}
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Get number of threads on mac
|
||||
id: mac_threads
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
shell: bash
|
||||
run: echo "num=$(($(sysctl -n hw.logicalcpu) - 2))" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Get number of threads on Linux
|
||||
id: linux_threads
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
shell: bash
|
||||
run: echo "num=$(($(nproc) - 2))" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Export output variable
|
||||
shell: bash
|
||||
id: number_of_threads_export
|
||||
run: |
|
||||
echo "num=${{ steps.mac_threads.outputs.num || steps.linux_threads.outputs.num }}" >> $GITHUB_OUTPUT
|
||||
14
.github/actions/git_common_ancestor/action.yml
vendored
Normal file
14
.github/actions/git_common_ancestor/action.yml
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
name: Git common ancestor
|
||||
description: Find the closest common commit
|
||||
outputs:
|
||||
commit:
|
||||
description: Hash of commit
|
||||
value: ${{ steps.find_common_ancestor.outputs.commit }}
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Find common git ancestor
|
||||
id: find_common_ancestor
|
||||
shell: bash
|
||||
run: |
|
||||
echo "commit=$(git merge-base --fork-point origin/develop)" >> $GITHUB_OUTPUT
|
||||
13
.github/actions/lint/action.yml
vendored
13
.github/actions/lint/action.yml
vendored
@@ -1,13 +0,0 @@
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
# Github's ubuntu-20.04 image already has clang-format-11 installed
|
||||
- run: |
|
||||
find src unittests -type f \( -name '*.cpp' -o -name '*.h' -o -name '*.ipp' \) -print0 | xargs -0 clang-format-11 -i
|
||||
shell: bash
|
||||
|
||||
- name: Check for differences
|
||||
id: assert
|
||||
shell: bash
|
||||
run: |
|
||||
git diff --color --exit-code | tee "clang-format.patch"
|
||||
50
.github/actions/prepare_runner/action.yml
vendored
Normal file
50
.github/actions/prepare_runner/action.yml
vendored
Normal file
@@ -0,0 +1,50 @@
|
||||
name: Prepare runner
|
||||
description: Install packages, set environment variables, create directories
|
||||
inputs:
|
||||
disable_ccache:
|
||||
description: Whether ccache should be disabled
|
||||
required: true
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Install packages on mac
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
shell: bash
|
||||
run: |
|
||||
brew install llvm@14 pkg-config ninja bison cmake ccache jq gh conan@1
|
||||
if ! command -v conan &> /dev/null; then
|
||||
echo "/opt/homebrew/opt/conan@1/bin" >> $GITHUB_PATH
|
||||
fi
|
||||
|
||||
- name: Fix git permissions on Linux
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
shell: bash
|
||||
run: git config --global --add safe.directory $PWD
|
||||
|
||||
- name: Set env variables for macOS
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
shell: bash
|
||||
run: |
|
||||
echo "CCACHE_DIR=${{ github.workspace }}/.ccache" >> $GITHUB_ENV
|
||||
echo "CONAN_USER_HOME=${{ github.workspace }}" >> $GITHUB_ENV
|
||||
|
||||
- name: Set env variables for Linux
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
shell: bash
|
||||
run: |
|
||||
echo "CCACHE_DIR=/root/.ccache" >> $GITHUB_ENV
|
||||
echo "CONAN_USER_HOME=/root/" >> $GITHUB_ENV
|
||||
|
||||
- name: Set CCACHE_DISABLE=1
|
||||
if: ${{ inputs.disable_ccache == 'true' }}
|
||||
shell: bash
|
||||
run: |
|
||||
echo "CCACHE_DISABLE=1" >> $GITHUB_ENV
|
||||
|
||||
- name: Create directories
|
||||
shell: bash
|
||||
run: |
|
||||
mkdir -p $CCACHE_DIR
|
||||
mkdir -p $CONAN_USER_HOME/.conan
|
||||
|
||||
|
||||
59
.github/actions/restore_cache/action.yml
vendored
Normal file
59
.github/actions/restore_cache/action.yml
vendored
Normal file
@@ -0,0 +1,59 @@
|
||||
name: Restore cache
|
||||
description: Find and restores conan and ccache cache
|
||||
inputs:
|
||||
conan_dir:
|
||||
description: Path to .conan directory
|
||||
required: true
|
||||
ccache_dir:
|
||||
description: Path to .ccache directory
|
||||
required: true
|
||||
build_type:
|
||||
description: Current build type (e.g. Release, Debug)
|
||||
required: true
|
||||
default: Release
|
||||
code_coverage:
|
||||
description: Whether code coverage is on
|
||||
required: true
|
||||
default: 'false'
|
||||
outputs:
|
||||
conan_hash:
|
||||
description: Hash to use as a part of conan cache key
|
||||
value: ${{ steps.conan_hash.outputs.hash }}
|
||||
conan_cache_hit:
|
||||
description: True if conan cache has been downloaded
|
||||
value: ${{ steps.conan_cache.outputs.cache-hit }}
|
||||
ccache_cache_hit:
|
||||
description: True if ccache cache has been downloaded
|
||||
value: ${{ steps.ccache_cache.outputs.cache-hit }}
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Find common commit
|
||||
id: git_common_ancestor
|
||||
uses: ./.github/actions/git_common_ancestor
|
||||
|
||||
- name: Calculate conan hash
|
||||
id: conan_hash
|
||||
shell: bash
|
||||
run: |
|
||||
conan info . -j info.json -o clio:tests=True
|
||||
packages_info=$(cat info.json | jq '.[] | "\(.display_name): \(.id)"' | grep -v 'clio')
|
||||
echo "$packages_info"
|
||||
hash=$(echo "$packages_info" | shasum -a 256 | cut -d ' ' -f 1)
|
||||
rm info.json
|
||||
echo "hash=$hash" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Restore conan cache
|
||||
uses: actions/cache/restore@v3
|
||||
id: conan_cache
|
||||
with:
|
||||
path: ${{ inputs.conan_dir }}/data
|
||||
key: clio-conan_data-${{ runner.os }}-${{ inputs.build_type }}-develop-${{ steps.conan_hash.outputs.hash }}
|
||||
|
||||
- name: Restore ccache cache
|
||||
uses: actions/cache/restore@v3
|
||||
id: ccache_cache
|
||||
if: ${{ env.CCACHE_DISABLE != '1' }}
|
||||
with:
|
||||
path: ${{ inputs.ccache_dir }}
|
||||
key: clio-ccache-${{ runner.os }}-${{ inputs.build_type }}${{ inputs.code_coverage == 'true' && '-code_coverage' || '' }}-develop-${{ steps.git_common_ancestor.outputs.commit }}
|
||||
56
.github/actions/save_cache/action.yml
vendored
Normal file
56
.github/actions/save_cache/action.yml
vendored
Normal file
@@ -0,0 +1,56 @@
|
||||
name: Save cache
|
||||
description: Save conan and ccache cache for develop branch
|
||||
inputs:
|
||||
conan_dir:
|
||||
description: Path to .conan directory
|
||||
required: true
|
||||
conan_hash:
|
||||
description: Hash to use as a part of conan cache key
|
||||
required: true
|
||||
conan_cache_hit:
|
||||
description: Whether conan cache has been downloaded
|
||||
required: true
|
||||
ccache_dir:
|
||||
description: Path to .ccache directory
|
||||
required: true
|
||||
ccache_cache_hit:
|
||||
description: Whether conan cache has been downloaded
|
||||
required: true
|
||||
ccache_cache_miss_rate:
|
||||
description: How many cache misses happened
|
||||
build_type:
|
||||
description: Current build type (e.g. Release, Debug)
|
||||
required: true
|
||||
default: Release
|
||||
code_coverage:
|
||||
description: Whether code coverage is on
|
||||
required: true
|
||||
default: 'false'
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Find common commit
|
||||
id: git_common_ancestor
|
||||
uses: ./.github/actions/git_common_ancestor
|
||||
|
||||
- name: Cleanup conan directory from extra data
|
||||
if: ${{ inputs.conan_cache_hit != 'true' }}
|
||||
shell: bash
|
||||
run: |
|
||||
conan remove "*" -s -b -f
|
||||
|
||||
- name: Save conan cache
|
||||
if: ${{ inputs.conan_cache_hit != 'true' }}
|
||||
uses: actions/cache/save@v3
|
||||
with:
|
||||
path: ${{ inputs.conan_dir }}/data
|
||||
key: clio-conan_data-${{ runner.os }}-${{ inputs.build_type }}-develop-${{ inputs.conan_hash }}
|
||||
|
||||
- name: Save ccache cache
|
||||
if: ${{ inputs.ccache_cache_hit != 'true' || inputs.ccache_cache_miss_rate == '100.0' }}
|
||||
uses: actions/cache/save@v3
|
||||
with:
|
||||
path: ${{ inputs.ccache_dir }}
|
||||
key: clio-ccache-${{ runner.os }}-${{ inputs.build_type }}${{ inputs.code_coverage == 'true' && '-code_coverage' || '' }}-develop-${{ steps.git_common_ancestor.outputs.commit }}
|
||||
|
||||
|
||||
48
.github/actions/setup_conan/action.yml
vendored
Normal file
48
.github/actions/setup_conan/action.yml
vendored
Normal file
@@ -0,0 +1,48 @@
|
||||
name: Setup conan
|
||||
description: Setup conan profile and artifactory
|
||||
outputs:
|
||||
conan_profile:
|
||||
description: Created conan profile name
|
||||
value: ${{ steps.conan_export_output.outputs.conan_profile }}
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: On mac
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
shell: bash
|
||||
env:
|
||||
CONAN_PROFILE: clio_apple_clang_15
|
||||
id: conan_setup_mac
|
||||
run: |
|
||||
echo "Creating $CONAN_PROFILE conan profile";
|
||||
conan profile new $CONAN_PROFILE --detect --force
|
||||
conan profile update settings.compiler.libcxx=libc++ $CONAN_PROFILE
|
||||
conan profile update settings.compiler.cppstd=20 $CONAN_PROFILE
|
||||
conan profile update env.CXXFLAGS=-DBOOST_ASIO_DISABLE_CONCEPTS $CONAN_PROFILE
|
||||
conan profile update "conf.tools.build:cxxflags+=[\"-DBOOST_ASIO_DISABLE_CONCEPTS\"]" $CONAN_PROFILE
|
||||
echo "created_conan_profile=$CONAN_PROFILE" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: On linux
|
||||
if: ${{ runner.os == 'Linux' }}
|
||||
shell: bash
|
||||
id: conan_setup_linux
|
||||
run: |
|
||||
echo "created_conan_profile=default" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Export output variable
|
||||
shell: bash
|
||||
id: conan_export_output
|
||||
run: |
|
||||
echo "conan_profile=${{ steps.conan_setup_mac.outputs.created_conan_profile || steps.conan_setup_linux.outputs.created_conan_profile }}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Add conan-non-prod artifactory
|
||||
shell: bash
|
||||
run: |
|
||||
if [[ -z $(conan remote list | grep conan-non-prod) ]]; then
|
||||
echo "Adding conan-non-prod"
|
||||
conan remote add --insert 0 conan-non-prod http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
else
|
||||
echo "Conan-non-prod is available"
|
||||
fi
|
||||
|
||||
|
||||
318
.github/workflows/build.yml
vendored
318
.github/workflows/build.yml
vendored
@@ -1,212 +1,184 @@
|
||||
name: Build Clio
|
||||
name: Build
|
||||
on:
|
||||
push:
|
||||
branches: [master, release/*, develop, develop-next]
|
||||
branches: [master, release/*, develop]
|
||||
pull_request:
|
||||
branches: [master, release/*, develop, develop-next]
|
||||
branches: [master, release/*, develop]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
lint:
|
||||
name: Lint
|
||||
check_format:
|
||||
name: Check format
|
||||
runs-on: ubuntu-20.04
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Run clang-format
|
||||
uses: ./.github/actions/lint
|
||||
- uses: actions/checkout@v4
|
||||
- name: Run formatters
|
||||
id: run_formatters
|
||||
run: |
|
||||
./.githooks/check-format
|
||||
shell: bash
|
||||
|
||||
check_docs:
|
||||
name: Check documentation
|
||||
runs-on: ubuntu-20.04
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Run linter
|
||||
id: run_linter
|
||||
run: |
|
||||
./.githooks/check-docs
|
||||
shell: bash
|
||||
|
||||
build_clio:
|
||||
name: Build Clio
|
||||
runs-on: [self-hosted, heavy]
|
||||
needs: lint
|
||||
build:
|
||||
name: Build
|
||||
needs:
|
||||
- check_format
|
||||
- check_docs
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
type:
|
||||
- suffix: deb
|
||||
image: rippleci/clio-dpkg-builder:2022-09-17
|
||||
script: dpkg
|
||||
- suffix: rpm
|
||||
image: rippleci/clio-rpm-builder:2022-09-17
|
||||
script: rpm
|
||||
include:
|
||||
- os: heavy
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
build_type: Release
|
||||
code_coverage: false
|
||||
- os: heavy
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
build_type: Debug
|
||||
code_coverage: true
|
||||
- os: macos14
|
||||
build_type: Release
|
||||
code_coverage: false
|
||||
runs-on: [self-hosted, "${{ matrix.os }}"]
|
||||
container: ${{ matrix.container }}
|
||||
|
||||
services:
|
||||
scylladb:
|
||||
image: ${{ (matrix.code_coverage) && 'scylladb/scylla' || '' }}
|
||||
options: >-
|
||||
--health-cmd "cqlsh -e 'describe cluster'"
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
|
||||
container:
|
||||
image: ${{ matrix.type.image }}
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Clean workdir
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
uses: kuznetsss/workspace-cleanup@1.0
|
||||
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
path: clio
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Clone Clio packaging repo
|
||||
uses: actions/checkout@v3
|
||||
- name: Prepare runner
|
||||
uses: ./.github/actions/prepare_runner
|
||||
with:
|
||||
path: clio-packages
|
||||
repository: XRPLF/clio-packages
|
||||
ref: main
|
||||
disable_ccache: false
|
||||
|
||||
- name: Build
|
||||
- name: Setup conan
|
||||
uses: ./.github/actions/setup_conan
|
||||
id: conan
|
||||
|
||||
- name: Restore cache
|
||||
uses: ./.github/actions/restore_cache
|
||||
id: restore_cache
|
||||
with:
|
||||
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
|
||||
ccache_dir: ${{ env.CCACHE_DIR }}
|
||||
build_type: ${{ matrix.build_type }}
|
||||
code_coverage: ${{ matrix.code_coverage }}
|
||||
|
||||
- name: Run conan and cmake
|
||||
uses: ./.github/actions/generate
|
||||
with:
|
||||
conan_profile: ${{ steps.conan.outputs.conan_profile }}
|
||||
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
|
||||
build_type: ${{ matrix.build_type }}
|
||||
code_coverage: ${{ matrix.code_coverage }}
|
||||
|
||||
- name: Build Clio
|
||||
uses: ./.github/actions/build_clio
|
||||
|
||||
- name: Show ccache's statistics
|
||||
shell: bash
|
||||
id: ccache_stats
|
||||
run: |
|
||||
export CLIO_ROOT=$(realpath clio)
|
||||
if [ ${{ matrix.type.suffix }} == "rpm" ]; then
|
||||
source /opt/rh/devtoolset-11/enable
|
||||
fi
|
||||
cmake -S clio-packages -B clio-packages/build -DCLIO_ROOT=$CLIO_ROOT
|
||||
cmake --build clio-packages/build --parallel $(nproc)
|
||||
cp ./clio-packages/build/clio-prefix/src/clio-build/clio_tests .
|
||||
mv ./clio-packages/build/*.${{ matrix.type.suffix }} .
|
||||
- name: Artifact packages
|
||||
ccache -s > /tmp/ccache.stats
|
||||
miss_rate=$(cat /tmp/ccache.stats | grep 'Misses' | head -n1 | sed 's/.*(\(.*\)%).*/\1/')
|
||||
echo "miss_rate=${miss_rate}" >> $GITHUB_OUTPUT
|
||||
cat /tmp/ccache.stats
|
||||
|
||||
- name: Strip tests
|
||||
if: ${{ !matrix.code_coverage }}
|
||||
run: strip build/clio_tests
|
||||
|
||||
- name: Upload clio_server
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: clio_${{ matrix.type.suffix }}_packages
|
||||
path: ${{ github.workspace }}/*.${{ matrix.type.suffix }}
|
||||
name: clio_server_${{ runner.os }}_${{ matrix.build_type }}
|
||||
path: build/clio_server
|
||||
|
||||
- name: Artifact clio_tests
|
||||
- name: Upload clio_tests
|
||||
if: ${{ !matrix.code_coverage }}
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: clio_tests-${{ matrix.type.suffix }}
|
||||
path: ${{ github.workspace }}/clio_tests
|
||||
name: clio_tests_${{ runner.os }}
|
||||
path: build/clio_tests
|
||||
|
||||
build_dev:
|
||||
name: Build on Mac/Clang14 and run tests
|
||||
needs: lint
|
||||
continue-on-error: false
|
||||
runs-on: [self-hosted, macOS]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Save cache
|
||||
uses: ./.github/actions/save_cache
|
||||
with:
|
||||
path: clio
|
||||
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
|
||||
conan_hash: ${{ steps.restore_cache.outputs.conan_hash }}
|
||||
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
|
||||
ccache_dir: ${{ env.CCACHE_DIR }}
|
||||
ccache_cache_hit: ${{ steps.restore_cache.outputs.ccache_cache_hit }}
|
||||
ccache_cache_miss_rate: ${{ steps.ccache_stats.outputs.miss_rate }}
|
||||
build_type: ${{ matrix.build_type }}
|
||||
code_coverage: ${{ matrix.code_coverage }}
|
||||
|
||||
- name: Check Boost cache
|
||||
id: boost
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: boost_1_77_0
|
||||
key: ${{ runner.os }}-boost
|
||||
# TODO: This is not a part of build process but it is the easiest way to do it here.
|
||||
# It will be refactored in https://github.com/XRPLF/clio/issues/1075
|
||||
- name: Run code coverage
|
||||
if: ${{ matrix.code_coverage }}
|
||||
uses: ./.github/actions/code_coverage
|
||||
|
||||
- name: Build Boost
|
||||
if: ${{ steps.boost.outputs.cache-hit != 'true' }}
|
||||
run: |
|
||||
rm -rf boost_1_77_0.tar.gz boost_1_77_0 # cleanup if needed first
|
||||
curl -s -fOJL "https://boostorg.jfrog.io/artifactory/main/release/1.77.0/source/boost_1_77_0.tar.gz"
|
||||
tar zxf boost_1_77_0.tar.gz
|
||||
cd boost_1_77_0
|
||||
./bootstrap.sh
|
||||
./b2 define=BOOST_ASIO_HAS_STD_INVOKE_RESULT cxxflags="-std=c++20"
|
||||
upload_coverage_report:
|
||||
name: Codecov
|
||||
needs: build
|
||||
uses: ./.github/workflows/upload_coverage_report.yml
|
||||
secrets:
|
||||
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
brew install llvm@14 pkg-config protobuf openssl ninja cassandra-cpp-driver bison cmake
|
||||
|
||||
- name: Setup environment for llvm-14
|
||||
run: |
|
||||
export PATH="/usr/local/opt/llvm@14/bin:$PATH"
|
||||
export LDFLAGS="-L/usr/local/opt/llvm@14/lib -L/usr/local/opt/llvm@14/lib/c++ -Wl,-rpath,/usr/local/opt/llvm@14/lib/c++"
|
||||
export CPPFLAGS="-I/usr/local/opt/llvm@14/include"
|
||||
|
||||
- name: Build clio
|
||||
run: |
|
||||
export BOOST_ROOT=$(pwd)/boost_1_77_0
|
||||
cd clio
|
||||
cmake -B build -DCMAKE_C_COMPILER='/usr/local/opt/llvm@14/bin/clang' -DCMAKE_CXX_COMPILER='/usr/local/opt/llvm@14/bin/clang++'
|
||||
if ! cmake --build build -j; then
|
||||
echo '# 🔥🔥 MacOS AppleClang build failed!💥' >> $GITHUB_STEP_SUMMARY
|
||||
exit 1
|
||||
fi
|
||||
- name: Run Test
|
||||
run: |
|
||||
cd clio/build
|
||||
./clio_tests --gtest_filter="-BackendTest*:BackendCassandraBaseTest*:BackendCassandraTest*"
|
||||
|
||||
test_clio:
|
||||
name: Test Clio
|
||||
runs-on: [self-hosted, Linux]
|
||||
needs: build_clio
|
||||
test:
|
||||
name: Run Tests
|
||||
needs: build
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
suffix: [rpm, deb]
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
|
||||
- name: Get clio_tests artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: clio_tests-${{ matrix.suffix }}
|
||||
|
||||
- name: Run tests
|
||||
timeout-minutes: 10
|
||||
uses: ./.github/actions/test
|
||||
|
||||
code_coverage:
|
||||
name: Build on Linux and code coverage
|
||||
needs: lint
|
||||
continue-on-error: false
|
||||
runs-on: ubuntu-22.04
|
||||
include:
|
||||
- os: heavy
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
- os: macos14
|
||||
runs-on: [self-hosted, "${{ matrix.os }}"]
|
||||
container: ${{ matrix.container }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
with:
|
||||
path: clio
|
||||
- name: Clean workdir
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
uses: kuznetsss/workspace-cleanup@1.0
|
||||
|
||||
- name: Check Boost cache
|
||||
id: boost
|
||||
uses: actions/cache@v3
|
||||
- uses: actions/download-artifact@v3
|
||||
with:
|
||||
path: boost
|
||||
key: ${{ runner.os }}-boost
|
||||
|
||||
- name: Build boost
|
||||
if: steps.boost.outputs.cache-hit != 'true'
|
||||
name: clio_tests_${{ runner.os }}
|
||||
- name: Run clio_tests
|
||||
run: |
|
||||
curl -s -OJL "https://boostorg.jfrog.io/artifactory/main/release/1.77.0/source/boost_1_77_0.tar.gz"
|
||||
tar zxf boost_1_77_0.tar.gz
|
||||
mv boost_1_77_0 boost
|
||||
cd boost
|
||||
./bootstrap.sh
|
||||
./b2
|
||||
- name: install deps
|
||||
run: |
|
||||
sudo apt-get -y install git pkg-config protobuf-compiler libprotobuf-dev libssl-dev wget build-essential doxygen bison flex autoconf clang-format gcovr
|
||||
- name: Build clio
|
||||
run: |
|
||||
export BOOST_ROOT=$(pwd)/boost
|
||||
cd clio
|
||||
cmake -B build -DCODE_COVERAGE=on -DTEST_PARAMETER='--gtest_filter="-BackendTest*:BackendCassandraBaseTest*:BackendCassandraTest*"'
|
||||
if ! cmake --build build -j$(nproc); then
|
||||
echo '# 🔥Ubuntu build🔥 failed!💥' >> $GITHUB_STEP_SUMMARY
|
||||
exit 1
|
||||
fi
|
||||
cd build
|
||||
make clio_tests-ccov
|
||||
- name: Code Coverage Summary Report
|
||||
uses: irongut/CodeCoverageSummary@v1.2.0
|
||||
with:
|
||||
filename: clio/build/clio_tests-gcc-cov/out.xml
|
||||
badge: true
|
||||
output: both
|
||||
format: markdown
|
||||
|
||||
- name: Save PR number and ccov report
|
||||
run: |
|
||||
mkdir -p ./UnitTestCoverage
|
||||
echo ${{ github.event.number }} > ./UnitTestCoverage/NR
|
||||
cp clio/build/clio_tests-gcc-cov/report.html ./UnitTestCoverage/report.html
|
||||
cp code-coverage-results.md ./UnitTestCoverage/out.md
|
||||
cat code-coverage-results.md > $GITHUB_STEP_SUMMARY
|
||||
- name: Upload coverage reports to Codecov
|
||||
uses: codecov/codecov-action@v3
|
||||
with:
|
||||
files: clio/build/clio_tests-gcc-cov/out.xml
|
||||
|
||||
- uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: UnitTestCoverage
|
||||
path: UnitTestCoverage/
|
||||
|
||||
- uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: code_coverage_report
|
||||
path: clio/build/clio_tests-gcc-cov/out.xml
|
||||
chmod +x ./clio_tests
|
||||
./clio_tests --gtest_filter="-BackendCassandraBaseTest*:BackendCassandraTest*:BackendCassandraFactoryTestWithDB*"
|
||||
|
||||
117
.github/workflows/clang-tidy.yml
vendored
Normal file
117
.github/workflows/clang-tidy.yml
vendored
Normal file
@@ -0,0 +1,117 @@
|
||||
name: Clang-tidy check
|
||||
on:
|
||||
schedule:
|
||||
- cron: "0 6 * * 1-5"
|
||||
workflow_dispatch:
|
||||
pull_request:
|
||||
branches: [develop]
|
||||
paths:
|
||||
- .clang_tidy
|
||||
- .github/workflows/clang-tidy.yml
|
||||
workflow_call:
|
||||
|
||||
jobs:
|
||||
clang_tidy:
|
||||
runs-on: [self-hosted, Linux]
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
permissions:
|
||||
contents: write
|
||||
issues: write
|
||||
pull-requests: write
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Prepare runner
|
||||
uses: ./.github/actions/prepare_runner
|
||||
with:
|
||||
disable_ccache: true
|
||||
|
||||
- name: Setup conan
|
||||
uses: ./.github/actions/setup_conan
|
||||
id: conan
|
||||
|
||||
- name: Restore cache
|
||||
uses: ./.github/actions/restore_cache
|
||||
id: restore_cache
|
||||
with:
|
||||
conan_dir: ${{ env.CONAN_USER_HOME }}/.conan
|
||||
ccache_dir: ${{ env.CCACHE_DIR }}
|
||||
|
||||
- name: Run conan and cmake
|
||||
uses: ./.github/actions/generate
|
||||
with:
|
||||
conan_profile: ${{ steps.conan.outputs.conan_profile }}
|
||||
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
|
||||
build_type: Release
|
||||
|
||||
- name: Get number of threads
|
||||
uses: ./.github/actions/get_number_of_threads
|
||||
id: number_of_threads
|
||||
|
||||
- name: Run clang-tidy
|
||||
continue-on-error: true
|
||||
shell: bash
|
||||
id: run_clang_tidy
|
||||
run: |
|
||||
run-clang-tidy-17 -p build -j ${{ steps.number_of_threads.outputs.threads_number }} -fix -quiet 1>output.txt
|
||||
|
||||
- name: Check format
|
||||
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
|
||||
continue-on-error: true
|
||||
shell: bash
|
||||
run: ./.githooks/check-format
|
||||
|
||||
- name: Print issues found
|
||||
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
|
||||
shell: bash
|
||||
run: |
|
||||
sed -i '/error\||/!d' ./output.txt
|
||||
cat output.txt
|
||||
rm output.txt
|
||||
|
||||
- name: Create an issue
|
||||
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
|
||||
id: create_issue
|
||||
shell: bash
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
echo -e 'Clang-tidy found issues in the code:\n' > issue.md
|
||||
echo -e "List of the issues found: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/" >> issue.md
|
||||
gh issue create --assignee 'cindyyan317,godexsoft,kuznetsss' --label bug --title 'Clang-tidy found bugs in code🐛' --body-file ./issue.md > create_issue.log
|
||||
created_issue=$(cat create_issue.log | sed 's|.*/||')
|
||||
echo "created_issue=$created_issue" >> $GITHUB_OUTPUT
|
||||
rm create_issue.log issue.md
|
||||
|
||||
- uses: crazy-max/ghaction-import-gpg@v5
|
||||
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
|
||||
with:
|
||||
gpg_private_key: ${{ secrets.ACTIONS_GPG_PRIVATE_KEY }}
|
||||
passphrase: ${{ secrets.ACTIONS_GPG_PASSPHRASE }}
|
||||
git_user_signingkey: true
|
||||
git_commit_gpgsign: true
|
||||
|
||||
- name: Create PR with fixes
|
||||
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
|
||||
uses: peter-evans/create-pull-request@v5
|
||||
env:
|
||||
GH_REPO: ${{ github.repository }}
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
with:
|
||||
commit-message: "[CI] clang-tidy auto fixes"
|
||||
committer: Clio CI <skuznetsov@ripple.com>
|
||||
branch: "clang_tidy/autofix"
|
||||
branch-suffix: timestamp
|
||||
delete-branch: true
|
||||
title: "[CI] clang-tidy auto fixes"
|
||||
body: "Fixes #${{ steps.create_issue.outputs.created_issue }}. Please review and commit clang-tidy fixes."
|
||||
reviewers: "cindyyan317,godexsoft,kuznetsss"
|
||||
|
||||
- name: Fail the job
|
||||
if: ${{ steps.run_clang_tidy.outcome != 'success' }}
|
||||
shell: bash
|
||||
run: exit 1
|
||||
29
.github/workflows/clang-tidy_on_fix_merged.yml
vendored
Normal file
29
.github/workflows/clang-tidy_on_fix_merged.yml
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
name: Restart clang-tidy workflow
|
||||
on:
|
||||
push:
|
||||
branches: [develop]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
restart_clang_tidy:
|
||||
runs-on: ubuntu-20.04
|
||||
|
||||
permissions:
|
||||
actions: write
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Check last commit matches clang-tidy auto fixes
|
||||
id: check
|
||||
shell: bash
|
||||
run: |
|
||||
passed=$(if [[ $(git log -1 --pretty=format:%s | grep '\[CI\] clang-tidy auto fixes') ]]; then echo 'true' ; else echo 'false' ; fi)
|
||||
echo "passed=$passed" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Run clang-tidy workflow
|
||||
if: ${{ contains(steps.check.outputs.passed, 'true') }}
|
||||
shell: bash
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
GH_REPO: ${{ github.repository }}
|
||||
run: gh workflow run clang-tidy.yml
|
||||
47
.github/workflows/docs.yml
vendored
Normal file
47
.github/workflows/docs.yml
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
name: Documentation
|
||||
on:
|
||||
push:
|
||||
branches: [release/*, develop]
|
||||
workflow_dispatch:
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pages: write
|
||||
id-token: write
|
||||
|
||||
concurrency:
|
||||
group: "pages"
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
environment:
|
||||
name: github-pages
|
||||
url: ${{ steps.deployment.outputs.page_url }}
|
||||
runs-on: ubuntu-20.04
|
||||
continue-on-error: true
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Build docs
|
||||
run: |
|
||||
mkdir -p build_docs && cd build_docs
|
||||
cmake ../docs && cmake --build . --target docs
|
||||
|
||||
- name: Setup Pages
|
||||
uses: actions/configure-pages@v3
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-pages-artifact@v3
|
||||
with:
|
||||
path: build_docs/html
|
||||
name: docs-develop # TODO: use x.y.z for `release/x.y.z` branches and `develop` for latest dev docs
|
||||
|
||||
- name: Deploy to GitHub Pages
|
||||
id: deployment
|
||||
uses: actions/deploy-pages@v4
|
||||
with:
|
||||
artifact_name: docs-develop
|
||||
146
.github/workflows/nightly.yml
vendored
Normal file
146
.github/workflows/nightly.yml
vendored
Normal file
@@ -0,0 +1,146 @@
|
||||
name: Nightly release
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 5 * * 1-5'
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Build clio
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- os: macos14
|
||||
build_type: Release
|
||||
- os: heavy
|
||||
build_type: Release
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
- os: heavy
|
||||
build_type: Debug
|
||||
container:
|
||||
image: rippleci/clio_ci:latest
|
||||
runs-on: [self-hosted, "${{ matrix.os }}"]
|
||||
container: ${{ matrix.container }}
|
||||
|
||||
steps:
|
||||
- name: Clean workdir
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
uses: kuznetsss/workspace-cleanup@1.0
|
||||
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Prepare runner
|
||||
uses: ./.github/actions/prepare_runner
|
||||
with:
|
||||
disable_ccache: true
|
||||
|
||||
- name: Setup conan
|
||||
uses: ./.github/actions/setup_conan
|
||||
id: conan
|
||||
|
||||
- name: Run conan and cmake
|
||||
uses: ./.github/actions/generate
|
||||
with:
|
||||
conan_profile: ${{ steps.conan.outputs.conan_profile }}
|
||||
conan_cache_hit: ${{ steps.restore_cache.outputs.conan_cache_hit }}
|
||||
build_type: ${{ matrix.build_type }}
|
||||
|
||||
- name: Build Clio
|
||||
uses: ./.github/actions/build_clio
|
||||
|
||||
- name: Strip tests
|
||||
run: strip build/clio_tests
|
||||
|
||||
- name: Upload clio_tests
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: clio_tests_${{ runner.os }}_${{ matrix.build_type }}
|
||||
path: build/clio_tests
|
||||
|
||||
- name: Compress clio_server
|
||||
shell: bash
|
||||
run: |
|
||||
cd build
|
||||
tar czf ./clio_server_${{ runner.os }}_${{ matrix.build_type }}.tar.gz ./clio_server
|
||||
|
||||
- name: Upload clio_server
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: clio_server_${{ runner.os }}_${{ matrix.build_type }}
|
||||
path: build/clio_server_${{ runner.os }}_${{ matrix.build_type }}.tar.gz
|
||||
|
||||
|
||||
run_tests:
|
||||
needs: build
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- os: macos14
|
||||
build_type: Release
|
||||
- os: heavy
|
||||
build_type: Release
|
||||
- os: heavy
|
||||
build_type: Debug
|
||||
runs-on: [self-hosted, "${{ matrix.os }}"]
|
||||
|
||||
steps:
|
||||
- name: Clean workdir
|
||||
if: ${{ runner.os == 'macOS' }}
|
||||
uses: kuznetsss/workspace-cleanup@1.0
|
||||
|
||||
- uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: clio_tests_${{ runner.os }}_${{ matrix.build_type }}
|
||||
|
||||
- name: Run clio_tests
|
||||
run: |
|
||||
chmod +x ./clio_tests
|
||||
./clio_tests --gtest_filter="-BackendCassandraBaseTest*:BackendCassandraTest*:BackendCassandraFactoryTestWithDB*"
|
||||
|
||||
nightly_release:
|
||||
needs: run_tests
|
||||
runs-on: ubuntu-20.04
|
||||
env:
|
||||
GH_REPO: ${{ github.repository }}
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
permissions:
|
||||
contents: write
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- uses: actions/download-artifact@v3
|
||||
with:
|
||||
path: nightly_release
|
||||
|
||||
- name: Prepare files
|
||||
shell: bash
|
||||
run: |
|
||||
cp ${{ github.workspace }}/.github/workflows/nightly_notes.md "${RUNNER_TEMP}/nightly_notes.md"
|
||||
cd nightly_release
|
||||
rm -r clio_tests*
|
||||
for d in $(ls); do
|
||||
archive_name=$(ls $d)
|
||||
mv ${d}/${archive_name} ./
|
||||
rm -r $d
|
||||
sha256sum ./$archive_name > ./${archive_name}.sha256sum
|
||||
cat ./$archive_name.sha256sum >> "${RUNNER_TEMP}/nightly_notes.md"
|
||||
done
|
||||
echo '```' >> "${RUNNER_TEMP}/nightly_notes.md"
|
||||
|
||||
- name: Remove current nightly release and nightly tag
|
||||
shell: bash
|
||||
run: |
|
||||
gh release delete nightly --yes || true
|
||||
git push origin :nightly || true
|
||||
|
||||
- name: Publish nightly release
|
||||
shell: bash
|
||||
run: |
|
||||
gh release create nightly --prerelease --title "Clio development (nightly) build" \
|
||||
--target $GITHUB_SHA --notes-file "${RUNNER_TEMP}/nightly_notes.md" \
|
||||
./nightly_release/clio_server*
|
||||
6
.github/workflows/nightly_notes.md
vendored
Normal file
6
.github/workflows/nightly_notes.md
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
> **Note:** Please remember that this is a development release and it is not recommended for production use.
|
||||
|
||||
Changelog (including previous releases): https://github.com/XRPLF/clio/commits/nightly
|
||||
|
||||
## SHA256 checksums
|
||||
```
|
||||
40
.github/workflows/update_docker_ci.yml
vendored
Normal file
40
.github/workflows/update_docker_ci.yml
vendored
Normal file
@@ -0,0 +1,40 @@
|
||||
name: Update CI docker image
|
||||
on:
|
||||
push:
|
||||
branches: [develop]
|
||||
paths:
|
||||
- 'docker/ci/**'
|
||||
- .github/workflows/update_docker_ci.yml
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
build_and_push:
|
||||
name: Build and push docker image
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
- name: Login to DockerHub
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
username: ${{ secrets.DOCKERHUB_USER }}
|
||||
password: ${{ secrets.DOCKERHUB_PW }}
|
||||
|
||||
- uses: actions/checkout@v4
|
||||
- uses: docker/setup-qemu-action@v3
|
||||
- uses: docker/setup-buildx-action@v3
|
||||
- uses: docker/metadata-action@v5
|
||||
id: meta
|
||||
with:
|
||||
images: rippleci/clio_ci
|
||||
tags: |
|
||||
type=raw,value=latest
|
||||
type=raw,value=gcc_11
|
||||
type=raw,value=${{ env.GITHUB_SHA }}
|
||||
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
context: ${{ github.workspace }}/docker/ci
|
||||
platforms: linux/amd64,linux/arm64
|
||||
push: true
|
||||
tags: ${{ steps.meta.outputs.tags }}
|
||||
|
||||
35
.github/workflows/upload_coverage_report.yml
vendored
Normal file
35
.github/workflows/upload_coverage_report.yml
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
name: Upload report
|
||||
on:
|
||||
workflow_dispatch:
|
||||
workflow_call:
|
||||
secrets:
|
||||
CODECOV_TOKEN:
|
||||
required: true
|
||||
|
||||
jobs:
|
||||
upload_report:
|
||||
name: Upload report
|
||||
runs-on: ubuntu-20.04
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Download report artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: coverage-report.xml
|
||||
path: build
|
||||
|
||||
- name: Upload coverage report
|
||||
if: ${{ hashFiles('build/coverage_report.xml') != '' }}
|
||||
uses: wandalen/wretry.action@v1.3.0
|
||||
with:
|
||||
action: codecov/codecov-action@v3
|
||||
with: |
|
||||
files: build/coverage_report.xml
|
||||
fail_ci_if_error: false
|
||||
verbose: true
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
attempt_limit: 5
|
||||
attempt_delay: 10000
|
||||
7
.gitignore
vendored
7
.gitignore
vendored
@@ -1,6 +1,11 @@
|
||||
*clio*.log
|
||||
build*/
|
||||
/build*/
|
||||
.devcontainer
|
||||
.build
|
||||
.cache
|
||||
.vscode
|
||||
.python-version
|
||||
.DS_Store
|
||||
CMakeUserPresets.json
|
||||
config.json
|
||||
src/main/impl/Build.cpp
|
||||
|
||||
@@ -1,33 +0,0 @@
|
||||
#[===================================================================[
|
||||
write version to source
|
||||
#]===================================================================]
|
||||
|
||||
find_package(Git REQUIRED)
|
||||
|
||||
set(GIT_COMMAND rev-parse --short HEAD)
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} OUTPUT_VARIABLE REV OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
|
||||
set(GIT_COMMAND branch --show-current)
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} OUTPUT_VARIABLE BRANCH OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
|
||||
if(BRANCH STREQUAL "")
|
||||
set(BRANCH "dev")
|
||||
endif()
|
||||
|
||||
if(NOT (BRANCH MATCHES master OR BRANCH MATCHES release/*)) # for develop and any other branch name YYYYMMDDHMS-<branch>-<git-ref>
|
||||
execute_process(COMMAND date +%Y%m%d%H%M%S OUTPUT_VARIABLE DATE OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
set(VERSION "${DATE}-${BRANCH}-${REV}")
|
||||
else()
|
||||
set(GIT_COMMAND describe --tags)
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} OUTPUT_VARIABLE TAG_VERSION OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
set(VERSION "${TAG_VERSION}-${REV}")
|
||||
endif()
|
||||
|
||||
if(CMAKE_BUILD_TYPE MATCHES Debug)
|
||||
set(VERSION "${VERSION}+DEBUG")
|
||||
endif()
|
||||
|
||||
message(STATUS "Build version: ${VERSION}")
|
||||
set(clio_version "${VERSION}")
|
||||
|
||||
configure_file(CMake/Build.cpp.in ${CMAKE_SOURCE_DIR}/src/main/impl/Build.cpp)
|
||||
@@ -1,126 +0,0 @@
|
||||
# call add_converage(module_name) to add coverage targets for the given module
|
||||
function(add_converage module)
|
||||
if("${CMAKE_C_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang"
|
||||
OR "${CMAKE_CXX_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang")
|
||||
message("[Coverage] Building with llvm Code Coverage Tools")
|
||||
# Using llvm gcov ; llvm install by xcode
|
||||
set(LLVM_COV_PATH /Library/Developer/CommandLineTools/usr/bin)
|
||||
if(NOT EXISTS ${LLVM_COV_PATH}/llvm-cov)
|
||||
message(FATAL_ERROR "llvm-cov not found! Aborting.")
|
||||
endif()
|
||||
|
||||
# set Flags
|
||||
target_compile_options(${module} PRIVATE -fprofile-instr-generate
|
||||
-fcoverage-mapping)
|
||||
target_link_options(${module} PUBLIC -fprofile-instr-generate
|
||||
-fcoverage-mapping)
|
||||
|
||||
target_compile_options(clio PRIVATE -fprofile-instr-generate
|
||||
-fcoverage-mapping)
|
||||
target_link_options(clio PUBLIC -fprofile-instr-generate
|
||||
-fcoverage-mapping)
|
||||
|
||||
# llvm-cov
|
||||
add_custom_target(
|
||||
${module}-ccov-preprocessing
|
||||
COMMAND LLVM_PROFILE_FILE=${module}.profraw $<TARGET_FILE:${module}>
|
||||
COMMAND ${LLVM_COV_PATH}/llvm-profdata merge -sparse ${module}.profraw -o
|
||||
${module}.profdata
|
||||
DEPENDS ${module})
|
||||
|
||||
add_custom_target(
|
||||
${module}-ccov-show
|
||||
COMMAND ${LLVM_COV_PATH}/llvm-cov show $<TARGET_FILE:${module}>
|
||||
-instr-profile=${module}.profdata -show-line-counts-or-regions
|
||||
DEPENDS ${module}-ccov-preprocessing)
|
||||
|
||||
# add summary for CI parse
|
||||
add_custom_target(
|
||||
${module}-ccov-report
|
||||
COMMAND
|
||||
${LLVM_COV_PATH}/llvm-cov report $<TARGET_FILE:${module}>
|
||||
-instr-profile=${module}.profdata
|
||||
-ignore-filename-regex=".*_makefiles|.*unittests|.*_deps"
|
||||
-show-region-summary=false
|
||||
DEPENDS ${module}-ccov-preprocessing)
|
||||
|
||||
# exclude libs and unittests self
|
||||
add_custom_target(
|
||||
${module}-ccov
|
||||
COMMAND
|
||||
${LLVM_COV_PATH}/llvm-cov show $<TARGET_FILE:${module}>
|
||||
-instr-profile=${module}.profdata -show-line-counts-or-regions
|
||||
-output-dir=${module}-llvm-cov -format="html"
|
||||
-ignore-filename-regex=".*_makefiles|.*unittests|.*_deps" > /dev/null 2>&1
|
||||
DEPENDS ${module}-ccov-preprocessing)
|
||||
|
||||
add_custom_command(
|
||||
TARGET ${module}-ccov
|
||||
POST_BUILD
|
||||
COMMENT
|
||||
"Open ${module}-llvm-cov/index.html in your browser to view the coverage report."
|
||||
)
|
||||
elseif("${CMAKE_C_COMPILER_ID}" MATCHES "GNU" OR "${CMAKE_CXX_COMPILER_ID}"
|
||||
MATCHES "GNU")
|
||||
message("[Coverage] Building with Gcc Code Coverage Tools")
|
||||
|
||||
find_program(GCOV_PATH gcov)
|
||||
if(NOT GCOV_PATH)
|
||||
message(FATAL_ERROR "gcov not found! Aborting...")
|
||||
endif() # NOT GCOV_PATH
|
||||
find_program(GCOVR_PATH gcovr)
|
||||
if(NOT GCOVR_PATH)
|
||||
message(FATAL_ERROR "gcovr not found! Aborting...")
|
||||
endif() # NOT GCOVR_PATH
|
||||
|
||||
set(COV_OUTPUT_PATH ${module}-gcc-cov)
|
||||
target_compile_options(${module} PRIVATE -fprofile-arcs -ftest-coverage
|
||||
-fPIC)
|
||||
target_link_libraries(${module} PRIVATE gcov)
|
||||
|
||||
target_compile_options(clio PRIVATE -fprofile-arcs -ftest-coverage
|
||||
-fPIC)
|
||||
target_link_libraries(clio PRIVATE gcov)
|
||||
# this target is used for CI as well generate the summary out.xml will send
|
||||
# to github action to generate markdown, we can paste it to comments or
|
||||
# readme
|
||||
add_custom_target(
|
||||
${module}-ccov
|
||||
COMMAND ${module} ${TEST_PARAMETER}
|
||||
COMMAND rm -rf ${COV_OUTPUT_PATH}
|
||||
COMMAND mkdir ${COV_OUTPUT_PATH}
|
||||
COMMAND
|
||||
gcovr -r ${CMAKE_SOURCE_DIR} --object-directory=${PROJECT_BINARY_DIR} -x
|
||||
${COV_OUTPUT_PATH}/out.xml --exclude='${CMAKE_SOURCE_DIR}/unittests/'
|
||||
--exclude='${PROJECT_BINARY_DIR}/'
|
||||
COMMAND
|
||||
gcovr -r ${CMAKE_SOURCE_DIR} --object-directory=${PROJECT_BINARY_DIR}
|
||||
--html ${COV_OUTPUT_PATH}/report.html
|
||||
--exclude='${CMAKE_SOURCE_DIR}/unittests/'
|
||||
--exclude='${PROJECT_BINARY_DIR}/'
|
||||
WORKING_DIRECTORY ${PROJECT_BINARY_DIR}
|
||||
COMMENT "Running gcovr to produce Cobertura code coverage report.")
|
||||
|
||||
# generate the detail report
|
||||
add_custom_target(
|
||||
${module}-ccov-report
|
||||
COMMAND ${module} ${TEST_PARAMETER}
|
||||
COMMAND rm -rf ${COV_OUTPUT_PATH}
|
||||
COMMAND mkdir ${COV_OUTPUT_PATH}
|
||||
COMMAND
|
||||
gcovr -r ${CMAKE_SOURCE_DIR} --object-directory=${PROJECT_BINARY_DIR}
|
||||
--html-details ${COV_OUTPUT_PATH}/index.html
|
||||
--exclude='${CMAKE_SOURCE_DIR}/unittests/'
|
||||
--exclude='${PROJECT_BINARY_DIR}/'
|
||||
WORKING_DIRECTORY ${PROJECT_BINARY_DIR}
|
||||
COMMENT "Running gcovr to produce Cobertura code coverage report.")
|
||||
add_custom_command(
|
||||
TARGET ${module}-ccov-report
|
||||
POST_BUILD
|
||||
COMMENT
|
||||
"Open ${COV_OUTPUT_PATH}/index.html in your browser to view the coverage report."
|
||||
)
|
||||
else()
|
||||
message(FATAL_ERROR "Complier not support yet")
|
||||
endif()
|
||||
endfunction()
|
||||
@@ -1,6 +0,0 @@
|
||||
set(Boost_USE_STATIC_LIBS ON)
|
||||
set(Boost_USE_STATIC_RUNTIME ON)
|
||||
|
||||
find_package(Boost 1.75 COMPONENTS filesystem log_setup log thread system REQUIRED)
|
||||
|
||||
target_link_libraries(clio PUBLIC ${Boost_LIBRARIES})
|
||||
@@ -1,24 +0,0 @@
|
||||
From 5cd9d09d960fa489a0c4379880cd7615b1c16e55 Mon Sep 17 00:00:00 2001
|
||||
From: CJ Cobb <ccobb@ripple.com>
|
||||
Date: Wed, 10 Aug 2022 12:30:01 -0400
|
||||
Subject: [PATCH] Remove bitset operator !=
|
||||
|
||||
---
|
||||
src/ripple/protocol/Feature.h | 1 -
|
||||
1 file changed, 1 deletion(-)
|
||||
|
||||
diff --git a/src/ripple/protocol/Feature.h b/src/ripple/protocol/Feature.h
|
||||
index b3ecb099b..6424be411 100644
|
||||
--- a/src/ripple/protocol/Feature.h
|
||||
+++ b/src/ripple/protocol/Feature.h
|
||||
@@ -126,7 +126,6 @@ class FeatureBitset : private std::bitset<detail::numFeatures>
|
||||
public:
|
||||
using base::bitset;
|
||||
using base::operator==;
|
||||
- using base::operator!=;
|
||||
|
||||
using base::all;
|
||||
using base::any;
|
||||
--
|
||||
2.32.0
|
||||
|
||||
@@ -1,11 +0,0 @@
|
||||
include(CheckIncludeFileCXX)
|
||||
|
||||
check_include_file_cxx("source_location" SOURCE_LOCATION_AVAILABLE)
|
||||
if(SOURCE_LOCATION_AVAILABLE)
|
||||
target_compile_definitions(clio PUBLIC "HAS_SOURCE_LOCATION")
|
||||
endif()
|
||||
|
||||
check_include_file_cxx("experimental/source_location" EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
|
||||
if(EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
|
||||
target_compile_definitions(clio PUBLIC "HAS_EXPERIMENTAL_SOURCE_LOCATION")
|
||||
endif()
|
||||
@@ -1,153 +0,0 @@
|
||||
find_package(ZLIB REQUIRED)
|
||||
|
||||
find_library(cassandra NAMES cassandra)
|
||||
if(NOT cassandra)
|
||||
message("System installed Cassandra cpp driver not found. Will build")
|
||||
find_library(zlib NAMES zlib1g-dev zlib-devel zlib z)
|
||||
if(NOT zlib)
|
||||
message("zlib not found. will build")
|
||||
add_library(zlib STATIC IMPORTED GLOBAL)
|
||||
ExternalProject_Add(zlib_src
|
||||
PREFIX ${nih_cache_path}
|
||||
GIT_REPOSITORY https://github.com/madler/zlib.git
|
||||
GIT_TAG v1.2.12
|
||||
INSTALL_COMMAND ""
|
||||
BUILD_BYPRODUCTS <BINARY_DIR>/${CMAKE_STATIC_LIBRARY_PREFIX}z.a
|
||||
)
|
||||
ExternalProject_Get_Property (zlib_src SOURCE_DIR)
|
||||
ExternalProject_Get_Property (zlib_src BINARY_DIR)
|
||||
set (zlib_src_SOURCE_DIR "${SOURCE_DIR}")
|
||||
file (MAKE_DIRECTORY ${zlib_src_SOURCE_DIR}/include)
|
||||
set_target_properties (zlib PROPERTIES
|
||||
IMPORTED_LOCATION
|
||||
${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}z.a
|
||||
INTERFACE_INCLUDE_DIRECTORIES
|
||||
${SOURCE_DIR}/include)
|
||||
add_dependencies(zlib zlib_src)
|
||||
file(TO_CMAKE_PATH "${zlib_src_SOURCE_DIR}" zlib_src_SOURCE_DIR)
|
||||
endif()
|
||||
find_library(krb5 NAMES krb5-dev libkrb5-dev)
|
||||
if(NOT krb5)
|
||||
message("krb5 not found. will build")
|
||||
add_library(krb5 STATIC IMPORTED GLOBAL)
|
||||
ExternalProject_Add(krb5_src
|
||||
PREFIX ${nih_cache_path}
|
||||
GIT_REPOSITORY https://github.com/krb5/krb5.git
|
||||
GIT_TAG krb5-1.20
|
||||
UPDATE_COMMAND ""
|
||||
CONFIGURE_COMMAND autoreconf src && CFLAGS=-fcommon ./src/configure --enable-static --disable-shared
|
||||
BUILD_IN_SOURCE 1
|
||||
BUILD_COMMAND make
|
||||
INSTALL_COMMAND ""
|
||||
BUILD_BYPRODUCTS <SOURCE_DIR>/lib/${CMAKE_STATIC_LIBRARY_PREFIX}krb5.a
|
||||
)
|
||||
message(${ep_lib_prefix}/krb5.a)
|
||||
message(${CMAKE_STATIC_LIBRARY_PREFIX}krb5.a)
|
||||
ExternalProject_Get_Property (krb5_src SOURCE_DIR)
|
||||
ExternalProject_Get_Property (krb5_src BINARY_DIR)
|
||||
set (krb5_src_SOURCE_DIR "${SOURCE_DIR}")
|
||||
file (MAKE_DIRECTORY ${krb5_src_SOURCE_DIR}/include)
|
||||
set_target_properties (krb5 PROPERTIES
|
||||
IMPORTED_LOCATION
|
||||
${SOURCE_DIR}/lib/${CMAKE_STATIC_LIBRARY_PREFIX}krb5.a
|
||||
INTERFACE_INCLUDE_DIRECTORIES
|
||||
${SOURCE_DIR}/include)
|
||||
add_dependencies(krb5 krb5_src)
|
||||
file(TO_CMAKE_PATH "${krb5_src_SOURCE_DIR}" krb5_src_SOURCE_DIR)
|
||||
endif()
|
||||
|
||||
|
||||
find_library(libuv1 NAMES uv1 libuv1 liubuv1-dev libuv1:amd64)
|
||||
|
||||
|
||||
if(NOT libuv1)
|
||||
message("libuv1 not found, will build")
|
||||
add_library(libuv1 STATIC IMPORTED GLOBAL)
|
||||
ExternalProject_Add(libuv_src
|
||||
PREFIX ${nih_cache_path}
|
||||
GIT_REPOSITORY https://github.com/libuv/libuv.git
|
||||
GIT_TAG v1.44.1
|
||||
INSTALL_COMMAND ""
|
||||
BUILD_BYPRODUCTS <BINARY_DIR>/${CMAKE_STATIC_LIBRARY_PREFIX}uv_a.a
|
||||
)
|
||||
|
||||
ExternalProject_Get_Property (libuv_src SOURCE_DIR)
|
||||
ExternalProject_Get_Property (libuv_src BINARY_DIR)
|
||||
set (libuv_src_SOURCE_DIR "${SOURCE_DIR}")
|
||||
file (MAKE_DIRECTORY ${libuv_src_SOURCE_DIR}/include)
|
||||
|
||||
set_target_properties (libuv1 PROPERTIES
|
||||
IMPORTED_LOCATION
|
||||
${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}uv_a.a
|
||||
INTERFACE_INCLUDE_DIRECTORIES
|
||||
${SOURCE_DIR}/include)
|
||||
add_dependencies(libuv1 libuv_src)
|
||||
|
||||
file(TO_CMAKE_PATH "${libuv_src_SOURCE_DIR}" libuv_src_SOURCE_DIR)
|
||||
endif()
|
||||
add_library (cassandra STATIC IMPORTED GLOBAL)
|
||||
ExternalProject_Add(cassandra_src
|
||||
PREFIX ${nih_cache_path}
|
||||
GIT_REPOSITORY https://github.com/datastax/cpp-driver.git
|
||||
GIT_TAG 2.16.2
|
||||
CMAKE_ARGS
|
||||
-DLIBUV_ROOT_DIR=${BINARY_DIR}
|
||||
-DLIBUV_INCLUDE_DIR=${SOURCE_DIR}/include
|
||||
-DCASS_BUILD_STATIC=ON
|
||||
-DCASS_BUILD_SHARED=OFF
|
||||
INSTALL_COMMAND ""
|
||||
BUILD_BYPRODUCTS <BINARY_DIR>/${CMAKE_STATIC_LIBRARY_PREFIX}cassandra_static.a
|
||||
)
|
||||
|
||||
ExternalProject_Get_Property (cassandra_src SOURCE_DIR)
|
||||
ExternalProject_Get_Property (cassandra_src BINARY_DIR)
|
||||
set (cassandra_src_SOURCE_DIR "${SOURCE_DIR}")
|
||||
file (MAKE_DIRECTORY ${cassandra_src_SOURCE_DIR}/include)
|
||||
|
||||
set_target_properties (cassandra PROPERTIES
|
||||
IMPORTED_LOCATION
|
||||
${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}cassandra_static.a
|
||||
INTERFACE_INCLUDE_DIRECTORIES
|
||||
${SOURCE_DIR}/include)
|
||||
message("cass dirs")
|
||||
message(${BINARY_DIR})
|
||||
message(${SOURCE_DIR})
|
||||
message(${BINARY_DIR}/${CMAKE_STATIC_LIBRARY_PREFIX}cassandra_static.a)
|
||||
add_dependencies(cassandra cassandra_src)
|
||||
|
||||
if(NOT libuv1)
|
||||
ExternalProject_Add_StepDependencies(cassandra_src build libuv1)
|
||||
target_link_libraries(cassandra INTERFACE libuv1)
|
||||
else()
|
||||
target_link_libraries(cassandra INTERFACE ${libuv1})
|
||||
endif()
|
||||
if(NOT krb5)
|
||||
|
||||
ExternalProject_Add_StepDependencies(cassandra_src build krb5)
|
||||
target_link_libraries(cassandra INTERFACE krb5)
|
||||
else()
|
||||
target_link_libraries(cassandra INTERFACE ${krb5})
|
||||
endif()
|
||||
|
||||
if(NOT zlib)
|
||||
ExternalProject_Add_StepDependencies(cassandra_src build zlib)
|
||||
target_link_libraries(cassandra INTERFACE zlib)
|
||||
else()
|
||||
target_link_libraries(cassandra INTERFACE ${zlib})
|
||||
endif()
|
||||
set(OPENSSL_USE_STATIC_LIBS TRUE)
|
||||
find_package(OpenSSL REQUIRED)
|
||||
target_link_libraries(cassandra INTERFACE OpenSSL::SSL)
|
||||
|
||||
file(TO_CMAKE_PATH "${cassandra_src_SOURCE_DIR}" cassandra_src_SOURCE_DIR)
|
||||
target_link_libraries(clio PUBLIC cassandra)
|
||||
else()
|
||||
message("Found system installed cassandra cpp driver")
|
||||
message(${cassandra})
|
||||
find_path(cassandra_includes NAMES cassandra.h REQUIRED)
|
||||
message(${cassandra_includes})
|
||||
get_filename_component(CASSANDRA_HEADER ${cassandra_includes}/cassandra.h REALPATH)
|
||||
get_filename_component(CASSANDRA_HEADER_DIR ${CASSANDRA_HEADER} DIRECTORY)
|
||||
target_link_libraries (clio PUBLIC ${cassandra})
|
||||
target_include_directories(clio PUBLIC ${CASSANDRA_HEADER_DIR})
|
||||
endif()
|
||||
@@ -1,20 +0,0 @@
|
||||
FetchContent_Declare(
|
||||
googletest
|
||||
URL https://github.com/google/googletest/archive/609281088cfefc76f9d0ce82e1ff6c30cc3591e5.zip
|
||||
)
|
||||
|
||||
FetchContent_GetProperties(googletest)
|
||||
|
||||
if(NOT googletest_POPULATED)
|
||||
FetchContent_Populate(googletest)
|
||||
add_subdirectory(${googletest_SOURCE_DIR} ${googletest_BINARY_DIR} EXCLUDE_FROM_ALL)
|
||||
endif()
|
||||
|
||||
target_link_libraries(clio_tests PUBLIC clio gmock_main)
|
||||
target_include_directories(clio_tests PRIVATE unittests)
|
||||
|
||||
enable_testing()
|
||||
|
||||
include(GoogleTest)
|
||||
|
||||
gtest_discover_tests(clio_tests)
|
||||
@@ -1,14 +0,0 @@
|
||||
FetchContent_Declare(
|
||||
libfmt
|
||||
URL https://github.com/fmtlib/fmt/releases/download/9.1.0/fmt-9.1.0.zip
|
||||
)
|
||||
|
||||
FetchContent_GetProperties(libfmt)
|
||||
|
||||
if(NOT libfmt_POPULATED)
|
||||
FetchContent_Populate(libfmt)
|
||||
add_subdirectory(${libfmt_SOURCE_DIR} ${libfmt_BINARY_DIR} EXCLUDE_FROM_ALL)
|
||||
endif()
|
||||
|
||||
target_link_libraries(clio PUBLIC fmt)
|
||||
|
||||
@@ -1,20 +0,0 @@
|
||||
set(RIPPLED_REPO "https://github.com/ripple/rippled.git")
|
||||
set(RIPPLED_BRANCH "1.9.2")
|
||||
set(NIH_CACHE_ROOT "${CMAKE_CURRENT_BINARY_DIR}" CACHE INTERNAL "")
|
||||
set(patch_command ! grep operator!= src/ripple/protocol/Feature.h || git apply < ${CMAKE_CURRENT_SOURCE_DIR}/CMake/deps/Remove-bitset-operator.patch)
|
||||
message(STATUS "Cloning ${RIPPLED_REPO} branch ${RIPPLED_BRANCH}")
|
||||
FetchContent_Declare(rippled
|
||||
GIT_REPOSITORY "${RIPPLED_REPO}"
|
||||
GIT_TAG "${RIPPLED_BRANCH}"
|
||||
GIT_SHALLOW ON
|
||||
PATCH_COMMAND "${patch_command}"
|
||||
)
|
||||
|
||||
FetchContent_GetProperties(rippled)
|
||||
if(NOT rippled_POPULATED)
|
||||
FetchContent_Populate(rippled)
|
||||
add_subdirectory(${rippled_SOURCE_DIR} ${rippled_BINARY_DIR} EXCLUDE_FROM_ALL)
|
||||
endif()
|
||||
|
||||
target_link_libraries(clio PUBLIC xrpl_core grpc_pbufs)
|
||||
target_include_directories(clio PUBLIC ${rippled_SOURCE_DIR}/src ) # TODO: Seems like this shouldn't be needed?
|
||||
@@ -1,6 +0,0 @@
|
||||
target_compile_options(clio
|
||||
PUBLIC -Wall
|
||||
-Werror
|
||||
-Wno-narrowing
|
||||
-Wno-deprecated-declarations
|
||||
-Wno-dangling-else)
|
||||
239
CMakeLists.txt
239
CMakeLists.txt
@@ -1,175 +1,96 @@
|
||||
cmake_minimum_required(VERSION 3.16.3)
|
||||
|
||||
project(clio)
|
||||
set(CMAKE_PROJECT_INCLUDE_BEFORE ${CMAKE_CURRENT_SOURCE_DIR}/cmake/ClioVersion.cmake)
|
||||
|
||||
if(CMAKE_CXX_COMPILER_VERSION VERSION_LESS 11)
|
||||
message(FATAL_ERROR "GCC 11+ required for building clio")
|
||||
endif()
|
||||
project(clio VERSION ${CLIO_VERSION} HOMEPAGE_URL "https://github.com/XRPLF/clio"
|
||||
DESCRIPTION "An XRP Ledger API Server"
|
||||
)
|
||||
|
||||
option(BUILD_TESTS "Build tests" TRUE)
|
||||
# =========================== Options ====================================== #
|
||||
option(verbose "Verbose build" FALSE)
|
||||
option(tests "Build tests" FALSE)
|
||||
option(benchmark "Build benchmarks" FALSE)
|
||||
option(docs "Generate doxygen docs" FALSE)
|
||||
option(coverage "Build test coverage report" FALSE)
|
||||
option(packaging "Create distribution packages" FALSE)
|
||||
option(lint "Run clang-tidy checks during compilation" FALSE)
|
||||
# ========================================================================== #
|
||||
set(san "" CACHE STRING "Add sanitizer instrumentation")
|
||||
set(CMAKE_EXPORT_COMPILE_COMMANDS TRUE)
|
||||
set_property(CACHE san PROPERTY STRINGS ";undefined;memory;address;thread")
|
||||
# ========================================================================== #
|
||||
|
||||
option(VERBOSE "Verbose build" TRUE)
|
||||
if(VERBOSE)
|
||||
set(CMAKE_MODULE_PATH "${PROJECT_SOURCE_DIR}/cmake" ${CMAKE_MODULE_PATH})
|
||||
|
||||
# Include required modules
|
||||
include(Ccache)
|
||||
include(CheckCXXCompilerFlag)
|
||||
include(ClangTidy)
|
||||
|
||||
add_library(clio_options INTERFACE)
|
||||
target_include_directories(clio_options INTERFACE ${CMAKE_SOURCE_DIR}/src)
|
||||
|
||||
# Set coverage build options
|
||||
if (coverage)
|
||||
if (NOT tests)
|
||||
message(FATAL_ERROR "Coverage requires tests to be enabled")
|
||||
endif ()
|
||||
include(CodeCoverage)
|
||||
append_coverage_compiler_flags_to_target(clio_options INTERFACE)
|
||||
endif ()
|
||||
|
||||
if (verbose)
|
||||
set(CMAKE_VERBOSE_MAKEFILE TRUE)
|
||||
set(FETCHCONTENT_QUIET FALSE CACHE STRING "Verbose FetchContent()")
|
||||
endif()
|
||||
endif ()
|
||||
|
||||
|
||||
if(PACKAGING)
|
||||
if (packaging)
|
||||
add_definitions(-DPKG=1)
|
||||
endif()
|
||||
target_compile_definitions(clio_options INTERFACE PKG=1)
|
||||
endif ()
|
||||
|
||||
#c++20 removed std::result_of but boost 1.75 is still using it.
|
||||
add_definitions(-DBOOST_ASIO_HAS_STD_INVOKE_RESULT=1)
|
||||
# Clio tweaks and checks
|
||||
include(CheckCompiler)
|
||||
include(Settings)
|
||||
include(SourceLocation)
|
||||
|
||||
add_library(clio)
|
||||
target_compile_features(clio PUBLIC cxx_std_20)
|
||||
target_include_directories(clio PUBLIC src)
|
||||
# Clio deps
|
||||
include(deps/libxrpl)
|
||||
include(deps/Boost)
|
||||
include(deps/OpenSSL)
|
||||
include(deps/Threads)
|
||||
include(deps/libfmt)
|
||||
include(deps/cassandra)
|
||||
include(deps/libbacktrace)
|
||||
|
||||
include(FetchContent)
|
||||
include(ExternalProject)
|
||||
include(CMake/settings.cmake)
|
||||
include(CMake/ClioVersion.cmake)
|
||||
include(CMake/deps/rippled.cmake)
|
||||
include(CMake/deps/libfmt.cmake)
|
||||
include(CMake/deps/Boost.cmake)
|
||||
include(CMake/deps/cassandra.cmake)
|
||||
include(CMake/deps/SourceLocation.cmake)
|
||||
add_subdirectory(src)
|
||||
|
||||
target_sources(clio PRIVATE
|
||||
## Main
|
||||
src/main/impl/Build.cpp
|
||||
## Backend
|
||||
src/backend/BackendInterface.cpp
|
||||
src/backend/CassandraBackend.cpp
|
||||
src/backend/SimpleCache.cpp
|
||||
## NextGen Backend
|
||||
src/backend/cassandra/impl/Future.cpp
|
||||
src/backend/cassandra/impl/Cluster.cpp
|
||||
src/backend/cassandra/impl/Batch.cpp
|
||||
src/backend/cassandra/impl/Result.cpp
|
||||
src/backend/cassandra/impl/Tuple.cpp
|
||||
src/backend/cassandra/impl/SslContext.cpp
|
||||
src/backend/cassandra/Handle.cpp
|
||||
src/backend/cassandra/SettingsProvider.cpp
|
||||
## ETL
|
||||
src/etl/ETLSource.cpp
|
||||
src/etl/ProbingETLSource.cpp
|
||||
src/etl/NFTHelpers.cpp
|
||||
src/etl/ReportingETL.cpp
|
||||
## Subscriptions
|
||||
src/subscriptions/SubscriptionManager.cpp
|
||||
## RPC
|
||||
src/rpc/Errors.cpp
|
||||
src/rpc/Factories.cpp
|
||||
src/rpc/RPCHelpers.cpp
|
||||
src/rpc/Counters.cpp
|
||||
src/rpc/WorkQueue.cpp
|
||||
src/rpc/common/Specs.cpp
|
||||
src/rpc/common/Validators.cpp
|
||||
# RPC impl
|
||||
src/rpc/common/impl/HandlerProvider.cpp
|
||||
## RPC handler
|
||||
src/rpc/handlers/AccountChannels.cpp
|
||||
src/rpc/handlers/AccountCurrencies.cpp
|
||||
src/rpc/handlers/AccountInfo.cpp
|
||||
src/rpc/handlers/AccountLines.cpp
|
||||
src/rpc/handlers/AccountNFTs.cpp
|
||||
src/rpc/handlers/AccountObjects.cpp
|
||||
src/rpc/handlers/AccountOffers.cpp
|
||||
src/rpc/handlers/AccountTx.cpp
|
||||
src/rpc/handlers/BookChanges.cpp
|
||||
src/rpc/handlers/BookOffers.cpp
|
||||
src/rpc/handlers/GatewayBalances.cpp
|
||||
src/rpc/handlers/Ledger.cpp
|
||||
src/rpc/handlers/LedgerData.cpp
|
||||
src/rpc/handlers/LedgerEntry.cpp
|
||||
src/rpc/handlers/LedgerRange.cpp
|
||||
src/rpc/handlers/NFTBuyOffers.cpp
|
||||
src/rpc/handlers/NFTHistory.cpp
|
||||
src/rpc/handlers/NFTInfo.cpp
|
||||
src/rpc/handlers/NFTOffersCommon.cpp
|
||||
src/rpc/handlers/NFTSellOffers.cpp
|
||||
src/rpc/handlers/NoRippleCheck.cpp
|
||||
src/rpc/handlers/Random.cpp
|
||||
src/rpc/handlers/TransactionEntry.cpp
|
||||
src/rpc/handlers/Tx.cpp
|
||||
## Util
|
||||
src/config/Config.cpp
|
||||
src/log/Logger.cpp
|
||||
src/util/Taggable.cpp)
|
||||
if (tests)
|
||||
add_subdirectory(unittests)
|
||||
endif ()
|
||||
|
||||
add_executable(clio_server src/main/main.cpp)
|
||||
target_link_libraries(clio_server PUBLIC clio)
|
||||
if (benchmark)
|
||||
add_subdirectory(benchmarks)
|
||||
endif ()
|
||||
|
||||
if(BUILD_TESTS)
|
||||
set(TEST_TARGET clio_tests)
|
||||
add_executable(${TEST_TARGET}
|
||||
unittests/Playground.cpp
|
||||
unittests/Backend.cpp
|
||||
unittests/Logger.cpp
|
||||
unittests/Config.cpp
|
||||
unittests/ProfilerTest.cpp
|
||||
unittests/DOSGuard.cpp
|
||||
unittests/SubscriptionTest.cpp
|
||||
unittests/SubscriptionManagerTest.cpp
|
||||
unittests/util/TestObject.cpp
|
||||
# RPC
|
||||
unittests/rpc/ErrorTests.cpp
|
||||
unittests/rpc/BaseTests.cpp
|
||||
unittests/rpc/RPCHelpersTest.cpp
|
||||
unittests/rpc/CountersTest.cpp
|
||||
unittests/rpc/AdminVerificationTest.cpp
|
||||
## RPC handlers
|
||||
unittests/rpc/handlers/DefaultProcessorTests.cpp
|
||||
unittests/rpc/handlers/TestHandlerTests.cpp
|
||||
unittests/rpc/handlers/AccountCurrenciesTest.cpp
|
||||
unittests/rpc/handlers/AccountLinesTest.cpp
|
||||
unittests/rpc/handlers/AccountTxTest.cpp
|
||||
unittests/rpc/handlers/AccountOffersTest.cpp
|
||||
unittests/rpc/handlers/AccountInfoTest.cpp
|
||||
unittests/rpc/handlers/AccountChannelsTest.cpp
|
||||
unittests/rpc/handlers/AccountNFTsTest.cpp
|
||||
unittests/rpc/handlers/BookOffersTest.cpp
|
||||
unittests/rpc/handlers/GatewayBalancesTest.cpp
|
||||
unittests/rpc/handlers/TxTest.cpp
|
||||
unittests/rpc/handlers/TransactionEntryTest.cpp
|
||||
unittests/rpc/handlers/LedgerEntryTest.cpp
|
||||
unittests/rpc/handlers/LedgerRangeTest.cpp
|
||||
unittests/rpc/handlers/NoRippleCheckTest.cpp
|
||||
unittests/rpc/handlers/ServerInfoTest.cpp
|
||||
unittests/rpc/handlers/PingTest.cpp
|
||||
unittests/rpc/handlers/RandomTest.cpp
|
||||
unittests/rpc/handlers/NFTInfoTest.cpp
|
||||
unittests/rpc/handlers/NFTBuyOffersTest.cpp
|
||||
unittests/rpc/handlers/NFTSellOffersTest.cpp
|
||||
unittests/rpc/handlers/NFTHistoryTest.cpp
|
||||
unittests/rpc/handlers/SubscribeTest.cpp
|
||||
unittests/rpc/handlers/UnsubscribeTest.cpp
|
||||
unittests/rpc/handlers/LedgerDataTest.cpp
|
||||
unittests/rpc/handlers/AccountObjectsTest.cpp
|
||||
unittests/rpc/handlers/BookChangesTest.cpp
|
||||
unittests/rpc/handlers/LedgerTest.cpp
|
||||
# Backend
|
||||
unittests/backend/cassandra/BaseTests.cpp
|
||||
unittests/backend/cassandra/BackendTests.cpp
|
||||
unittests/backend/cassandra/RetryPolicyTests.cpp
|
||||
unittests/backend/cassandra/SettingsProviderTests.cpp
|
||||
unittests/backend/cassandra/ExecutionStrategyTests.cpp
|
||||
unittests/backend/cassandra/AsyncExecutorTests.cpp)
|
||||
include(CMake/deps/gtest.cmake)
|
||||
# Enable selected sanitizer if enabled via `san`
|
||||
if (san)
|
||||
target_compile_options(
|
||||
clio PUBLIC # Sanitizers recommend minimum of -O1 for reasonable performance
|
||||
$<$<CONFIG:Debug>:-O1> ${SAN_FLAG} -fno-omit-frame-pointer
|
||||
)
|
||||
target_compile_definitions(
|
||||
clio PUBLIC $<$<STREQUAL:${san},address>:SANITIZER=ASAN> $<$<STREQUAL:${san},thread>:SANITIZER=TSAN>
|
||||
$<$<STREQUAL:${san},memory>:SANITIZER=MSAN> $<$<STREQUAL:${san},undefined>:SANITIZER=UBSAN>
|
||||
)
|
||||
target_link_libraries(clio INTERFACE ${SAN_FLAG} ${SAN_LIB})
|
||||
endif ()
|
||||
|
||||
# test for dwarf5 bug on ci
|
||||
target_compile_options(clio PUBLIC -gdwarf-4)
|
||||
# Generate `docs` target for doxygen documentation if enabled Note: use `make docs` to generate the documentation
|
||||
if (docs)
|
||||
add_subdirectory(docs)
|
||||
endif ()
|
||||
|
||||
# if CODE_COVERAGE enable, add clio_test-ccov
|
||||
if(CODE_COVERAGE)
|
||||
include(CMake/coverage.cmake)
|
||||
add_converage(${TEST_TARGET})
|
||||
endif()
|
||||
endif()
|
||||
|
||||
include(CMake/install/install.cmake)
|
||||
if(PACKAGING)
|
||||
include(CMake/packaging.cmake)
|
||||
endif()
|
||||
include(install/install)
|
||||
if (packaging)
|
||||
include(cmake/packaging.cmake) # This file exists only in build runner
|
||||
endif ()
|
||||
|
||||
@@ -20,6 +20,12 @@ Please run the following command in order to use git hooks that are helpful for
|
||||
git config --local core.hooksPath .githooks
|
||||
```
|
||||
|
||||
## Git hooks dependencies
|
||||
The pre-commit hook requires `clang-format >= 17.0.0` and `cmake-format` to be installed on your machine.
|
||||
`clang-format` can be installed using `brew` on macOS and default package manager on Linux.
|
||||
`cmake-format` can be installed using `pip`.
|
||||
The hook will also attempt to automatically use `doxygen` to verify that everything public in the codebase is covered by doc comments. If `doxygen` is not installed, the hook will raise a warning suggesting to install `doxygen` for future commits.
|
||||
|
||||
## Git commands
|
||||
This sections offers a detailed look at the git commands you will need to use to get your PR submitted.
|
||||
Please note that there are more than one way to do this and these commands are provided for your convenience.
|
||||
@@ -62,6 +68,11 @@ git commit --amend -S
|
||||
git push --force
|
||||
```
|
||||
|
||||
## Use ccache (optional)
|
||||
Clio uses `ccache` to speed up compilation. If you want to use it, please make sure it is installed on your machine.
|
||||
CMake will automatically detect it and use it if it is available.
|
||||
|
||||
|
||||
## Fixing issues found during code review
|
||||
While your code is in review, it's possible that some changes will be requested by reviewer(s).
|
||||
This section describes the process of adding your fixes.
|
||||
@@ -91,8 +102,14 @@ The button for that is near the bottom of the PR's page on GitHub.
|
||||
This is a non-exhaustive list of recommended style guidelines. These are not always strictly enforced and serve as a way to keep the codebase coherent.
|
||||
|
||||
## Formatting
|
||||
Code must conform to `clang-format` version 10, unless the result would be unreasonably difficult to read or maintain.
|
||||
To change your code to conform use `clang-format -i <your changed files>`.
|
||||
Code must conform to `clang-format` version 17, unless the result would be unreasonably difficult to read or maintain.
|
||||
In most cases the pre-commit hook will take care of formatting and will fix any issues automatically.
|
||||
To manually format your code, use `clang-format -i <your changed files>` for C++ files and `cmake-format -i <your changed files>` for CMake files.
|
||||
|
||||
## Documentation
|
||||
All public namespaces, classes and functions must be covered by doc (`doxygen`) comments. Everything that is not within a nested `impl` namespace is considered public.
|
||||
|
||||
> **Note:** Keep in mind that this is enforced by Clio's CI and your build will fail if newly added public code lacks documentation.
|
||||
|
||||
## Avoid
|
||||
* Proliferation of nearly identical code.
|
||||
@@ -126,6 +143,7 @@ Existing maintainers can resign, or be subject to a vote for removal at the behe
|
||||
|
||||
* [cindyyan317](https://github.com/cindyyan317) (Ripple)
|
||||
* [godexsoft](https://github.com/godexsoft) (Ripple)
|
||||
* [kuznetsss](https://github.com/kuznetsss) (Ripple)
|
||||
* [legleux](https://github.com/legleux) (Ripple)
|
||||
|
||||
## Honorable ex-Maintainers
|
||||
|
||||
252
README.md
252
README.md
@@ -1,238 +1,48 @@
|
||||
# Clio
|
||||
Clio is an XRP Ledger API server. Clio is optimized for RPC calls, over WebSocket or JSON-RPC. Validated
|
||||
historical ledger and transaction data are stored in a more space-efficient format,
|
||||
using up to 4 times less space than rippled. Clio can be configured to store data in Apache Cassandra or ScyllaDB,
|
||||
allowing for scalable read throughput. Multiple Clio nodes can share
|
||||
access to the same dataset, allowing for a highly available cluster of Clio nodes,
|
||||
without the need for redundant data storage or computation.
|
||||
# <img src='./docs/img/xrpl-logo.svg' width='40' valign="top" /> Clio
|
||||
|
||||
Clio offers the full rippled API, with the caveat that Clio by default only returns validated data.
|
||||
This means that `ledger_index` defaults to `validated` instead of `current` for all requests.
|
||||
Other non-validated data is also not returned, such as information about queued transactions.
|
||||
For requests that require access to the p2p network, such as `fee` or `submit`, Clio automatically forwards the request to a rippled node and propagates the response back to the client. To access non-validated data for *any* request, simply add `ledger_index: "current"` to the request, and Clio will forward the request to rippled.
|
||||
[](https://github.com/XRPLF/clio/actions/workflows/build.yml?query=branch%3Adevelop)
|
||||
[](https://github.com/XRPLF/clio/actions/workflows/nightly.yml?query=branch%3Adevelop)
|
||||
[](https://github.com/XRPLF/clio/actions/workflows/clang-tidy.yml?query=branch%3Adevelop)
|
||||
[](https://app.codecov.io/gh/XRPLF/clio)
|
||||
|
||||
Clio does not connect to the peer-to-peer network. Instead, Clio extracts data from a group of specified rippled nodes. Running Clio requires access to at least one rippled node
|
||||
from which data can be extracted. The rippled node does not need to be running on the same machine as Clio.
|
||||
Clio is an XRP Ledger API server optimized for RPC calls over WebSocket or JSON-RPC.
|
||||
It stores validated historical ledger and transaction data in a more space efficient format, and uses up to 4 times less space than [rippled](https://github.com/XRPLF/rippled).
|
||||
|
||||
Clio can be configured to store data in [Apache Cassandra](https://cassandra.apache.org/_/index.html) or [ScyllaDB](https://www.scylladb.com/), enabling scalable read throughput.
|
||||
Multiple Clio nodes can share access to the same dataset, which allows for a highly available cluster of Clio nodes without the need for redundant data storage or computation.
|
||||
|
||||
## Requirements
|
||||
1. Access to a Cassandra cluster or ScyllaDB cluster. Can be local or remote.
|
||||
## 📡 Clio and `rippled`
|
||||
|
||||
2. Access to one or more rippled nodes. Can be local or remote.
|
||||
Clio offers the full `rippled` API, with the caveat that Clio by default only returns validated data. This means that `ledger_index` defaults to `validated` instead of `current` for all requests. Other non-validated data, such as information about queued transactions, is also not returned.
|
||||
|
||||
## Building
|
||||
Clio retrieves data from a designated group of `rippled` nodes instead of connecting to the peer-to-peer network.
|
||||
For requests that require access to the peer-to-peer network, such as `fee` or `submit`, Clio automatically forwards the request to a `rippled` node and propagates the response back to the client. To access non-validated data for *any* request, simply add `ledger_index: "current"` to the request, and Clio will forward the request to `rippled`.
|
||||
|
||||
Clio is built with CMake. Clio requires at least GCC-11/clang-14.0.0 (C++20), and Boost 1.75.0.
|
||||
> [!NOTE]
|
||||
> Clio requires access to at least one `rippled` node, which can run on the same machine as Clio or separately.
|
||||
|
||||
Use these instructions to build a Clio executable from the source. These instructions were tested on Ubuntu 20.04 LTS.
|
||||
## 📚 Learn more about Clio
|
||||
|
||||
```sh
|
||||
# Install dependencies
|
||||
sudo apt-get -y install git pkg-config protobuf-compiler libprotobuf-dev libssl-dev wget build-essential bison flex autoconf cmake clang-format
|
||||
# Install gcovr to run code coverage
|
||||
sudo apt-get -y install gcovr
|
||||
Below are some useful docs to learn more about Clio.
|
||||
|
||||
# Compile Boost
|
||||
wget -O $HOME/boost_1_75_0.tar.gz https://boostorg.jfrog.io/artifactory/main/release/1.75.0/source/boost_1_75_0.tar.gz
|
||||
tar xvzf $HOME/boost_1_75_0.tar.gz
|
||||
cd $HOME/boost_1_75_0
|
||||
./bootstrap.sh
|
||||
./b2 -j$(nproc)
|
||||
echo "export BOOST_ROOT=$HOME/boost_1_75_0" >> $HOME/.profile && source $HOME/.profile
|
||||
**For Developers**:
|
||||
|
||||
# Clone the Clio Git repository & build Clio
|
||||
cd $HOME
|
||||
git clone https://github.com/XRPLF/clio.git
|
||||
cd $HOME/clio
|
||||
cmake -B build && cmake --build build --parallel $(nproc)
|
||||
```
|
||||
- [How to build Clio](./docs/build-clio.md)
|
||||
- [Metrics and static analysis](./docs/metrics-and-static-analysis.md)
|
||||
- [Coverage report](./docs/coverage-report.md)
|
||||
|
||||
## Running
|
||||
```sh
|
||||
./clio_server config.json
|
||||
```
|
||||
**For Operators**:
|
||||
|
||||
Clio needs access to a rippled server. The config files of rippled and Clio need
|
||||
to match in a certain sense.
|
||||
Clio needs to know:
|
||||
- the IP of rippled
|
||||
- the port on which rippled is accepting unencrypted WebSocket connections
|
||||
- the port on which rippled is handling gRPC requests
|
||||
- [How to configure Clio and rippled](./docs/configure-clio.md)
|
||||
- [How to run Clio](./docs/run-clio.md)
|
||||
- [Logging](./docs/logging.md)
|
||||
|
||||
rippled needs to open:
|
||||
- a port to accept unencrypted websocket connections
|
||||
- a port to handle gRPC requests, with the IP(s) of Clio specified in the `secure_gateway` entry
|
||||
**General reference material:**
|
||||
|
||||
The example configs of rippled and Clio are setups such that minimal changes are
|
||||
required. When running locally, the only change needed is to uncomment the `port_grpc`
|
||||
section of the rippled config. When running Clio and rippled on separate machines,
|
||||
in addition to uncommenting the `port_grpc` section, a few other steps must be taken:
|
||||
1. change the `ip` of the first entry of `etl_sources` to the IP where your rippled
|
||||
server is running
|
||||
2. open a public, unencrypted WebSocket port on your rippled server
|
||||
3. change the IP specified in `secure_gateway` of `port_grpc` section of the rippled config
|
||||
to the IP of your Clio server. This entry can take the form of a comma-separated list if
|
||||
you are running multiple Clio nodes.
|
||||
- [API reference](https://xrpl.org/http-websocket-apis.html)
|
||||
- [Clio documentation](https://xrpl.org/the-clio-server.html#the-clio-server)
|
||||
|
||||
## 🆘 Help
|
||||
|
||||
In addition, the parameter `start_sequence` can be included and configured within the top level of the config file. This parameter specifies the sequence of first ledger to extract if the database is empty. Note that ETL extracts ledgers in order and that no backfilling functionality currently exists, meaning Clio will not retroactively learn ledgers older than the one you specify. Choosing to specify this or not will yield the following behavior:
|
||||
- If this setting is absent and the database is empty, ETL will start with the next ledger validated by the network.
|
||||
- If this setting is present and the database is not empty, an exception is thrown.
|
||||
|
||||
In addition, the optional parameter `finish_sequence` can be added to the json file as well, specifying where the ledger can stop.
|
||||
|
||||
To add `start_sequence` and/or `finish_sequence` to the config.json file appropriately, they will be on the same top level of precedence as other parameters (such as `database`, `etl_sources`, `read_only`, etc.) and be specified with an integer. Here is an example snippet from the config file:
|
||||
|
||||
```json
|
||||
"start_sequence": 12345,
|
||||
"finish_sequence": 54321
|
||||
```
|
||||
|
||||
The parameters `ssl_cert_file` and `ssl_key_file` can also be added to the top level of precedence of our Clio config. `ssl_cert_file` specifies the filepath for your SSL cert while `ssl_key_file` specifies the filepath for your SSL key. It is up to you how to change ownership of these folders for your designated Clio user. Your options include:
|
||||
- Copying the two files as root somewhere that's accessible by the Clio user, then running `sudo chown` to your user
|
||||
- Changing the permissions directly so it's readable by your Clio user
|
||||
- Running Clio as root (strongly discouraged)
|
||||
|
||||
An example of how to specify `ssl_cert_file` and `ssl_key_file` in the config:
|
||||
|
||||
```json
|
||||
"server":{
|
||||
"ip": "0.0.0.0",
|
||||
"port": 51233
|
||||
},
|
||||
"ssl_cert_file" : "/full/path/to/cert.file",
|
||||
"ssl_key_file" : "/full/path/to/key.file"
|
||||
```
|
||||
|
||||
Once your config files are ready, start rippled and Clio. It doesn't matter which you
|
||||
start first, and it's fine to stop one or the other and restart at any given time.
|
||||
|
||||
Clio will wait for rippled to sync before extracting any ledgers. If there is already
|
||||
data in Clio's database, Clio will begin extraction with the ledger whose sequence
|
||||
is one greater than the greatest sequence currently in the database. Clio will wait
|
||||
for this ledger to be available. Be aware that the behavior of rippled is to sync to
|
||||
the most recent ledger on the network, and then backfill. If Clio is extracting ledgers
|
||||
from rippled, and then rippled is stopped for a significant amount of time and then restarted, rippled
|
||||
will take time to backfill to the next ledger that Clio wants. The time it takes is proportional
|
||||
to the amount of time rippled was offline for. Also be aware that the amount rippled backfills
|
||||
are dependent on the online_delete and ledger_history config values; if these values
|
||||
are small, and rippled is stopped for a significant amount of time, rippled may never backfill
|
||||
to the ledger that Clio wants. To avoid this situation, it is advised to keep history
|
||||
proportional to the amount of time that you expect rippled to be offline. For example, if you
|
||||
expect rippled to be offline for a few days from time to time, you should keep at least
|
||||
a few days of history. If you expect rippled to never be offline, then you can keep a very small
|
||||
amount of history.
|
||||
|
||||
Clio can use multiple rippled servers as a data source. Simply add more entries to
|
||||
the `etl_sources` section. Clio will load balance requests across the servers specified
|
||||
in this list. As long as one rippled server is up and synced, Clio will continue
|
||||
extracting ledgers.
|
||||
|
||||
In contrast to rippled, Clio will answer RPC requests for the data already in the
|
||||
database as soon as the server starts. Clio doesn't wait to sync to the network, or
|
||||
for rippled to sync.
|
||||
|
||||
When starting Clio with a fresh database, Clio needs to download a ledger in full.
|
||||
This can take some time, and depends on database throughput. With a moderately fast
|
||||
database, this should take less than 10 minutes. If you did not properly set `secure_gateway`
|
||||
in the `port_grpc` section of rippled, this step will fail. Once the first ledger
|
||||
is fully downloaded, Clio only needs to extract the changed data for each ledger,
|
||||
so extraction is much faster and Clio can keep up with rippled in real-time. Even under
|
||||
intense load, Clio should not lag behind the network, as Clio is not processing the data,
|
||||
and is simply writing to a database. The throughput of Clio is dependent on the throughput
|
||||
of your database, but a standard Cassandra or Scylla deployment can handle
|
||||
the write load of the XRP Ledger without any trouble. Generally the performance considerations
|
||||
come on the read side, and depends on the number of RPC requests your Clio nodes
|
||||
are serving. Be aware that very heavy read traffic can impact write throughput. Again, this
|
||||
is on the database side, so if you are seeing this, upgrade your database.
|
||||
|
||||
It is possible to run multiple Clio nodes that share access to the same database.
|
||||
The Clio nodes don't need to know about each other. You can simply spin up more Clio
|
||||
nodes pointing to the same database as you wish, and shut them down as you wish.
|
||||
On startup, each Clio node queries the database for the latest ledger. If this latest
|
||||
ledger does not change for some time, the Clio node begins extracting ledgers
|
||||
and writing to the database. If the Clio node detects a ledger that it is trying to
|
||||
write has already been written, the Clio node will backoff and stop writing. If later
|
||||
the Clio node sees no ledger written for some time, it will start writing again.
|
||||
This algorithm ensures that at any given time, one and only one Clio node is writing
|
||||
to the database.
|
||||
|
||||
It is possible to force Clio to only read data, and to never become a writer.
|
||||
To do this, set `read_only: true` in the config. One common setup is to have a
|
||||
small number of writer nodes that are inaccessible to clients, with several
|
||||
read only nodes handling client requests. The number of read only nodes can be scaled
|
||||
up or down in response to request volume.
|
||||
|
||||
When using multiple rippled servers as data sources and multiple Clio nodes,
|
||||
each Clio node should use the same set of rippled servers as sources. The order doesn't matter.
|
||||
The only reason not to do this is if you are running servers in different regions, and
|
||||
you want the Clio nodes to extract from servers in their region. However, if you
|
||||
are doing this, be aware that database traffic will be flowing across regions,
|
||||
which can cause high latencies. A possible alternative to this is to just deploy
|
||||
a database in each region, and the Clio nodes in each region use their region's database.
|
||||
This is effectively two systems.
|
||||
|
||||
## Developing against `rippled` in standalone mode
|
||||
|
||||
If you wish you develop against a `rippled` instance running in standalone
|
||||
mode there are a few quirks of both clio and rippled you need to keep in mind.
|
||||
You must:
|
||||
|
||||
1. Advance the `rippled` ledger to at least ledger 256
|
||||
2. Wait 10 minutes before first starting clio against this standalone node.
|
||||
|
||||
## Logging
|
||||
Clio provides several logging options, all are configurable via the config file and are detailed below.
|
||||
|
||||
`log_level`: The minimum level of severity at which the log message will be outputted by default.
|
||||
Severity options are `trace`, `debug`, `info`, `warning`, `error`, `fatal`. Defaults to `info`.
|
||||
|
||||
`log_format`: The format of log lines produced by clio. Defaults to `"%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%"`.
|
||||
Each of the variables expands like so
|
||||
- `TimeStamp`: The full date and time of the log entry
|
||||
- `SourceLocation`: A partial path to the c++ file and the line number in said file (`source/file/path:linenumber`)
|
||||
- `ThreadID`: The ID of the thread the log entry is written from
|
||||
- `Channel`: The channel that this log entry was sent to
|
||||
- `Severity`: The severity (aka log level) the entry was sent at
|
||||
- `Message`: The actual log message
|
||||
|
||||
`log_channels`: An array of json objects, each overriding properties for a logging `channel`.
|
||||
At the moment of writing, only `log_level` can be overriden using this mechanism.
|
||||
|
||||
Each object is of this format:
|
||||
```json
|
||||
{
|
||||
"channel": "Backend",
|
||||
"log_level": "fatal"
|
||||
}
|
||||
```
|
||||
If no override is present for a given channel, that channel will log at the severity specified by the global `log_level`.
|
||||
Overridable log channels: `Backend`, `WebServer`, `Subscriptions`, `RPC`, `ETL` and `Performance`.
|
||||
|
||||
> **Note:** See `example-config.json` for more details.
|
||||
|
||||
`log_to_console`: Enable/disable log output to console. Options are `true`/`false`. Defaults to true.
|
||||
|
||||
`log_directory`: Path to the directory where log files are stored. If such directory doesn't exist, Clio will create it. If not specified, logs are not written to a file.
|
||||
|
||||
`log_rotation_size`: The max size of the log file in **megabytes** before it will rotate into a smaller file. Defaults to 2GB.
|
||||
|
||||
`log_directory_max_size`: The max size of the log directory in **megabytes** before old log files will be
|
||||
deleted to free up space. Defaults to 50GB.
|
||||
|
||||
`log_rotation_hour_interval`: The time interval in **hours** after the last log rotation to automatically
|
||||
rotate the current log file. Defaults to 12 hours.
|
||||
|
||||
Note, time-based log rotation occurs dependently on size-based log rotation, where if a
|
||||
size-based log rotation occurs, the timer for the time-based rotation will reset.
|
||||
|
||||
`log_tag_style`: Tag implementation to use. Must be one of:
|
||||
- `uint`: Lock free and threadsafe but outputs just a simple unsigned integer
|
||||
- `uuid`: Threadsafe and outputs a UUID tag
|
||||
- `none`: Don't use tagging at all
|
||||
|
||||
## Cassandra / Scylla Administration
|
||||
|
||||
Since Clio relies on either Cassandra or Scylla for its database backend, here are some important considerations:
|
||||
|
||||
- Scylla, by default, will reserve all free RAM on a machine for itself. If you are running `rippled` or other services on the same machine, restrict its memory usage using the `--memory` argument: https://docs.scylladb.com/getting-started/scylla-in-a-shared-environment/
|
||||
Feel free to open an [issue](https://github.com/XRPLF/clio/issues) if you have a feature request or something doesn't work as expected.
|
||||
If you have any questions about building, running, contributing, using Clio or any other, you could always start a new [discussion](https://github.com/XRPLF/clio/discussions).
|
||||
|
||||
@@ -1,23 +0,0 @@
|
||||
# Release Notes
|
||||
|
||||
This document contains the release notes for `clio_server`, an XRP Ledger API Server.
|
||||
|
||||
To build and run `clio_server`, follow the instructions in [README.md](https://github.com/XRPLF/clio).
|
||||
|
||||
If you find issues or have a new idea, please open [an issue](https://github.com/XRPLF/clio/issues).
|
||||
|
||||
# Releases
|
||||
|
||||
## 0.1.0
|
||||
|
||||
Clio is an XRP Ledger API server. Clio is optimized for RPC calls, over websocket or JSON-RPC. Validated historical ledger and transaction data is stored in a more space efficient format, using up to 4 times less space than rippled.
|
||||
|
||||
Clio uses Cassandra or ScyllaDB, allowing for scalable read throughput. Multiple clio nodes can share access to the same dataset, allowing for a highly available cluster of clio nodes, without the need for redundant data storage or computation.
|
||||
|
||||
**0.1.0** is the first beta of Project Clio. It contains:
|
||||
- `./src/backend` is the BackendInterface. This provides an abstraction for reading and writing information to a database.
|
||||
- `./src/etl` is the ReportingETL. The classes in this folder are used to extract information from the P2P network and write it to a database, either locally or over the network.
|
||||
- `./src/rpc` contains RPC handlers that are called by clients. These handlers should expose the same API as rippled.
|
||||
- `./src/subscriptions` contains the SubscriptionManager. This manages publishing to clients subscribing to streams or accounts.
|
||||
- `./src/webserver` contains a flex server that handles both http/s and ws/s traffic on a single port.
|
||||
- `./unittests` simple unit tests that write to and read from a database to verify that the ETL works.
|
||||
16
benchmarks/CMakeLists.txt
Normal file
16
benchmarks/CMakeLists.txt
Normal file
@@ -0,0 +1,16 @@
|
||||
add_executable(clio_benchmark)
|
||||
|
||||
target_sources(
|
||||
clio_benchmark
|
||||
PRIVATE # Common
|
||||
Main.cpp
|
||||
Playground.cpp
|
||||
# ExecutionContext
|
||||
util/async/ExecutionContextBenchmarks.cpp
|
||||
)
|
||||
|
||||
include(deps/gbench)
|
||||
|
||||
target_include_directories(clio_benchmark PRIVATE .)
|
||||
target_link_libraries(clio_benchmark PUBLIC clio benchmark::benchmark_main)
|
||||
set_target_properties(clio_benchmark PROPERTIES RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR})
|
||||
22
benchmarks/Main.cpp
Normal file
22
benchmarks/Main.cpp
Normal file
@@ -0,0 +1,22 @@
|
||||
//------------------------------------------------------------------------------
|
||||
/*
|
||||
This file is part of clio: https://github.com/XRPLF/clio
|
||||
Copyright (c) 2023, the clio developers.
|
||||
|
||||
Permission to use, copy, modify, and distribute this software for any
|
||||
purpose with or without fee is hereby granted, provided that the above
|
||||
copyright notice and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
|
||||
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
*/
|
||||
//==============================================================================
|
||||
|
||||
#include <benchmark/benchmark.h>
|
||||
|
||||
BENCHMARK_MAIN();
|
||||
45
benchmarks/Playground.cpp
Normal file
45
benchmarks/Playground.cpp
Normal file
@@ -0,0 +1,45 @@
|
||||
//------------------------------------------------------------------------------
|
||||
/*
|
||||
This file is part of clio: https://github.com/XRPLF/clio
|
||||
Copyright (c) 2023, the clio developers.
|
||||
|
||||
Permission to use, copy, modify, and distribute this software for any
|
||||
purpose with or without fee is hereby granted, provided that the above
|
||||
copyright notice and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
|
||||
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
*/
|
||||
//==============================================================================
|
||||
|
||||
/*
|
||||
* Use this file for temporary benchmarks and implementations.
|
||||
* Usage example:
|
||||
* ```
|
||||
* ./clio_benchmarks
|
||||
* --benchmark_time_unit=ms
|
||||
* --benchmark_repetitions=10
|
||||
* --benchmark_display_aggregates_only=true
|
||||
* --benchmark_min_time=1x
|
||||
* --benchmark_filter="Playground"
|
||||
* ```
|
||||
*
|
||||
* Note: Please don't push your temporary work to the repo.
|
||||
*/
|
||||
|
||||
// #include <benchmark/benchmark.h>
|
||||
|
||||
// static void
|
||||
// benchmarkPlaygroundTest1(benchmark::State& state)
|
||||
// {
|
||||
// for (auto _ : state) {
|
||||
// // ...
|
||||
// }
|
||||
// }
|
||||
|
||||
// BENCHMARK(benchmarkPlaygroundTest1);
|
||||
268
benchmarks/util/async/ExecutionContextBenchmarks.cpp
Normal file
268
benchmarks/util/async/ExecutionContextBenchmarks.cpp
Normal file
@@ -0,0 +1,268 @@
|
||||
//------------------------------------------------------------------------------
|
||||
/*
|
||||
This file is part of clio: https://github.com/XRPLF/clio
|
||||
Copyright (c) 2024, the clio developers.
|
||||
|
||||
Permission to use, copy, modify, and distribute this software for any
|
||||
purpose with or without fee is hereby granted, provided that the above
|
||||
copyright notice and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
|
||||
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
|
||||
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
|
||||
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
|
||||
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
|
||||
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
|
||||
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
*/
|
||||
//==============================================================================
|
||||
|
||||
#include "etl/ETLHelpers.hpp"
|
||||
#include "util/Random.hpp"
|
||||
#include "util/async/AnyExecutionContext.hpp"
|
||||
#include "util/async/AnyOperation.hpp"
|
||||
#include "util/async/context/BasicExecutionContext.hpp"
|
||||
#include "util/async/context/SyncExecutionContext.hpp"
|
||||
|
||||
#include <benchmark/benchmark.h>
|
||||
|
||||
#include <chrono>
|
||||
#include <cstddef>
|
||||
#include <cstdint>
|
||||
#include <latch>
|
||||
#include <optional>
|
||||
#include <stdexcept>
|
||||
#include <thread>
|
||||
#include <vector>
|
||||
|
||||
using namespace util;
|
||||
using namespace util::async;
|
||||
|
||||
class TestThread {
|
||||
std::vector<std::thread> threads_;
|
||||
etl::ThreadSafeQueue<std::optional<uint64_t>> q_;
|
||||
etl::ThreadSafeQueue<uint64_t> res_;
|
||||
|
||||
public:
|
||||
TestThread(std::vector<uint64_t> const& data) : q_(data.size()), res_(data.size())
|
||||
{
|
||||
for (auto el : data)
|
||||
q_.push(el);
|
||||
}
|
||||
|
||||
~TestThread()
|
||||
{
|
||||
for (auto& t : threads_) {
|
||||
if (t.joinable())
|
||||
t.join();
|
||||
}
|
||||
}
|
||||
|
||||
void
|
||||
run(std::size_t numThreads)
|
||||
{
|
||||
std::latch completion{numThreads};
|
||||
for (std::size_t i = 0; i < numThreads; ++i) {
|
||||
q_.push(std::nullopt);
|
||||
threads_.emplace_back([this, &completion]() { process(completion); });
|
||||
}
|
||||
|
||||
completion.wait();
|
||||
}
|
||||
|
||||
private:
|
||||
void
|
||||
process(std::latch& completion)
|
||||
{
|
||||
while (auto v = q_.pop()) {
|
||||
if (not v.has_value())
|
||||
break;
|
||||
|
||||
res_.push(v.value() * v.value());
|
||||
}
|
||||
|
||||
completion.count_down(1);
|
||||
}
|
||||
};
|
||||
|
||||
template <typename CtxType>
|
||||
class TestExecutionContextBatched {
|
||||
etl::ThreadSafeQueue<std::optional<uint64_t>> q_;
|
||||
etl::ThreadSafeQueue<uint64_t> res_;
|
||||
std::size_t batchSize_;
|
||||
|
||||
public:
|
||||
TestExecutionContextBatched(std::vector<uint64_t> const& data, std::size_t batchSize = 5000u)
|
||||
: q_(data.size()), res_(data.size()), batchSize_(batchSize)
|
||||
{
|
||||
for (auto el : data)
|
||||
q_.push(el);
|
||||
}
|
||||
|
||||
void
|
||||
run(std::size_t numThreads)
|
||||
{
|
||||
using OpType = typename CtxType::template StoppableOperation<void>;
|
||||
|
||||
CtxType ctx{numThreads};
|
||||
std::vector<OpType> operations;
|
||||
|
||||
for (std::size_t i = 0; i < numThreads; ++i) {
|
||||
q_.push(std::nullopt);
|
||||
|
||||
operations.push_back(ctx.execute(
|
||||
[this](auto stopRequested) {
|
||||
bool hasMore = true;
|
||||
auto doOne = [this] {
|
||||
auto v = q_.pop();
|
||||
if (not v.has_value())
|
||||
return false;
|
||||
|
||||
res_.push(v.value() * v.value());
|
||||
return true;
|
||||
};
|
||||
|
||||
while (not stopRequested and hasMore) {
|
||||
for (std::size_t i = 0; i < batchSize_ and hasMore; ++i)
|
||||
hasMore = doOne();
|
||||
}
|
||||
},
|
||||
std::chrono::seconds{5}
|
||||
));
|
||||
}
|
||||
|
||||
for (auto& op : operations)
|
||||
op.wait();
|
||||
}
|
||||
};
|
||||
|
||||
template <typename CtxType>
|
||||
class TestAnyExecutionContextBatched {
|
||||
etl::ThreadSafeQueue<std::optional<uint64_t>> q_;
|
||||
etl::ThreadSafeQueue<uint64_t> res_;
|
||||
std::size_t batchSize_;
|
||||
|
||||
public:
|
||||
TestAnyExecutionContextBatched(std::vector<uint64_t> const& data, std::size_t batchSize = 5000u)
|
||||
: q_(data.size()), res_(data.size()), batchSize_(batchSize)
|
||||
{
|
||||
for (auto el : data)
|
||||
q_.push(el);
|
||||
}
|
||||
|
||||
void
|
||||
run(std::size_t numThreads)
|
||||
{
|
||||
CtxType ctx{numThreads};
|
||||
AnyExecutionContext anyCtx{ctx};
|
||||
std::vector<AnyOperation<void>> operations;
|
||||
|
||||
for (std::size_t i = 0; i < numThreads; ++i) {
|
||||
q_.push(std::nullopt);
|
||||
|
||||
operations.push_back(anyCtx.execute(
|
||||
[this](auto stopRequested) {
|
||||
bool hasMore = true;
|
||||
auto doOne = [this] {
|
||||
auto v = q_.pop();
|
||||
if (not v.has_value())
|
||||
return false;
|
||||
|
||||
res_.push(v.value() * v.value());
|
||||
return true;
|
||||
};
|
||||
|
||||
while (not stopRequested and hasMore) {
|
||||
for (std::size_t i = 0; i < batchSize_ and hasMore; ++i)
|
||||
hasMore = doOne();
|
||||
}
|
||||
},
|
||||
std::chrono::seconds{5}
|
||||
));
|
||||
}
|
||||
|
||||
for (auto& op : operations)
|
||||
op.wait();
|
||||
}
|
||||
};
|
||||
|
||||
static auto
|
||||
generateData()
|
||||
{
|
||||
constexpr auto TOTAL = 10'000;
|
||||
std::vector<uint64_t> data;
|
||||
data.reserve(TOTAL);
|
||||
for (auto i = 0; i < TOTAL; ++i)
|
||||
data.push_back(util::Random::uniform(1, 100'000'000));
|
||||
|
||||
return data;
|
||||
}
|
||||
|
||||
static void
|
||||
benchmarkThreads(benchmark::State& state)
|
||||
{
|
||||
auto data = generateData();
|
||||
for (auto _ : state) {
|
||||
TestThread t{data};
|
||||
t.run(state.range(0));
|
||||
}
|
||||
}
|
||||
|
||||
template <typename CtxType>
|
||||
void
|
||||
benchmarkExecutionContextBatched(benchmark::State& state)
|
||||
{
|
||||
auto data = generateData();
|
||||
for (auto _ : state) {
|
||||
TestExecutionContextBatched<CtxType> t{data, state.range(1)};
|
||||
t.run(state.range(0));
|
||||
}
|
||||
}
|
||||
|
||||
template <typename CtxType>
|
||||
void
|
||||
benchmarkAnyExecutionContextBatched(benchmark::State& state)
|
||||
{
|
||||
auto data = generateData();
|
||||
for (auto _ : state) {
|
||||
TestAnyExecutionContextBatched<CtxType> t{data, state.range(1)};
|
||||
t.run(state.range(0));
|
||||
}
|
||||
}
|
||||
|
||||
// Simplest implementation using async queues and std::thread
|
||||
BENCHMARK(benchmarkThreads)->Arg(1)->Arg(2)->Arg(4)->Arg(8);
|
||||
|
||||
// Same implementation using each of the available execution contexts
|
||||
BENCHMARK(benchmarkExecutionContextBatched<PoolExecutionContext>)
|
||||
->ArgsProduct({
|
||||
{1, 2, 4, 8}, // threads
|
||||
{500, 1000, 5000, 10000} // batch size
|
||||
});
|
||||
BENCHMARK(benchmarkExecutionContextBatched<CoroExecutionContext>)
|
||||
->ArgsProduct({
|
||||
{1, 2, 4, 8}, // threads
|
||||
{500, 1000, 5000, 10000} // batch size
|
||||
});
|
||||
BENCHMARK(benchmarkExecutionContextBatched<SyncExecutionContext>)
|
||||
->ArgsProduct({
|
||||
{1, 2, 4, 8}, // threads
|
||||
{500, 1000, 5000, 10000} // batch size
|
||||
});
|
||||
|
||||
// Same implementations going thru AnyExecutionContext
|
||||
BENCHMARK(benchmarkAnyExecutionContextBatched<PoolExecutionContext>)
|
||||
->ArgsProduct({
|
||||
{1, 2, 4, 8}, // threads
|
||||
{500, 1000, 5000, 10000} // batch size
|
||||
});
|
||||
BENCHMARK(benchmarkAnyExecutionContextBatched<CoroExecutionContext>)
|
||||
->ArgsProduct({
|
||||
{1, 2, 4, 8}, // threads
|
||||
{500, 1000, 5000, 10000} // batch size
|
||||
});
|
||||
BENCHMARK(benchmarkAnyExecutionContextBatched<SyncExecutionContext>)
|
||||
->ArgsProduct({
|
||||
{1, 2, 4, 8}, // threads
|
||||
{500, 1000, 5000, 10000} // batch size
|
||||
});
|
||||
@@ -1,38 +0,0 @@
|
||||
{
|
||||
"database":
|
||||
{
|
||||
"type":"cassandra",
|
||||
"cassandra":
|
||||
{
|
||||
"secure_connect_bundle":"[path/to/zip. ignore if using contact_points]",
|
||||
"contact_points":"[ip. ignore if using secure_connect_bundle]",
|
||||
"port":"[port. ignore if using_secure_connect_bundle]",
|
||||
"keyspace":"clio",
|
||||
"username":"[username, if any]",
|
||||
"password":"[password, if any]",
|
||||
"max_requests_outstanding":25000,
|
||||
"threads":8
|
||||
}
|
||||
},
|
||||
"etl_sources":
|
||||
[
|
||||
{
|
||||
"ip":"[rippled ip]",
|
||||
"ws_port":"6006",
|
||||
"grpc_port":"50051"
|
||||
}
|
||||
],
|
||||
"dos_guard":
|
||||
{
|
||||
"whitelist":["127.0.0.1"]
|
||||
},
|
||||
"server":{
|
||||
"ip":"0.0.0.0",
|
||||
"port":8080
|
||||
},
|
||||
"log_level":"debug",
|
||||
"log_file":"./clio.log",
|
||||
"online_delete":0,
|
||||
"extractor_threads":8,
|
||||
"read_only":false
|
||||
}
|
||||
@@ -17,10 +17,12 @@
|
||||
*/
|
||||
//==============================================================================
|
||||
|
||||
#include <main/Build.h>
|
||||
#include "main/Build.hpp"
|
||||
|
||||
#include <string>
|
||||
|
||||
namespace Build {
|
||||
static constexpr char versionString[] = "@VERSION@";
|
||||
static constexpr char versionString[] = "@CLIO_VERSION@";
|
||||
|
||||
std::string const&
|
||||
getClioVersionString()
|
||||
5
cmake/Ccache.cmake
Normal file
5
cmake/Ccache.cmake
Normal file
@@ -0,0 +1,5 @@
|
||||
find_program(CCACHE_PATH "ccache")
|
||||
if (CCACHE_PATH)
|
||||
set(CMAKE_CXX_COMPILER_LAUNCHER "${CCACHE_PATH}")
|
||||
message(STATUS "Using ccache: ${CCACHE_PATH}")
|
||||
endif ()
|
||||
42
cmake/CheckCompiler.cmake
Normal file
42
cmake/CheckCompiler.cmake
Normal file
@@ -0,0 +1,42 @@
|
||||
if (CMAKE_CXX_COMPILER_ID STREQUAL "Clang")
|
||||
if (CMAKE_CXX_COMPILER_VERSION VERSION_LESS 14)
|
||||
message(FATAL_ERROR "Clang 14+ required for building clio")
|
||||
endif ()
|
||||
set(is_clang TRUE)
|
||||
elseif (CMAKE_CXX_COMPILER_ID STREQUAL "AppleClang")
|
||||
if (CMAKE_CXX_COMPILER_VERSION VERSION_LESS 14)
|
||||
message(FATAL_ERROR "AppleClang 14+ required for building clio")
|
||||
endif ()
|
||||
set(is_appleclang TRUE)
|
||||
elseif (CMAKE_CXX_COMPILER_ID STREQUAL "GNU")
|
||||
if (CMAKE_CXX_COMPILER_VERSION VERSION_LESS 11)
|
||||
message(FATAL_ERROR "GCC 11+ required for building clio")
|
||||
endif ()
|
||||
set(is_gcc TRUE)
|
||||
else ()
|
||||
message(FATAL_ERROR "Supported compilers: AppleClang 14+, Clang 14+, GCC 11+")
|
||||
endif ()
|
||||
|
||||
if (san)
|
||||
string(TOLOWER ${san} san)
|
||||
set(SAN_FLAG "-fsanitize=${san}")
|
||||
set(SAN_LIB "")
|
||||
if (is_gcc)
|
||||
if (san STREQUAL "address")
|
||||
set(SAN_LIB "asan")
|
||||
elseif (san STREQUAL "thread")
|
||||
set(SAN_LIB "tsan")
|
||||
elseif (san STREQUAL "memory")
|
||||
set(SAN_LIB "msan")
|
||||
elseif (san STREQUAL "undefined")
|
||||
set(SAN_LIB "ubsan")
|
||||
endif ()
|
||||
endif ()
|
||||
set(_saved_CRL ${CMAKE_REQUIRED_LIBRARIES})
|
||||
set(CMAKE_REQUIRED_LIBRARIES "${SAN_FLAG};${SAN_LIB}")
|
||||
check_cxx_compiler_flag(${SAN_FLAG} COMPILER_SUPPORTS_SAN)
|
||||
set(CMAKE_REQUIRED_LIBRARIES ${_saved_CRL})
|
||||
if (NOT COMPILER_SUPPORTS_SAN)
|
||||
message(FATAL_ERROR "${san} sanitizer does not seem to be supported by your compiler")
|
||||
endif ()
|
||||
endif ()
|
||||
33
cmake/ClangTidy.cmake
Normal file
33
cmake/ClangTidy.cmake
Normal file
@@ -0,0 +1,33 @@
|
||||
if (lint)
|
||||
|
||||
# Find clang-tidy binary
|
||||
if (DEFINED ENV{CLIO_CLANG_TIDY_BIN})
|
||||
set(_CLANG_TIDY_BIN $ENV{CLIO_CLANG_TIDY_BIN})
|
||||
if ((NOT EXISTS ${_CLANG_TIDY_BIN}) OR IS_DIRECTORY ${_CLANG_TIDY_BIN})
|
||||
message(FATAL_ERROR "$ENV{CLIO_CLANG_TIDY_BIN} no such file. Check CLIO_CLANG_TIDY_BIN env variable")
|
||||
endif ()
|
||||
message(STATUS "Using clang-tidy from CLIO_CLANG_TIDY_BIN")
|
||||
else ()
|
||||
find_program(_CLANG_TIDY_BIN NAMES "clang-tidy-17" "clang-tidy" REQUIRED)
|
||||
endif ()
|
||||
|
||||
if (NOT _CLANG_TIDY_BIN)
|
||||
message(
|
||||
FATAL_ERROR
|
||||
"clang-tidy binary not found. Please set the CLIO_CLANG_TIDY_BIN environment variable or install clang-tidy."
|
||||
)
|
||||
endif ()
|
||||
|
||||
# Support for https://github.com/matus-chochlik/ctcache
|
||||
find_program(CLANG_TIDY_CACHE_PATH NAMES "clang-tidy-cache")
|
||||
if (CLANG_TIDY_CACHE_PATH)
|
||||
set(_CLANG_TIDY_CMD "${CLANG_TIDY_CACHE_PATH};${_CLANG_TIDY_BIN}"
|
||||
CACHE STRING "A combined command to run clang-tidy with caching wrapper"
|
||||
)
|
||||
else ()
|
||||
set(_CLANG_TIDY_CMD "${_CLANG_TIDY_BIN}")
|
||||
endif ()
|
||||
|
||||
set(CMAKE_CXX_CLANG_TIDY "${_CLANG_TIDY_CMD};--quiet")
|
||||
message(STATUS "Using clang-tidy: ${CMAKE_CXX_CLANG_TIDY}")
|
||||
endif ()
|
||||
44
cmake/ClioVersion.cmake
Normal file
44
cmake/ClioVersion.cmake
Normal file
@@ -0,0 +1,44 @@
|
||||
#[===================================================================[
|
||||
write version to source
|
||||
#]===================================================================]
|
||||
|
||||
find_package(Git REQUIRED)
|
||||
|
||||
set(GIT_COMMAND rev-parse --short HEAD)
|
||||
execute_process(
|
||||
COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR} OUTPUT_VARIABLE REV
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
)
|
||||
|
||||
set(GIT_COMMAND branch --show-current)
|
||||
execute_process(
|
||||
COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR} OUTPUT_VARIABLE BRANCH
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
)
|
||||
|
||||
if (BRANCH STREQUAL "")
|
||||
set(BRANCH "dev")
|
||||
endif ()
|
||||
|
||||
if (NOT (BRANCH MATCHES master OR BRANCH MATCHES release/*)) # for develop and any other branch name
|
||||
# YYYYMMDDHMS-<branch>-<git-rev>
|
||||
execute_process(COMMAND date +%Y%m%d%H%M%S OUTPUT_VARIABLE DATE OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
set(CLIO_VERSION "${DATE}-${BRANCH}-${REV}")
|
||||
set(DOC_CLIO_VERSION "develop")
|
||||
else ()
|
||||
set(GIT_COMMAND describe --tags)
|
||||
execute_process(
|
||||
COMMAND ${GIT_EXECUTABLE} ${GIT_COMMAND} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR} OUTPUT_VARIABLE CLIO_TAG_VERSION
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
)
|
||||
set(CLIO_VERSION "${CLIO_TAG_VERSION}-${REV}")
|
||||
set(DOC_CLIO_VERSION "${CLIO_TAG_VERSION}")
|
||||
endif ()
|
||||
|
||||
if (CMAKE_BUILD_TYPE MATCHES Debug)
|
||||
set(CLIO_VERSION "${CLIO_VERSION}+DEBUG")
|
||||
endif ()
|
||||
|
||||
message(STATUS "Build version: ${CLIO_VERSION}")
|
||||
|
||||
configure_file(${CMAKE_CURRENT_LIST_DIR}/Build.cpp.in ${CMAKE_CURRENT_LIST_DIR}/../src/main/impl/Build.cpp)
|
||||
361
cmake/CodeCoverage.cmake
Normal file
361
cmake/CodeCoverage.cmake
Normal file
@@ -0,0 +1,361 @@
|
||||
# Copyright (c) 2012 - 2017, Lars Bilke All rights reserved.
|
||||
#
|
||||
# Redistribution and use in source and binary forms, with or without modification, are permitted provided that the
|
||||
# following conditions are met:
|
||||
#
|
||||
# 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following
|
||||
# disclaimer.
|
||||
#
|
||||
# 1. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following
|
||||
# disclaimer in the documentation and/or other materials provided with the distribution.
|
||||
#
|
||||
# 1. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote
|
||||
# products derived from this software without specific prior written permission.
|
||||
#
|
||||
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES,
|
||||
# INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
|
||||
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
|
||||
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
#
|
||||
# CHANGES:
|
||||
#
|
||||
# 2012-01-31, Lars Bilke - Enable Code Coverage
|
||||
#
|
||||
# 2013-09-17, Joakim Söderberg - Added support for Clang. - Some additional usage instructions.
|
||||
#
|
||||
# 2016-02-03, Lars Bilke - Refactored functions to use named parameters
|
||||
#
|
||||
# 2017-06-02, Lars Bilke - Merged with modified version from github.com/ufz/ogs
|
||||
#
|
||||
# 2019-05-06, Anatolii Kurotych - Remove unnecessary --coverage flag
|
||||
#
|
||||
# 2019-12-13, FeRD (Frank Dana) - Deprecate COVERAGE_LCOVR_EXCLUDES and COVERAGE_GCOVR_EXCLUDES lists in favor of
|
||||
# tool-agnostic COVERAGE_EXCLUDES variable, or EXCLUDE setup arguments. - CMake 3.4+: All excludes can be specified
|
||||
# relative to BASE_DIRECTORY - All setup functions: accept BASE_DIRECTORY, EXCLUDE list - Set lcov basedir with -b
|
||||
# argument - Add automatic --demangle-cpp in lcovr, if 'c++filt' is available (can be overridden with NO_DEMANGLE option
|
||||
# in setup_target_for_coverage_lcovr().) - Delete output dir, .info file on 'make clean' - Remove Python detection,
|
||||
# since version mismatches will break gcovr - Minor cleanup (lowercase function names, update examples...)
|
||||
#
|
||||
# 2019-12-19, FeRD (Frank Dana) - Rename Lcov outputs, make filtered file canonical, fix cleanup for targets
|
||||
#
|
||||
# 2020-01-19, Bob Apthorpe - Added gfortran support
|
||||
#
|
||||
# 2020-02-17, FeRD (Frank Dana) - Make all add_custom_target()s VERBATIM to auto-escape wildcard characters in EXCLUDEs,
|
||||
# and remove manual escaping from gcovr targets
|
||||
#
|
||||
# 2021-01-19, Robin Mueller - Add CODE_COVERAGE_VERBOSE option which will allow to print out commands which are run -
|
||||
# Added the option for users to set the GCOVR_ADDITIONAL_ARGS variable to supply additional flags to the gcovr command
|
||||
#
|
||||
# 2020-05-04, Mihchael Davis - Add -fprofile-abs-path to make gcno files contain absolute paths - Fix BASE_DIRECTORY not
|
||||
# working when defined - Change BYPRODUCT from folder to index.html to stop ninja from complaining about double defines
|
||||
#
|
||||
# 2021-05-10, Martin Stump - Check if the generator is multi-config before warning about non-Debug builds
|
||||
#
|
||||
# 2022-02-22, Marko Wehle - Change gcovr output from -o <filename> for --xml <filename> and --html <filename> output
|
||||
# respectively. This will allow for Multiple Output Formats at the same time by making use of GCOVR_ADDITIONAL_ARGS,
|
||||
# e.g. GCOVR_ADDITIONAL_ARGS "--txt".
|
||||
#
|
||||
# 2022-09-28, Sebastian Mueller - fix append_coverage_compiler_flags_to_target to correctly add flags - replace
|
||||
# "-fprofile-arcs -ftest-coverage" with "--coverage" (equivalent)
|
||||
#
|
||||
# 2023-12-15, Bronek Kozicki - remove setup_target_for_coverage_lcov (slow) and setup_target_for_coverage_fastcov (no
|
||||
# support for Clang) - fix Clang support by adding find_program( ... llvm-cov ) - add Apple Clang support by adding
|
||||
# execute_process( COMMAND xcrun -f llvm-cov ... ) - add CODE_COVERAGE_GCOV_TOOL to explicitly select gcov tool and
|
||||
# disable find_program - replace both functions setup_target_for_coverage_gcovr_* with single
|
||||
# setup_target_for_coverage_gcovr - add support for all gcovr output formats
|
||||
#
|
||||
# USAGE:
|
||||
#
|
||||
# 1. Copy this file into your cmake modules path.
|
||||
#
|
||||
# 1. Add the following line to your CMakeLists.txt (best inside an if-condition using a CMake option() to enable it just
|
||||
# optionally): include(CodeCoverage)
|
||||
#
|
||||
# 1. Append necessary compiler flags for all supported source files: append_coverage_compiler_flags() Or for specific
|
||||
# target: append_coverage_compiler_flags_to_target(YOUR_TARGET_NAME)
|
||||
#
|
||||
# 3.a (OPTIONAL) Set appropriate optimization flags, e.g. -O0, -O1 or -Og
|
||||
#
|
||||
# 1. If you need to exclude additional directories from the report, specify them using full paths in the
|
||||
# COVERAGE_EXCLUDES variable before calling setup_target_for_coverage_*(). Example: set(COVERAGE_EXCLUDES
|
||||
# '${PROJECT_SOURCE_DIR}/src/dir1/*'
|
||||
# '/path/to/my/src/dir2/*') Or, use the EXCLUDE argument to setup_target_for_coverage_*(). Example:
|
||||
# setup_target_for_coverage_gcovr( NAME coverage EXECUTABLE testrunner EXCLUDE "${PROJECT_SOURCE_DIR}/src/dir1/*"
|
||||
# "/path/to/my/src/dir2/*")
|
||||
#
|
||||
# 4.a NOTE: With CMake 3.4+, COVERAGE_EXCLUDES or EXCLUDE can also be set relative to the BASE_DIRECTORY (default:
|
||||
# PROJECT_SOURCE_DIR) Example: set(COVERAGE_EXCLUDES "dir1/*") setup_target_for_coverage_gcovr( NAME coverage EXECUTABLE
|
||||
# testrunner FORMAT html-details BASE_DIRECTORY "${PROJECT_SOURCE_DIR}/src" EXCLUDE "dir2/*")
|
||||
#
|
||||
# 4.b If you need to pass specific options to gcovr, specify them in GCOVR_ADDITIONAL_ARGS variable. Example: set
|
||||
# (GCOVR_ADDITIONAL_ARGS --exclude-throw-branches --exclude-noncode-lines -s) setup_target_for_coverage_gcovr( NAME
|
||||
# coverage EXECUTABLE testrunner EXCLUDE "src/dir1" "src/dir2")
|
||||
#
|
||||
# 1. Use the functions described below to create a custom make target which runs your test executable and produces a code
|
||||
# coverage report.
|
||||
#
|
||||
# 1. Build a Debug build: cmake -DCMAKE_BUILD_TYPE=Debug .. make make my_coverage_target
|
||||
|
||||
include(CMakeParseArguments)
|
||||
|
||||
option(CODE_COVERAGE_VERBOSE "Verbose information" FALSE)
|
||||
|
||||
# Check prereqs
|
||||
find_program(GCOVR_PATH gcovr PATHS ${CMAKE_SOURCE_DIR}/scripts/test)
|
||||
|
||||
if (DEFINED CODE_COVERAGE_GCOV_TOOL)
|
||||
set(GCOV_TOOL "${CODE_COVERAGE_GCOV_TOOL}")
|
||||
elseif (DEFINED ENV{CODE_COVERAGE_GCOV_TOOL})
|
||||
set(GCOV_TOOL "$ENV{CODE_COVERAGE_GCOV_TOOL}")
|
||||
elseif ("${CMAKE_CXX_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang")
|
||||
if (APPLE)
|
||||
execute_process(COMMAND xcrun -f llvm-cov OUTPUT_VARIABLE LLVMCOV_PATH OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
else ()
|
||||
find_program(LLVMCOV_PATH llvm-cov)
|
||||
endif ()
|
||||
if (LLVMCOV_PATH)
|
||||
set(GCOV_TOOL "${LLVMCOV_PATH} gcov")
|
||||
endif ()
|
||||
elseif ("${CMAKE_CXX_COMPILER_ID}" MATCHES "GNU")
|
||||
find_program(GCOV_PATH gcov)
|
||||
set(GCOV_TOOL "${GCOV_PATH}")
|
||||
endif ()
|
||||
|
||||
# Check supported compiler (Clang, GNU and Flang)
|
||||
get_property(LANGUAGES GLOBAL PROPERTY ENABLED_LANGUAGES)
|
||||
foreach (LANG ${LANGUAGES})
|
||||
if ("${CMAKE_${LANG}_COMPILER_ID}" MATCHES "(Apple)?[Cc]lang")
|
||||
if ("${CMAKE_${LANG}_COMPILER_VERSION}" VERSION_LESS 3)
|
||||
message(FATAL_ERROR "Clang version must be 3.0.0 or greater! Aborting...")
|
||||
endif ()
|
||||
elseif (NOT "${CMAKE_${LANG}_COMPILER_ID}" MATCHES "GNU" AND NOT "${CMAKE_${LANG}_COMPILER_ID}" MATCHES
|
||||
"(LLVM)?[Ff]lang"
|
||||
)
|
||||
message(FATAL_ERROR "Compiler is not GNU or Flang! Aborting...")
|
||||
endif ()
|
||||
endforeach ()
|
||||
|
||||
set(COVERAGE_COMPILER_FLAGS "-g --coverage" CACHE INTERNAL "")
|
||||
if (CMAKE_CXX_COMPILER_ID MATCHES "(GNU|Clang)")
|
||||
include(CheckCXXCompilerFlag)
|
||||
check_cxx_compiler_flag(-fprofile-abs-path HAVE_cxx_fprofile_abs_path)
|
||||
if (HAVE_cxx_fprofile_abs_path)
|
||||
set(COVERAGE_CXX_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
endif ()
|
||||
include(CheckCCompilerFlag)
|
||||
check_c_compiler_flag(-fprofile-abs-path HAVE_c_fprofile_abs_path)
|
||||
if (HAVE_c_fprofile_abs_path)
|
||||
set(COVERAGE_C_COMPILER_FLAGS "${COVERAGE_COMPILER_FLAGS} -fprofile-abs-path")
|
||||
endif ()
|
||||
endif ()
|
||||
|
||||
set(CMAKE_Fortran_FLAGS_COVERAGE ${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the Fortran compiler during coverage builds." FORCE
|
||||
)
|
||||
set(CMAKE_CXX_FLAGS_COVERAGE ${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the C++ compiler during coverage builds." FORCE
|
||||
)
|
||||
set(CMAKE_C_FLAGS_COVERAGE ${COVERAGE_COMPILER_FLAGS}
|
||||
CACHE STRING "Flags used by the C compiler during coverage builds." FORCE
|
||||
)
|
||||
set(CMAKE_EXE_LINKER_FLAGS_COVERAGE "" CACHE STRING "Flags used for linking binaries during coverage builds." FORCE)
|
||||
set(CMAKE_SHARED_LINKER_FLAGS_COVERAGE ""
|
||||
CACHE STRING "Flags used by the shared libraries linker during coverage builds." FORCE
|
||||
)
|
||||
mark_as_advanced(
|
||||
CMAKE_Fortran_FLAGS_COVERAGE CMAKE_CXX_FLAGS_COVERAGE CMAKE_C_FLAGS_COVERAGE CMAKE_EXE_LINKER_FLAGS_COVERAGE
|
||||
CMAKE_SHARED_LINKER_FLAGS_COVERAGE
|
||||
)
|
||||
|
||||
get_property(GENERATOR_IS_MULTI_CONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG)
|
||||
if (NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG))
|
||||
message(WARNING "Code coverage results with an optimised (non-Debug) build may be misleading")
|
||||
endif () # NOT (CMAKE_BUILD_TYPE STREQUAL "Debug" OR GENERATOR_IS_MULTI_CONFIG)
|
||||
|
||||
if (CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
|
||||
link_libraries(gcov)
|
||||
endif ()
|
||||
|
||||
# Defines a target for running and collection code coverage information Builds dependencies, runs the given executable
|
||||
# and outputs reports. NOTE! The executable should always have a ZERO as exit code otherwise the coverage generation
|
||||
# will not complete.
|
||||
#
|
||||
# setup_target_for_coverage_gcovr( NAME ctest_coverage # New target name EXECUTABLE ctest -j
|
||||
# ${PROCESSOR_COUNT} # Executable in PROJECT_BINARY_DIR DEPENDENCIES executable_target # Dependencies to build
|
||||
# first BASE_DIRECTORY "../" # Base directory for report # (defaults to PROJECT_SOURCE_DIR) FORMAT
|
||||
# "cobertura" # Output format, one of: # xml cobertura sonarqube json-summary # json-details
|
||||
# coveralls csv txt # html-single html-nested html-details # (xml is an alias to cobertura; # if no format is set,
|
||||
# defaults to xml) EXCLUDE "src/dir1/*" "src/dir2/*" # Patterns to exclude (can be relative # to BASE_DIRECTORY,
|
||||
# with CMake 3.4+) ) The user can set the variable GCOVR_ADDITIONAL_ARGS to supply additional flags to the GCVOR
|
||||
# command.
|
||||
function (setup_target_for_coverage_gcovr)
|
||||
set(options NONE)
|
||||
set(oneValueArgs BASE_DIRECTORY NAME FORMAT)
|
||||
set(multiValueArgs EXCLUDE EXECUTABLE EXECUTABLE_ARGS DEPENDENCIES)
|
||||
cmake_parse_arguments(Coverage "${options}" "${oneValueArgs}" "${multiValueArgs}" ${ARGN})
|
||||
|
||||
if (NOT GCOV_TOOL)
|
||||
message(FATAL_ERROR "Could not find gcov or llvm-cov tool! Aborting...")
|
||||
endif ()
|
||||
|
||||
if (NOT GCOVR_PATH)
|
||||
message(FATAL_ERROR "Could not find gcovr tool! Aborting...")
|
||||
endif ()
|
||||
|
||||
# Set base directory (as absolute path), or default to PROJECT_SOURCE_DIR
|
||||
if (DEFINED Coverage_BASE_DIRECTORY)
|
||||
get_filename_component(BASEDIR ${Coverage_BASE_DIRECTORY} ABSOLUTE)
|
||||
else ()
|
||||
set(BASEDIR ${PROJECT_SOURCE_DIR})
|
||||
endif ()
|
||||
|
||||
if (NOT DEFINED Coverage_FORMAT)
|
||||
set(Coverage_FORMAT xml)
|
||||
endif ()
|
||||
|
||||
if ("--output" IN_LIST GCOVR_ADDITIONAL_ARGS)
|
||||
message(FATAL_ERROR "Unsupported --output option detected in GCOVR_ADDITIONAL_ARGS! Aborting...")
|
||||
else ()
|
||||
if ((Coverage_FORMAT STREQUAL "html-details") OR (Coverage_FORMAT STREQUAL "html-nested"))
|
||||
set(GCOVR_OUTPUT_FILE ${PROJECT_BINARY_DIR}/${Coverage_NAME}/index.html)
|
||||
set(GCOVR_CREATE_FOLDER ${PROJECT_BINARY_DIR}/${Coverage_NAME})
|
||||
elseif (Coverage_FORMAT STREQUAL "html-single")
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.html)
|
||||
elseif ((Coverage_FORMAT STREQUAL "json-summary") OR (Coverage_FORMAT STREQUAL "json-details")
|
||||
OR (Coverage_FORMAT STREQUAL "coveralls")
|
||||
)
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.json)
|
||||
elseif (Coverage_FORMAT STREQUAL "txt")
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.txt)
|
||||
elseif (Coverage_FORMAT STREQUAL "csv")
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.csv)
|
||||
else ()
|
||||
set(GCOVR_OUTPUT_FILE ${Coverage_NAME}.xml)
|
||||
endif ()
|
||||
endif ()
|
||||
|
||||
if ((Coverage_FORMAT STREQUAL "cobertura") OR (Coverage_FORMAT STREQUAL "xml"))
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --cobertura "${GCOVR_OUTPUT_FILE}")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --cobertura-pretty)
|
||||
set(Coverage_FORMAT cobertura) # overwrite xml
|
||||
elseif (Coverage_FORMAT STREQUAL "sonarqube")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --sonarqube "${GCOVR_OUTPUT_FILE}")
|
||||
elseif (Coverage_FORMAT STREQUAL "json-summary")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json-summary "${GCOVR_OUTPUT_FILE}")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json-summary-pretty)
|
||||
elseif (Coverage_FORMAT STREQUAL "json-details")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json "${GCOVR_OUTPUT_FILE}")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --json-pretty)
|
||||
elseif (Coverage_FORMAT STREQUAL "coveralls")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --coveralls "${GCOVR_OUTPUT_FILE}")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --coveralls-pretty)
|
||||
elseif (Coverage_FORMAT STREQUAL "csv")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --csv "${GCOVR_OUTPUT_FILE}")
|
||||
elseif (Coverage_FORMAT STREQUAL "txt")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --txt "${GCOVR_OUTPUT_FILE}")
|
||||
elseif (Coverage_FORMAT STREQUAL "html-single")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html "${GCOVR_OUTPUT_FILE}")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html-self-contained)
|
||||
elseif (Coverage_FORMAT STREQUAL "html-nested")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html-nested "${GCOVR_OUTPUT_FILE}")
|
||||
elseif (Coverage_FORMAT STREQUAL "html-details")
|
||||
list(APPEND GCOVR_ADDITIONAL_ARGS --html-details "${GCOVR_OUTPUT_FILE}")
|
||||
else ()
|
||||
message(FATAL_ERROR "Unsupported output style ${Coverage_FORMAT}! Aborting...")
|
||||
endif ()
|
||||
|
||||
# Collect excludes (CMake 3.4+: Also compute absolute paths)
|
||||
set(GCOVR_EXCLUDES "")
|
||||
foreach (EXCLUDE ${Coverage_EXCLUDE} ${COVERAGE_EXCLUDES} ${COVERAGE_GCOVR_EXCLUDES})
|
||||
if (CMAKE_VERSION VERSION_GREATER 3.4)
|
||||
get_filename_component(EXCLUDE ${EXCLUDE} ABSOLUTE BASE_DIR ${BASEDIR})
|
||||
endif ()
|
||||
list(APPEND GCOVR_EXCLUDES "${EXCLUDE}")
|
||||
endforeach ()
|
||||
list(REMOVE_DUPLICATES GCOVR_EXCLUDES)
|
||||
|
||||
# Combine excludes to several -e arguments
|
||||
set(GCOVR_EXCLUDE_ARGS "")
|
||||
foreach (EXCLUDE ${GCOVR_EXCLUDES})
|
||||
list(APPEND GCOVR_EXCLUDE_ARGS "-e")
|
||||
list(APPEND GCOVR_EXCLUDE_ARGS "${EXCLUDE}")
|
||||
endforeach ()
|
||||
|
||||
# Set up commands which will be run to generate coverage data Run tests
|
||||
set(GCOVR_EXEC_TESTS_CMD ${Coverage_EXECUTABLE} ${Coverage_EXECUTABLE_ARGS})
|
||||
|
||||
# Create folder
|
||||
if (DEFINED GCOVR_CREATE_FOLDER)
|
||||
set(GCOVR_FOLDER_CMD ${CMAKE_COMMAND} -E make_directory ${GCOVR_CREATE_FOLDER})
|
||||
else ()
|
||||
set(GCOVR_FOLDER_CMD echo) # dummy
|
||||
endif ()
|
||||
|
||||
# Running gcovr
|
||||
set(GCOVR_CMD
|
||||
${GCOVR_PATH}
|
||||
--gcov-executable
|
||||
${GCOV_TOOL}
|
||||
--gcov-ignore-parse-errors=negative_hits.warn_once_per_file
|
||||
-r
|
||||
${BASEDIR}
|
||||
${GCOVR_ADDITIONAL_ARGS}
|
||||
${GCOVR_EXCLUDE_ARGS}
|
||||
--object-directory=${PROJECT_BINARY_DIR}
|
||||
)
|
||||
|
||||
if (CODE_COVERAGE_VERBOSE)
|
||||
message(STATUS "Executed command report")
|
||||
|
||||
message(STATUS "Command to run tests: ")
|
||||
string(REPLACE ";" " " GCOVR_EXEC_TESTS_CMD_SPACED "${GCOVR_EXEC_TESTS_CMD}")
|
||||
message(STATUS "${GCOVR_EXEC_TESTS_CMD_SPACED}")
|
||||
|
||||
if (NOT GCOVR_FOLDER_CMD STREQUAL "echo")
|
||||
message(STATUS "Command to create a folder: ")
|
||||
string(REPLACE ";" " " GCOVR_FOLDER_CMD_SPACED "${GCOVR_FOLDER_CMD}")
|
||||
message(STATUS "${GCOVR_FOLDER_CMD_SPACED}")
|
||||
endif ()
|
||||
|
||||
message(STATUS "Command to generate gcovr coverage data: ")
|
||||
string(REPLACE ";" " " GCOVR_CMD_SPACED "${GCOVR_CMD}")
|
||||
message(STATUS "${GCOVR_CMD_SPACED}")
|
||||
endif ()
|
||||
|
||||
add_custom_target(
|
||||
${Coverage_NAME}
|
||||
COMMAND ${GCOVR_EXEC_TESTS_CMD}
|
||||
COMMAND ${GCOVR_FOLDER_CMD}
|
||||
COMMAND ${GCOVR_CMD}
|
||||
BYPRODUCTS ${GCOVR_OUTPUT_FILE}
|
||||
WORKING_DIRECTORY ${PROJECT_BINARY_DIR}
|
||||
DEPENDS ${Coverage_DEPENDENCIES}
|
||||
VERBATIM # Protect arguments to commands
|
||||
COMMENT "Running gcovr to produce code coverage report."
|
||||
)
|
||||
|
||||
# Show info where to find the report
|
||||
add_custom_command(
|
||||
TARGET ${Coverage_NAME} POST_BUILD COMMAND ;
|
||||
COMMENT "Code coverage report saved in ${GCOVR_OUTPUT_FILE} formatted as ${Coverage_FORMAT}"
|
||||
)
|
||||
endfunction () # setup_target_for_coverage_gcovr
|
||||
|
||||
function (append_coverage_compiler_flags)
|
||||
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
set(CMAKE_Fortran_FLAGS "${CMAKE_Fortran_FLAGS} ${COVERAGE_COMPILER_FLAGS}" PARENT_SCOPE)
|
||||
message(STATUS "Appending code coverage compiler flags: ${COVERAGE_COMPILER_FLAGS}")
|
||||
endfunction () # append_coverage_compiler_flags
|
||||
|
||||
# Setup coverage for specific library
|
||||
function (append_coverage_compiler_flags_to_target name mode)
|
||||
separate_arguments(_flag_list NATIVE_COMMAND "${COVERAGE_COMPILER_FLAGS}")
|
||||
target_compile_options(${name} ${mode} ${_flag_list})
|
||||
if (CMAKE_C_COMPILER_ID STREQUAL "GNU" OR CMAKE_Fortran_COMPILER_ID STREQUAL "GNU")
|
||||
target_link_libraries(${name} ${mode} gcov)
|
||||
endif ()
|
||||
endfunction ()
|
||||
20
cmake/Docs.cmake
Normal file
20
cmake/Docs.cmake
Normal file
@@ -0,0 +1,20 @@
|
||||
find_package(Doxygen REQUIRED)
|
||||
|
||||
# See Doxyfile for these settings:
|
||||
set(SOURCE ${CMAKE_CURRENT_SOURCE_DIR}/..)
|
||||
set(USE_DOT "YES")
|
||||
set(LINT "NO")
|
||||
set(EXCLUDES "")
|
||||
# ---
|
||||
|
||||
set(DOXYGEN_IN ${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile)
|
||||
set(DOXYGEN_OUT ${CMAKE_CURRENT_BINARY_DIR}/Doxyfile)
|
||||
|
||||
configure_file(${DOXYGEN_IN} ${DOXYGEN_OUT})
|
||||
add_custom_target(
|
||||
docs
|
||||
COMMAND ${DOXYGEN_EXECUTABLE} ${DOXYGEN_OUT}
|
||||
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
|
||||
COMMENT "Generating API documentation with Doxygen"
|
||||
VERBATIM
|
||||
)
|
||||
38
cmake/Settings.cmake
Normal file
38
cmake/Settings.cmake
Normal file
@@ -0,0 +1,38 @@
|
||||
set(COMPILER_FLAGS
|
||||
-Wall
|
||||
-Wcast-align
|
||||
-Wdouble-promotion
|
||||
-Wextra
|
||||
-Werror
|
||||
-Wformat=2
|
||||
-Wimplicit-fallthrough
|
||||
-Wmisleading-indentation
|
||||
-Wno-narrowing
|
||||
-Wno-deprecated-declarations
|
||||
-Wno-dangling-else
|
||||
-Wno-unused-but-set-variable
|
||||
-Wnon-virtual-dtor
|
||||
-Wnull-dereference
|
||||
-Wold-style-cast
|
||||
-pedantic
|
||||
-Wpedantic
|
||||
-Wunused
|
||||
)
|
||||
|
||||
# TODO: reenable when we change CI #884 if (is_gcc AND NOT lint) list(APPEND COMPILER_FLAGS -Wduplicated-branches
|
||||
# -Wduplicated-cond -Wlogical-op -Wuseless-cast ) endif ()
|
||||
|
||||
if (is_clang)
|
||||
list(APPEND COMPILER_FLAGS -Wshadow # gcc is to aggressive with shadowing
|
||||
# https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78147
|
||||
)
|
||||
endif ()
|
||||
|
||||
if (is_appleclang)
|
||||
list(APPEND COMPILER_FLAGS -Wreorder-init-list)
|
||||
endif ()
|
||||
|
||||
# See https://github.com/cpp-best-practices/cppbestpractices/blob/master/02-Use_the_Tools_Available.md#gcc--clang for
|
||||
# the flags description
|
||||
|
||||
target_compile_options(clio_options INTERFACE ${COMPILER_FLAGS})
|
||||
11
cmake/SourceLocation.cmake
Normal file
11
cmake/SourceLocation.cmake
Normal file
@@ -0,0 +1,11 @@
|
||||
include(CheckIncludeFileCXX)
|
||||
|
||||
check_include_file_cxx("source_location" SOURCE_LOCATION_AVAILABLE)
|
||||
if (SOURCE_LOCATION_AVAILABLE)
|
||||
target_compile_definitions(clio_options INTERFACE "HAS_SOURCE_LOCATION")
|
||||
endif ()
|
||||
|
||||
check_include_file_cxx("experimental/source_location" EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
|
||||
if (EXPERIMENTAL_SOURCE_LOCATION_AVAILABLE)
|
||||
target_compile_definitions(clio_options INTERFACE "HAS_EXPERIMENTAL_SOURCE_LOCATION")
|
||||
endif ()
|
||||
4
cmake/deps/Boost.cmake
Normal file
4
cmake/deps/Boost.cmake
Normal file
@@ -0,0 +1,4 @@
|
||||
set(Boost_USE_STATIC_LIBS ON)
|
||||
set(Boost_USE_STATIC_RUNTIME ON)
|
||||
|
||||
find_package(Boost 1.82 REQUIRED CONFIG COMPONENTS program_options coroutine system log log_setup)
|
||||
3
cmake/deps/OpenSSL.cmake
Normal file
3
cmake/deps/OpenSSL.cmake
Normal file
@@ -0,0 +1,3 @@
|
||||
find_package(OpenSSL 1.1.1 REQUIRED CONFIG)
|
||||
|
||||
set_target_properties(OpenSSL::SSL PROPERTIES INTERFACE_COMPILE_DEFINITIONS OPENSSL_NO_SSL2)
|
||||
2
cmake/deps/Threads.cmake
Normal file
2
cmake/deps/Threads.cmake
Normal file
@@ -0,0 +1,2 @@
|
||||
set(THREADS_PREFER_PTHREAD_FLAG ON)
|
||||
find_package(Threads REQUIRED)
|
||||
1
cmake/deps/cassandra.cmake
Normal file
1
cmake/deps/cassandra.cmake
Normal file
@@ -0,0 +1 @@
|
||||
find_package(cassandra-cpp-driver REQUIRED CONFIG)
|
||||
1
cmake/deps/gbench.cmake
Normal file
1
cmake/deps/gbench.cmake
Normal file
@@ -0,0 +1 @@
|
||||
find_package(benchmark REQUIRED CONFIG)
|
||||
4
cmake/deps/gtest.cmake
Normal file
4
cmake/deps/gtest.cmake
Normal file
@@ -0,0 +1,4 @@
|
||||
find_package(GTest REQUIRED)
|
||||
|
||||
enable_testing()
|
||||
include(GoogleTest)
|
||||
3
cmake/deps/libbacktrace.cmake
Normal file
3
cmake/deps/libbacktrace.cmake
Normal file
@@ -0,0 +1,3 @@
|
||||
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_LINK)
|
||||
target_compile_definitions(clio_options INTERFACE BOOST_STACKTRACE_USE_BACKTRACE)
|
||||
find_package(libbacktrace REQUIRED CONFIG)
|
||||
1
cmake/deps/libfmt.cmake
Normal file
1
cmake/deps/libfmt.cmake
Normal file
@@ -0,0 +1 @@
|
||||
find_package(fmt REQUIRED CONFIG)
|
||||
1
cmake/deps/libxrpl.cmake
Normal file
1
cmake/deps/libxrpl.cmake
Normal file
@@ -0,0 +1 @@
|
||||
find_package(xrpl REQUIRED CONFIG)
|
||||
@@ -2,15 +2,12 @@ set(CLIO_INSTALL_DIR "/opt/clio")
|
||||
set(CMAKE_INSTALL_PREFIX ${CLIO_INSTALL_DIR})
|
||||
|
||||
install(TARGETS clio_server DESTINATION bin)
|
||||
# install(TARGETS clio_tests DESTINATION bin) # NOTE: Do we want to install the tests?
|
||||
|
||||
#install(FILES example-config.json DESTINATION etc RENAME config.json)
|
||||
file(READ example-config.json config)
|
||||
file(READ docs/examples/config/example-config.json config)
|
||||
string(REGEX REPLACE "./clio_log" "/var/log/clio/" config "${config}")
|
||||
file(WRITE ${CMAKE_BINARY_DIR}/install-config.json "${config}")
|
||||
install(FILES ${CMAKE_BINARY_DIR}/install-config.json DESTINATION etc RENAME config.json)
|
||||
|
||||
configure_file("${CMAKE_SOURCE_DIR}/CMake/install/clio.service.in" "${CMAKE_BINARY_DIR}/clio.service")
|
||||
configure_file("${CMAKE_SOURCE_DIR}/cmake/install/clio.service.in" "${CMAKE_BINARY_DIR}/clio.service")
|
||||
|
||||
install(FILES "${CMAKE_BINARY_DIR}/clio.service" DESTINATION /lib/systemd/system)
|
||||
|
||||
97
conanfile.py
Normal file
97
conanfile.py
Normal file
@@ -0,0 +1,97 @@
|
||||
from conan import ConanFile
|
||||
from conan.tools.cmake import CMake, CMakeToolchain, cmake_layout
|
||||
|
||||
|
||||
class Clio(ConanFile):
|
||||
name = 'clio'
|
||||
license = 'ISC'
|
||||
author = 'Alex Kremer <akremer@ripple.com>, John Freeman <jfreeman@ripple.com>'
|
||||
url = 'https://github.com/xrplf/clio'
|
||||
description = 'Clio RPC server'
|
||||
settings = 'os', 'compiler', 'build_type', 'arch'
|
||||
options = {
|
||||
'fPIC': [True, False],
|
||||
'verbose': [True, False],
|
||||
'tests': [True, False], # build unit tests; create `clio_tests` binary
|
||||
'benchmark': [True, False], # build benchmarks; create `clio_benchmarks` binary
|
||||
'docs': [True, False], # doxygen API docs; create custom target 'docs'
|
||||
'packaging': [True, False], # create distribution packages
|
||||
'coverage': [True, False], # build for test coverage report; create custom target `clio_tests-ccov`
|
||||
'lint': [True, False], # run clang-tidy checks during compilation
|
||||
}
|
||||
|
||||
requires = [
|
||||
'boost/1.82.0',
|
||||
'cassandra-cpp-driver/2.17.0',
|
||||
'fmt/10.1.1',
|
||||
'protobuf/3.21.12',
|
||||
'grpc/1.50.1',
|
||||
'openssl/1.1.1u',
|
||||
'xrpl/2.2.0-b1',
|
||||
'libbacktrace/cci.20210118'
|
||||
]
|
||||
|
||||
default_options = {
|
||||
'fPIC': True,
|
||||
'verbose': False,
|
||||
'tests': False,
|
||||
'benchmark': False,
|
||||
'packaging': False,
|
||||
'coverage': False,
|
||||
'lint': False,
|
||||
'docs': False,
|
||||
|
||||
'xrpl/*:tests': False,
|
||||
'cassandra-cpp-driver/*:shared': False,
|
||||
'date/*:header_only': True,
|
||||
'grpc/*:shared': False,
|
||||
'grpc/*:secure': True,
|
||||
'libpq/*:shared': False,
|
||||
'lz4/*:shared': False,
|
||||
'openssl/*:shared': False,
|
||||
'protobuf/*:shared': False,
|
||||
'protobuf/*:with_zlib': True,
|
||||
'snappy/*:shared': False,
|
||||
'gtest/*:no_main': True,
|
||||
}
|
||||
|
||||
exports_sources = (
|
||||
'CMakeLists.txt', 'cmake/*', 'src/*'
|
||||
)
|
||||
|
||||
def requirements(self):
|
||||
if self.options.tests:
|
||||
self.requires('gtest/1.14.0')
|
||||
if self.options.benchmark:
|
||||
self.requires('benchmark/1.8.3')
|
||||
|
||||
def configure(self):
|
||||
if self.settings.compiler == 'apple-clang':
|
||||
self.options['boost'].visibility = 'global'
|
||||
|
||||
def layout(self):
|
||||
cmake_layout(self)
|
||||
# Fix this setting to follow the default introduced in Conan 1.48
|
||||
# to align with our build instructions.
|
||||
self.folders.generators = 'build/generators'
|
||||
|
||||
generators = 'CMakeDeps'
|
||||
def generate(self):
|
||||
tc = CMakeToolchain(self)
|
||||
tc.variables['verbose'] = self.options.verbose
|
||||
tc.variables['tests'] = self.options.tests
|
||||
tc.variables['coverage'] = self.options.coverage
|
||||
tc.variables['lint'] = self.options.lint
|
||||
tc.variables['docs'] = self.options.docs
|
||||
tc.variables['packaging'] = self.options.packaging
|
||||
tc.variables['benchmark'] = self.options.benchmark
|
||||
tc.generate()
|
||||
|
||||
def build(self):
|
||||
cmake = CMake(self)
|
||||
cmake.configure()
|
||||
cmake.build()
|
||||
|
||||
def package(self):
|
||||
cmake = CMake(self)
|
||||
cmake.install()
|
||||
@@ -40,10 +40,10 @@ RUN source /opt/rh/devtoolset-11/enable && cd /tmp/clio && \
|
||||
RUN mkdir output
|
||||
RUN strip clio/build/clio_server && strip clio/build/clio_tests
|
||||
RUN cp clio/build/clio_tests output/ && cp clio/build/clio_server output/
|
||||
RUN cp clio/example-config.json output/example-config.json
|
||||
RUN cp clio/docs/examples/config/example-config.json output/example-config.json
|
||||
|
||||
FROM centos:7
|
||||
COPY --from=build /tmp/output /clio
|
||||
RUN mkdir -p /opt/clio/etc && mv /clio/example-config.json /opt/clio/etc/config.json
|
||||
RUN mkdir -p /opt/clio/etc && mv /clio/docs/examples/config/example-config.json /opt/clio/etc/config.json
|
||||
|
||||
CMD ["/clio/clio_server", "/opt/clio/etc/config.json"]
|
||||
|
||||
79
docker/ci/dockerfile
Normal file
79
docker/ci/dockerfile
Normal file
@@ -0,0 +1,79 @@
|
||||
FROM ubuntu:focal
|
||||
ARG DEBIAN_FRONTEND=noninteractive
|
||||
ARG TARGETARCH
|
||||
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
USER root
|
||||
WORKDIR /root/
|
||||
|
||||
ENV GCC_VERSION=11 \
|
||||
CCACHE_VERSION=4.8.3 \
|
||||
LLVM_TOOLS_VERSION=17 \
|
||||
GH_VERSION=2.40.0 \
|
||||
DOXYGEN_VERSION=1.10.0
|
||||
|
||||
# Add repositories
|
||||
RUN apt-get -qq update \
|
||||
&& apt-get -qq install -y --no-install-recommends --no-install-suggests gnupg wget curl software-properties-common \
|
||||
&& add-apt-repository -y ppa:ubuntu-toolchain-r/test \
|
||||
&& wget -O - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | apt-key add - \
|
||||
&& apt-add-repository 'deb https://apt.kitware.com/ubuntu/ focal main' \
|
||||
&& echo "deb http://apt.llvm.org/focal/ llvm-toolchain-focal-${LLVM_TOOLS_VERSION} main" >> /etc/apt/sources.list \
|
||||
&& wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key | apt-key add -
|
||||
|
||||
# Install packages
|
||||
RUN apt update -qq \
|
||||
&& apt install -y --no-install-recommends --no-install-suggests cmake python3 python3-pip sudo git \
|
||||
ninja-build make pkg-config libzstd-dev libzstd1 g++-${GCC_VERSION} flex bison jq graphviz \
|
||||
clang-format-${LLVM_TOOLS_VERSION} clang-tidy-${LLVM_TOOLS_VERSION} clang-tools-${LLVM_TOOLS_VERSION} \
|
||||
&& update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-${GCC_VERSION} 100 \
|
||||
&& update-alternatives --install /usr/bin/c++ c++ /usr/bin/g++-${GCC_VERSION} 100 \
|
||||
&& update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-${GCC_VERSION} 100 \
|
||||
&& update-alternatives --install /usr/bin/cc cc /usr/bin/gcc-${GCC_VERSION} 100 \
|
||||
&& update-alternatives --install /usr/bin/gcov gcov /usr/bin/gcov-${GCC_VERSION} 100 \
|
||||
&& update-alternatives --install /usr/bin/gcov-dump gcov-dump /usr/bin/gcov-dump-${GCC_VERSION} 100 \
|
||||
&& update-alternatives --install /usr/bin/gcov-tool gcov-tool /usr/bin/gcov-tool-${GCC_VERSION} 100 \
|
||||
&& update-alternatives --install /usr/bin/clang-format clang-format /usr/bin/clang-format-${LLVM_TOOLS_VERSION} 100 \
|
||||
&& apt-get clean && apt remove -y software-properties-common \
|
||||
&& pip3 install -q --upgrade --no-cache-dir pip \
|
||||
&& pip3 install -q --no-cache-dir conan==1.62 gcovr cmake-format
|
||||
|
||||
WORKDIR /tmp
|
||||
|
||||
# Install ccache from source
|
||||
RUN wget "https://github.com/ccache/ccache/releases/download/v${CCACHE_VERSION}/ccache-${CCACHE_VERSION}.tar.gz" \
|
||||
&& tar xf "ccache-${CCACHE_VERSION}.tar.gz" \
|
||||
&& cd "ccache-${CCACHE_VERSION}" \
|
||||
&& mkdir build && cd build \
|
||||
&& cmake -GNinja -DCMAKE_BUILD_TYPE=Release .. \
|
||||
&& cmake --build . --target install
|
||||
|
||||
# Install doxygen from sounce
|
||||
RUN wget "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.src.tar.gz" \
|
||||
&& tar xf "doxygen-${DOXYGEN_VERSION}.src.tar.gz" \
|
||||
&& cd "doxygen-${DOXYGEN_VERSION}" \
|
||||
&& mkdir build && cd build \
|
||||
&& cmake -GNinja -DCMAKE_BUILD_TYPE=Release .. \
|
||||
&& cmake --build . --target install
|
||||
|
||||
# Install gh
|
||||
RUN wget https://github.com/cli/cli/releases/download/v${GH_VERSION}/gh_${GH_VERSION}_linux_${TARGETARCH}.tar.gz \
|
||||
&& tar xf gh_${GH_VERSION}_linux_${TARGETARCH}.tar.gz \
|
||||
&& mv gh_${GH_VERSION}_linux_${TARGETARCH}/bin/gh /usr/bin/gh
|
||||
|
||||
# Clean up
|
||||
RUN rm -rf /tmp/* /var/tmp/*
|
||||
|
||||
WORKDIR /root/
|
||||
# Using root by default is not very secure but github checkout action doesn't work with any other user
|
||||
# https://github.com/actions/checkout/issues/956
|
||||
# And Github Actions doc recommends using root
|
||||
# https://docs.github.com/en/actions/creating-actions/dockerfile-support-for-github-actions#user
|
||||
|
||||
# Setup conan
|
||||
RUN conan profile new default --detect \
|
||||
&& conan profile update settings.compiler.cppstd=20 default \
|
||||
&& conan profile update settings.compiler.libcxx=libstdc++11 default \
|
||||
&& conan remote add --insert 0 conan-non-prod http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
|
||||
|
||||
16
docker/develop/compose.yaml
Normal file
16
docker/develop/compose.yaml
Normal file
@@ -0,0 +1,16 @@
|
||||
version: '3.7'
|
||||
services:
|
||||
clio_develop:
|
||||
image: rippleci/clio_ci:latest
|
||||
volumes:
|
||||
- clio_develop_conan_data:/root/.conan/data
|
||||
- clio_develop_ccache:/root/.ccache
|
||||
- ../../:/root/clio
|
||||
- clio_develop_build:/root/clio/build_docker
|
||||
working_dir: /root/clio/build_docker
|
||||
tty: true
|
||||
|
||||
volumes:
|
||||
clio_develop_conan_data:
|
||||
clio_develop_ccache:
|
||||
clio_develop_build:
|
||||
62
docker/develop/run
Executable file
62
docker/develop/run
Executable file
@@ -0,0 +1,62 @@
|
||||
#!/bin/bash
|
||||
|
||||
script_dir=$(dirname $0)
|
||||
|
||||
pushd $script_dir > /dev/null
|
||||
|
||||
function start_container {
|
||||
if [ -z "$(docker ps -q -f name=clio_develop)" ]; then
|
||||
docker compose up -d
|
||||
fi
|
||||
}
|
||||
|
||||
function run {
|
||||
start_container
|
||||
docker compose exec clio_develop "$@"
|
||||
}
|
||||
|
||||
function stop_container {
|
||||
docker compose down
|
||||
}
|
||||
|
||||
function open_terminal {
|
||||
start_container
|
||||
docker compose exec clio_develop /bin/bash
|
||||
}
|
||||
|
||||
function print_help {
|
||||
cat <<EOF
|
||||
run: Run a command inside the development container.
|
||||
|
||||
Usage:
|
||||
run [options or command]
|
||||
|
||||
If no options are provided, the command will be executed inside the container.
|
||||
|
||||
Options:
|
||||
-h, --help Show this help message and exit.
|
||||
-t, --terminal Open a terminal inside the container.
|
||||
-s, --stop Stop the container.
|
||||
EOF
|
||||
}
|
||||
|
||||
case $1 in
|
||||
-h|--help)
|
||||
print_help ;;
|
||||
|
||||
-t|--terminal)
|
||||
open_terminal ;;
|
||||
|
||||
-s|--stop)
|
||||
stop_container ;;
|
||||
|
||||
-*)
|
||||
echo "Unknown option: $1"
|
||||
print_help ;;
|
||||
|
||||
*)
|
||||
run "$@" ;;
|
||||
esac
|
||||
|
||||
popd > /dev/null
|
||||
|
||||
8
docs/CMakeLists.txt
Normal file
8
docs/CMakeLists.txt
Normal file
@@ -0,0 +1,8 @@
|
||||
cmake_minimum_required(VERSION 3.16.3)
|
||||
project(docs)
|
||||
|
||||
include(${CMAKE_CURRENT_SOURCE_DIR}/../cmake/ClioVersion.cmake)
|
||||
|
||||
# Generate `docs` target for doxygen documentation
|
||||
# Note: use `cmake --build . --target docs` from your `build` directory to generate the documentation
|
||||
include(${CMAKE_CURRENT_SOURCE_DIR}/../cmake/Docs.cmake)
|
||||
44
docs/Doxyfile
Normal file
44
docs/Doxyfile
Normal file
@@ -0,0 +1,44 @@
|
||||
PROJECT_NAME = "Clio"
|
||||
PROJECT_LOGO = ${SOURCE}/docs/img/xrpl-logo.svg
|
||||
PROJECT_NUMBER = ${DOC_CLIO_VERSION}
|
||||
PROJECT_BRIEF = The XRP Ledger API server.
|
||||
|
||||
EXTRACT_ALL = NO
|
||||
EXTRACT_PRIVATE = NO
|
||||
EXTRACT_PACKAGE = YES
|
||||
EXTRACT_STATIC = YES
|
||||
EXTRACT_LOCAL_CLASSES = NO
|
||||
EXTRACT_ANON_NSPACES = NO
|
||||
|
||||
SORT_MEMBERS_CTORS_1ST = YES
|
||||
|
||||
INPUT = ${SOURCE}/src
|
||||
EXCLUDE_SYMBOLS = ${EXCLUDES}
|
||||
RECURSIVE = YES
|
||||
HAVE_DOT = ${USE_DOT}
|
||||
|
||||
QUIET = YES
|
||||
WARNINGS = ${LINT}
|
||||
WARN_NO_PARAMDOC = ${LINT}
|
||||
WARN_IF_INCOMPLETE_DOC = ${LINT}
|
||||
WARN_IF_UNDOCUMENTED = ${LINT}
|
||||
|
||||
GENERATE_LATEX = NO
|
||||
GENERATE_HTML = YES
|
||||
|
||||
SORT_MEMBERS_CTORS_1ST = YES
|
||||
|
||||
GENERATE_TREEVIEW = YES
|
||||
DISABLE_INDEX = NO
|
||||
FULL_SIDEBAR = NO
|
||||
HTML_HEADER = ${SOURCE}/docs/doxygen-awesome-theme/header.html
|
||||
HTML_EXTRA_STYLESHEET = ${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome.css \
|
||||
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only.css \
|
||||
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only-darkmode-toggle.css
|
||||
HTML_EXTRA_FILES = ${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-darkmode-toggle.js \
|
||||
${SOURCE}/docs/doxygen-awesome-theme/doxygen-awesome-interactive-toc.js
|
||||
|
||||
HTML_COLORSTYLE = LIGHT
|
||||
HTML_COLORSTYLE_HUE = 209
|
||||
HTML_COLORSTYLE_SAT = 255
|
||||
HTML_COLORSTYLE_GAMMA = 113
|
||||
109
docs/build-clio.md
Normal file
109
docs/build-clio.md
Normal file
@@ -0,0 +1,109 @@
|
||||
# How to build Clio
|
||||
|
||||
Clio is built with [CMake](https://cmake.org/) and uses [Conan](https://conan.io/) for managing dependencies. It is written in C++20 and therefore requires a modern compiler.
|
||||
|
||||
## Minimum Requirements
|
||||
|
||||
- [Python 3.7](https://www.python.org/downloads/)
|
||||
- [Conan 1.55](https://conan.io/downloads.html)
|
||||
- [CMake 3.16](https://cmake.org/download/)
|
||||
- [**Optional**] [GCovr](https://gcc.gnu.org/onlinedocs/gcc/Gcov.html): needed for code coverage generation
|
||||
- [**Optional**] [CCache](https://ccache.dev/): speeds up compilation if you are going to compile Clio often
|
||||
|
||||
| Compiler | Version |
|
||||
|-------------|---------|
|
||||
| GCC | 11 |
|
||||
| Clang | 14 |
|
||||
| Apple Clang | 14.0.3 |
|
||||
|
||||
### Conan Configuration
|
||||
|
||||
Clio does not require anything but default settings in your (`~/.conan/profiles/default`) Conan profile. It's best to have no extra flags specified.
|
||||
|
||||
> Mac example:
|
||||
|
||||
```
|
||||
[settings]
|
||||
os=Macos
|
||||
os_build=Macos
|
||||
arch=armv8
|
||||
arch_build=armv8
|
||||
compiler=apple-clang
|
||||
compiler.version=14
|
||||
compiler.libcxx=libc++
|
||||
build_type=Release
|
||||
compiler.cppstd=20
|
||||
```
|
||||
|
||||
> Linux example:
|
||||
|
||||
```
|
||||
[settings]
|
||||
os=Linux
|
||||
os_build=Linux
|
||||
arch=x86_64
|
||||
arch_build=x86_64
|
||||
compiler=gcc
|
||||
compiler.version=11
|
||||
compiler.libcxx=libstdc++11
|
||||
build_type=Release
|
||||
compiler.cppstd=20
|
||||
```
|
||||
|
||||
#### Artifactory
|
||||
|
||||
Make sure artifactory is setup with Conan.
|
||||
|
||||
```sh
|
||||
conan remote add --insert 0 conan-non-prod http://18.143.149.228:8081/artifactory/api/conan/conan-non-prod
|
||||
```
|
||||
|
||||
Now you should be able to download the prebuilt `xrpl` package on some platforms.
|
||||
|
||||
> [!NOTE]
|
||||
> You may need to edit the `~/.conan/remotes.json` file to ensure that this newly added artifactory is listed last. Otherwise, you could see compilation errors when building the project with gcc version 13 (or newer).
|
||||
|
||||
Remove old packages you may have cached.
|
||||
|
||||
```sh
|
||||
conan remove -f xrpl
|
||||
```
|
||||
|
||||
## Building Clio
|
||||
|
||||
Navigate to Clio's root directory and run:
|
||||
|
||||
```sh
|
||||
mkdir build && cd build
|
||||
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=False
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
|
||||
cmake --build . --parallel 8 # or without the number if you feel extra adventurous
|
||||
```
|
||||
|
||||
> [!TIP]
|
||||
> You can omit the `-o tests=True` if you don't want to build `clio_tests`.
|
||||
|
||||
If successful, `conan install` will find the required packages and `cmake` will do the rest. You should see `clio_server` and `clio_tests` in the `build` directory (the current directory).
|
||||
|
||||
> [!TIP]
|
||||
> To generate a Code Coverage report, include `-o coverage=True` in the `conan install` command above, along with `-o tests=True` to enable tests. After running the `cmake` commands, execute `make clio_tests-ccov`. The coverage report will be found at `clio_tests-llvm-cov/index.html`.
|
||||
|
||||
## Building Clio with Docker
|
||||
|
||||
It is also possible to build Clio using [Docker](https://www.docker.com/) if you don't want to install all the dependencies on your machine.
|
||||
|
||||
```sh
|
||||
docker run -it rippleci/clio_ci:latest
|
||||
git clone https://github.com/XRPLF/clio
|
||||
mkdir build && cd build
|
||||
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=False
|
||||
cmake -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake -DCMAKE_BUILD_TYPE=Release ..
|
||||
cmake --build . --parallel 8 # or without the number if you feel extra adventurous
|
||||
```
|
||||
|
||||
## Developing against `rippled` in standalone mode
|
||||
|
||||
If you wish to develop against a `rippled` instance running in standalone mode there are a few quirks of both Clio and `rippled` that you need to keep in mind. You must:
|
||||
|
||||
1. Advance the `rippled` ledger to at least ledger 256.
|
||||
2. Wait 10 minutes before first starting Clio against this standalone node.
|
||||
99
docs/configure-clio.md
Normal file
99
docs/configure-clio.md
Normal file
@@ -0,0 +1,99 @@
|
||||
# How to configure Clio and `rippled`
|
||||
|
||||
## Ports
|
||||
|
||||
Clio needs access to a `rippled` server in order to work. The following configurations are required for Clio and `rippled` to communicate:
|
||||
|
||||
1. In the Clio config file, provide the following:
|
||||
|
||||
- The IP of the `rippled` server
|
||||
|
||||
- The port on which `rippled` is accepting unencrypted WebSocket connections
|
||||
|
||||
- The port on which `rippled` is handling gRPC requests
|
||||
|
||||
2. In the `rippled` config file, you need to open:
|
||||
|
||||
- A port to accept unencrypted WebSocket connections
|
||||
|
||||
- A port to handle gRPC requests, with the IP(s) of Clio specified in the `secure_gateway` entry
|
||||
|
||||
The example configs of [rippled](https://github.com/XRPLF/rippled/blob/develop/cfg/rippled-example.cfg) and [Clio](../docs/examples/config/example-config.json) are set up in a way that minimal changes are required.
|
||||
When running locally, the only change needed is to uncomment the `port_grpc` section of the `rippled` config.
|
||||
|
||||
If you're running Clio and `rippled` on separate machines, in addition to uncommenting the `port_grpc` section, a few other steps must be taken:
|
||||
|
||||
1. Change the `ip` in `etl_sources` to the IP where your `rippled` server is running.
|
||||
|
||||
2. Open a public, unencrypted WebSocket port on your `rippled` server.
|
||||
|
||||
3. In the `rippled` config, change the IP specified for `secure_gateway`, under the `port_grpc` section, to the IP of your Clio server. This entry can take the form of a comma-separated list if you are running multiple Clio nodes.
|
||||
|
||||
## Ledger sequence
|
||||
|
||||
The parameter `start_sequence` can be included and configured within the top level of the config file. This parameter specifies the sequence of the first ledger to extract if the database is empty.
|
||||
|
||||
Note that ETL extracts ledgers in order, and backfilling functionality currently doesn't exist. This means Clio does not retroactively learn ledgers older than the one you specify. Choosing to specify this or not will yield the following behavior:
|
||||
|
||||
- If this setting is absent and the database is empty, ETL will start with the next ledger validated by the network.
|
||||
|
||||
- If this setting is present and the database is not empty, an exception is thrown.
|
||||
|
||||
In addition, the optional parameter `finish_sequence` can be added to the json file as well, specifying where the ledger can stop.
|
||||
|
||||
To add `start_sequence` and/or `finish_sequence` to the `config.json` file appropriately, they must be on the same top level of precedence as other parameters (i.e., `database`, `etl_sources`, `read_only`) and be specified with an integer.
|
||||
|
||||
Here is an example snippet from the config file:
|
||||
|
||||
```json
|
||||
"start_sequence": 12345,
|
||||
"finish_sequence": 54321
|
||||
```
|
||||
|
||||
## SSL
|
||||
|
||||
The parameters `ssl_cert_file` and `ssl_key_file` can also be added to the top level of precedence of our Clio config. The `ssl_cert_file` field specifies the filepath for your SSL cert, while `ssl_key_file` specifies the filepath for your SSL key. It is up to you how to change ownership of these folders for your designated Clio user.
|
||||
|
||||
Your options include:
|
||||
|
||||
- Copying the two files as root somewhere that's accessible by the Clio user, then running `sudo chown` to your user
|
||||
- Changing the permissions directly so it's readable by your Clio user
|
||||
- Running Clio as root (strongly discouraged)
|
||||
|
||||
Here is an example of how to specify `ssl_cert_file` and `ssl_key_file` in the config:
|
||||
|
||||
```json
|
||||
"server": {
|
||||
"ip": "0.0.0.0",
|
||||
"port": 51233
|
||||
},
|
||||
"ssl_cert_file": "/full/path/to/cert.file",
|
||||
"ssl_key_file": "/full/path/to/key.file"
|
||||
```
|
||||
|
||||
## Admin rights for requests
|
||||
|
||||
By default Clio checks admin privileges by IP address from requests (only `127.0.0.1` is considered to be an admin). This is not very secure because the IP could be spoofed. For better security, an `admin_password` can be provided in the `server` section of Clio's config:
|
||||
|
||||
```json
|
||||
"server": {
|
||||
"admin_password": "secret"
|
||||
}
|
||||
```
|
||||
|
||||
If the password is presented in the config, Clio will check the Authorization header (if any) in each request for the password. The Authorization header should contain the type `Password`, and the password from the config (e.g. `Password secret`).
|
||||
Exactly equal password gains admin rights for the request or a websocket connection.
|
||||
|
||||
## ETL sources forwarding cache
|
||||
|
||||
Clio can cache requests to ETL sources to reduce the load on the ETL source.
|
||||
Only following commands are cached: `server_info`, `server_state`, `server_definitions`, `fee`, `ledger_closed`.
|
||||
By default the forwarding cache is off.
|
||||
To enable the caching for a source, `forwarding_cache_timeout` value should be added to the configuration file, e.g.:
|
||||
|
||||
```json
|
||||
"forwarding_cache_timeout": 0.250,
|
||||
```
|
||||
|
||||
`forwarding_cache_timeout` defines for how long (in seconds) a cache entry will be valid after being placed into the cache.
|
||||
Zero value turns off the cache feature.
|
||||
42
docs/coverage-report.md
Normal file
42
docs/coverage-report.md
Normal file
@@ -0,0 +1,42 @@
|
||||
# Coverage report
|
||||
|
||||
Coverage report is intended for developers using compilers GCC or Clang (including Apple Clang). It is generated by the build target `coverage_report`, which is only enabled when both `tests` and `coverage` options are set (e.g., with `-o coverage=True -o tests=True` in `conan`).
|
||||
|
||||
## Prerequisites
|
||||
|
||||
To generate the coverage report you need:
|
||||
|
||||
- [gcovr tool](https://gcovr.com/en/stable/getting-started.html) (can be installed e.g. with `pip install gcovr`)
|
||||
- `gcov` for GCC (installed with the compiler by default)
|
||||
- `llvm-cov` for Clang (installed with the compiler by default, also on Apple)
|
||||
- `Debug` build type
|
||||
|
||||
## Creating the coverage report
|
||||
|
||||
The coverage report is created when the following steps are completed, in order:
|
||||
|
||||
1. `clio_tests` binary built with the instrumentation data, enabled by the `coverage`
|
||||
option mentioned above.
|
||||
2. Completed run of unit tests, which populates coverage capture data.
|
||||
3. Completed run of `gcovr` tool, which internally invokes either `gcov` or `llvm-cov`
|
||||
to assemble both instrumentation data and coverage capture data into a coverage report.
|
||||
|
||||
The above steps are automated into a single target `coverage_report`. The instrumented `clio_tests` binary can also be used for running regular unit tests.
|
||||
|
||||
In case of a spurious failure of unit tests, it is possible to re-run the `coverage_report` target without rebuilding the `clio_tests` binary (since it is simply a dependency of the coverage report target).
|
||||
|
||||
The default coverage report format is `html-details`, but developers can override it to any of the formats listed in `cmake/CodeCoverage.cmake` by setting `CODE_COVERAGE_REPORT_FORMAT` variable in `cmake`. For example, CI is setting this parameter to `xml` for the [codecov](https://codecov.io) integration.
|
||||
|
||||
If some unit tests predictably fail (e.g., due to absence of a Cassandra database), it is possible to set unit tests options in the `CODE_COVERAGE_TESTS_ARGS` cmake variable, as demonstrated below:
|
||||
|
||||
```sh
|
||||
cd .build
|
||||
conan install .. --output-folder . --build missing --settings build_type=Debug -o tests=True -o coverage=True
|
||||
cmake -DCODE_COVERAGE_REPORT_FORMAT=json-details -DCMAKE_BUILD_TYPE=Debug -DCODE_COVERAGE_TESTS_ARGS="--gtest_filter=-BackendCassandra*" -DCMAKE_TOOLCHAIN_FILE:FILEPATH=build/generators/conan_toolchain.cmake ..
|
||||
cmake --build . --target coverage_report
|
||||
```
|
||||
|
||||
After the `coverage_report` target is completed, the generated coverage report will be stored inside the build directory as either:
|
||||
|
||||
- A File named `coverage_report.*`, with a suitable extension for the report format.
|
||||
- A Directory named `coverage_report`, with `index.html` and other files inside, for `html-details` or `html-nested` report formats.
|
||||
21
docs/doxygen-awesome-theme/LICENSE
Normal file
21
docs/doxygen-awesome-theme/LICENSE
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2021 - 2023 jothepro
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
157
docs/doxygen-awesome-theme/doxygen-awesome-darkmode-toggle.js
Normal file
157
docs/doxygen-awesome-theme/doxygen-awesome-darkmode-toggle.js
Normal file
@@ -0,0 +1,157 @@
|
||||
/**
|
||||
|
||||
Doxygen Awesome
|
||||
https://github.com/jothepro/doxygen-awesome-css
|
||||
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2021 - 2023 jothepro
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
*/
|
||||
|
||||
class DoxygenAwesomeDarkModeToggle extends HTMLElement {
|
||||
// SVG icons from https://fonts.google.com/icons
|
||||
// Licensed under the Apache 2.0 license:
|
||||
// https://www.apache.org/licenses/LICENSE-2.0.html
|
||||
static lightModeIcon = `<svg xmlns="http://www.w3.org/2000/svg" enable-background="new 0 0 24 24" height="24px" viewBox="0 0 24 24" width="24px" fill="#FCBF00"><rect fill="none" height="24" width="24"/><circle cx="12" cy="12" opacity=".3" r="3"/><path d="M12,9c1.65,0,3,1.35,3,3s-1.35,3-3,3s-3-1.35-3-3S10.35,9,12,9 M12,7c-2.76,0-5,2.24-5,5s2.24,5,5,5s5-2.24,5-5 S14.76,7,12,7L12,7z M2,13l2,0c0.55,0,1-0.45,1-1s-0.45-1-1-1l-2,0c-0.55,0-1,0.45-1,1S1.45,13,2,13z M20,13l2,0c0.55,0,1-0.45,1-1 s-0.45-1-1-1l-2,0c-0.55,0-1,0.45-1,1S19.45,13,20,13z M11,2v2c0,0.55,0.45,1,1,1s1-0.45,1-1V2c0-0.55-0.45-1-1-1S11,1.45,11,2z M11,20v2c0,0.55,0.45,1,1,1s1-0.45,1-1v-2c0-0.55-0.45-1-1-1C11.45,19,11,19.45,11,20z M5.99,4.58c-0.39-0.39-1.03-0.39-1.41,0 c-0.39,0.39-0.39,1.03,0,1.41l1.06,1.06c0.39,0.39,1.03,0.39,1.41,0s0.39-1.03,0-1.41L5.99,4.58z M18.36,16.95 c-0.39-0.39-1.03-0.39-1.41,0c-0.39,0.39-0.39,1.03,0,1.41l1.06,1.06c0.39,0.39,1.03,0.39,1.41,0c0.39-0.39,0.39-1.03,0-1.41 L18.36,16.95z M19.42,5.99c0.39-0.39,0.39-1.03,0-1.41c-0.39-0.39-1.03-0.39-1.41,0l-1.06,1.06c-0.39,0.39-0.39,1.03,0,1.41 s1.03,0.39,1.41,0L19.42,5.99z M7.05,18.36c0.39-0.39,0.39-1.03,0-1.41c-0.39-0.39-1.03-0.39-1.41,0l-1.06,1.06 c-0.39,0.39-0.39,1.03,0,1.41s1.03,0.39,1.41,0L7.05,18.36z"/></svg>`
|
||||
static darkModeIcon = `<svg xmlns="http://www.w3.org/2000/svg" enable-background="new 0 0 24 24" height="24px" viewBox="0 0 24 24" width="24px" fill="#FE9700"><rect fill="none" height="24" width="24"/><path d="M9.37,5.51C9.19,6.15,9.1,6.82,9.1,7.5c0,4.08,3.32,7.4,7.4,7.4c0.68,0,1.35-0.09,1.99-0.27 C17.45,17.19,14.93,19,12,19c-3.86,0-7-3.14-7-7C5,9.07,6.81,6.55,9.37,5.51z" opacity=".3"/><path d="M9.37,5.51C9.19,6.15,9.1,6.82,9.1,7.5c0,4.08,3.32,7.4,7.4,7.4c0.68,0,1.35-0.09,1.99-0.27C17.45,17.19,14.93,19,12,19 c-3.86,0-7-3.14-7-7C5,9.07,6.81,6.55,9.37,5.51z M12,3c-4.97,0-9,4.03-9,9s4.03,9,9,9s9-4.03,9-9c0-0.46-0.04-0.92-0.1-1.36 c-0.98,1.37-2.58,2.26-4.4,2.26c-2.98,0-5.4-2.42-5.4-5.4c0-1.81,0.89-3.42,2.26-4.4C12.92,3.04,12.46,3,12,3L12,3z"/></svg>`
|
||||
static title = "Toggle Light/Dark Mode"
|
||||
|
||||
static prefersLightModeInDarkModeKey = "prefers-light-mode-in-dark-mode"
|
||||
static prefersDarkModeInLightModeKey = "prefers-dark-mode-in-light-mode"
|
||||
|
||||
static _staticConstructor = function() {
|
||||
DoxygenAwesomeDarkModeToggle.enableDarkMode(DoxygenAwesomeDarkModeToggle.userPreference)
|
||||
// Update the color scheme when the browsers preference changes
|
||||
// without user interaction on the website.
|
||||
window.matchMedia('(prefers-color-scheme: dark)').addEventListener('change', event => {
|
||||
DoxygenAwesomeDarkModeToggle.onSystemPreferenceChanged()
|
||||
})
|
||||
// Update the color scheme when the tab is made visible again.
|
||||
// It is possible that the appearance was changed in another tab
|
||||
// while this tab was in the background.
|
||||
document.addEventListener("visibilitychange", visibilityState => {
|
||||
if (document.visibilityState === 'visible') {
|
||||
DoxygenAwesomeDarkModeToggle.onSystemPreferenceChanged()
|
||||
}
|
||||
});
|
||||
}()
|
||||
|
||||
static init() {
|
||||
$(function() {
|
||||
$(document).ready(function() {
|
||||
const toggleButton = document.createElement('doxygen-awesome-dark-mode-toggle')
|
||||
toggleButton.title = DoxygenAwesomeDarkModeToggle.title
|
||||
toggleButton.updateIcon()
|
||||
|
||||
window.matchMedia('(prefers-color-scheme: dark)').addEventListener('change', event => {
|
||||
toggleButton.updateIcon()
|
||||
})
|
||||
document.addEventListener("visibilitychange", visibilityState => {
|
||||
if (document.visibilityState === 'visible') {
|
||||
toggleButton.updateIcon()
|
||||
}
|
||||
});
|
||||
|
||||
$(document).ready(function(){
|
||||
document.getElementById("MSearchBox").parentNode.appendChild(toggleButton)
|
||||
})
|
||||
$(window).resize(function(){
|
||||
document.getElementById("MSearchBox").parentNode.appendChild(toggleButton)
|
||||
})
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
constructor() {
|
||||
super();
|
||||
this.onclick=this.toggleDarkMode
|
||||
}
|
||||
|
||||
/**
|
||||
* @returns `true` for dark-mode, `false` for light-mode system preference
|
||||
*/
|
||||
static get systemPreference() {
|
||||
return window.matchMedia('(prefers-color-scheme: dark)').matches
|
||||
}
|
||||
|
||||
/**
|
||||
* @returns `true` for dark-mode, `false` for light-mode user preference
|
||||
*/
|
||||
static get userPreference() {
|
||||
return (!DoxygenAwesomeDarkModeToggle.systemPreference && localStorage.getItem(DoxygenAwesomeDarkModeToggle.prefersDarkModeInLightModeKey)) ||
|
||||
(DoxygenAwesomeDarkModeToggle.systemPreference && !localStorage.getItem(DoxygenAwesomeDarkModeToggle.prefersLightModeInDarkModeKey))
|
||||
}
|
||||
|
||||
static set userPreference(userPreference) {
|
||||
DoxygenAwesomeDarkModeToggle.darkModeEnabled = userPreference
|
||||
if(!userPreference) {
|
||||
if(DoxygenAwesomeDarkModeToggle.systemPreference) {
|
||||
localStorage.setItem(DoxygenAwesomeDarkModeToggle.prefersLightModeInDarkModeKey, true)
|
||||
} else {
|
||||
localStorage.removeItem(DoxygenAwesomeDarkModeToggle.prefersDarkModeInLightModeKey)
|
||||
}
|
||||
} else {
|
||||
if(!DoxygenAwesomeDarkModeToggle.systemPreference) {
|
||||
localStorage.setItem(DoxygenAwesomeDarkModeToggle.prefersDarkModeInLightModeKey, true)
|
||||
} else {
|
||||
localStorage.removeItem(DoxygenAwesomeDarkModeToggle.prefersLightModeInDarkModeKey)
|
||||
}
|
||||
}
|
||||
DoxygenAwesomeDarkModeToggle.onUserPreferenceChanged()
|
||||
}
|
||||
|
||||
static enableDarkMode(enable) {
|
||||
if(enable) {
|
||||
DoxygenAwesomeDarkModeToggle.darkModeEnabled = true
|
||||
document.documentElement.classList.add("dark-mode")
|
||||
document.documentElement.classList.remove("light-mode")
|
||||
} else {
|
||||
DoxygenAwesomeDarkModeToggle.darkModeEnabled = false
|
||||
document.documentElement.classList.remove("dark-mode")
|
||||
document.documentElement.classList.add("light-mode")
|
||||
}
|
||||
}
|
||||
|
||||
static onSystemPreferenceChanged() {
|
||||
DoxygenAwesomeDarkModeToggle.darkModeEnabled = DoxygenAwesomeDarkModeToggle.userPreference
|
||||
DoxygenAwesomeDarkModeToggle.enableDarkMode(DoxygenAwesomeDarkModeToggle.darkModeEnabled)
|
||||
}
|
||||
|
||||
static onUserPreferenceChanged() {
|
||||
DoxygenAwesomeDarkModeToggle.enableDarkMode(DoxygenAwesomeDarkModeToggle.darkModeEnabled)
|
||||
}
|
||||
|
||||
toggleDarkMode() {
|
||||
DoxygenAwesomeDarkModeToggle.userPreference = !DoxygenAwesomeDarkModeToggle.userPreference
|
||||
this.updateIcon()
|
||||
}
|
||||
|
||||
updateIcon() {
|
||||
if(DoxygenAwesomeDarkModeToggle.darkModeEnabled) {
|
||||
this.innerHTML = DoxygenAwesomeDarkModeToggle.darkModeIcon
|
||||
} else {
|
||||
this.innerHTML = DoxygenAwesomeDarkModeToggle.lightModeIcon
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
customElements.define("doxygen-awesome-dark-mode-toggle", DoxygenAwesomeDarkModeToggle);
|
||||
@@ -0,0 +1,81 @@
|
||||
/**
|
||||
|
||||
Doxygen Awesome
|
||||
https://github.com/jothepro/doxygen-awesome-css
|
||||
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2022 - 2023 jothepro
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
*/
|
||||
|
||||
class DoxygenAwesomeInteractiveToc {
|
||||
static topOffset = 38
|
||||
static hideMobileMenu = true
|
||||
static headers = []
|
||||
|
||||
static init() {
|
||||
window.addEventListener("load", () => {
|
||||
let toc = document.querySelector(".contents > .toc")
|
||||
if(toc) {
|
||||
toc.classList.add("interactive")
|
||||
if(!DoxygenAwesomeInteractiveToc.hideMobileMenu) {
|
||||
toc.classList.add("open")
|
||||
}
|
||||
document.querySelector(".contents > .toc > h3")?.addEventListener("click", () => {
|
||||
if(toc.classList.contains("open")) {
|
||||
toc.classList.remove("open")
|
||||
} else {
|
||||
toc.classList.add("open")
|
||||
}
|
||||
})
|
||||
|
||||
document.querySelectorAll(".contents > .toc > ul a").forEach((node) => {
|
||||
let id = node.getAttribute("href").substring(1)
|
||||
DoxygenAwesomeInteractiveToc.headers.push({
|
||||
node: node,
|
||||
headerNode: document.getElementById(id)
|
||||
})
|
||||
|
||||
document.getElementById("doc-content")?.addEventListener("scroll", () => {
|
||||
DoxygenAwesomeInteractiveToc.update()
|
||||
})
|
||||
})
|
||||
DoxygenAwesomeInteractiveToc.update()
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
static update() {
|
||||
let active = DoxygenAwesomeInteractiveToc.headers[0]?.node
|
||||
DoxygenAwesomeInteractiveToc.headers.forEach((header) => {
|
||||
let position = header.headerNode.getBoundingClientRect().top
|
||||
header.node.classList.remove("active")
|
||||
header.node.classList.remove("aboveActive")
|
||||
if(position < DoxygenAwesomeInteractiveToc.topOffset) {
|
||||
active = header.node
|
||||
active?.classList.add("aboveActive")
|
||||
}
|
||||
})
|
||||
active?.classList.add("active")
|
||||
active?.classList.remove("aboveActive")
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,40 @@
|
||||
|
||||
/**
|
||||
|
||||
Doxygen Awesome
|
||||
https://github.com/jothepro/doxygen-awesome-css
|
||||
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2021 - 2023 jothepro
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
*/
|
||||
|
||||
@media screen and (min-width: 768px) {
|
||||
|
||||
#MSearchBox {
|
||||
width: calc(var(--side-nav-fixed-width) - calc(2 * var(--spacing-medium)) - var(--searchbar-height) - 1px);
|
||||
}
|
||||
|
||||
#MSearchField {
|
||||
width: calc(var(--side-nav-fixed-width) - calc(2 * var(--spacing-medium)) - 66px - var(--searchbar-height));
|
||||
}
|
||||
}
|
||||
116
docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only.css
Normal file
116
docs/doxygen-awesome-theme/doxygen-awesome-sidebar-only.css
Normal file
@@ -0,0 +1,116 @@
|
||||
/**
|
||||
|
||||
Doxygen Awesome
|
||||
https://github.com/jothepro/doxygen-awesome-css
|
||||
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2021 - 2023 jothepro
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
*/
|
||||
|
||||
html {
|
||||
/* side nav width. MUST be = `TREEVIEW_WIDTH`.
|
||||
* Make sure it is wide enough to contain the page title (logo + title + version)
|
||||
*/
|
||||
--side-nav-fixed-width: 335px;
|
||||
--menu-display: none;
|
||||
|
||||
--top-height: 120px;
|
||||
--toc-sticky-top: -25px;
|
||||
--toc-max-height: calc(100vh - 2 * var(--spacing-medium) - 25px);
|
||||
}
|
||||
|
||||
#projectname {
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
|
||||
@media screen and (min-width: 768px) {
|
||||
html {
|
||||
--searchbar-background: var(--page-background-color);
|
||||
}
|
||||
|
||||
#side-nav {
|
||||
min-width: var(--side-nav-fixed-width);
|
||||
max-width: var(--side-nav-fixed-width);
|
||||
top: var(--top-height);
|
||||
overflow: visible;
|
||||
}
|
||||
|
||||
#nav-tree, #side-nav {
|
||||
height: calc(100vh - var(--top-height)) !important;
|
||||
}
|
||||
|
||||
#nav-tree {
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
#top {
|
||||
display: block;
|
||||
border-bottom: none;
|
||||
height: var(--top-height);
|
||||
margin-bottom: calc(0px - var(--top-height));
|
||||
max-width: var(--side-nav-fixed-width);
|
||||
overflow: hidden;
|
||||
background: var(--side-nav-background);
|
||||
}
|
||||
#main-nav {
|
||||
float: left;
|
||||
padding-right: 0;
|
||||
}
|
||||
|
||||
.ui-resizable-handle {
|
||||
cursor: default;
|
||||
width: 1px !important;
|
||||
background: var(--separator-color);
|
||||
box-shadow: 0 calc(-2 * var(--top-height)) 0 0 var(--separator-color);
|
||||
}
|
||||
|
||||
#nav-path {
|
||||
position: fixed;
|
||||
right: 0;
|
||||
left: var(--side-nav-fixed-width);
|
||||
bottom: 0;
|
||||
width: auto;
|
||||
}
|
||||
|
||||
#doc-content {
|
||||
height: calc(100vh - 31px) !important;
|
||||
padding-bottom: calc(3 * var(--spacing-large));
|
||||
padding-top: calc(var(--top-height) - 80px);
|
||||
box-sizing: border-box;
|
||||
margin-left: var(--side-nav-fixed-width) !important;
|
||||
}
|
||||
|
||||
#MSearchBox {
|
||||
width: calc(var(--side-nav-fixed-width) - calc(2 * var(--spacing-medium)));
|
||||
}
|
||||
|
||||
#MSearchField {
|
||||
width: calc(var(--side-nav-fixed-width) - calc(2 * var(--spacing-medium)) - 65px);
|
||||
}
|
||||
|
||||
#MSearchResultsWindow {
|
||||
left: var(--spacing-medium) !important;
|
||||
right: auto;
|
||||
}
|
||||
}
|
||||
2669
docs/doxygen-awesome-theme/doxygen-awesome.css
Normal file
2669
docs/doxygen-awesome-theme/doxygen-awesome.css
Normal file
File diff suppressed because it is too large
Load Diff
82
docs/doxygen-awesome-theme/header.html
Normal file
82
docs/doxygen-awesome-theme/header.html
Normal file
@@ -0,0 +1,82 @@
|
||||
<!-- HTML header for doxygen 1.9.7-->
|
||||
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "https://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
|
||||
<html xmlns="http://www.w3.org/1999/xhtml" lang="$langISO">
|
||||
<head>
|
||||
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
|
||||
<meta http-equiv="X-UA-Compatible" content="IE=11"/>
|
||||
<meta name="generator" content="Doxygen $doxygenversion"/>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1"/>
|
||||
<!--BEGIN PROJECT_NAME--><title>$projectname: $title</title><!--END PROJECT_NAME-->
|
||||
<!--BEGIN !PROJECT_NAME--><title>$title</title><!--END !PROJECT_NAME-->
|
||||
<link href="$relpath^tabs.css" rel="stylesheet" type="text/css"/>
|
||||
<!--BEGIN DISABLE_INDEX-->
|
||||
<!--BEGIN FULL_SIDEBAR-->
|
||||
<script type="text/javascript">var page_layout=1;</script>
|
||||
<!--END FULL_SIDEBAR-->
|
||||
<!--END DISABLE_INDEX-->
|
||||
<script type="text/javascript" src="$relpath^jquery.js"></script>
|
||||
<script type="text/javascript" src="$relpath^dynsections.js"></script>
|
||||
$treeview
|
||||
$search
|
||||
$mathjax
|
||||
$darkmode
|
||||
<link href="$relpath^$stylesheet" rel="stylesheet" type="text/css" />
|
||||
$extrastylesheet
|
||||
<script type="text/javascript" src="$relpath^doxygen-awesome-darkmode-toggle.js"></script>
|
||||
<script type="text/javascript">
|
||||
DoxygenAwesomeDarkModeToggle.init()
|
||||
</script>
|
||||
<script type="text/javascript" src="$relpath^doxygen-awesome-interactive-toc.js"></script>
|
||||
<script type="text/javascript">
|
||||
DoxygenAwesomeInteractiveToc.init()
|
||||
</script>
|
||||
</head>
|
||||
<body>
|
||||
<!--BEGIN DISABLE_INDEX-->
|
||||
<!--BEGIN FULL_SIDEBAR-->
|
||||
<div id="side-nav" class="ui-resizable side-nav-resizable"><!-- do not remove this div, it is closed by doxygen! -->
|
||||
<!--END FULL_SIDEBAR-->
|
||||
<!--END DISABLE_INDEX-->
|
||||
|
||||
<div id="top"><!-- do not remove this div, it is closed by doxygen! -->
|
||||
|
||||
<!--BEGIN TITLEAREA-->
|
||||
<div id="titlearea">
|
||||
<table cellspacing="0" cellpadding="0">
|
||||
<tbody>
|
||||
<tr id="projectrow">
|
||||
<!--BEGIN PROJECT_LOGO-->
|
||||
<td id="projectlogo"><img alt="Logo" src="$relpath^$projectlogo"/></td>
|
||||
<!--END PROJECT_LOGO-->
|
||||
<!--BEGIN PROJECT_NAME-->
|
||||
<td id="projectalign">
|
||||
<div id="projectname">$projectname<!--BEGIN PROJECT_NUMBER--><span id="projectnumber"> $projectnumber</span><!--END PROJECT_NUMBER-->
|
||||
</div>
|
||||
<!--BEGIN PROJECT_BRIEF--><div id="projectbrief">$projectbrief</div><!--END PROJECT_BRIEF-->
|
||||
</td>
|
||||
<!--END PROJECT_NAME-->
|
||||
<!--BEGIN !PROJECT_NAME-->
|
||||
<!--BEGIN PROJECT_BRIEF-->
|
||||
<td>
|
||||
<div id="projectbrief">$projectbrief</div>
|
||||
</td>
|
||||
<!--END PROJECT_BRIEF-->
|
||||
<!--END !PROJECT_NAME-->
|
||||
<!--BEGIN DISABLE_INDEX-->
|
||||
<!--BEGIN SEARCHENGINE-->
|
||||
<!--BEGIN !FULL_SIDEBAR-->
|
||||
<td>$searchbox</td>
|
||||
<!--END !FULL_SIDEBAR-->
|
||||
<!--END SEARCHENGINE-->
|
||||
<!--END DISABLE_INDEX-->
|
||||
</tr>
|
||||
<!--BEGIN SEARCHENGINE-->
|
||||
<!--BEGIN FULL_SIDEBAR-->
|
||||
<tr><td colspan="2">$searchbox</td></tr>
|
||||
<!--END FULL_SIDEBAR-->
|
||||
<!--END SEARCHENGINE-->
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<!--END TITLEAREA-->
|
||||
<!-- end header part -->
|
||||
43
docs/examples/config/cloud-example-config.json
Normal file
43
docs/examples/config/cloud-example-config.json
Normal file
@@ -0,0 +1,43 @@
|
||||
/*
|
||||
* This is an example configuration file. Please do not use without modifying to suit your needs.
|
||||
*/
|
||||
{
|
||||
"database": {
|
||||
"type": "cassandra",
|
||||
"cassandra": {
|
||||
// This option can be used to setup a secure connect bundle connection
|
||||
"secure_connect_bundle": "[path/to/zip. ignore if using contact_points]",
|
||||
// The following options are used only if using contact_points
|
||||
"contact_points": "[ip. ignore if using secure_connect_bundle]",
|
||||
"port": "[port. ignore if using_secure_connect_bundle]",
|
||||
// Authentication settings
|
||||
"username": "[username, if any]",
|
||||
"password": "[password, if any]",
|
||||
// Other common settings
|
||||
"keyspace": "clio",
|
||||
"max_write_requests_outstanding": 25000,
|
||||
"max_read_requests_outstanding": 30000,
|
||||
"threads": 8
|
||||
}
|
||||
},
|
||||
"etl_sources": [
|
||||
{
|
||||
"ip": "[rippled ip]",
|
||||
"ws_port": "6006",
|
||||
"grpc_port": "50051"
|
||||
}
|
||||
],
|
||||
"dos_guard": {
|
||||
"whitelist": [
|
||||
"127.0.0.1"
|
||||
]
|
||||
},
|
||||
"server": {
|
||||
"ip": "0.0.0.0",
|
||||
"port": 8080
|
||||
},
|
||||
"log_level": "debug",
|
||||
"log_file": "./clio.log",
|
||||
"extractor_threads": 8,
|
||||
"read_only": false
|
||||
}
|
||||
122
docs/examples/config/example-config.json
Normal file
122
docs/examples/config/example-config.json
Normal file
@@ -0,0 +1,122 @@
|
||||
/*
|
||||
* This is an example configuration file. Please do not use without modifying to suit your needs.
|
||||
*/
|
||||
{
|
||||
"database": {
|
||||
"type": "cassandra",
|
||||
"cassandra": {
|
||||
"contact_points": "127.0.0.1",
|
||||
"port": 9042,
|
||||
"keyspace": "clio",
|
||||
"replication_factor": 1,
|
||||
"table_prefix": "",
|
||||
"max_write_requests_outstanding": 25000,
|
||||
"max_read_requests_outstanding": 30000,
|
||||
"threads": 8,
|
||||
//
|
||||
// Advanced options. USE AT OWN RISK:
|
||||
// ---
|
||||
"core_connections_per_host": 1, // Defaults to 1
|
||||
"write_batch_size": 20 // Defaults to 20
|
||||
//
|
||||
// Below options will use defaults from cassandra driver if left unspecified.
|
||||
// See https://docs.datastax.com/en/developer/cpp-driver/2.17/api/struct.CassCluster/ for details.
|
||||
//
|
||||
// "queue_size_io": 2
|
||||
//
|
||||
// ---
|
||||
}
|
||||
},
|
||||
"allow_no_etl": false, // Allow Clio to run without valid ETL source, otherwise Clio will stop if ETL check fails
|
||||
"etl_sources": [
|
||||
{
|
||||
"ip": "127.0.0.1",
|
||||
"ws_port": "6006",
|
||||
"grpc_port": "50051"
|
||||
}
|
||||
],
|
||||
"forwarding_cache_timeout": 0.250, // in seconds, could be 0, which means no cache
|
||||
"dos_guard": {
|
||||
// Comma-separated list of IPs to exclude from rate limiting
|
||||
"whitelist": [
|
||||
"127.0.0.1"
|
||||
],
|
||||
//
|
||||
// The below values are the default values and are only specified here
|
||||
// for documentation purposes. The rate limiter currently limits
|
||||
// connections and bandwidth per IP. The rate limiter looks at the raw
|
||||
// IP of a client connection, and so requests routed through a load
|
||||
// balancer will all have the same IP and be treated as a single client.
|
||||
//
|
||||
"max_fetches": 1000000, // Max bytes per IP per sweep interval
|
||||
"max_connections": 20, // Max connections per IP
|
||||
"max_requests": 20, // Max connections per IP per sweep interval
|
||||
"sweep_interval": 1 // Time in seconds before resetting max_fetches and max_requests
|
||||
},
|
||||
"server": {
|
||||
"ip": "0.0.0.0",
|
||||
"port": 51233,
|
||||
// Max number of requests to queue up before rejecting further requests.
|
||||
// Defaults to 0, which disables the limit.
|
||||
"max_queue_size": 500,
|
||||
// If request contains header with authorization, Clio will check if it matches the prefix 'Password ' + this value's sha256 hash
|
||||
// If matches, the request will be considered as admin request
|
||||
"admin_password": "xrp",
|
||||
// If local_admin is true, Clio will consider requests come from 127.0.0.1 as admin requests
|
||||
// It's true by default unless admin_password is set,'local_admin' : true and 'admin_password' can not be set at the same time
|
||||
"local_amdin": false
|
||||
},
|
||||
// Overrides log level on a per logging channel.
|
||||
// Defaults to global "log_level" for each unspecified channel.
|
||||
"log_channels": [
|
||||
{
|
||||
"channel": "Backend",
|
||||
"log_level": "fatal"
|
||||
},
|
||||
{
|
||||
"channel": "WebServer",
|
||||
"log_level": "info"
|
||||
},
|
||||
{
|
||||
"channel": "Subscriptions",
|
||||
"log_level": "info"
|
||||
},
|
||||
{
|
||||
"channel": "RPC",
|
||||
"log_level": "error"
|
||||
},
|
||||
{
|
||||
"channel": "ETL",
|
||||
"log_level": "debug"
|
||||
},
|
||||
{
|
||||
"channel": "Performance",
|
||||
"log_level": "trace"
|
||||
}
|
||||
],
|
||||
"prometheus": {
|
||||
"enabled": true,
|
||||
"compress_reply": true
|
||||
},
|
||||
"log_level": "info",
|
||||
// Log format (this is the default format)
|
||||
"log_format": "%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%",
|
||||
"log_to_console": true,
|
||||
// Clio logs to file in the specified directory only if "log_directory" is set
|
||||
// "log_directory": "./clio_log",
|
||||
"log_rotation_size": 2048,
|
||||
"log_directory_max_size": 51200,
|
||||
"log_rotation_hour_interval": 12,
|
||||
"log_tag_style": "uint",
|
||||
"extractor_threads": 8,
|
||||
"read_only": false,
|
||||
// "start_sequence": [integer] the ledger index to start from,
|
||||
// "finish_sequence": [integer] the ledger index to finish at,
|
||||
// "ssl_cert_file" : "/full/path/to/cert.file",
|
||||
// "ssl_key_file" : "/full/path/to/key.file"
|
||||
"api_version": {
|
||||
"min": 1, // Minimum API version supported (could be 1 or 2)
|
||||
"max": 2, // Maximum API version supported (could be 1 or 2, but >= min)
|
||||
"default": 1 // Clio behaves the same as rippled by default
|
||||
}
|
||||
}
|
||||
25
docs/examples/infrastructure/README.md
Normal file
25
docs/examples/infrastructure/README.md
Normal file
@@ -0,0 +1,25 @@
|
||||
# Example of clio monitoring infrastructure
|
||||
|
||||
This directory contains an example of docker based infrastructure to collect and visualise metrics from clio.
|
||||
|
||||
The structure of the directory:
|
||||
- `compose.yaml`
|
||||
Docker-compose file with Prometheus and Grafana set up.
|
||||
- `prometheus.yaml`
|
||||
Defines metrics collection from Clio and Prometheus itself.
|
||||
Demonstrates how to setup Clio target and Clio's admin authorisation in Prometheus.
|
||||
- `grafana/clio_dashboard.json`
|
||||
Json file containing preconfigured dashboard in Grafana format.
|
||||
- `grafana/dashboard_local.yaml`
|
||||
Grafana configuration file defining the directory to search for dashboards json files.
|
||||
- `grafana/datasources.yaml`
|
||||
Grafana configuration file defining Prometheus as a data source for Grafana.
|
||||
|
||||
## How to try
|
||||
|
||||
1. Make sure you have `docker` and `docker-compose` installed.
|
||||
2. Run `docker-compose up -d` from this directory. It will start docker containers with Prometheus and Grafana.
|
||||
3. Open [http://localhost:3000/dashboards](http://localhost:3000/dashboards). Grafana login `admin`, password `grafana`.
|
||||
There will be preconfigured Clio dashboard.
|
||||
|
||||
If Clio is not running yet launch Clio to see metrics. Some of the metrics may appear only after requests to Clio.
|
||||
20
docs/examples/infrastructure/compose.yaml
Normal file
20
docs/examples/infrastructure/compose.yaml
Normal file
@@ -0,0 +1,20 @@
|
||||
services:
|
||||
prometheus:
|
||||
image: prom/prometheus
|
||||
ports:
|
||||
- 9090:9090
|
||||
volumes:
|
||||
- ./prometheus.yaml:/etc/prometheus/prometheus.yml
|
||||
command:
|
||||
- '--config.file=/etc/prometheus/prometheus.yml'
|
||||
grafana:
|
||||
image: grafana/grafana
|
||||
ports:
|
||||
- 3000:3000
|
||||
environment:
|
||||
- GF_SECURITY_ADMIN_USER=admin
|
||||
- GF_SECURITY_ADMIN_PASSWORD=grafana
|
||||
volumes:
|
||||
- ./grafana/datasources.yaml:/etc/grafana/provisioning/datasources/datasources.yaml
|
||||
- ./grafana/dashboard_local.yaml:/etc/grafana/provisioning/dashboards/local.yaml
|
||||
- ./grafana/clio_dashboard.json:/var/lib/grafana/dashboards/clio_dashboard.json
|
||||
1240
docs/examples/infrastructure/grafana/clio_dashboard.json
Normal file
1240
docs/examples/infrastructure/grafana/clio_dashboard.json
Normal file
File diff suppressed because it is too large
Load Diff
23
docs/examples/infrastructure/grafana/dashboard_local.yaml
Normal file
23
docs/examples/infrastructure/grafana/dashboard_local.yaml
Normal file
@@ -0,0 +1,23 @@
|
||||
apiVersion: 1
|
||||
|
||||
providers:
|
||||
- name: 'Clio dashboard'
|
||||
# <int> Org id. Default to 1
|
||||
orgId: 1
|
||||
# <string> name of the dashboard folder.
|
||||
folder: ''
|
||||
# <string> folder UID. will be automatically generated if not specified
|
||||
folderUid: ''
|
||||
# <string> provider type. Default to 'file'
|
||||
type: file
|
||||
# <bool> disable dashboard deletion
|
||||
disableDeletion: false
|
||||
# <int> how often Grafana will scan for changed dashboards
|
||||
updateIntervalSeconds: 10
|
||||
# <bool> allow updating provisioned dashboards from the UI
|
||||
allowUiUpdates: false
|
||||
options:
|
||||
# <string, required> path to dashboard files on disk. Required when using the 'file' type
|
||||
path: /var/lib/grafana/dashboards
|
||||
# <bool> use folder names from filesystem to create folders in Grafana
|
||||
foldersFromFilesStructure: true
|
||||
8
docs/examples/infrastructure/grafana/datasources.yaml
Normal file
8
docs/examples/infrastructure/grafana/datasources.yaml
Normal file
@@ -0,0 +1,8 @@
|
||||
apiVersion: 1
|
||||
|
||||
datasources:
|
||||
- name: Prometheus
|
||||
type: prometheus
|
||||
url: http://prometheus:9090
|
||||
isDefault: true
|
||||
access: proxy
|
||||
19
docs/examples/infrastructure/prometheus.yaml
Normal file
19
docs/examples/infrastructure/prometheus.yaml
Normal file
@@ -0,0 +1,19 @@
|
||||
scrape_configs:
|
||||
- job_name: clio
|
||||
scrape_interval: 5s
|
||||
scrape_timeout: 5s
|
||||
authorization:
|
||||
type: Password
|
||||
# sha256sum from password `xrp`
|
||||
# use echo -n 'your_password' | shasum -a 256 to get hash
|
||||
credentials: 0e1dcf1ff020cceabf8f4a60a32e814b5b46ee0bb8cd4af5c814e4071bd86a18
|
||||
static_configs:
|
||||
- targets:
|
||||
- host.docker.internal:51233
|
||||
- job_name: prometheus
|
||||
honor_timestamps: true
|
||||
scrape_interval: 15s
|
||||
scrape_timeout: 10s
|
||||
static_configs:
|
||||
- targets:
|
||||
- localhost:9090
|
||||
117
docs/img/xrpl-logo.svg
Normal file
117
docs/img/xrpl-logo.svg
Normal file
File diff suppressed because one or more lines are too long
|
After Width: | Height: | Size: 13 KiB |
76
docs/logging.md
Normal file
76
docs/logging.md
Normal file
@@ -0,0 +1,76 @@
|
||||
# Logging
|
||||
|
||||
Clio provides several logging options, which all are configurable via the config file. These are detailed in the following sections.
|
||||
|
||||
## `log_level`
|
||||
|
||||
The minimum level of severity at which the log message will be outputted by default. Severity options are `trace`, `debug`, `info`, `warning`, `error`, `fatal`. Defaults to `info`.
|
||||
|
||||
## `log_format`
|
||||
|
||||
The format of log lines produced by Clio. Defaults to `"%TimeStamp% (%SourceLocation%) [%ThreadID%] %Channel%:%Severity% %Message%"`.
|
||||
|
||||
Each of the variables expands like so:
|
||||
|
||||
- `TimeStamp`: The full date and time of the log entry
|
||||
- `SourceLocation`: A partial path to the c++ file and the line number in said file (`source/file/path:linenumber`)
|
||||
- `ThreadID`: The ID of the thread the log entry is written from
|
||||
- `Channel`: The channel that this log entry was sent to
|
||||
- `Severity`: The severity (aka log level) the entry was sent at
|
||||
- `Message`: The actual log message
|
||||
|
||||
## `log_channels`
|
||||
|
||||
An array of JSON objects, each overriding properties for a logging `channel`.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> At the time of writing, only `log_level` can be overridden using this mechanism.
|
||||
|
||||
Each object is of this format:
|
||||
|
||||
```json
|
||||
{
|
||||
"channel": "Backend",
|
||||
"log_level": "fatal"
|
||||
}
|
||||
```
|
||||
|
||||
If no override is present for a given channel, that channel will log at the severity specified by the global `log_level`.
|
||||
|
||||
The log channels that can be overridden are: `Backend`, `WebServer`, `Subscriptions`, `RPC`, `ETL` and `Performance`.
|
||||
|
||||
> [!NOTE]
|
||||
> See [example-config.json](../docs/examples/config/example-config.json) for more details.
|
||||
|
||||
## `log_to_console`
|
||||
|
||||
Enable or disable log output to console. Options are `true`/`false`. This option defaults to `true`.
|
||||
|
||||
## `log_directory`
|
||||
|
||||
Path to the directory where log files are stored. If such directory doesn't exist, Clio will create it.
|
||||
|
||||
If the option is not specified, the logs are not written to a file.
|
||||
|
||||
## `log_rotation_size`
|
||||
|
||||
The max size of the log file in **megabytes** before it will rotate into a smaller file. Defaults to 2GB.
|
||||
|
||||
## `log_directory_max_size`
|
||||
|
||||
The max size of the log directory in **megabytes** before old log files will be deleted to free up space. Defaults to 50GB.
|
||||
|
||||
## `log_rotation_hour_interval`
|
||||
|
||||
The time interval in **hours** after the last log rotation to automatically rotate the current log file. Defaults to 12 hours.
|
||||
|
||||
> [!NOTE]
|
||||
> Log rotation based on time occurs in conjunction with size-based log rotation. For example, if a size-based log rotation occurs, the timer for the time-based rotation will reset.
|
||||
|
||||
## `log_tag_style`
|
||||
|
||||
Tag implementation to use. Must be one of:
|
||||
|
||||
- `uint`: Lock free and threadsafe but outputs just a simple unsigned integer
|
||||
- `uuid`: Threadsafe and outputs a UUID tag
|
||||
- `none`: Doesn't use tagging at all
|
||||
30
docs/metrics-and-static-analysis.md
Normal file
30
docs/metrics-and-static-analysis.md
Normal file
@@ -0,0 +1,30 @@
|
||||
# Metrics and static analysis
|
||||
|
||||
## Prometheus metrics collection
|
||||
|
||||
Clio natively supports [Prometheus](https://prometheus.io/) metrics collection. It accepts Prometheus requests on the port configured in the `server` section of the config.
|
||||
|
||||
Prometheus metrics are enabled by default, and replies to `/metrics` are compressed. To disable compression, and have human readable metrics, add `"prometheus": { "enabled": true, "compress_reply": false }` to Clio's config.
|
||||
|
||||
To completely disable Prometheus metrics add `"prometheus": { "enabled": false }` to Clio's config.
|
||||
|
||||
It is important to know that Clio responds to Prometheus request only if they are admin requests. If you are using the admin password feature, the same password should be provided in the Authorization header of Prometheus requests.
|
||||
|
||||
You can find an example docker-compose file, with Prometheus and Grafana configs, in [examples/infrastructure](../docs/examples/infrastructure/).
|
||||
|
||||
## Using `clang-tidy` for static analysis
|
||||
|
||||
The minimum [clang-tidy](https://clang.llvm.org/extra/clang-tidy/) version required is 17.0.
|
||||
|
||||
Clang-tidy can be run by Cmake when building the project. To achieve this, you just need to provide the option `-o lint=True` for the `conan install` command:
|
||||
|
||||
```sh
|
||||
conan install .. --output-folder . --build missing --settings build_type=Release -o tests=True -o lint=True
|
||||
```
|
||||
|
||||
By default Cmake will try to find `clang-tidy` automatically in your system.
|
||||
To force Cmake to use your desired binary, set the `CLIO_CLANG_TIDY_BIN` environment variable to the path of the `clang-tidy` binary. For example:
|
||||
|
||||
```sh
|
||||
export CLIO_CLANG_TIDY_BIN=/opt/homebrew/opt/llvm@17/bin/clang-tidy
|
||||
```
|
||||
82
docs/run-clio.md
Normal file
82
docs/run-clio.md
Normal file
@@ -0,0 +1,82 @@
|
||||
# How to run Clio
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Access to a Cassandra cluster or ScyllaDB cluster. Can be local or remote.
|
||||
> [!IMPORTANT]
|
||||
> There are some key considerations when using **ScyllaDB**. By default, Scylla reserves all free RAM on a machine for itself. If you are running `rippled` or other services on the same machine, restrict its memory usage using the `--memory` argument.
|
||||
>
|
||||
> See [ScyllaDB in a Shared Environment](https://docs.scylladb.com/getting-started/scylla-in-a-shared-environment/) to learn more.
|
||||
|
||||
- Access to one or more `rippled` nodes. Can be local or remote.
|
||||
|
||||
## Starting `rippled` and Clio
|
||||
|
||||
To run Clio you must first make the necessary changes to your configuration file, `config.json`. See [How to configure Clio and rippled](./configure-clio.md) to learn more.
|
||||
|
||||
Once your config files are ready, start `rippled` and Clio.
|
||||
|
||||
> [!TIP]
|
||||
> It doesn't matter which you start first, and it's fine to stop one or the other and restart at any given time.
|
||||
|
||||
To start Clio, simply run:
|
||||
|
||||
```sh
|
||||
./clio_server config.json
|
||||
```
|
||||
|
||||
Clio will wait for `rippled` to sync before extracting any ledgers. If there is already data in Clio's database, Clio will begin extraction with the ledger whose sequence is one greater than the greatest sequence currently in the database. Clio will wait for this ledger to be available.
|
||||
|
||||
## Extracting ledgers from `rippled`
|
||||
|
||||
Be aware that the behavior of `rippled` is to sync to the most recent ledger on the network, and then backfill. If Clio is extracting ledgers from `rippled`, and then `rippled` is stopped for a significant amount of time and then restarted, `rippled` will take time to backfill to the next ledger that Clio wants.
|
||||
|
||||
The time it takes is proportional to the amount of time `rippled` was offline for. Additionally, the amount `rippled` backfills is dependent on the `online_delete` and `ledger_history` config values. If these values are small, and `rippled` is stopped for a significant amount of time, `rippled` may never backfill to the ledger that Clio wants.
|
||||
To avoid this situation, it is advised to keep history proportional to the amount of time that you expect rippled to be offline. For example, if you expect `rippled` to be offline for a few days from time to time, you should keep at least a few days of history. If you expect `rippled` to never be offline, then you can keep a very small
|
||||
amount of history.
|
||||
|
||||
Clio can use multiple `rippled` servers as a data source. Simply add more entries to the `etl_sources` section, and Clio will load balance requests across the servers specified in this list. As long as one `rippled` server is up and synced, Clio will continue extracting ledgers.
|
||||
|
||||
In contrast to `rippled`, Clio answers RPC requests for the data already in the database as soon as the server starts. Clio does not wait to sync to the network, or for `rippled` to sync.
|
||||
|
||||
## Starting Clio with a fresh database
|
||||
|
||||
When starting Clio with a fresh database, Clio needs to download a ledger in full.
|
||||
This can take some time, and depends on database throughput. With a moderately fast database, this should take less than 10 minutes. If you did not properly set `secure_gateway` in the `port_grpc` section of `rippled`, this step will fail.
|
||||
|
||||
Once the first ledger is fully downloaded, Clio only needs to extract the changed data for each ledger, so extraction is much faster and Clio can keep up with `rippled` in real-time. Even under intense load, Clio should not lag behind the network, as Clio is not processing the data, and is simply writing to a database. The throughput of Clio is dependent on the throughput of your database, but a standard Cassandra or Scylla deployment can handle the write load of the XRP Ledger without any trouble.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> Generally the performance considerations come on the read side, and depend on the number of RPC requests your Clio nodes are serving. Be aware that very heavy read traffic can impact write throughput. Again, this is on the database side, so if you are seeing this, upgrade your database.
|
||||
|
||||
## Running multiple Clio nodes
|
||||
|
||||
It is possible to run multiple Clio nodes that share access to the same database. The Clio nodes don't need to know about each other. You can simply spin up more Clio nodes pointing to the same database, and shut them down as you wish.
|
||||
|
||||
On startup, each Clio node queries the database for the latest ledger. If this latest ledger does not change for some time, the Clio node begins extracting ledgers and writing to the database. If the Clio node detects a ledger that it is trying to write has already been written, the Clio node will backoff and stop writing. If the node does not see a ledger written for some time, it will start writing again. This algorithm ensures that at any given time, one and only one Clio node is writing to the database.
|
||||
|
||||
### Configuring read only Clio nodes
|
||||
|
||||
It is possible to force Clio to only read data, and to never become a writer. To do this, set `read_only: true` in the config. One common setup is to have a small number of writer nodes that are inaccessible to clients, with several read only nodes handling client requests. The number of read only nodes can be scaled up or down in response to request volume.
|
||||
|
||||
### Running multiple `rippled` servers
|
||||
|
||||
When using multiple `rippled` servers as data sources and multiple Clio nodes, each Clio node should use the same set of `rippled` servers as sources. The order doesn't matter. The only reason not to do this is if you are running servers in different regions, and you want the Clio nodes to extract from servers in their region. However, if you are doing this, be aware that database traffic will be flowing across regions, which can cause high latencies. A possible alternative to this is to just deploy a database in each region, and the Clio nodes in each region use their region's database. This is effectively two systems.
|
||||
|
||||
Clio supports API versioning as [described here](https://xrpl.org/request-formatting.html#api-versioning).
|
||||
It's possible to configure `minimum`, `maximum` and `default` version like so:
|
||||
|
||||
```json
|
||||
"api_version": {
|
||||
"min": 1,
|
||||
"max": 2,
|
||||
"default": 1
|
||||
}
|
||||
```
|
||||
|
||||
All of the above are optional.
|
||||
|
||||
Clio will fallback to hardcoded defaults when these values are not specified in the config file, or if the configured values are outside of the minimum and maximum supported versions hardcoded in [src/rpc/common/APIVersion.h](../src/rpc/common/APIVersion.hpp).
|
||||
|
||||
> [!TIP]
|
||||
> See the [example-config.json](../docs/examples/config/example-config.json) for more details.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user